← Back

State Farm2024AI Exploration

What I learned when I asked AI to do my job.

Asked to explore whether AI could replace the manual process of writing help articles. Built a test, ran it, documented the results honestly, and came back with findings more complicated than a yes or no.

Workflow notes and AI prompt documentation AI-generated help article result AI-generated help article result 2

Problem

The ask was simple: could writers use AI to draft help center articles faster, with less manual effort? I was given room to explore the question and come back with a recommendation.

Solution

Built a test process. Drafted prompts designed to produce on-brand help content. Ran multiple articles through the system and evaluated the output against our quality bar: accuracy, tone, scannability, and user trust.

Then I wrote up my honest critique of what it got right, what it consistently got wrong, and what a realistic human-plus-AI workflow would actually look like.

"The best AI outputs I've seen still needed a writer to finish them."

Result

AI can produce structurally sound help content faster than a human starting from scratch. The bones are often fine. But the voice drifts, the specificity gets soft, and anything requiring institutional knowledge falls flat without a subject matter expert in the loop.

The honest answer: AI is a first-draft accelerator, not a replacement. A writer who knows how to work with AI output can move faster. A team that tries to remove the writer entirely produces articles that feel like they were written by nobody.

A "we didn't ship it" story tells you more about judgment than a success story does. That's the work.

Want to talk about AI and content?

josiah.goodrum@gmail.com

I have thoughts. Many of them.