AI is accelerating my technical writing output, and other observations
- Observation 1: AI is accelerating me
- Observation 2: Prompt engineering isn’t an obvious skill
- Observation 3: Documentation is more within AI’s sights than creative content
- Observation 4: I can work through many problems using AI
- Conclusion
Note that these observations are personal ones based on my experiences, in my workspace and projects, team, and tools, etc. These observations might be entirely different in another space.
Observation 1: AI is accelerating me
First observation: AI is accelerating my doc output. I’m able to write similar documentation in half the time or less now. Each week at work, I submit a writeup of docs delivered and docs in progress. This weekly writeup isn’t something I’d previously been doing before February, so I don’t have a long historical record for more exact comparisons. But I think I’m about twice as productive as before. I simply get a lot of stuff done at work. I’d even say I’m a hyperproductive tech writer. If you’ve seen the output on my blog over the years, this is how I am at work too. I produce a lot of content, regularly and steadily. AI has ratcheted up my productivity levels.
For example, last week, on Tuesday morning, a team included me in their weekly meeting, and one team member noted that partners were eager for an early draft of docs for an upcoming feature. Apparently, the partner had been promised some documentation for an early Thursday morning meeting the same week. (To be fair, the team reached out a month ago asking for docs, but without a firm deadline, I deprioritized it.) I was up to the challenge of getting them something by Thursday.
I gathered all the internal documentation I could find about the project and plugged it into Gemini. My first step: Learn about the product. After an hour of having AI distill the essential info from 100+ pages of documents, I felt I had an acceptable level of knowledge about the product, so I started on the second step: producing an outline of important points for a user guide. After this outline, I steered Gemini section by section through writing the draft.
First drafts always need work, even with AI. So I worked through this initial draft to bring it into an acceptable state, sometimes doing the writing myself, other times having AI do the re-writing, and so on.
After some fact-checking and other cleanup, including wiring up the reference documentation and output, I had a first draft that I floated to half a dozen SMEs for review. More edits and improvements based on their feedback followed.
During this project, I also did something somewhat superhuman. I needed to get lists of hundreds of different items for various hierarchical groups (I’m being general here). These items were listed in a Google Sheet and large diagram tree, with links to each item in another file and with somewhat different names. I fed all of this into AI and it miraculously sorted it out and created a table of the items in an incredibly impressive way. My writeup here doesn’t do justice to the complexity of the task, and due to information confidentiality, I can’t go into more details. Needless to say, this task would have taken me a week to do manually, and it would have fried my brain in the process.
By Thursday morning, I had a draft of the documentation—about 6 pages of content plus reference material. It wasn’t perfect, but this was for an early release, long before product launch. The partners received the content well and had a good meeting with the team. Everyone thanked me for producing the documentation in such a short time.
At this point, some may object that AI can quickly generate poor quality content, so this isn’t surprising. Absolutely, it’s easy to ask AI to generate many pages of problematic, inaccurate, badly written content. So far, though, this doesn’t seem to be the case here. The content I’ve generated with AI is good enough, apparently, for SMEs and stakeholders to be satisfied with it.
Consider that in many ways, tech writers might have a higher bar for content quality than the engineers we write for. (For example, had I used “may” instead of might in that last sentence, some tech writers would have cringed, while most engineers would have no clue what the difference was.) Also, some documentation is harder to write than other types. Try writing an overview for a large SDK versus a how-to topic—the former is much more difficult. I’ll often go the rounds with AI tools and manually written/edited content before arriving at an acceptable draft. Relying too heavily on AI without human oversight and editing could indeed lead to a drop in quality.
Overall, despite the challenges, AI empowers me to do things I previously couldn’t. Before, if someone had asked me for a user guide in two days, I would have pushed back and demanded at least a week. But now I’m like, sure, let’s see what we can do. It really depends on how much internal documentation you have at your disposal. Gather the right content, and you can work wonders with the right AI tools.
Observation 2: Prompt engineering isn’t an obvious skill
Second observation: Prompt engineering isn’t an obvious skill. Although sometimes I think working with AI tools is mostly common sense, I receive lots of feedback from other tech writers that they don’t get useful outputs. Very few technical writers seem to even be using AI in their writing workflows. Many are still searching for those use cases where AI will unlock some gain. I find the non-use of AI baffling. I’ve almost stopped writing docs myself, allowing AI to write for me (for technical documentation at least).
Really, I’m starting to think that maybe writing by hand will soon become a manual, antiquated activity, like doing math on paper instead of using a calculator. That’s been my experience for the past month or so.
Many tech writers have a constant fear that AI will take our jobs. I often think, what I’m doing isn’t rocket science. Any person with some education can do it. And yet, just as engineers struggle to write, tech writers frequently struggle with AI tools. They don’t understand how to use them effectively. Even though “prompt engineering” is often a ridiculed term online, again and again I hear feedback from TWs about AI not being useful to them, or they simply don’t have interest in AI, as if it’s irrelevant to their work. This blows me away. When I can ramp up on a product in an hour and write a user guide in a couple of days, and code a doc publishing script that automates even more tasks, how can AI not be useful? How can it not be essential?
An often repeated saying is that AI tools won’t replace us, we’ll be replaced by those who know how to use AI tools. I feel like this is more and more true. Consider this scenario: You hire a roofer to install a new roof, which mainly involves removing the old shingles and installing new ones. One roofer arrives with a hammer. It will take this roofer 2 weeks to do the job. Another roofer arrives with a pneumatic roofing nailer power tool. It will take this roofer 3 days to do the job. The cost of the first roofer is 4 times that of the second. The output is pretty much the same. Which roofer do you hire?
It’s the same with tech writers. Suppose you have a large project. One tech writer can create the documentation using AI tools in a quarter of the time, while the other will take 75% longer. Which tech writer do you hire?
Fortunately, I think tech writers can learn how to use AI tools as power tools. Especially with more awareness and knowledge about effective prompting techniques, tech writers can become much more productive using AI. This is why my prompt engineering series contains specific prompts that you can plug in. Combine these with some well-developed documentation templates, and these tools become easier to work with.
Tech writers will become skilled prompt engineers for producing docs in ways that engineers won’t have the patience for. Engineers most likely won’t start using AI tools to write docs when TWs will do it for them. I might use 20 different AI prompts and interactions working my way through a single page of documentation, whereas an engineer might have the patience for a single attempt and output.
Observation 3: Documentation is more within AI’s sights than creative content
Another observation I’ve had is that documentation is much more within AI’s sights than creative content. I’m writing the first draft of blog post manually, not yet using AI tools. This is partly because it can sometimes take longer to fix the creative output from Claude, Gemini, or ChatGPT, than to simply write the blog post myself. Experience-driven, opinion-based blog posts like this one are much harder to pull off using an AI tool (but not impossible, as I explained here). But documentation is in a totally different category.
Documentation doesn’t usually have a byline. It doesn’t typically use first person or draw upon personal experiences. Documentation (the kind produced by paying corporations) is usually voiceless and explanatory. AI tools do a much better job with documentation than with creative content.
At the end of the day, I don’t care nearly as much about the documentation I create at work as I do about the creative content I’m writing myself. Documentation is more like tax code content—people are interested in clear, accurate, and informative material. AI is pretty good at producing this. As such, AI tools have a much greater capability to disrupt the tech comm industry than the creative industry. In the creative industry, AI-written content is often categorized as junk or spam content. I don’t think the same classification will be applied to AI-produced documentation.
Despite the fact that AI tools are good at writing documentation, that doesn’t mean documentation doesn’t require a lot of judgment, insight, and contextual decision making. I have years of experience going into the evaluation of each AI output. Even without being a content SME for a product, I know when content looks right or needs work. For example, I know that stripping adjectives from the content output as well as using shorter sentences and paragraphs puts the content more squarely within the tech comm genre. Engineers might not immediately see this and could struggle to calibrate the AI outputs in effective ways.
In short, while AI is well-suited for certain types of documentation, it’s not infallible. There’s still a need for human expertise and creativity in producing high-quality docs. AI-generated content can contain inaccuracies or lack the nuance that comes from human insight.
Observation 4: I can work through many problems using AI
My last observation is more of a tangential comment. I’ve noticed that I have a certain quality that’s hard to describe: I can sit down at a computer and work my way through a complex, tedious problem for hours. Whereas my other family members lose patience quickly, I’ve got more of a knack for detailed problem-solving and tedious information analysis. I don’t know if this is a learned characteristic from working for 20 years as a technical writer, but AI tools are helping with this. They’ve given me confidence that I can find answers.
For example, suppose I need to develop a doc build script. I can use AI to learn what I need to learn to figure it out. I can make my way from A to Z even without strong prior knowledge of something. It’s the same confidence many have when driving into unknown areas with Google Maps. If you’ve got Google Maps, then unless you go offline, you’ll probably be able to route your way out of any remote area and make your way back home. AI tools allow similar capabilities. They are tremendous for finding and learning information to complete tasks. Just give me some coffee, access to powerful AI tools, and I can figure out many things. I can sit in front a computer for hours or days doing this.
In fact, if I can’t figure something out, it rattles around in my brain until I get it sorted out. I sometimes wake up at 4 am thinking about problems I haven’t yet solved. The other week someone needed me to convert hundreds of latitudes and longitudes into a documentation page for users. I ended up converting the initial content into JSON and then developing a Jinja script to print the json into a filterable table. It was pretty easy wiring this up, especially with AI, but I still had to work through various technical issues and yes, I woke up at 4 am thinking about it and sat down at my computer to finish the implementation.
The patience factor is something I didn’t fully appreciate before. Whereas a normal person might throw up their hands when they don’t understand something and quickly move on, I can sink time into the problem. I have the patience to try to understand it, to carefully piece together how it works and make my way through to the end result needed. I think many technical people have this skill—I just didn’t realize it was such a key trait I’d developed.
Despite the usefulness of AI, I know there are still many complex problems AI can’t solve satisfactorily. For instance, I’m trying to figure out a review process for automating release notes gathering from 8+ teams for a biweekly release. I’m also struggling with best practices for being a tech lead to manage other writers. AI doesn’t seem all that helpful in these and many other strategic scenarios. AI is a tool that can augment human intelligence but not necessarily replace it entirely. At least it’s not straightforward how to use AI in many scenearios to accelerate doc work.
Conclusion
In summary, here are my observations in this post:
- AI is accelerating me
- Prompt engineering isn’t an obvious skill
- Documentation is more within AI’s sights than creative content
- I can work through many problems using AI
I should note that using AI tools with large token contexts (for example, 700,000 words) has been a game-changer for documentation tasks. This enables me to feed AI tools massive amounts of information and get outputs that are much more accurate, detailed, and relevant. If you don’t have access to these tools, you might not have the same experiences and observations as me.
Interestingly, despite having access to powerful AI tools, I’ve been blogging less lately. I’m not entirely sure why. Partly, I’ve been incredibly busy at work. The documentation work in my group has been accelerating with more products and partners. I’m also leading a biweekly prompt engineering study group at work (which is more like a training session). This all takes time.
Perhaps I haven’t carved out dedicated writing time like I used to. My mornings usually involve a long bike ride to work (I drive to the train, ride the train downtown, and bike the remaining 6.5 miles). It takes me 1 hour and 15 minutes, then I’m fully focused on work. By the time I get home in the evenings, I’m usually out of energy to write. I’m at my best when writing first thing in the morning.
For example, I’m writing this post on a Saturday morning at Panera Bread while my youngest daughter plays volleyball at a nearby park. It seems I haven’t figured out how to use AI to expand the hours of my day. AI might be accelerating my documentation work, but the number of documentation tasks is also accelerating in equal amounts.
Some content in this post was AI-assisted. Image is from ChatGPT.
About Tom Johnson
I'm an API technical writer based in the Seattle area. On this blog, I write about topics related to technical writing and communication — such as software documentation, API documentation, AI, information architecture, content strategy, writing processes, plain language, tech comm careers, and more. Check out my API documentation course if you're looking for more info about documenting APIs. Or see my posts on AI and AI course section for more on the latest in AI and tech comm.
If you're a technical writer and want to keep on top of the latest trends in the tech comm, be sure to subscribe to email updates below. You can also learn more about me or contact me. Finally, note that the opinions I express on my blog are my own points of view, not that of my employer.