This is the script and slides for what I prepared as a lighting talk for the Write the Docs 2023 Portland conference. Sadly, I did not get a slot, so it was not presented.
Slides
Intro
Hi everyone, hope you’re enjoying the conference so far!
For those of you who don’t know, my name is Cynthia Ng, better known especially on Slack as Arty-chan. I’m currently a Staff Support Engineer at GitLab, and work closely with the Technical Writing team.
I want to add a quick note to the presentation that since this is the 5 minute version of the presentation I was hoping to do, I haven’t generalized this as much as I normally would have, so I’ll be talking a bit more about how we do it at GitLab than I might normally have done, but hopefully, it’s still general enough for you to consider how you might apply it for your projects, especially if you already have “docs as code”.
What is Docs as Code?
The Write the Docs website describes this quite well.
Documentation as Code (Docs as Code) refers to a philosophy that you should be writing documentation with the same tools as code [and using] the same workflows as development teams. – Write the Docs
I won’t go into the benefits and all that, and recommend reading some of the articles and resources that the Write the Docs website refers to.
But for context, generally, you would expect the workflow to look something like this (see slide) where you start with an issue, someone makes changes in a version controller repository, and it goes through automated checks and manual review before it’s published; all within the tooling that your engineering team also uses.
What I wanted to focus on is the automation part.
Automated Checks
One way to do automated checks is to create a CI pipeline which runs a bunch of jobs where each job is a set of tests. The breakdown is a bit simplified, but…
For example, we can have a job that helps the author follow the review process, prompting them to assign a Technical Writer for review if the changes include documentation. Then, one job to check for broken links, and another to check if the docs changes cause linking issues in the product
Furthermore, we have one job that checks adherence to the style guide. It includes syntax formatting for Markdown, which is what we use for our content files, but also includes things like capitalization, and Vale rules, such as future tense. If there is a rule violation, then it’s considered an “error”, the job fails, and the changes cannot be published or “merged” even with approvals until they’re fixed.
Then we use a feature called “code quality”, which can check the “quality” of the content. This job reports on all the “warnings” against rules that the Technical Writing team has decided shouldn’t block changes, but the author should be notified about.
I also wanted to call out accessibility testing. While, like ours, it’s typically integrated into other tests, basic accessibility of content can be tested, such as heading structure, and readability. By the way, if you want to learn more about writing accessible content, consider checking out the last lightning talk I did back in 2017.
Here’s a quick example of what a report would look like, and what it looks like when you’re looking at your changes.
Benefits
Hopefully, you can see the benefits of automating the parts of the review that you can. Even if you’re a solo writer, it can help you catch mistakes.
Of course, automated checks can’t cover everything since there are things that you just can’t automate, which is where a Technical Writer is really needed.
Thanks
If you’re interested in learning more about how we do documentation at GitLab, or chatting about automation in general, please come find me, or anyone else from GitLab who is here today.