Forum → Wiki → Blog Workflow

One of the sites I’m working with lately at my job combines a forum (vBulletin), blog (Joomla), and wiki (Mediawiki) into one experience. Each of these tools does a great job at what it was designed to do. They’re three separate platforms skinned and linked together.

I used to think the site was a hodgepodge of software platforms, but now I see that these three resources can harmonize together in an amazing way.

A possible workflow from forum to wiki to blog

A possible workflow of information from forum to wiki to blog.

Here’s the interaction in a little more detail:

  • Forum: Users contribute openly and regularly to the forum, creating at least several new threads and probably 15 new responses a day. Users feel comfortable posting and responding to forum questions, because they aren’t making official remarks about any topic. They’re offering their thoughts, or asking questions. It’s an informal medium that is inviting and comfortable.
  • Wiki: The conversations on the forum drive needs in the wiki. Answers and resolutions from the most popular forum threads should be transferred to the wiki as official articles.  Transferring this content requires you to organize and articulate the information, which isn’t always easy. So admittedly this transfer isn’t often done with the site I mentioned, but if someone were designated into this role, it could be powerful.
  • Blog: The blog showcases the information recently added to the wiki. The blog can also serve as a voice to energize a community, to call attention to needs on the wiki, or to bring other news to users.

I never considered how well these tools work together, but they do. The different mediums allow users to interact in ways that suit them. Of course it would be nice to have one tool that has an incredibly powerful blog, wiki, and forum wrapped up into one package. Some wiki platforms provide all three, such as Tiki Wiki. But swiss-army knife tools almost invariably perform much like an on/off road motorcycle.

The drawback of having three sources for content, however, is that content published on one source may never make it to the other sources. For example, if I write a blog article about a new application, shouldn’t that content also appear as an article on the wiki? If a forum thread clarifies a topic, shouldn’t that clarification be added to a wiki article? If I add a new wiki section, shouldn’t that section be announced and summarized, as well as explained, on the blog? Content overlap becomes a problem. So does search.

Regardless of the overlap problem, combining a forum with a wiki and blog has tangible benefits. It helps solve the participation problem with wikis. Users are more comfortable asking a question in a forum rather than changing the original content of an article. Wiki admins can harvest information from these forum threads to strengthen the information of the wiki. Significant new wiki information should be announced to users on the blog.

Adobe Robohelp Madcap Flare

This entry was posted in general, wikis on by .

By Tom Johnson

I'm a technical writer working for The 41st Parameter in San Jose, California. I'm primarily interested in topics related to technical writing, such as visual communication (video tutorials, illustrations), findability (organization, information architecture), API documentation (code examples, programming), and web publishing (web platforms, interactivity) -- pretty much everything related to technical writing. If you're trying to keep up to date about the field of technical communication, subscribe to my blog either by RSS or by email. To learn more about me, see my About page. You can also contact me if you have questions.

21 thoughts on “Forum → Wiki → Blog Workflow

  1. Kai

    Thanks, Tom, for this very insightful post! I had never considered how the different media might move content along to maturity, but it makes immediate sense to me! Could there be a fourth stage in the form of web help, if you need an authoritative deliverable…?

    1. Tom Johnson

      The wiki is seen as the authoritative help deliverable. If someone makes an update that invalidates or makes inaccurate some of the data, then we just correct that. That’s how it should work, right? From the wiki, we can then take that information and produce other formats.

  2. rick

    Instead of combining three different systems, did you consider using a single CMS that has all of these features? IMHO, you get a much tighter integration, without requiring any customization.

    1. Tom Johnson

      In an ideal world, I would be the one selecting and defining all the tools. But in the real world, I don’t get the make those decisions. Actually, the tools in place — at least the forum — have been there for years.

      I know Tiki Wiki offers all three features on the same platform. Besides the benefits of integrated search, which can be overcome through Google Custom Search, what other benefits does an all-in-one platform provide?

      1. rick

        With an all-in-one you could have SSO, groupings/categorization of similar content across different types (e.g., a single page/dashboard that has all “Foo” items in wiki pages/blogs/forums), explicit 1:1 connections between different types (e.g., a single forum/thread to discuss a specific wiki page/blog post), a single editing paradigm (e.g., I would hate to have to use BBedit for the forum, Mediawiki syntax for wiki pages, and WordPress coding for the blog!), etc…

        1. rick

          …. and that doesn’t even begin to explore the back-end maintenance issues. Now you’ve got 3 different applications to track, patch, and upgrade.

        2. Noz

          You can have similar benefits with self-describing content. That way you avoid locking yourself into a single all-in-one solution. If you go all-in-one you open yourself up to having to swap your all in one out when you discover your Swiss Army CMS has a killer flaw, or the vendor stops updating the product because not enough people chose your all-in-one supplier.

          Making the content format and application-neutral can also help unify the editing paradigm.

        3. Tom Johnson

          I see benefits to both solutions. Besides the interchangeability element that Noz mentioned, each of these tools (at least the forum and the wiki) are really good at what they do. And somehow our developers have rigged it so the forum and wiki use the same single-sign-in login that works on all other org products.

          Anyway, as I said before, it would be a tough sell to pitch this migration. I readily admit that if starting over, I would have searched for a more encompassing, integrated tool. But things being as they are, we have three separate tools. What I’m trying to point out in this post is that users are sometimes more inclined to participate in forums than add content directly to wikis. We always hear that only 1% of people participate in wikis. I’d guess probably 20% participate in forums. The forums give you the knowledge. You just have to transfer it to the wiki. And the blog is the news mechanism. Never mind the issue of 3 separate tools versus a 3 in 1 tool.

  3. Noz Urbina

    Hi Tom,

    This is of course fascinating for me. If you remember the first time we met face to face I was presenting on the idea of Technical Communication being both a ‘mine’ for product direction, as a self-perpetuating cycle from internal to external community and back again? The solution you have here is an example of what I was talking about.

    As an XML’r, I see the tool issues, but I raise you a ‘content engine’. I agree that a ‘does it all’ solution is usually not best. If you look at some of the big name vendors buying up ‘one of each’ so they have what they feel is a complete solution for their prospects, eventually those prospects will have to compromise in one area because there’s no way to have the best of everything for everyone, everytime.

    How do you escape that? I believe that XML is fundamental to the answer. The various tool should not really integrate with each other, but with a centralised repository that stores the content with sufficient Metadata that it can be read up by each of the systems and presented. This means a single post, let’s say living as a DITA topic, can start as a blog but move to being a WIKI page just by tweaking some metadata.

    Each App (blog, wiki, forum) then really becomes itself a ‘skin’ on the content, which is king. The content repository is where content lives ‘application neutral’. If you then wanted to swap out one of the 3, you would have less back end recoding to do than if you directly integrate the 3 tools.

    On the fundamentals, of course, as I said years ago, am TOTALLY in support of this approach of what can be called ‘Content Lifecycle’ solutions that support content’s progress through stages. The obvious next step was already mentioned – integrate with the CCMS to make externally created materials into ‘official’ materials once they have been internally validated.

    We’re going to be seeing more and more solutions like this – hopefully more based on XML. Mark my words.

    1. Tom Johnson

      Noz, thanks for sharing your insights. I believe you and Sarah O’Keefe and Richard Hamilton are on the same page with the XML solution. I need to learn how to create custom transforms to manipulate XML-structured content across different formats and platforms.

      But it’s not just a matter of moving the same content. The content itself is in a different format — for at least the forum and wiki. On the forum the content is interspersed in threads and half-expressed ideas. It consists of questions and responses, thoughts and complaints — from various people at different points in time. You can’t just push that content into the wiki. It requires a writer to shape and organize the information into a readable article.

      The wiki articles would single source more into blog content. But really the blog should just summarize the new wiki article and point readers to it. That way updating the content doesn’t require edits on multiple platforms.

      Here’s what I am interested in with XML. How can I extract pages on a Mediawiki wiki and package them up into selective guides and then publish them in a book-like PDF format?

      I’m surprised there isn’t already a tool for that. However, not all print experiences should be single sourced either. I used to single source my online help into a printed PDF — until I sat down and tried to read it. At that point I realized that the print experience is different. It’s linear, more concise, conceptual, visual, big-picture, etc.

      What I’m saying is that although it may seem ideal to push the same content across different formats and platforms, the content itself must also be restrategized to fit the new format or platform.

    2. Ketan Sevekari

      Hi Noz,I agree almost every writer and every tool developer today is moving towards XML for content development and publishing. However as far as DITA is concerned, my experience, is that it is only feasible to use it when your organization is large enough to generate a lot of content, skilled resources to specialize and maintain the system and provide continuous troubleshooting and technical support (maintenance and error correction and tweaking is a continuous process if the system is implemented shabbily), resources to educate and train the writers to use the system and learn a new way of independent writing (independent writing here means that every topic is independent in itself having a head and a tail, so it can be reused). Also the only practical reasons that are advocated are the potential gains in reducing localization costs because of reuse of data. However reuse seldom occurs, the critical break-even point in costs takes a long time to come, if at all it is going to and the entire process ends up eating up a lot of money.

      1. Noz Urbina

        Ok – lot in there!

        Both comments sort of allude to the same concept: Tom, your question: “How can I extract pages on a Mediawiki wiki and package them up into selective guides and then publish them in a book-like PDF format?” and Ketan’s comment about organisation’s requiring a certain size to make things tick over with DITA are both related to when and how to apply DITA.

        If you’re familiar with the DITA ‘maturity model’ (http://www.congility.com/site/dita_whitepaper_mm1-download/) even the most DITA pure of DITA heads (of which I’m not, I’m a 2nd-tier DITA-head) will still tell you that ‘doing DITA’ doesn’t mean doing everything – including reuse – from day (or year) one.

        So, small organisations and teams take on more knowledge themselves, but usually implement only a few of DITA’s features. And that’s how it should be.
        DITA in more full-on implementations is often a true ‘industrialisation’ of content processes. It sets up very different ways of working, and this is usually what makes or breaks the solution. Ketan, your comment “the only practical reasons that are advocated are the potential gains in reducing localization costs because of reuse of data” is sadly not surprising. That is NOT the only practical benefit of DITA, and even within itself it’s incomplete because the savings in localisation is sometimes over 50% *not* from reuse of content but simply from separating content from formatting.

        The content manager for Symbian (now Nokia) once said to us (Mekon), “We don’t really care about DITA, it’s just a good way of working”.

        I think fully addressing your comments is out of scope for comment reply, so I’ll cut this off without everything I’d like to. I can use this as an example to address what you said, Tom, about different formats across system. I’m aware the content in a format like this conversation is not the same as it would be in the WIKI, but imagine this: my original comment, your and Ketan’s replies, and the reply I’m making now all form a branch of a hierarchy off the original post. That branch could be clipped off and moved over into a WIKI environment in a certain status (“To be processed”) for an SME to digest and restructure it into a proper WIKI entry on CMS and XML in a Content LifeCycle solution. We could keep an audit trail keeping related links back to the original post and the comments from the WIKI entry.
        Your comment about single sourcing and Ketan’s both talk to the idea that ‘XML’ is something that different people will understand is supposed to certain things, but those are only facets – one perspective on the possibilities. It’s not that XML is just so friggin’ amazing, it’s not easy or trivial to implement, but the solution you original described is pretty complex and setting up and governing it is quite a process too. Because XML is highly attractive but not completely intuitive to get off the ground you get a staggering number of differing descriptions of what it’s for and what the ‘benefit’ is.

        I’d say probably the majority of XML projects fail, yet it grows in popularity every year. Not only do the majority of CRM projects fail, they fail at a ratio of 5:1. The game is improving that ratio and standardising the skills require to avoid failure. The more popular it gets, the bigger the skills and tools snowball grows and the less failed (shabbily implemented) project’s there will be.

        I don’t want to ‘sell’ in this post, but Ketan, we got a team at Schlumberger up and running on DITA with a relatively small implementation budget, 1 day’s ‘train-the-trainer’ and a good firm kick (http://bit.ly/gUFrfA). They didn’t do reuse across documents *at all*. They did DITA because they wanted to be able to manipulate the content in funky ways and because they wanted to manage it in a source-control system and have tools that the users could get used to fast. They did a 3-hour online internal training course and then did weekly 1 hour teleconf meetings to cover authoring issues. After 5 weeks, there weren’t anymore issues. The system was maintained by 1 guy who had some XML skills. In another (larger, better funded) department the do reuse ‘hardcore’ and get 80% repurposing out of their content.

        The 2 authors who do DITA in the Nokia software department set themselves up completely independantly and they don’t use the (massive) central CMS.

        It can be done, but it does take some know-how, especially at the start.

  4. Shay

    Hey Tom,

    Interesting Post. How did you create the image? Illustrator? I’m curious especially these days.

    In my programs, we also use verity of tools – we have a forum, but we also communicate via email and sometime use Google docs at times. While these are different than the tools you mentioned, the principle is the same: some content is overlapping, including the search.

    Personally I find that the best solution for me is to mention other topics on the forum, blog or good doc if it is fully explained somewhere else; this way the information stays in context, and you can still find it in other places.

    Using the same logic, if a piece of information belongs on two sources, it should be duplicated (and reformatted to fit the source). So, for example, if I were to describe how I developed an essay, I would describe it on the school’s forum as well as on my blog, but the writing will be somewhat different: the blog will give a more generalized information which is more presentable for the general public while the post on the forum will be more accurate and to the point, possibly longer with different parts (or different posts).

    1. Tom Johnson

      Shay, I used Adobe Illustrator to create the image.

      I agree with you about changing the content to fit the medium. At times it does need a bit of rewriting, or a different intro, or something.

  5. Ketan Sevekari

    I agree to what Rick says. I use Wikidot extensively and I know it provides all three features; forum, wiki and blog. It depends on how you set it up.

    As a technical writer I am aware of the Ideal and Real situations, but you can try convincing the decision makers by giving some factual data on how using one system can save resources, administration efforts, time and also be a single source for all the content.

    Also, now-a-days most of the newest services on the internet allow importing data from other services. Same like you can import WordPress blogs into Posterous and vice-versa.
    Wikidot has a very active community and you can try asking if there is any way you can import your existing forum conversations into the forum provided by Wikidot.

    1. Tom Johnson

      Thanks Ketan. I’ll have to check out Wikidot some time. One benefit of Mediawiki is that it has a ton of extensions, tutorials, and possibilities. Regardless of whether you’re using three tools or one master tool, it still requires work to extract and recast/restructure content in forum threads into wiki articles. Even if you have one tool that offers both a forum and wiki, it doesn’t automatically migrate content from one to the other, or rewrite that content. That’s really what I’m getting at in this post — taking the information that bubbles up more freely from one medium, the forum, and recasting/rewriting/restructuring into wiki articles.

  6. Pingback: Shift your perspective and learn « Kai's Tech Writing Blog

  7. Pingback: Forum → Wiki → Blog Workflow | I’d Rather Be Writing | Write A Blog

  8. Nate

    Just wanted to say this is a great discussion and post (I know I’m a bit late to the party). Currently struggling with a similar issue. We (our company) have a number of sites that run something akin to forums (actually organic groups – drupal). We also have the ability to load up content to the site. Drupal is a cms afterall. But content gets lost.

    I’ve been thinking of a way to integrate less structured content with more structured content and was thinking of integrating a wiki and a forum. Glad someone else is grappling with the same issue.

    Any updates?

    I’m not much of a developer more of a business analyst type, but I can get by. I was able to cobble together a forum and a wiki on Drupal 7 – mostly to play with and grapple with the issue. With that said, it isn’t terrible. No, the two aren’t integrated other than they are on the same site. But if you have a person moderating the wiki that might be ok.

    Thoughts?

  9. Chris Banzet

    WOW, it’s wonderful to see airplanes flying at 10,000 feet over your head….. :) You admire their flight path, even if you don’t understand the more granular values of their comments. :)

    I’m taking on a very interesting project, and this article hits right on the nail-head of what I want to do. However, I’m, a guru in office technology, not writing content, so somewhere out there, I’d sure love someone to offer me some advice and clarification in crayon about what is the best solution and why?

    As my site will show, I am currently using VBulletin and if I’m being honest, I can’t even take the training wheels off this bike, much less ride two more…. :)

    I think the proper phrase for this moment is…. “HELP!!”

    Chris Banzet
    chrisban35@gmail.com
    301-300-8786

    1. Tom Johnson

      Chris, if you want to provide more details about the solution advice you need, go ahead. It wasn’t clear what you were asking about. Thanks.

Comments are closed.