The Design Implications of Tool Choices

By | 2012/05/11

Every documentation tool has a built in information design bias. When you choose a tool, be it FrameMaker, DITA, AuthorIt, a WIKI, or SPFE, you are implicitly choosing an approach to information design. If you don’t understand and accept the design implications of your tool choice, as many people do not, you are setting yourself up for expense, frustration, and disappointment.

I suspect that most of us underestimate how much our tools affect our design. Design is always a matter of balancing the desirable with the practical. Want a car the can pull 1G in the corners, do zero to 60 in 4 seconds, gets 60 mpg, and can carry 12 passengers in comfort? Well, you can’t have one. As desirable as all these characteristics might be, we don’t have the tools to build something that fulfills them all. We bend our design to the capability of our tools.

Since most tech pubs folk have been using Desktop Publishing and/or Word Processing tools for the last 25 years, they have naturally, if tacitly, adapted their information design and their presentation and formatting design, to the strengths and weaknesses of those tools. We have been doing this for so long that we seldom recognize or think about the constraints as being imposed by the tool anymore, we simply design with a tacit understanding of what kinds of designs are feasible to execute and which are not.

The available tools shape our design practices.

The available tools shape our design practices.

We seldom recognize, therefore, how much our accustomed tools have influenced our design philosophy. It no longer appears to us as an adaptation to the limits of tools, but simply as an established best practice in design.

When we move to a new tool, therefore, we often expect to keep the same information design and the same presentation and formatting style as we had with the old tool. In many cases, maintenance of the current design is written into the business requirements that we take to market when we purchase a new system.

Herein lies much of the grief, anguish, and expense that so many companies experience when they switch tools.

I was talking the other day to a career tech writer who I have known for many years. The company he works for has adopted DITA, but is refusing to adopt their information design to the DITA model, which is causing him much frustration. The company refuses to use DITA conventions such as the short description element in the way that they were intended to be used because, they say, that is not the way they do things. Yet if you don’t do things the DITA way, you have to expend a lot of effort to make DITA behave differently.

Many people adopt DITA because they believe the promise of reuse it offers, or because they have become convinced that XML is the future and DITA appears to provide a easy route to XML (this is something of an illusion, as the DITA package makes it look easier than it really is). They often don’t realize that DITA has some very specific information design principles build into its DNA. (Not everyone misses this, of course; some people adopt DITA specifically because they like these principles).

Two major factors influenced the design DNA of DITA. One is Horn’s information mapping, of which DITA implements a simplified variant, and the fact that DITA was originally created to develop online help systems. The result is that DITA strongly favors the kind of information design that I have dubbed Frankenbooks: deeply nested hierarchies of topics which may or may not make sense independently of the parent and sibling topics in which they are embedded.

By now it is possible to recognize the DITA look in a doc set. The surest sign is the brief topics, sparsely linked, with links at the bottom pointing you to the previous, next, and parent topics. Frankenbooks are in DITA’s DNA.

Similarly, many people choose wikis because of their ease of use, and their ability to greatly expand the number of people who can contribute to the documentation set. But wikis have an information design bias in their DNA as well. Wikis favor Every Page is Page One topics; topics that are self sufficient, stand alone, and richly linked to related topics, but without any sense of next, previous, or parent. Every Page is Page One is in wiki DNA (and SPFE DNA) the way long documents are in Frame’s DNA, and Frankenbooks are in DITA’s DNA.

Of course, this does not mean that you cannot reproduce your existing design in any of these tools. You could write books in a wiki or Every Page is Page One topics in Frame. But you will be fighting the tool every step of the way.

Worse, in attempting to force your old design principles on the new tool, you may compromise its effectiveness and lose much of the efficiency you originally bought it for. This is important to understand, because the design principles built into the tools DNA, and the particular kinds of production efficiencies it provides, are not incidental to each other; they are intimately and inextricably tied together.

In SPFE, for instance, one of the principle production efficiencies it provides is soft linking. But soft linking works best and most naturally in an Every Page is Page One information model because that model makes it easiest to identify link targets algorithmically. Another virtue of SPFE is that it allows a wide variety of authors to contribute structured content without requiring a detailed knowledge of the publishing system. But that works much better in an Every Page is Page One information design, in which authors do not have to be aware of the context in which the topic they are writing will be used.

Wikis, similarly, open up authoring to a wide audience without requiring much knowledge of the wiki system, but you give much of that up if you try to use the wiki to write a book. DITA uses its information types and maps as the underpinning of its reuse model. You can force DITA to create a traditional narrative flow or a set of Every Page is Page One topics, but both will make the reuse mechanism harder to use.

Each tool, then, should be looked at holistically. All the process efficiencies and design possibilities that each tool offers are a package, and you will cause yourself much grief and expense if you try to mix and match. While all these tools allow for a great deal of design flexibility within the overall design philosophy that is built into their DNA, none of them work well when your design forces them outside that envelope. Despite what sales folk may say to begile you, none of them offer exactly what you are doing now, only better. A change of tool demands a change in design. Something is gained, and something is lost.

This is by no means easy to deal with, of course. Our design skills have been honed over the years in one kind of tool environment. Put a different kind of tool in front of us, and all our design instincts are suddenly at odds with the tool. Every design choice then becomes a focrefully act of breaking the old mold and deliberately probing and testing the limits and possibilities of the new tool and figuring out how to create a design that takes full advantage of the possibilities and is affected as little as possible by the limits.

This will be emotionally hard, as well as intellectually hard, because it will mean giving up some things you have come to value highly over the many years of your professional practice. JoAnn Hackos recently pointed out an article that argues that fear of change can best be understood as fear of loss. Changing your information tools will mean losing some things you have been used to. It will also mean gaining some new things that you want, and presumably, if you have done your homework properly, the value of what is gained will outweigh the value of what is lost. But if fear of loss causes you to cling desperately to the things you should be letting go, you will certainly increase your costs, and you will probably lose some or all of the new things you hoped for. You could end up doing the same thing you were doing before but with tools less well suited for the job.

There is another pitfall to be aware of as well. The almost universal advice to people contemplating a move to any form of structured writing is to begin by modeling your content. This is good advice in many ways. The last thing you should do is rush out and buy a tool, then try to figure out what to do with it. You need absolutely to start with your business goals and your information designs.

But if you create your information models and your information designs without looking at the tools and at the design patterns that exist in their DNA, your information models and designs will actually be based on your existing design patterns — the ones that you absorbed over the years from your old tools. If you then build those models and designs into your requirements for a new system, you will, in fact, have produced a recipe for frustration, disappointment, and cost overruns.

The only way out of this is to step back and think about the different design patterns that exist for your type of content: Narrative books, Frankenbooks, Every Page is Page One, etc. Understand them all, their strengths and their weaknesses, and what the differences between them will mean for your readers, then look at what you will need to do to move your content from its current design pattern to the new design pattern you have chosen, and what kind of development process will be most effective for you to produce content in the chosen design pattern.

Only when this is done are you ready to proceed either to the details of design or the details of tool selection.

 

 

8 thoughts on “The Design Implications of Tool Choices

  1. tim

    Good advice, often hard for some to follow. This reminds me of Howard Roark’s critique of the Parthenon:

    “Your Greeks took marble and they made copies of their wooden structures out of it, because others had done it that way. Then your masters of the Renaissance came along and made copies in plaster of copies in marble of copies in wood. Now here we are making copies in steel and concrete of copies in plaster of copies in marble of copies in wood. Why?”

    Reply
    1. Mark Baker Post author

      Thanks for the comment, Tim. That is a wonderful quote and very apropos. It makes me worry though. I know that tech pubs has been slow to adapt to communicating on the web (as opposed to merely publishing to the web) but I hope we can make the adjustment in less than three millennia.

      Reply
  2. Anne Gentle

    Mark, great read as usual. My only tenuous disconnect from the basic argument is that there are over 1200 web content management tools but not 1200 implications in choosing one web CMS. (1200 source: http://www.cmsmatrix.org/) This discomfort with your analysis and a need to discuss causes me to comment.

    Tech comm has had just a handful of tools to choose from for decades. Entering the web world, the sheer volume of tools to choose from means the analysis is not as simple as the three (plus etc.) models you’ve laid out here.

    At the heart of this analysis paralysis lies your last comment “adapt to communicating on the web (as opposed to merely publishing to the web)” – we’re in a much, much more complex business landscape and communication arena. Sorry to say, I call oversimplified here (unusual for me to conclude after reading your posts!) 🙂 The design patterns are not three but possibly thousands. However your basic guidance is true – that the tool selection will ultimately guide your design choices. Could explain why many content strategists choose to stay out of tools consulting and stick to requirements gathering, performing audits, creating models, metadata, workflow, and managing change.

    Reply
    1. Mark Baker Post author

      Anne, thanks for the comment.

      Certainly there aren’t 1200 different models of information design or development process for the web, so clearly many of those 1200 systems must be based on the same models, perhaps with small variations or omissions/simplifications. That’s not an uncommon part of technological development. At first, everyone jumps in with their own implementation of the same basic ideas.

      Over time, these multiple products with the same model get winnowed down to one, based, I suppose, on the quality of the implementation or other factors that help consolidate a market leading position. There used to be multiple competing word processors and multiple competing desktop publishing programs, and now there are just Word and Frame (as far as tech-doc oriented tools are concerned). On the other hand, there are currently dozens of tiny start-up DITA tools, all based on the same model (DITA), or trying to make DITA look like an existing model, or offering a simplified version of DITA. These too will doubtless be winnowed down over time.

      I’ll readily own up to oversimplification though — is that not the privilege of blogging? I certainly did not mean to suggest that there were only three design patterns. I doubt there are thousands — though there are certainly thousands of designs. We may have to wait a while longer for the design patterns to emerge clearly from those thousands of designs and their successes and failures.

      I don’t doubt that you are right about content strategists avoidance of tool selection, but I do see that as one of the biggest issues — a polite way of saying failings — of content strategy. The clients, and in many cases the content strategists themselves, approach requirements gathering, performing audits, creating models, metadata, workflow, and managing change with an outdated model of both the technical and social aspects of the web, a strategic vision formed in the book world by the economics, technologies, and social conventions of paper.

      One of the most tragic and bloody aspects of modern warfare has been the failure of generals, trained in the strategies of the previous war, to adapt their strategic thinking to the advances in technology over the intervening years. It has often taken years of war and untold casualties before the old generals are dismissed and new generals, who understand the strategic implications of technology, are able to form new battle plans and bring the wars to an end.

      I feel that we currently have a generation of content strategists advising people on their strategy for the web who are themselves completely steeped in the strategic conventions of paper . The clearest part of this disconnect, I think, is that they tend to look at a website, and to advise their clients to look at their website, as a publication. A website is not a publication; it is a storefront, an agora, a salon, a soapbox, a clubhouse, a coffee shop, an argument, a noise, a rubbish tip, an archaeological dig, a museum, a riot, a protest, a square, a temple, a highway, a conversation. In brief, it is a colloquium.

      Engaging with the tools issues may seem tactical or even operational, but it is necessary to free our strategic thinking from endlessly tracking in the ruts left by our old tools.

      Reply
  3. Mike Sinotte

    Mark,

    I just discovered your blog and enjoyed your post.

    I found that many discussions focus on “How are we going to document this information?” and they ignore the question “How is this information going to be used?”.

    I would like to add that when choosing a documentation tool, make a distinction between the readers that are customers (outside your company) and the employees who are within the company. The tool functionality needed to provide procedures, policies and company information to employees is different than the tool functionality needed to publish information to customers.

    For the employee reader, the tool needs to handle the dynamic nature of the work environment and allow for changes to be made by a wider range of authors (business area managers and SMEs). The employee reader tool needs to manage that the new information is properly communicated to the right employees. These tools are designed to handle faster publishing cycles to stay current with the incremental changes as the business processes evolves.

    With customer reader tools, the focus is more on creating content where the information goes through a more thorough review cycle before it is published. These documents have a slower publication cycle and the publication release is often paired with new product offerings to customers.

    Reply
    1. Mark Baker Post author

      Hi Mike. Thanks for the comment. You are certainly correct that the employee reader has different needs from the customer reader, and it is important for the tech writer to understand the priority of the needs of each group.

      An interesting side-note to this is something I have observed in more than one company. In many cases, tech writers look to employees who have direct customer contact to try to gather feedback on the content. What sometimes happens is that the feedback they get is not really coming from the customers, but from the sales staff and sales engineers who are asking tech comm to write scripted demos for them.

      Tech comm people can easily mistake these requests as coming from customers, and even if they don’t, it is hard not to react to the only source of feedback you are getting.

      Reply
  4. Paul Trotter

    Hi Mark,

    A great read, and I wish more of the organizations out there looking to change tools would take some of your advice.

    As a tool vendor, we have built a product that delivers certain benefits to our clients without involving the user too much in the underlying technology used to implement those benefits. Our goal is to get out clients, not just using, but adopting our products and reaping the benefits they were seeking in the first place.

    However, as you elegantly point out, the biggest battle we have is when clients refuse to give up their old models and attempt to force our technology to implement a model it was not designed to deliver, and more importantly, by implementing that design they put at risk the very benefits they were looking for. They want that bus that does 0 to 60 in under 5 secs.

    To further complicate things, you have many “so called” experts out there telling organizations they should design an information model first, or use one like DITA, then find a tool with the features that will implement that model.

    Ultimately any investment an organization makes must make a return within an acceptable time. The biggest trap we see organizations falling into with DITA and other XML technologies, is they don’t know what they don’t know. They believe the hype and don’t realize that for the most part it is a DIY solution that requires significant ongoing investment in tools and very specialist resources. If you have the skills and/or are prepared to pay for them then fine. We hear these stories every day and have migrated a heap of clients to Author-it from these technologies.

    Organizations need to remember that they buy tools and systems to provide business benefits like; increasing author productivity, reducing cost of localization, improving time-to-market, and improving quality, not to implement features or technologies. If more organizations focused on benefits when looking for a solution they would make better choices and waste less resources. They must also be prepared to change how they work and what they deliver to realize those benefits. If not, they should stay doing what they are doing and using what they are using.

    Paul Trotter
    Founder and CEO
    Author-it Software Corporation
    @pstrotter

    Reply
    1. Mark Baker Post author

      Hi Paul,

      Thanks for the comment. To your last paragraph I can only say, Amen.

      DITA is an interesting case. It’s greatest virtue is that it has taken the fear out of XML solutions. The problem is, it has not actually removed most of the tasks that people were afraid of. In one sense, that is a good thing, because removing the fear from the task genuinely does help people learn and perform that task. In another sense, it is a bad thing, because people are finding DITA to be more work than they first though, and because it risks giving other XML solutions a black eye.

      In documentation, as in every industrial function, there are cases where your business requirements are best served by packaged off-the-shelf tools, and cases where they are best served by custom development. Every packaged off-the-shelf tool implements a specific model of a process. This is inevitable, since you can’t provide high-level tools without implementing a specific solution model. Indeed, the chief value of such tools is in the suitability of the model they implement to your business requirements.

      To my mind, the model and the tools are inseparable considerations. To choose a tool without understanding its model is foolish, and to choose a model without understanding its tool requirements and availability is equally foolish.

      Reply

Leave a Reply