Originally published at ribbonfarm.com
Work can be a delight when your tools and your environment are crafted in ways that enable you to focus on the task at hand. However, most people I suspect have only limited experience with this sort of situation; it’s rather common to see everyday tasks performed with sub-optimal tools. Software engineers are in a privileged position, parallel perhaps to blacksmiths of the past, in that the same skills used for their work may be deployed toward tool improvement. Correspondingly, they pride themselves on possessing and creating excellent tools. Unfortunately, most other roles in a given business are ill-equipped and poorly positioned to effect a similarly-scaled tool-chain improvement effort. Instead, they are reduced to requesting assistance from other departments or outside vendors, a relationship which Kevin Simler highlighted last year in a spectacular post entitled UX and the Civilizing Process. You should read the entire piece, but the salient portion for our purposes is the following paragraph:
You might think that enterprise software would be more demanding, UX-wise, since it costs more and people are using it for higher-stakes work — but then you’d be forgetting about the perversity of enterprise sales, specifically the disconnect between users and purchasers. A consumer who gets frustrated with a free iPhone app will switch to a competitor without batting an eyelash, but that just can’t happen in the enterprise world. As a rule of thumb, the less patient your users, the better-behaved your app needs to be.
Any given software project will be improved by increased usability. Nevertheless, we’ve all witnessed moments where “more cowbell” doesn’t seem to effect the desired improvements. An unalloyed good in its tautological form (better is better), it is in the specifics that we see usability as a concept fetishized.
This isn’t an accident; in fact, there can be an inverse relationship between the best user experience at the level of an individual or a small group, versus the best user experience for an organization or a network of organizations.
In poker, a tell is some sort of behavior which gives hints about the card’s in a player’s hand. Poor usability is a tell which may indicate that an organization’s and a user’s needs are in conflict, and that the organization’s needs trump.
Most days, I spend my time providing software support, consultation and training to sysadmins and programmers working on the iSeries platform (aka AS/400, aka System i aka i5 aka POWER i aka whatever IBM marketing is calling it this week). When I train new employees on the system, they are often taken aback by the terminal emulation software with the “hacker classic” green-on-black text interfaces. “This system looks like it belongs in the 80s” some say, while others are perplexed as to how non-technical users could ever be expected to use such a system (these comments betray just how much cultural conditioning plays into what is or is not considered usable).
The iSeries is neither the zeitgeist nor a high-stakes consumer app. It’s rather the epitome of Simler’s “rustic enterprise”; parallel in ways to how, in lead-up to the industrial revolution, you could observe a technological downgrade on the order of a couple centuries by moving inland from coastal Europe.
I have no qualms about complaining about a platform’s shortcomings, but playing city mouse/country mouse is only interesting when it combines clear-eyed critique with a nostalgia-free empathy for the alien and seemingly outdated.
In these introductions I typically begin by talking about the sorts of users who work on iSeries applications. Warehousing, logistics, manufacturing and accounting departments or companies are all well-represented, and we’ll pick one that the user is most familiar with to talk over; accounting, for instance. Software, at its best, functions as an extension of the mind, and becomes transparent to its user (Heidegger famously coined the term zuhanden, or ready-to-hand, for this sort of human/tool synergy). Csikszentmihalyi’s concept of flow doesn’t detail human relationships to tools, but flow is interruptible and one of the primary sources of interruption is tool presence (vorhanden); that is, when a tool (by its inadequacy toward the task at hand, or simply by a shift in perception) is recognized as separate from the tool user.
If you ever watch a seasoned accounts payable clerk run invoices on a “green screen” application, they fly. Sometimes they outpace the screen refresh rate, and they’ve input instructions for the next two or three screens before they’ve loaded. The effect is akin to watching an experienced vi
(or emacs
) user write code, merge files, check emails and otherwise manipulate the universe through effective utilization of their beloved text editor. To identify with these body extensions is to provide kindling for flame wars between different tool devotees. Given stable enough tasks and users, tools will be refined and ultimately recede until they become invisible, as their capacities become absorbed and assumed, and the sunk costs of obtaining expertise are lost to time. The parallels to etiquette are good, but at peak the better analogy would perhaps be speaking in one’s native language.
Interestingly, a side-effect of this receding-out-of-view of our best tools is that we discount their positive effects, and this discount comes to bear when the advertised usability of new tools is measured against the status quo. These are extreme examples, but I’ve seen more than one company move from thick-client terminal emulation to a web application version of the same, only to discover that the penalties were significant and sustained. Not only were the refresh times between pages longer on the “more usable” web application version, but commands and data could not be entered onto new pages until they had loaded. This incurred a double penalty; first, the extra second or two multiplied by every page load, but more significantly the cognitive disruption of losing the flow and waiting on the application in order to proceed with one’s work. One user I spoke with gave a thoroughly unscientific estimate of a 20-30% performance penalty moving from the old system on which she was experienced, to the new. If the old method was akin to a native tongue, the change could be consider akin to having to conduct office calls in colloquial French, given a 4-hour training class and two weeks of practice.
There are seemingly irrational and rational reasons for this state of affairs to come about. On the irrational side, novelty bias and a preference for beauty can distort technical comparisons, particularly when the benefits of the integration of the existing solution into your organization are difficult to quantify, and some of the costs of switching are necessarily hidden.
However, it’s the rational reasons for switching to a rhetorically “more usable” but quantitatively less efficient solution that are slightly evil and particularly interesting. We’re used to thinking at a human-level resolution, but if we zoom out to the larger agentic level, that of the organization (thank you, Mike Travers), what is perverse for the individual or the small group may conform to organizational reason.
We held as constant in the example above the stability of the work, and of the workforce employed. What if one or both of these is unstable? Or rather, what if you have other reasons for being willing to accept the costs of this instability?
The 2008 recession forced a lot of companies in the small-to-medium business (SMB) range out of business or into strategies for stemming blood loss. Layoffs, hiring freezes, and increased workload for existing employees all helped reduce profit loss, and in the longer term many companies invested in a strategy of increased automation, with new jobs growth generally bifurcating between very-highly skilled work and deskilled work. This is the white collar echo of the blue collar automation of factories and warehouses. (For companies already flush with deskilled labour, such as McDonalds and Walmart, you saw an explicit reliance upon social services to “make up the difference” imposed by the slowed economy).
To return to the terminal emulator example: a company switching from a thick client terminal emulator to a browser-based alternative is a tell; it sees its employee base and the labour market as being more volatile and is anticipating a higher turnover rate, with the attendant devaluation of employee training and technologies whose payoffs are slow initially but steady thereafter.
Similarly, in the world of business forms processing, user intervention is a given, but the types of use change considerably: instead of a department of AP clerks handling incoming invoices, for instance, you have people who simply scan documents and then people who exception handle whatever the software can’t automatically recognize or calculate. These sorts of solutions also impose a skills gap, a discontinuity that makes it more difficult for unskilled workers to work their way into skilled positions; no amount of scanning pieces of paper will, alone, prepare someone for a skilled accounting position. It is the workers in the middle that are disrupted by these sorts of technological interventions, as at the bottom labour is too cheap to meaningfully disrupt, and the top as yet requires too much independent judgment to replace.
Not all technological change falls into this category. There are technologies that are a boon to both the workers and the organizations as a whole. The point here is that when a technology is adopted that is “less usable” to workers than what it is replacing, this is simply an indication that worker and organizational interests are no longer in alignment, and the workers’ interests have been trumped.
I’ve focused on software technology in the examples above, but technology is a far more flexible concept than that. Little understood as technology are the social organization of a firm, its schedules, its incentive structures, its jargon and bureaucratic processes. Next time we’ll explore the world of undead technologies; abortive attempts at innovation that brought more new problems than solved old, and which were not catastrophic enough to kill their host firm.