5 Software Design Strategies That Let Users Scale Their Brains
Human Hardware Challenges
When designing and developing software, it is critical to take into account the limitations of the technology employed, especially hardware—things like computers, boxes and other physical devices. But there’s another aspect to hardware that should be taken into account and is often overlooked: the user.
As far as computing power goes, human hardware is pretty limited. Don’t get me wrong—humans can do amazing things, but computationally we’re often the weak link in the chain. Don’t believe me? Check out Stuart K. Card, Thomas P. Moran & Allen Newell’s Human Processor Model. This model draws analogies between computational hardware and human cognition / perception processing time. It can serve as a useful metric when estimating cognitive load and time on task of a particular design.
Diagram of the Model Human Process concept (right) and a roughly analogous set of computer hardware (left)
We don’t always think of ourselves in the chain of custody when technology is considered but our brain, motor processing and perception systems all act a lot like hardware when using software. And the point of most software is to allow users to cognitively process information and interact with that information, whether it’s by paying bills online or piloting an airplane. Therefore, taking into consideration the limitations of our own human “hardware” should be central to any type of design effort. To design a successful user experience, it is imperative to consider not only technical limitations of the system itself, but also human limitations in perception, cognition and motor movement.
It is also important to note that our ability to transfer information from perception and working memory processing to cognitive processing, longer-term memory and/or motor activity gets significantly impaired as distractions arise. Basically, we’re bad at parallel-processing information. This is a key factor to consider, especially when users find themselves in a frenetic, interruption-rich environment. It can be easy to underestimate these limitations and overestimate a user’s appetite to overcome them. In this post I’ll identify some common human hardware pitfalls that arise when designing software and outline some strategies to mitigate those issues. In other words, let’s help users scale their brains.
Common Pitfalls and How to Avoid Them
1) Pitfall: Information overload
Overloading the user with information is the most common problem on this list. This is for good reason—most users, particularly highly specialized ones, will consistently provide feedback that they’d like to see more information and click (or tap) fewer times to get to it. Misreading these cues often creates a design paradox: there is way too much information but users are asking to see more? The trick here is not to show less information, but rather to figure out which information to show en masse.
Data plotted over 3 dimensions (left) and 2 dimensions (right). While the right-side data loses some information with the reduction in dimensions, it is more readily digestible and can direct the user to drill into more detail to progressively disclose more complexity.
Solution: Show meaningful information at every level, prioritize what’s important and progressively disclose the rest.
Showing large amounts of homogeneous info is good because users can quickly digest one set of data organized or displayed in the same way; if you need to go deeper and see more dimensions than just one, then two dimensions is most often the sweet spot. There are techniques to show additional dimensions, but in most cases, this information should be progressively disclosed by a user’s explicit action once they’ve digested the initial view provided to them.
Ouch, I really do have a headache! It is going to take a while to understand what is going on here… Image source.
2 dimensions are much easier for our brains to processes, even when large amounts of data are presented. Image source.
What’s important here is to understand which two dimensions of the information are most vital to the user, then homogenize or categorize the available data into something that can be shown across two dimensions. If you limit the dimensions, the human brain can do what it does best—gather meaning from relationships through generalizations. This will act as a signpost to tell the user what information they should drill into. The user can then explicitly ask the system to progressively disclose second-tier information.
2) Pitfall: Ignoring environmental distractions and impediments that erode attention to your application
As Stuart Card, et al., have illustrated in the Human Processor Model, the window of opportunity to move something from working memory to cognitive processing is very finite, and that opportunity degrades significantly when the user is managing more than one task at a time. Users don’t typically have the luxury of tuning out the rest of the world to use your software. More and more, users access systems across devices, outside of relatively sedate settings such as a closed-door office or a quiet night at home. They may need to do something in a pinch while on the go, even if your primary use cases do not factor in mobile devices or small screens. Impediments such as accessibility challenges should also be considered. For example, does your interface use color only to signify errors or indicate status? A significant segment of the population may not see that due to color blindness.
View of the Expero blog page: Normal (left) and simulation of deuteranopia, a form of green color blindness (right). Simulation generated using http://www.color-blindness.com/coblis-color-blindness-simulator/.
Solution: Know what your users do (outside your system) and design your application to fit within their environmental challenges.
If a user’s attention is dragged away from your software, upon returning, the application should provide enough context to bring them back to where they left off with as little activation energy as possible. Here’s a trick I often use as a gut check: look at your screen for 3 seconds, then look away and see what you remember. Now factor in that you’re pretty motivated to remember what you saw and you are more familiar with your application than a large percentage of your user base.
Make sure you experience your design in suboptimal conditions—look at it on an old monitor, take your phone outside and see what your app looks like in the glare of the sun, run your website through an accessibility audit system. You might think these conditions are edge cases and your users will make do, but you might be wrong.
Take the time to understand the environmental factors of your target users and experience them firsthand, if possible. If you are designing an app meant to be used by nurses, spend some time in a hospital near a busy nurses’ station. Chances are, after witnessing the constant din of alarms, phone calls and other interruptions, you’ll design your app with a new appreciation for just how often your users are forced to switch contexts.
3) Pitfall: Underestimating the time it takes to process information and perform a task
Even after many years of experience, I am consistently surprised about what trips up usability test participants during testing sessions. The parts of the application that I anticipate will be the easiest to comprehend often emerge as the problem areas, and vice versa. This underscores the innate bias UX designers bring when evaluating our own designs. We know why something is designed the way it is. Users do not have the benefit of that reasoning, nor do they care about it. The result is that the cognitive load to contextualize and navigate an application is often much higher for users than we as designers anticipate. Over time they may become more familiar with task flows within a given application and things will become faster and easier for them. But it is dangerous to bank on this assumption, especially since negative first impressions can hinder user adoption. Therefore, revealing speedbumps early and addressing them should be paramount during the design process.
Solution: Test, test, test.
The easiest way to mitigate design bias and uncover unanticipated issues is to watch target end-users attempt to use the solution early and often. Note there is an intentional emphasis here on use; gathering feedback on a static mockup is not at all the same as actually using an application. What prospective users do and what they say they would do are sometimes very different. Testing early means that there is an initial investment to make interactive prototypes, but this investment will pay dividends later on in the form of averted rework of a full feature or build. And prototyping early is less of an investment now than it has historically been with the advent of new tools and technologies. There are now leaner—and more cost-effective—ways to do this, as my colleague Valle Hansen has pointed out in other posts (see 4 Ways to Kick-Start Lean User Research for Agile Product Teams).
4) Pitfall: Overestimating the interest users have in using your software
There are obviously exceptions, but generally people use software to get things done. Often, applications we design account for only a small part of a larger task, in a sea of tasks that need to get done on any given day. In these types of settings, the more efficient and transparent a user interface becomes to the end-user, the more usable it is. As designers we are trained to sweat the details, corner cases and initial impressions of those who see our work. This can lead to a misconception, however, that users of the software we design are interested and actively looking for the same details. This may even lead to a tendency to make our design more overt, noticeable, requiring more attention to detail by the user in order to complete a designated task or fulfill a basic goal. As stated above, your solution most likely is competing with other environmental variables for the user’s attention. The most successful design is often the one that the user barely notices as they complete tasks in an application while attending to other tasks in parallel.
Solution: Understand what your users really want to do and how they want to do it, then optimize the UX to achieve those goals (aggressively) with as little friction as possible.
So how do you make sure you are designing an interface that attracts just enough attention? This skill requires a very rigorous understanding of how and why users do what they do (inside and outside of the system). Once you understand what role your solution plays in the larger picture of the what the user intends to accomplish, tailoring your solution becomes a lot easier (and probably humbling). The truth is that a user typically intends to spend far less time in your system than you imagine. Some of the tasks that you imagined those same users would perform in your solution will happen, at least in part, outside of your solution. Your focus should be on finding out what the core value is to your users and eliminating distraction when at all possible either by deprioritizing or removing extraneous information.
5) Pitfall: Being lulled into complacency because your software is not designed to be “walk-up-and-use”
Next-generation cockpit (left) and a 747 cockpit from the 1970s (right). Both interfaces are designed to be used by highly specialized, trained pilots. This is illustrative that simplifying as much as possible allows even the most expert user to reserve brain power for the tasks that matter.
While it’s important to include the right amount of complexity and flexibility for the type of user a given solution targets, it is equally important not to confuse necessary complexity with unwarranted complications. Complexity brings nuances that users will often miss if these details are not at the forefront of the way they think about completing a task. Unbridled flexibility and customization lead to ambiguity and conditional states that the user has to spend time to logically untangle to make sense of. These undesirable outcomes often get ushered in under the guise that they’ll be handled in training. But that assumes that 1) the training will be accessible at the right time; 2) the user is willing to invest in the time it takes to learn the system; and 3) the frequency of use will counteract any atrophy of the knowledge a user obtained during a training session. The reality is even if a user has esoteric knowledge that sometimes requires complicated interaction, that same user often wants that complexity masked in something simple until a complicated use case arises. This is often referred to as the myth of the power or sophisticated user. While user personae will often reveal sophisticated goals that require sophisticated solutions, that seldom leads to necessarily “difficult to use.”
Solution: Assume the worst (even if it is not realistic).
Start with assuming the worst-case scenario—someone is going to use the system you’re designing who has not been trained to use it. This might be unrealistic but it sets a defensive posture for every design decision that you make. As stated above, also keep in mind that even a sophisticated, longtime user can be hampered by multitasking and distractions. The burden of unnecessary cognitive load is frustrating for any user, new or seasoned. As with anything else on this list, testing your assumptions with direct observation of usage is the best way to mitigate missing the mark.
To Sum Up
Always remember that human cognition is remarkable but also limited. Keeping the constraints of human hardware in mind as we design allows us to build tools that accentuate the strengths without being hindered by the weaknesses. Heterogeneous information, distractions and competing stimuli are the chief constituent factors in impeding our cognitive capabilities to scale with the an increase in information. Treating these factors just like technical limitations in any other system means we can measure their effects and use design to overcome them.