design patternsYou Already Know How To Use It

Advertisement

In the first television advertisement for the iPad, the narrator intoned, “It’s crazy powerful. It’s magical. You already know how to use it.” This was an astonishing claim. Here was a new, market-defining, revolutionary device, unlike anything we had seen before, and we already knew how to use it. And yet, for the most part, the claim was true. How does a company like Apple make such great new things that people already know how to use?

One answer lies in the ability of Apple designers to draw upon patterns that people are familiar with. The interaction medium might be completely new: before the iPhone, few people had used a multitouch screen. But everyone knew how to pinch or stretch something, and this interaction pattern was easily transferrable to the small screen after seeing it done just once. As Alan Cooper writes in About Face, “All idioms must be learned; good idioms need to be learned only once.”

The Role Of Dopamine In Pattern Recognition

Our brains like to find such patterns. We are wired to search for patterns that our past experiences have shown will lead to successful interactions (in love, war, gambling, investing, etc.). Jonah Lehrer, in How We Decide, writes that our brain produces a pleasure-inducing neurochemical, dopamine, when we recognize familiar patterns in the world around us. When we act on these patterns and are successful in whatever we are trying to do, we get an additional burst of this pleasing chemical.

If we think we recognize a pattern but are mistaken, or if the pattern doesn’t behave in the way we expect it to, then we do not get that second infusion of the neurochemical, and we readjust our expectations. Many neuroscientists believe this reward system is one way in which learning takes place. The process creates a self-reinforcing, pleasure-based cycle that encourages us to learn from our mistakes and to become better interpreters of the world around us.

The dopamine reward system produces positive or negative emotions based on our experiences in the world. Lehrer argues that this reverses our age-old understanding of the role of emotions in the decision-making process. Since Plato, the rational mind has been depicted as the charioteer holding the reins on our unruly emotions. What makes humans unique, according to this metaphor, is our ability to use logic and rationality to control our emotions and make rational decisions. Lehrer’s book details recent research in neuroscience that upends this reason-based model of decision-making. Emotions, some of them caused by the dopamine-based reward system, play a central role in our decision-making processes.

Brain
(Image: Sue Clark)

These discoveries in neuroscience provide a strong argument for using design patterns in interaction design. Take the carousel pattern, which is prevalent in desktop, tablet and handheld devices. The Yahoo Design Library has a useful illustration of this design pattern. Content appears to slide in from one side of the panel; items at each end are partially obscured to indicate that more virtual space, and more content, lies outside the carousel pane; arrows appear, when appropriate, to indicate how to get to that additional content. This is a very simple pattern that people can learn after using the feature just once.

New users of Pandora will encounter this carousel pattern almost to the letter, and even if they are encountering it for the first time, they will learn it almost immediately. Then, when they encounter versions of the carousel pattern in other designs, they will recognize it before they even begin to interact with it. Their recognition of the pattern will produce pleasure as the dopamine neurons begin firing. When the user then interacts with the pattern — by clicking the arrow on either end to reveal additional content, for instance — and is successful, then more dopamine is produced, leading to additional feelings of pleasure.

Carousel
Carousel design pattern, via Yahoo Developer Network

Admittedly, neuroscientists have not yet attached functional magnetic resonance imaging machines to users in order to measure their brain’s dopamine production as they experience the carousel (or any other) interaction design pattern. To date, our insight into the brain’s responses to the patterns we encounter in the world is limited to what we can extrapolate to humans from experiments that have been conducted on monkeys and to inferences we can draw from the work of psychologists.

Lehrer’s radar technician’s story

Lehrer tells the story of a radar technician during the first Gulf War who spent several days watching blips that represented fighter aircraft returning to ship from a certain point on the coast of Kuwait. One set of blips in the early morning made the technician feel nervous, and he couldn’t explain why. They looked to him to be just the same as those he had observed hour after hour in days past, but his emotional response to this particular set of early morning blips told him that something was wrong. Acting on little more than this emotional response, he ordered the blips to be shot down — thus saving countless lives: the blips turned out to be enemy missiles en route to destroy Allied ships in the Gulf.

The technician could not explain how he knew they were not just another pair of fighter jets. It was only after much review and the discoveries of a cognitive psychologist who was brought in to review the case that investigators determined that what was different about those blips was where they first appeared on his screen: a little farther from shore than all of the other blips. He couldn’t tell at the time that this is what made them different, but subconsciously his brain detected a change in the pattern that he had been observing for hours. The change in pattern caused an emotional and somatic response of panic and anxiety and caused him, despite his reason, to order the blips to be destroyed.

The radar technician’s story (and many others recounted in Lehrer’s book) suggests that our brains observe and act on patterns without our being conscious of it. Recognizable patterns appear, our dopamine neurons fire, our learning is reinforced, and we remain in a state of “flow.” But when a pattern is broken or behaves unexpectedly, all hell breaks loose. Our brain sends out a “prediction error signal”. An area of the brain called the anterior cingulate cortex (ACC) monitors the activity of the prefrontal cortex, and when the ACC detects the absence of activity among dopamine neurons that results from the predicted event not occurring, it sends out this error signal. This results in other chemicals being produced, by the amygdala and the hypothalamus, among other areas, which causes these feelings of panic and anxiety: the heart races, the muscles tense, we become short of breath.

Broken Patterns Cause Panic And Anxiety

Ordinarily, we do not want our users to experience these feelings of panic and anxiety when they use our systems. Yet we know it happens frequently. One reason is that we often present users with interfaces that lack the visual cues to indicate what patterns are being employed. Consider Roku’s Channel Store. When users visit the interface to add a channel to their system, they are confronted with what appears to be a static table of contents. Without prior experience of the carousel pattern, users might interpret this 3 × 4 tabular interface as offering only 12 channels.

In fact, this table does behave according to the carousel pattern. Additional content does lie to the right and left of each row. The content even scrolls vertically as well, but users would never know this from the visual display of the information. Even worse, a new user will learn little about the carousel pattern to apply to their next encounter with it. Ironically, Roku is best known as a Netflix streamer, and Netflix itself applies the carousel pattern expertly to its similar table of contents in its streaming interfaces on game devices such as the Wii. Way back in Design of Everyday Things, Donald Norman defined “visibility” as meaning that “the correct controls are visible, and they convey the correct information.” Neither is the case in the design of Roku’s Channel Store, so users have no way of knowing, without extensive exploration, that the carousel pattern is being employed.

Roku Channel Store
Roku’s Channel Store.

Sometimes, the problem is the reverse: users will think a design pattern is being used when it isn’t. What we recognize as a pattern doesn’t function as we expect; our brains think that something has gone wrong, and the result, again, is anxiety and panic. Take the basic design of a list of items on a smartphone. Users of iOS know this pattern well; it is famously illustrated in Josh Clark’s Tapworthy: Designing Great iPhone Apps. A left-to-right swipe gesture opens a control for deletion, prompting the user to confirm the delete action. This design pattern is easy to learn, but its implementation in other smartphone applications is sporadic and unpredictable. Palm’s webOS email system, for example, uses the swipe gesture for deletion but offers no “Delete” button to confirm the gesture. The email item simply vanishes off the screen. In the messaging application on Palm’s OS, on the other hand, the system does present a deletion control.

iOS Swipe
Swiping left to right to delete in iOS. (Image: Josh Clark)

Early versions of the Android OS didn’t acception the swipe gesture for deletion at all, and it usually interpreted the gesture as a tap by opening the “Edit Item” page. The Gingerbread update introduced even more inconsistency to the user experience: a right-to-left swipe over a contact, for instance, opens the instant messaging app, and a left-to-right swipe opens the phone app — and initiates a phone call! A user who would naturally expect this gesture to trigger a prompt to delete the contact suddenly finds themselves calling that contact. Talk about panic!

Pattern-Matching Is Harder Than It Sounds

All of us have experienced this feeling of panic to one degree or another. I still feel it when I instinctively move my mouse (in Windows) to the task bar to return to a Web page that I thought I had minimized, when in fact (and for at least three years now) the page I am looking for is open in a different tab, rather than in a minimized, separate window. Interaction habits of mind do not change quickly. And because I use three different Web browsers on at least four different computers, I am constantly unsettled in my search for the “Home” button, which used to be to the left of the URL window in most browsers, but now is all the way on the right in the standard installation of Firefox 12 on both Windows and Mac and doesn’t exist at all in a standard installation of Safari. There is no longer a reliable pattern to determine where I will find the “Home” button on a Web browser. But my brain wants one, feels good when it finds one and rebels (chemically) when it doesn’t.

To be sure, inconsistencies across platforms, browsers and software can have many causes, from patent issues to design legacies. And it is inevitable that interaction designs will change and improve over time. We should not be held to existing patterns just because the human brain prefers it. But we can design according to our developing understanding of how the brain functions. We can employ idioms, such as “pinch,” that are not obvious but are quickly learned. We can progress gradually, building on fundamental elements of existing designs so that new interaction designs retain enough of the old that our brains still recognize them. We can also cautiously introduce new schemes as redundant elements: one doesn’t have to use three- and four-finger swipe gestures on the MacBook Pro’s mousepad, but once one discovers these gestures, they are easy to adopt as natural improvements to the pointer controls and buttons in application interfaces.

In fact, this last approach takes advantage of the brain’s chemistry. The prediction error signal is sent when an expected event does not occur and the result is disappointment or failure. But sometimes, the result of a prediction error is delight, not panic. The expected result did not occur, but something better did. David Rock, in Your Brain at Work, observes that this experience of delight or novelty also produces dopamine and feelings of pleasure. The experience is similar to that of humor: jokes often work because the punchline presents an unexpected twist, a novel outcome. More importantly, jokes work best when the stakes are minimal: no one really gets hurt in a pratfall. When jokes cut too close to the bone, they are painful. We cringe.

New interaction designs can be introduced according to the same principle: if they cause unexpected delight, and no one (and no one’s data) gets hurt, they will induce unexpected pleasure and will be quickly adopted over the legacy designs they are meant to replace.

Further Resources

(al)

↑ Back to top

  1. 1

    Bold claims in this article. Apparently Jakob Nielsen seems to disagree about iPad usability:

    http://www.useit.com/alertbox/ipad.html

    0
    • 2

      To be fair, many of the problems that the author of the article you linked encounters are inside of third party apps or websites, so not really a fault of the iPad itself or Apple. He could have applied the same points to many android and windows phone apps too.

      0
      • 3

        Well what is an iPad without the apps? As Apple is notoriously keen on control they could provide stricter control for 3rd party developers for using more reliable design patterns on the iPad. This article states that “You already know how to use it”, but clearly user testing (by Jakob Nielsen and others) shows that isn’t always the case – even for more experienced users.

        0
  2. 4

    Granted the iPhone works great but is far from perfect. The construction of the iPhone seems all style and no function, I have seen more shattered cases on the iPhone, both front and back than any other phone. They are the worst example, but many other manufacturers produce similarly un robust products, that do not fit in with peoples lives in the real world, they make people have to adapt, to use kid gloves on these fragile items. So how does it work in the Apple Design team, yeah likes make this uber cool tech, lets make it so cool that the first thing anyone does on purchase is cover it up.

    0
  3. 6

    An awesome case of my degree in psychology being useful for once (even though only to understand what Charles means here and reminiscing on hours of cramming for exams). Indeed, the feeling of success is very important – as we know emotions play a big role in the decisions our users make. I still remember my dad being excitingly showing me how he can pinch zoom on his first iPhone.

    One thing I’d like to add though is priming. Although pinch zooming and scrolling on touch screen seem intuitive, there is a chance that we actually know how to do it because of the vast amount of ads and tutorials that we have seen demonstrating this. In fact, many of us bought the device for that function. My point is that it is hard to tell whether something is a learned interaction or if it is just successfully adopted from real world. It might be a misstake to expect life objects/interactions to be easily adoptable on other devices. It might need a combination of many more factors and effort that make it work.

    0
    • 7

      Hi Dmitri,
      Thanks for the comment. You write: “there is a chance that we actually know how to do it because of the vast amount of ads and tutorials that we have seen demonstrating this.” I completely agree and attribute some of the quick adoption of iPad to the advertisements, which show the device from the user’s perspective. We usually just see hands interacting with the screen, which helps us identify as users and adopt the new interaction idioms that much more quickly.

      0
    • 8

      I think a lot of it is learned via older devices such as Palm One when it was around. Although it used stylus, but some of the UI concepts are the same. We are also all learned on computers which means it is easier for us to quickly figure out interfaces.

      0
  4. 9

    One problem that some patterns ( like the swipe to unlock, home button, etc) is that, as they become popular, and users expect for them to exist, some teams might end up not being allowed to implement them due to pattent restrictions… unfourtunately.

    0
    • 10

      Patents on UI patterns are the bane of UX.

      0
      • 11

        I do allude to this briefly: “To be sure, inconsistencies across platforms, browsers and software can have many causes, from patent issues to design legacies.” I actually think the effect of patents on usability will make a good next essay.

        0
  5. 12

    While your explanation for design and usage patterns sounds nice to read and good to have, the computing world today has the habit of patenting such gestures, features and behaviors so that competitors are locked out from using the same experience.

    ‘Inconsistency’ if you do not follow the features introduced by the game-changer and ‘Shameless copier’ if you follow.

    0
    • 13

      Patenting and patent trolling is a definite problem, but mostly: for hardware manufacturers. As developers and designers we are still in a good place: manufacturers fight for these patents so that they could make an API available to the devs who make apps. Non the less, still a problem – just clarifying that we’re not under fire on this one yet ;)

      0
  6. 14

    Come on it doesn’t take a stroke of genius to slightly modify the iPhone 4 UI and voila! The previous users know how to use this one too. Genius. Magic.

    0
  7. 15

    It’s true, we technically did know how to use it. However, I found that most non-techs I introduced to the iPad suffered from a mental block garnered from the fact it was technology. A sort of pseudo fear of the unknown.
    When the iPad was placed in front of them they instantly switched to the “I don’t know how to use this” mode, when even a simply press becomes a desperately long pause followed by a shaky extended finger.

    0
  8. 16

    Your comments on Android’s deletion behaviour are a little misleading. Android has never implemented swiping to delete as any part of the stock UI, outside of manufacturer skinning. I think you’re talking about Samsung’s TouchWiz skin here. The default deletion behaviour on Android is to long-press the item, which brings up a menu containing relevant actions such as delete. To be honest I prefer this to Apple’s implementation as it’s harder to accidentally trigger and offers more interaction options (such as mark as read, label email etc.)

    0
    • 17

      This is a good point, thanks for distinguishing between Android and the TouchWiz skin from Samsung. My point about system behavior inconsistency still applies, but shouldn’t be directed at Android. The fact that Samsung and others can apply their own interaction layer just makes the problem worse.

      0
  9. 18

    “You already know how to use it” just meant “It has the same OS as your iPhone and we made a larger screen” It takes no design patterns or anything to achieve that effect. It wasn’t directed at non-iPhone users. The iPad is thus a very bad example here in my opinion.

    0
  10. 19

    I have to disagree with some of your points here. A design pattern is functional only if it is intuitive. You use the iOS swipe to invoke delete as an example while it can be argued it to be very unintuitive and not even a very good one. I also fail to see why you hold that a standard and expect other operating systems blindly follow? Android reserves swiping for moving between screens which, in my opinion, is a natural gesture and doesn’t require chance discovery or learning like the swipe to uncover delete button? I find it sad that our generation’s designers have very uncritical view of Apple’s design and consider it perfect on the face value. I’ve seen this in companies I’ve worked. It seems to lead into bad designs very easily.

    0
    • 20

      I don’t claim that these design patterns are “intuitive.” In fact I agree with Jef Raskin in The Humane Interface that we too often call something “intuitive” when we really just mean it is easy to learn. I also don’t really apply a value judgement in the essay–I don’t claim (or think) that Apple’s gestures are necessarily better than others. My only point in the section you refer to is that inconsistency between platforms is disconcerting and prevents the process of pattern recognition that I am interested in.

      0
  11. 21

    It’s a pity such a good article almost immediately gets bashed because it’s (partly) about iOS. It’s just an example people, Android is awesome too, have a cookie.

    Anyway, I think most of the ‘you already know how to use it’ is because most UI patterns rely heavily on skeuomorphism and visualizing elements that mimic real-world interface elements. It’s not that far a stretch from the carrousel pattern to something simple like a numeric combination lock with some dials on it from 0 to 9. You can only see 3 numbers at a time, but the rest is just a turn away.

    Hell even HTML radiobuttons are called that because they mimic old-school car radios, as you can only select one option/radiostation at a time…

    0
  12. 22

    “The Gingerbread update introduced even more inconsistency to the user experience: a right-to-left swipe over a contact, for instance, opens the instant messaging app, and a left-to-right swipe opens the phone app — and initiates a phone call! A user who would naturally expect this gesture to trigger a prompt to delete the contact suddenly finds themselves calling that contact. Talk about panic!”

    A use who would naturally expect this gesture is not an Android user, because this is no design pattern at all in Android. This gesture belongs to iOS. Also the user interface shows what action is executed when the user finishes the swipe by lifting of his finger off the screen.

    The most common way to delete an item in Android is by longpressing it and choose the option to delete it, or in bulk selectionmode you just have to press the trashcan in the top of the bar after selection multiple items.

    0
  13. 23

    Some points on the “swipe right to delete” pattern:

    You say, “This design pattern is easy to learn”. I emphatically disagree. I’ve used iOS only a little, but I never knew about it. Take a look at the screen: there is absolutely nothing there to suggest that such an action is possible. You would only find out by accident or by trying various gestures one-by-one in desperation.

    The pattern may be easy to remember, but the problem is that it’s undiscoverable. For better or worse, this is an approach that Apple take: they prioritise the absence of clutter and number of interactions over discoverability. Balancing these things is somewhat subjective and there is no one right answer. Apple’s approach works for Apple because of their dogma: without the consistency (and let’s ignore some things like the iOS5 iPad app store), the “clean but undiscoverable” ethos would fail entirely.

    Apple’s UI works very well once you’ve learned it, but learning it is not something that it assists you with at all. Many of the UI operations on my iPad I had to be told about by other people, and I’m sure there are others I have no idea about.

    So whilst Apple’s UIs have many positive traits, let’s please recognise that those a number of those traits are achieved through certain necessary tradeoffs, rather than believe that there exists The Perfect UI and it happens to be Apple-flavoured.

    0
    • 24

      I’d echo this. When I got my iPhone a few years ago it took me several days to realise that I could swipe to delete, and I discovered it entirely by accident.

      0
    • 25

      I still remember teaching students how to use a mouse when my college at the time got its first Windows computer lab. For them, it wasn’t natural or intuitive but it was “easy to learn” once I showed them. I agree with what you write above, just wanted to explain better what I meant by “easy to learn.” Discoverability is important and as you say, there are tradeoffs.

      0
  14. 26

    Always challenge received wisdom on existing patterns. Granted, we shouldn’t spend time trying to fix what is not broken but without keeping this principle at the forefront we will never innovate on patterns that have become antiquated in our ever changing UX landscape.

    0
  15. 27

    “A user who would naturally expect this gesture to trigger a prompt to delete”
    Why should the iOS way to interpret this gesture be any more “natural” than the other interpretations?

    0
    • 28

      I probably should not have used the word “naturally.” And I meant this as a restrictive clause, i.e, not all users, just users who have learned the pattern previously. So “naturally” refers more to the brain’s processes of looking for and learning patterns than some implicit value claim as you interpreted it.

      0
  16. 29

    I completely disagree with the bold statements made in the advertisement. I’m still getting friends and colleagues showing me how to use things I didn’t know existed. I would argue that the user interface and controls definitely follow similar patterns to other apple products, but should you have not used apple products before, then it certainly isn’t 100% intuitive.

    Still, good article though!

    0
  17. 30

    Great article and I agree with many of your points except where it comes to inconsistency between platforms. How can you define what is the “right” pattern that is “simple” for others to use and how do you know that convention is the best it can be? For example, your anecdote about the delete message swipe gesture on iOS where you cite other platforms as causing “panic” among users – it would only cause panic to those who were migrating from iOS and had LEARNED that gesture. If iOS was not your first platform, then you would have no problem picking up the other gesture; case in point I had no idea about this message gesture and didn’t find it intuitive at all until someone showed it to me. In the context of an Android phone the swipe left/right to open functions is a convenient shortcut, while tap-and-hold to open the menu is an idiom pervasive throughout the platform. Not to say that Android is best, or Apple is best; I’m just trying to be non-partisan here and point out that it takes equal amounts of learning for the same function in this case. Your article seems to imply a first is best mentality, when often human beings don’t get it right the first time. I’d hate for all of the industry to follow such thinking.

    0
  18. 31

    I kind of liked the way the swipe gesture worked in webOS to quickly clean up my emails. I remember that webOS (and most apps) did have the delete confirmation described here (http://www.webosnation.com/just-type-remove-items-just-type).

    0
    • 32

      The only phone I’ve stood in line for was the Palm Pre and I’m still a webOS fan. As I write above, the email application didn’t present a delete confirmation while other applications on the phone did.

      0
  19. 33

    “You already know how to use it” is going much too far. When I first got an iPhone, I wanted to call a contact from my address book. I opened the contact and looked for a “call” button. “What the heck?” I said, and one of my friends (who’d had an iPhone for a while) said, “Oh, you just need to tap the number. It’s so intuitive,” he said, “ONCE YOU FIGURE OUT HOW TO USE IT.” This is an example of an iPhone design pattern that I did not find intuitive.

    Another example is the “swipe to delete” you mentioned earlier. This pattern isn’t particularly intuitive, and I don’t see why other mobile apps would necessarily emulate it. In my experience, the Android address book still does not behave this way, like this article suggests. And that’s fine, because it’s not a great design pattern just because Apple does it, IMO.

    0
  20. 34

    The “swipe to delete” interaction sounds like a UI nightmare personally. This can easily be performed accidentally, whereas the long press to bring up additional options including Delete seems much more practical. Android and Windows Phone use this interaction to great effect.

    0
  21. 35

    Interesting and thought-provoking article!

    My one concern is that it draws heavily on the writings of Jonah Lehrer whose credibility was blemished due to the exposure of some practice of fabrication in his work, _Imagine_.

    Now since this news broke just around the time of this article’s publication, Smashing Magazine or Charles Hannon can’t be held at fault.

    Still, having come to this article only recently (I do a lot of ‘save for later’ing), I think an update/mention might be in order that acknowledges the potentially dubious credibility of some of Mr. Lehrer’s claims

    0
  22. 36

    Alexander Chalkidis

    October 20, 2012 10:56 pm

    Such debates could of course (and are to a certain degree) be resolved through research with people not exposed to the technologies in question before. I think you will find that it is neither Apple or iPad specific of course. No, pinching is not a “natural” way to zoom in to a picture of course; bringing it closer to your face is!

    0
  23. 37

    Very good!!

    0
  24. 38

    “You already know how to use it”

    And yet someone else had to show me where the Mail button was hiding and how to install an app…

    Apple devices also don’t seem to listen to my fingers. I don’t blame them specifically for this – I seem to have Apple-ignored fingers.

    0
  25. 39

    Very good… Just wondering (and suggesting!) why not have a ‘share on linkedin’ link here?!

    0

Leave a Comment

Yay! You've decided to leave a comment. That's fantastic! Please keep in mind that comments are moderated and rel="nofollow" is in use. So, please do not use a spammy keyword or a domain as your name, or else it will be deleted. Let's have a personal and meaningful conversation instead. Thanks for dropping by!

↑ Back to top