Menu Search
Jump to the content X X
SmashingConf London Avatar

We use ad-blockers as well, you know. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. our upcoming SmashingConf London, dedicated to all things web performance.

Beyond The Button: Embracing The Gesture-Driven Interface

As a mobile UI or UX designer, you probably remember the launch of Apple’s first iPhone as if it was yesterday. Among other things, it introduced a completely touchscreen-centered interaction to a individual’s most private and personal device. It was a game-changer.

Today, kids grow up with touchscreen experiences like it’s the most natural thing. Parents are amazed by how fast their children understand how a tablet or smartphone works. This shows that touch and gesture interactions have a lot of potential to make mobile experiences easier and more fun to use.

Further Reading on SmashingMag: Link

Challenging Bars And Buttons Link

The introduction of “Human Interface Guidelines” and Apple’s App Review Board had a great impact on the quality of mobile applications. It helped a lot of designers and developers understand the core mobile UI elements and interactions. One of Apple’s popular suggestions, for instance, is to use UITabBar5 and UINavigationBar6 components — a guideline that many of us have followed, including me.

In fact, if you can honestly say that the first iPhone application you designed didn’t have any top or bottom bar elements, get in touch and send over a screenshot. I will buy you a beer and gladly tweet that you were ahead of your time.

My issue with the top and bottom bars is that they fill almost 20% of the screen. When designing for a tiny canvas, we should use every available pixel to focus on the content. In the end, that’s what really matters.

In this innovative industry, mobile designers need some time to explore how to design more creative and original interfaces. Add to that Apple’s frustrating rejection of apps that “think outside the box,” it is no surprise that experimental UI and UX designs such as Clear7 and Rise8 took a while to see the light of day. But they are here now. And while they might be quite extreme and focused on high-brow users and early adopters, they show us the great creative potential of gesture-driven interfaces.

Rise and Clear9
Pulling to refresh feels very intuitive.

The Power Of Gesture-Driven Interfaces Link

For over two years now, I’ve been exploring the ways in which gestures add value to the user experience of a mobile application. The most important criterion for me is that these interactions feel very intuitive. This is why creative interactions such as Loren Brichter’s “Pull to Refresh10” have become a standard in no time. Brichter’s interaction, introduced in Tweetie for iPhone, feels so intuitive that countless list-based applications suddenly adopted the gesture upon its appearance.

Removing UI Clutter Link

A great way to start designing a more gesture-driven interface is to use your main screen only as a viewport to the main content. Don’t feel obliged to make important navigation always visible on the main screen. Rather, consider giving it a place of its own. Speaking in terms of a virtual 2-D or 3-D environment, you could design the navigation somewhere next to, below, behind, in front of, above or hidden on top of the main view. A dragging or swiping gesture is a great way to lead the user to this UI element. It’s up to you to define and design the app.

What I like about Facebook and Gmail on iOS, for instance, is their implementation of a “side-swiping” menu. This trending UI concept is very easy to use. Users swipe the viewport to the right to reveal navigation elements. Not only does this make the app very content-focused, but accessing any section of the application takes only two to three touch interactions. A lot of apps do far worse than that!

Sideswipe Menu11
Facebook and Gmail’s side-swiping menu

In addition to the UI navigation, your app probably also supports contextual interactions, too. Adding the same two or three buttons below every content item will certainly clutter the UI! While buttons might seem to be useful triggers, gestures have great potential to make interaction with content more intuitive and fun. Don’t hesitate to integrate simple gestures such as tapping, double-tapping and tapping-and-holding to trigger important interactions. Instagram supports a simple double-tap to perform one of its key features, liking and unliking a content item. I would not be surprised to see other apps integrate this shortcut in the near future.

An Interface That Fits Link

When designing an innovative mobile product, predicting user behavior can be very difficult. When we worked with Belgium’s Public Radio, my team really struggled with the UI balance between music visualization and real-time news. The sheer number of contextual scenarios and preferences made it very hard to come up with the perfect UI. So, we decided to integrate a simple dragging gesture to enable users to adjust the balance themselves.

By dragging, users can balance music-related content and live news.

This gesture adds a creative contextual dimension to the application. The dragging gesture does not take the user from one section (news or music) to another. Rather, it enables the user to focus on the type of content they are most interested in, without missing out on the other.

Think in Terms of Time, Dimension and Animation Link

What action is triggered when the user taps an item? And how do you visualize that it has actually happened? How fast does a particular UI element animate into the viewport? Does it automatically go off-screen after five seconds of no interaction?

The rise of touch and gesture-driven devices dramatically changes the way we design interaction. Instead of thinking in terms of screens and pages, we are thinking more in terms of time, dimension and animation. You’ve probably noticed that fine-tuning user interactions and demonstrating them to colleagues and clients with static wireframe screenshots is not easy. You don’t fully see, understand and feel what will happen when you touch, hold, drag and swipe items.

Certain prototyping tools, including Pop13 and Invision14, can help bring wireframes to life. They are very useful for testing an application’s flow and for pinpointing where and when a user might get stuck. Your application has a lot more going on than simple back-and-forth navigation, so you need to detect interface bugs and potential sources of confusion as soon as possible. You wouldn’t want your development team to point them out to you now, would you?

Invision enables you to import and link your digital wireframes.

To be more innovative and experimental, get together with your client first and explain that a traditional wireframe is not the UX deliverable that they need. Show the value of interactive wireframes and encourage them to include this in the process. It might increase the timeline and budget, but if they are expecting you to go the extra mile, it shouldn’t be a problem.

I even offer to produce a conceptual interface video for my clients as well, because once they’ve worked with the interactive wireframes and sorted out the details, my clients will often need something sexier to present to their internal stakeholders.

The Learning Curve Link

When designing gesture-based interactions, be aware that every time you remove UI clutter, the application’s learning curve goes up. Without visual cues, users could get confused about how to interact with the application. A bit of exploration is no problem, but users should know where to begin. Many apps show a UI walkthrough when first launched, and I agree with Max Rudberg16 that walkthroughs should explain only the most important interactions. Don’t explain everything at once. If it’s too explicit and long, users will skip it.

Why not challenge yourself and gradually introduce creative UI hints as the user uses the application? This pattern is often referred to as progressive disclosure and is a great way to show only the information that is relevant to the user’s current activity. YouTube’s Capture application, for instance, tells the user to rotate the device to landscape orientation just as the user is about to open the camera for the first time.

Visual Hints17
Fight the learning curve with a UI walkthrough and/or visual hints.

Adding visual cues to the UI is not the only option. In the Sparrow app, the search bar appears for a few seconds, before animating upwards and going off screen, a subtle way to say that it’s waiting to be pulled down.

Stop Talking, Start Making Link

The iPhone ushered in a revolution in interactive communication. Only five years later, touchscreen devices are all around us, and interaction designers are redefining the ways people use digital content.

We need to explore and understand the potential of touch and gesture-based interfaces and start thinking more in terms of time, dimension and animation. As demonstrated by several innovative applications, gestures are a great way to make an app more content-focused, original and fun. And many gesture-based interactions that seem too experimental at first come to be seen as very intuitive.

For a complete overview of the opportunities for gestures on all major mobile platforms, check out Luke Wroblewski’s “Touch Gesture Reference Overview18.” I hope you’re inspired to explore gesture-based interaction and intensify your adventures in mobile interfaces. Don’t be afraid to go the extra mile. With interactive wireframes, you can iterate your way to the best possible experience. So, let’s stop talking and start making.

(al) (il)

Footnotes Link

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18

↑ Back to top Tweet itShare on Facebook

Thomas Joos is managing partner at Little Miss Robot. He is passionate about transforming innovative business vision into digital products and services. Thomas is also very active in the design community as an inspirational speaker, sharing thoughts and insights on innovative digital creation.

  1. 1

    I’ve found little to no real insight in this article, not only do you forget to include any significant actual gesture driven interfaces besides clear and rise, you blatantly include your own firms projects without any real insights about the topic itself,
    I’m just disappointed because there’s so much to say about this topic but not many people are actually going indepth

  2. 2

    Charles Worthington

    May 24, 2013 4:31 am

    While this article is interesting on its own (though quite a bit too short) what really strikes me as interesting as what it represents: designers too ensconced in their own world to see what else is happening.

    An article on gesture driven interfaces and not a single mention about what Matias Duarte has done with Android, where gestures have become the forefront of how you interact with the OS? Android now has far more than half of the mobile OS market. If designers continue to only look at iOS, pretty soon they’re going to look around and say, “Wait, where are my users?”

    Thomas and other designers would be well served to realize this.

    • 3

      I agree with your comment, Charles. I guess it seems funny to me for this to be presented as new when I’ve long been accustomed to it on my Galaxy Note II. I switched from Apple to Android for a reason. I’m not anti-Apple, but it does seem strange that people don’t realize how far behind Apple is when it comes to innovation, at least as it pertains to smartphones.

  3. 4

    Nice read! I love gesture based apps. Need to mention one more great gesture based app – Beat Music player (, really nicely done!

  4. 6

    Facebook just removed the swipe to the right for my iPhone 5

  5. 7

    Samuel Petit

    May 24, 2013 7:41 am

    Hi! Awesome article Thomas.

    I’m working for a french startup which is designing a gesture-driven search engine. Basically users can search anything without ever typing text, and – the truly awesome thing – combine/substract images with just one touch.

    I thing we’ll release closed beta version soon (well… at least before the end of the year). Would you like to be part of it and give us your insight? That would be great!

    (BTW, any UI/UX expert enjoying Smashing Magazine as much as I do is welcome too ^^ )


    P.S. This part is not UX related but, just so you know, another great feature of the search engine we are designing is that it works with “concepts” and not “keywords”.

    How is that different? 2 short examples:

    1) The user “thumb up” a picture with an aircraft, “thumb up” a picture with a red plane, and “thumb down” a picture of a plane with a propeller. Our search engine will combine the three information to display mainly red jet aircrafts.

    2) If you search “Arsenal” on Google, it shows only Arsenal soccer team pictures. Even if the user tries to do “Arsenal -Sport”, Google doesn’t understand the concept of “Sport”. When a user does the same on our search engine, it removes all sport related pictures from the results (and displays all kind of arsenals including armories and boat-related one).

  6. 8

    I really like gesture driven interfaces, but there have one big problem. They are not obvious. Only a bunch of gestures are learnt (e.g. Zoom and Rotate Gestures) and can be used without any doubts. After my research about Gestural Interfaces for screen based devices ( and I was a little bit disappointed how the design of gestures differ. A bigger set of gestures or more complex gestures are very difficult to memorize. Marking menus could be a solution and an emphasizer for touch gestures. But I have not seen many good examples… However, the gestural based text input via Swype is really cool and proves the power of gestural interfaces

  7. 9

    I really like gesture driven interfaces, but there have one big problem. They are not all obvious and intuitiv. Only a bunch of gestures are learnt (e.g. Zoom and Rotate Gestures) and can be used without any doubts. After my research about Gestural Interfaces for screen based devices ( and I was a little bit disappointed how the design of gestures differ. A bigger set of gestures or more complex gestures are very difficult to memorize. Marking menus could be a solution for this problem. But I have not seen many good examples, especially for multitouch screens… However, the gestural based text input via Swype is really cool and proves the power of gestural interfaces.

  8. 10

    Peter Winnberg

    May 25, 2013 9:45 pm

    Interesting article, but I think that it misses the usability problems that gesture-driven user interfaces sometimes have.

    When we are designing mobile applications we sometimes incorrectly assume that the gestures we use in our applications are so natural that we don’t need to follow basic usability principles. But visibility is still a very important principle to consider when designing mobile apps.

    Consider this; if we have an application that uses 100% of the screen and no elements on the screen indicate that more content/navigation is available why should a user try to explore the application further? But let’s say that the user decides to do that anyway, which gesture should they use? Should they guess? Some sort of hint is needed. An invisible user interface can be seductively beautiful but because it is invisible it is likely to have many issues like this.

    But maybe after learning more about how the application works the user could turn off some navigational aids, like hints about which gesture to use, in the preferences. But this could still lead to the same problems again if the user forgets details about the app.

    Another important usability principle is consistency and having the same UINavigationBar means that there will be some consistency between different apps. But it also uses screen space and on a small screen this can be a problem. But if you look at web sites instead of apps, moving from one web site to another could change the user experience completely. Because of this maybe users could get used to losing some consistency for that extra screen space. But there could be other ways to solve this. One way could be to redesign the UINavigationBar and making it auto hide and only have a small button in left/right corner of the screen to make it appear. That would make most of the screen space available to the apps most of the time.

  9. 11

    Paul Thompson

    May 26, 2013 3:55 pm

    Thanks… good points to ponder for sure.

  10. 12

    Jon Harthun

    May 26, 2013 8:15 pm

    Seriously, what is next?
    -A thought driven interface? -_- …

  11. 13

    I have an app Zomato in which we have to shake phone to view next listing.

  12. 14

    Justin Megahan

    May 28, 2013 12:25 am

    Good thoughts. I think the learning curve problem will take care of itself as more and more designs adopt gestured based interaction. Already the side-swiping menu and pull-to-refresh seem completely natural.

  13. 15

    This is really useful! Thank you for the inspiring article.

  14. 16


    May 30, 2013 8:49 am

    Very inspiring article. It is almost impossible though to find map-based apps that do NOT rely on the top or bottom bar to access the secondary contents – that is because any gesture would interfere with the interaction possibilities on the map (zooming, panning, and so on). Do you think there is any chance to bring innovation into this field?

  15. 17

    What are the results of usability testing on gesture driven interfaces?

  16. 19

    Jon-Eric Steinbomer

    June 28, 2013 12:54 pm

    “Today, kids grow up with touchscreen experiences like it’s the most natural thing.” We see this time and again with our user research studies – kids approach a screen or device and start pressing, swiping and pinching away.

    I’d argue that touch is currently more of a natural interaction than NUI/kinect-driven experiences simply due to the inclusion of the tactile feedback. Will be interesting to see how this evolves as more Glass-like UI comes online.

    Good discussion!

  17. 20

    Gestures are interesting when thinking of testing. We are almost going to want to balance in-lab testing with field research ‘out in the wild’ to know how users are interacting with their devices in real settings.


↑ Back to top