Menu Search
Jump to the content X X
Smashing Conf Barcelona

You know, we use ad-blockers as well. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. our upcoming SmashingConf Barcelona, dedicated to smart front-end techniques and design patterns.

How To Build Honest UIs And Help Users Make Better Decisions

Many apps today, such as Google Now, Spotify and Amazon, make assumptions about user preferences based on personal data. They may even use this information to make decisions on our behalf, without any direct input from us. For example, Facebook tailors your news feed and Amazon recommends products — both hiding “irrelevant” information and only showing what they think you will like.

This type of design pattern, where user choice is removed, has recently been coined “anticipatory design”. Its aim is to leverage data on user behavior to automate the decision-making process in user interfaces. The outcome lowers the excessive number of decisions people currently make, thereby reducing decision fatigue and improving decisions overall.

Further Reading on SmashingMag: Link

Despite the good intentions imbued in anticipatory design, though, automating decisions can implicitly raise trust issues — especially at a time when trust has been eroded through the use of dark patterns6 in user interfaces. Therefore, in contrast to the deceitful dark patterns that are meant to trick users, this article will look at how we can give people confidence in the decisions made for them by using “light patterns,” which ensure that user interfaces are honest and transparent, while even nudging users to make better decisions for themselves.

First, Why Decide For Your User? Link

In today’s online world, consumers face more options than ever before. For example, consider shopping in marketplaces such as Amazon and eBay. Even when we know exactly what we want (replacement Apple headphones, for example), the choice can still be overwhelming:

Amazon and eBay overwhelming product choice7
The overwhelming number of options for the exact same product on Amazon and eBay (Images: Amazon168 and eBay9) (View large version10)

Another example is all-you-can-eat music services, such as Spotify, which put huge amounts of music at our fingertips, with no extra cost to listen to more. The additional choices quickly add up:


The overwhelming choice of music on Spotify. (Image: Spotify11)>

While more choice is welcome, too much can create a daunting experience for the user, because then actually making a decision becomes difficult. This problem has been highlighted extensively before, most notably by Barry Shwartz’s paradox of choice and Hick’s Law:

Both studies suggest that by reducing the amount of choice in a user interface, we can improve a user’s ability to make decisions, thereby reducing frustration and making the user experience better.

Articles about “decision fatigue” back this up, stating that making high numbers of decisions can cause people to be less effective at making the important decisions in life. That’s why Mark Zuckerberg wears the same style of clothes14 every day:

I really want to clear my life to make it so that I have to make as few decisions as possible about anything except how to best serve this community.

How To Reduce Choice Link

Reducing the number of choices for a user has, therefore, become the focus for many of today’s apps. This has been done in a number of ways, two of which we’ll discuss now.

Make Options More Relevant Link

Many products are personalized to individual preferences, limiting options only to those deemed relevant to the current user. This has been done famously by Amazon, both on its website and through tailored email recommendations based on data collected on customers:

Tailored recommendations by Amazon15
Tailored recommendations by Amazon (Image: Amazon168) (View large version17)

Anticipate Decisions Link

Recommendations such as those above might not be enough to reduce the difficulty of choice, because users are still faced with the many relevant options that have been filtered through. This is where products can go a step further by making decisions on the user’s behalf, totally removing the burden of choice.

For example, apps such as Google Now18 are increasingly carrying out actions for users, without any direct user input:

Google Now anticipatory design examples19
Examples of anticipatory design in Google Now (Image: Google20) (View large version21)

Google Now makes a lot of decisions in the background, from finding out where you’ve parked your car to searching for football scores — and notifying you at the right time without even having to be asked:

Google Now overview22
Google Now: “What you need, before you ask.” (Image: Google575523) (View large version24)

Spotify shows another instance of this type of assumptive approach by creating playlists for users before they even think to. As was stated in the announcement25:

It’s like having your best friend make you a personalised mixtape every single week.

Spotify Discover Weekly26
Spotify Discover Weekly’s personlized playlists (Image: Spotify27) (View large version28)

The task of searching for new music and deciding which tracks to add to a playlist are carried out for you.

This notion of making decisions for users has been called “anticipatory design29” and has become a topic of debate because of the ethics involved in making decisions on behalf of users.

Creating Trust In Anticipatory Design Link

In the process of reducing choices and making decisions for people using the approaches outlined above, one could be accused of being presumptuous about what users want. This can create distrust if the app doesn’t do what the user expects, especially at a time when many apps have been exposed for exhibiting dark patterns30, tricking users into doing things they don’t want to do.

Consequently, the higher the number of decisions an app makes for users, the more transparent it should be in order to maintain trust. This can be done by avoiding certain dark behaviors and instead favoring transparency through “light patterns,” which keep users informed and in control, even when anticipatory design is used. Let’s explore a few light patterns.

Avoid Limiting Information Link

When options are filtered away to show users more of what they might like (through app personalization and recommendation systems), an inherent problem can be created whereby users begin to see more and more of the same type of content:

Amazon browsing history recommendations31
Amazon browsing history recommendations (Image: Amazon32) (View large version33)

This can making discovery of new things tricky. It is evident not only on e-commerce websites such as Amazon, but also on social media websites such as Facebook. As Time magazine states34:

Facebook hopes to show more links to people who click lots of links, more videos to people who watch lots of videos and so forth.

Many users might not be happy with this because they don’t want brands to determine what they see. For instance, Joel Spolsky, CEO of Stack Overflow, accuses Facebook of hiding information35:

Facebook is not showing all posts. It is choosing what to show you. An interesting question is to what extent does the Facebook algorithm tend to reinforce your preconceptions? Because that’s what it has been trained to do.

Give the User Control Link

One way to avoid limiting information is to make it easier for users to improve the assumptions that are made about them, through feedback mechanisms.

This can be done in different ways, from obvious (and, therefore, easier) mechanisms to less obvious ones:

Feedback mechanisms used by Google, Facebook and Amazon36
Feedback mechanisms used by Google, Facebook and Amazon (View large version37)
  • Google Now (top left) prompts users directly underneath its Now cards to check that the information shown is relevant.
  • Facebook (top right) is slightly less obvious, employing a dropdown caret in the top right of each news item. Clicking the caret reveals options to hide news you don’t want to see.
  • Amazon (bottom) makes it even more difficult to tailor recommendations. You need to navigate to “Your Account” → “Your Recommendations” → “Improve Recommendations” to adjust what it shows you.

Of these three examples, Google offers the most transparent feedback mechanisms, giving multiple obvious interactions for users to provide feedback on cards, ensuring that the user is in control:

Google Now: You're in control38
Google Now: “You’re in control.” (Image: Google464239) (View large version40)

As well as swiping cards, you can also access customization settings from the menu icon on each card:

Customizing Google Now41
Customizing Google Now (Image: Google464239)

In the case of Facebook and Amazon, even though users can provide feedback to tailor what they see, the underlying news feed and recommendation algorithms have greater control, as outlined by Joel Spolsky43.

Avoid Disguising Ads as Content Link

Disguising ads as content is a common dark pattern44, and it can happen when actions are carried out without the user’s explicit consent.

As an example, Google Now recently partnered with brands such as Lyft, Airbnb, Uber and Instacart to prompt users with services available from those apps, at the time it thinks you need them. While cards from third-party services can be useful, when the cards are for paid services, it can almost seem like another form of advertising:

Google Now partner services45
Google Now partner services (Image: Google464239) (View large version47)

When similar dark shades of design can be seen in related products, the motivation behind anticipatory decisions becomes more suspect. Google Maps is a good example of this, appearing to disguise ads as pins48 on map search results:

Google Maps disguised ads49
Google Maps disguises ads as pins (Image: The Next Web50) (View large version51)

Make Use of Existing User Input Link

When making assumptions about users, it’s important that they be accurate. A tried and tested way to do this is to make use of previous user input, as seen in web browsers features such as pre-populated forms, or by remembering credit-card details and passwords in anticipation of future use:

Google Chrome autocomplete52
Google Chrome pre-populated forms

This saves users from having to repeat the same tasks. The same principle can be applied when making more complex assumptions that combine multiple streams of data. Campaign Live53 highlights an example of this when it discussed how the taxi service Hailo’s “Now card” combines time, geolocation and previous user input to rebook taxis in Google Now:

Let’s say you come into London and you book a Hailo cab, and you go into a particular area between 7am and 10am. If you’re still there at 5pm, there’s an assumption you may want to leave and that’s when the Google Now card would prompt you to book a cab.

The assumption is likely to be more accurate in this case (and will appear less like a disguised ad) because the offer is based on a previous booking made by the user via the same service, within a reasonable time period:

Hailo card promtps user based on previous actions54
Hailo card prompts user based on related action (Image: Google575523)

Let Users Opt Out Link

Despite being able to customize the recommendations given to them, sometimes people don’t want apps to make decisions for them at all. In this case, opting out must be easy. Even though you can’t delete the Google Now app, you can disable Now cards in the settings:

Disable cards in Google Now56
Google Now lets users disable Now cards. (Image: Google575523)

In contrast, there is no way to turn off Amazon recommendations, unless you log out completely — which makes sense (for Amazon) because 35% of product sales58 are a result of recommendations, according to Venture Beat.

A question remains, therefore, as to whether features that record and make use of user data in these ways should be opt-in by default. There is a big difference between opt-in by choice and presumed consent, as shown in this example of organ donors from Dark Patterns6159:

Dark patterns chart: Opt-in by choice60
The difference between opt-in by choice and presumed consent for organ donors (Image: Dark Patterns6159)

Basically, when opt-in is the default, consent to be an organ donor is nearly 100%, whereas when the decision to opt-in is not presumed, the consent percentage is very low.

Use Dark Patterns To Help People Link

It’s evident that companies use dark patterns to advance their own agendas, and it’s even easier today with the tools that help companies make decisions on behalf of users. However, what if similar methods could be used to help people make better decisions for themselves? Currently, many of us make poor decisions due to human frailties such as lack of self-control or a focus on short-term gains.

Nudge People to the Right Option Link

In their book Nudge: Improving Decisions About Health, Wealth, and Happiness62, Richard Thaler and Cass Sunstein suggest creating an ethical “choice architecture63” that nudges the user towards choosing the best overall option in the long run.

Nudge: Imporving Decisions About Health, Wealth and Happiness64
Nudge: Improving Decisions About Health, Wealth and Happiness (Image: Wikipedia65)

In this vein, the techniques we have seen being used to create dark patterns can also be used to form light patterns that nudge users to make better choices.

Auto-Enrolment Link

For example, as life expectancy has increased, it’s become important for people to save for old age through pension plans such as the US’ 401(k)66. However, as explained by Thaler and Sunstein, even though these plans offer “free money,” many people still don’t choose to enroll. Possibile solutions suggested by Thaler and Sunstein to help people save for old age include:

  • automatically enrolling people (similar to the organ donor example),
  • forcing people to make a simple yes or no decision on whether to enroll.

These approaches are examples of light patterns because they serve to benefit users, pushing people to take action and to make good long-term decisions. Even though the latter approach forces people to decide, it simplifies the decision to an easy binary choice, which encourages people to participate.

Create Good Behavioral Patterns Link

Alan Shapiro suggests67 that anticipatory apps could actually encourage behavioral patterns in users. By being constantly advised where to go and what to buy, people could become conditioned by app notifications and the decisions made on their behalf.

This could make for some scary scenarios, such as when a company is primarily interested in selling you products, because it’s more likely to instill behavior that favors impulse purchases and the use of its services. For instance, Amazon’s new Prime Pantry service is full of shady patterns, starting with its Pantry Boxes, which encourage people to buy more than they intended:

Amazon Pantry: Buy more68
Amazon Pantry Box encourages purchasing habits (Image: redefined mom69) (View large version70)

As put by Matt Crowley71, head of product at Circadia:

Amazon has shifted the conversation away from “do I need this?” to “what else do I need to fill up this box?”

Amazon has even gone as far as filing patents for a system that leverages user data to predict and deliver products before the customer has even placed an order. Amazon calls it anticipatory shipping72:

Amazon anticipatory shipping patent diagram73
Amazon’s anticipatory shipping patent diagram (Image: Tech Crunch74) (View large version75)

Putting these motives aside, what if the same tactics could be used to help people form good behaviors and habits? There are plenty of examples of this today, with the emergence of many self-improvement and habit-forming apps.

For example, stickK76 helps you77 kick bad habits by using “the psychological power of loss aversion and accountability to drive behavior change.”

stickK helps you kick bad habits78
stickK helps you kick bad habits. (Image: stickK79) (View large version80)

Duolingo81 reminds you to practice your new language every day, helping you to form a beneficial habit.

Duolingo notifications82
Duolingo helps you form language-learning habits. (Image: Upquire.com83)

From what we see above, the benefits people get from decisions being made on their behalf in anticipatory design are largely determined by the ethics of the company behind the app. How willing is a company to exploit customer data for its own purposes, and how much data are users willing to trade for convenience?

As explained throughout, giving users control and staying transparent are key to maintaining trust. What do you think about dark patterns used in anticipatory design? Do light patterns really exist, and who is in control when design assumptions are made?

(il, yk, al)

Footnotes Link

  1. 1 https://www.smashingmagazine.com/2011/03/why-user-experience-cannot-be-designed/
  2. 2 https://www.smashingmagazine.com/2013/01/effectively-planning-ux-design-projects/
  3. 3 https://www.smashingmagazine.com/2010/01/25-user-experience-videos-that-are-worth-your-time/
  4. 4 https://www.smashingmagazine.com/2010/01/better-user-experience-using-storytelling-part-one/
  5. 5 https://www.smashingmagazine.com/2008/01/10-principles-of-effective-web-design/
  6. 6 http://www.darkpatterns.org
  7. 7 https://www.smashingmagazine.com/wp-content/uploads/2016/09/1-A9vSF2GIo4k005KHHKOOqg-opt.png
  8. 8 https://amazon.co.uk
  9. 9 https://ebay.co.uk
  10. 10 https://www.smashingmagazine.com/wp-content/uploads/2016/09/1-A9vSF2GIo4k005KHHKOOqg-opt.png
  11. 11 https://spotify.com
  12. 12 http://www.ft.com/cms/s/9cebd444-cd9c-11de-8162-00144feabdc0,Authorised=false.html?siteedition=uk&_i_location=http%3A%2F%2Fwww.ft.com%2Fcms%2Fs%2F0%2F9cebd444-cd9c-11de-8162-00144feabdc0.html%3Fsiteedition%3Duk&_i_referer=&classification=conditional_standard&iab=barrier-app
  13. 13 https://www.smashingmagazine.com/2012/02/redefining-hicks-law/
  14. 14 http://www.independent.co.uk/news/people/why-mark-zuckerberg-wears-the-same-clothes-to-work-everyday-a6834161.html
  15. 15 https://www.smashingmagazine.com/wp-content/uploads/2016/09/amazon-choice-opt.png
  16. 16 https://amazon.co.uk
  17. 17 https://www.smashingmagazine.com/wp-content/uploads/2016/09/amazon-choice-opt.png
  18. 18 https://www.google.co.uk/landing/now/
  19. 19 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-now-opt.png
  20. 20 https://google.co.uk
  21. 21 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-now-opt.png
  22. 22 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-now-overview-opt.png
  23. 23 https://www.google.co.uk/landing/now/
  24. 24 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-now-overview-opt.png
  25. 25 https://press.spotify.com/uk/2015/07/20/introducing-discover-weekly-your-ultimate-personalised-playlist/
  26. 26 https://www.smashingmagazine.com/wp-content/uploads/2016/09/spotify-discover-weekly-opt.png
  27. 27 http://spotify.com
  28. 28 https://www.smashingmagazine.com/wp-content/uploads/2016/09/spotify-discover-weekly-opt.png
  29. 29 https://www.smashingmagazine.com/2015/09/anticipatory-design/
  30. 30 http://alistapart.com/article/dark-patterns-deception-vs.-honesty-in-ui-design
  31. 31 https://www.smashingmagazine.com/wp-content/uploads/2016/09/amazon-lamps-opt.png
  32. 32 http://amazon.co.uk
  33. 33 https://www.smashingmagazine.com/wp-content/uploads/2016/09/amazon-lamps-opt.png
  34. 34 http://time.com/3950525/facebook-news-feed-algorithm/
  35. 35 http://www.techworld.com/social-media/facebook-makes-me-angry-stack-overflows-ceo-joel-spolsky-on-developers-ethical-choices-3644318/
  36. 36 https://www.smashingmagazine.com/wp-content/uploads/2016/09/feedback-mechanisms-opt.png
  37. 37 https://www.smashingmagazine.com/wp-content/uploads/2016/09/feedback-mechanisms-opt.png
  38. 38 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-control-opt.png
  39. 39 https://www.google.com/search/about/learn-more/now/#cards
  40. 40 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-control-opt.png
  41. 41 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-control-animated.gif
  42. 42 https://www.google.com/search/about/learn-more/now/#cards
  43. 43 http://www.techworld.com/social-media/facebook-makes-me-angry-stack-overflows-ceo-joel-spolsky-on-developers-ethical-choices-3644318/
  44. 44 http://darkpatterns.org/disguised-ads/
  45. 45 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-now-services-opt.png
  46. 46 https://www.google.com/search/about/learn-more/now/#cards
  47. 47 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-now-services-opt.png
  48. 48 http://thenextweb.com/google/2016/05/24/youll-soon-see-promoted-pins-google-maps-searches/#gref
  49. 49 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-disguised-ads-opt.png
  50. 50 http://thenextweb.com/google/2016/05/24/youll-soon-see-promoted-pins-google-maps-searches/#gref
  51. 51 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-disguised-ads-opt.png
  52. 52 https://www.smashingmagazine.com/wp-content/uploads/2016/09/browser-autocomplete-opt.png
  53. 53 http://www.campaignlive.com/article/brands-show-cards-google/1332234#wMhCR4FsQU1PWOG7.99
  54. 54 https://www.smashingmagazine.com/wp-content/uploads/2016/09/hailo-card-opt.png
  55. 55 https://www.google.co.uk/landing/now/
  56. 56 https://www.smashingmagazine.com/wp-content/uploads/2016/09/google-turn-off-opt.png
  57. 57 https://www.google.co.uk/landing/now/
  58. 58 http://venturebeat.com/2006/12/10/aggregate-knowledge-raises-5m-from-kleiner-on-a-roll/
  59. 59 http://darkpatterns.org/
  60. 60 https://www.smashingmagazine.com/wp-content/uploads/2016/09/dark-patterns-opt.jpeg
  61. 61 http://darkpatterns.org/
  62. 62 https://en.wikipedia.org/wiki/Nudge_(book)
  63. 63 https://en.wikipedia.org/wiki/Choice_architecture
  64. 64 https://www.smashingmagazine.com/wp-content/uploads/2016/09/nudge-opt.jpg
  65. 65 https://en.wikipedia.org/wiki/Nudge_(book)
  66. 66 https://en.wikipedia.org/wiki/401(k)
  67. 67 http://www.fastcodesign.com/3045039/the-next-big-thing-in-design-fewer-choices
  68. 68 https://www.smashingmagazine.com/wp-content/uploads/2016/09/amazon-pantry-opt.png
  69. 69 http://redefinedmom.com/how-does-amazon-prime-pantry-work/
  70. 70 https://www.smashingmagazine.com/wp-content/uploads/2016/09/amazon-pantry-opt.png
  71. 71 https://medium.com/@mwcrowley/the-sketchy-ux-decisions-behind-amazons-prime-pantry-f7cf12878c17#.8lhlf42sv
  72. 72 http://www.forbes.com/forbes/welcome/?toURL=http://www.forbes.com/sites/onmarketing/2014/01/28/why-amazons-anticipatory-shipping-is-pure-genius/&refURL=&referrer=#14fea58d2fac
  73. 73 https://www.smashingmagazine.com/wp-content/uploads/2016/09/amazon-anticipatory-shipping-opt.png
  74. 74 https://techcrunch.com/2014/01/18/amazon-pre-ships/
  75. 75 https://www.smashingmagazine.com/wp-content/uploads/2016/09/amazon-anticipatory-shipping-opt.png
  76. 76 http://www.stickk.com/
  77. 77 http://www.stickk.com/tour
  78. 78 https://www.smashingmagazine.com/wp-content/uploads/2016/09/stick-opt.png
  79. 79 https://stickk.com/
  80. 80 https://www.smashingmagazine.com/wp-content/uploads/2016/09/stick-opt.png
  81. 81 https://www.duolingo.com/
  82. 82 https://www.smashingmagazine.com/wp-content/uploads/2016/09/duolingo-opt.png
  83. 83 http://upquire.com

↑ Back to top Tweet itShare on Facebook

Graeme is the creator of a popular UX Prototyping Tools site, where he does everything from content strategy and design, to front end development. He's always making new stuff, and enjoys sharing what he's learning on his blog. He's also employed by IBM as a designer on the IBM Watson Internet of Things.

  1. 1

    Sort of amusing that an article about reducing user fatigue features images that, when clicked to view larger, simply loads the image. and forces the user to click back – reloading the entire page – just to continue reading.

    How is it that no one thought to include modal-viewing of larger images?

    7
    • 2

      I was recently faced with the same problem: using and overlay seemed like a solution, but having it cover (and block) the entire content didn’t help either.

      I finally arrived to the concept of showing the overlay within the content, being able to direct user focus without sacrificing the accessibility and navigation of the page.

      Here’s a demo of the concept, which can be taken to many directions http://codepen.io/indrekpaas/pen/vXjrpo

      3
  2. 3

    Auldyn Matthews

    October 19, 2016 5:02 pm

    I appreciate the discussions that the author brings up here, especially as more and more data is being collected. How do we create great experiences with users and not break their trust? How do we not annoy users?

    One note on the Duolingo piece: while I like the idea of reminders, often the implementation of a daily reminder gets annoying quickly. I’ve noticed with many apps I quickly end up ignoring them or turning them off. Changing behavior can take anywhere from 30-90 days, so I wonder how we, as designers, can engage users for that duration without just pushing out notifications.

    1
    • 4

      Graeme Fulton

      October 20, 2016 9:28 am

      I agree with your point on Duolingo and receiving too many notifications. When many apps are doing the same thing, the notifications become useless and annoying to me.

      I think it’s good though how they actually stop sending you notifications themselves if you don’t act upon them:

      It still doesn’t quite solve the notification overload, but at least they’ve thought about it.

      0
      • 5

        Graeme Fulton

        October 20, 2016 9:30 am

        *Good how Duolingo stop sending the notifications, they say “These reminders don’t seem to be working. We’ll stop sending them for now”

        -1
  3. 6

    I for one hate ‘anticipatory’ / customised design – it should be burned with fire, anybody that uses it should be sold in to slavery & the earth where it was built salted as it is a LAZY way of allowing users to explore, and it is invariably done badly.

    Eli Parisier in his famous TED talk , talked about the danger of info bubbles (eg if Facepalm decides you are a liberal you will only see posts and promoted stories from liberals) , which can damage social cohesion – some people are actually shocked that literally nobody they know holds a different opinion to them on and have an existential crises when they realise that others hold a contrary position (Brexit is great example). Also the social media platforms trying to “nudge” peoples opinions towards one or other of the US presidential candidates including shadowbanning & manual intervention to suppress positive stories and trending #tags that support one or other ( I cannot vote in the election, I am an interested observer)

    Another case in point youtube has decided that because I am interested in medieval church music (subscriptions) that I am interested in Vevo channels featuring pop stars or heavy metal … and it has also worked out that john.doe@gmail is the same as fred.dagg@….edu the former account reflects my true interests, whilst the latter has a different subscription set that reflects what I use in class and keeps trying to pollute , ie make recommendations based on one account, with the other – do I really need a NSFClassroom Miley Cyrus recommendation when I have a video of data flow diagrams up, or an advertisment for weebly, or the bit I really want the students to focus on obscured by a pop up for life insurance? Other teachers are finding the same , and instead of streaming video from youtube , are pre downloading the video.

    One danger of anticipatory design is that requires lots and lots and lots and lots of personal information, for me there is nothing in Google Now that makes a compelling case for the data slurp (or battery drain) that needs to happen for it work out that I might need to book a cab. A colleague just had this happen, a week ago he opened a pdf in gmail that had an invoice in it for his child’s school fees, today he got a reminder from gmail that they are due tomorrow (he paid them yesterday) ! He has not asked for the reminder , so as a result he is switching to another email provider that is not as , as he puts it, ‘creepy’ as gmail, and has decided that next year he will not use any google products in his teaching practice.

    As for nudging – there is nothing less ethical. WHO decides what is good ? The same fachidioten that say I should eat xyz to prevent heart disease, but xyz is then found to cause diabetes or cancer or alzheimers, or if I eat abc then I double my chances of getting mno – when my chance of getting mno is 1 in a million anyway.

    I am not a slave to my device(s) & do not suffer FOMO (fear of missing out) so all notifications for ALL software on my devices is turned off , when I want something, I decide to look for it – I run my device not my device runs me.

    I realise that I am probably not your typical use case, but there are many like me out there.

    8
    • 7

      Great examples here!

      What you say about ‘nudging’ reminded me of a documentary on the BBC by Dr Chris van Tulleken, called “The Doctor Who Gave Up Drugs”. He showed how pharmaceutical companies push doctors into prescribing patients with drugs, for almost any problem, such as depression and diabetes. This happened so much that it became the norm and ‘correct’ to prescribe drugs.
      He carried out an experiment to cure patients without drugs, and found that most problems could be solved with lifestyle changes such as introducing exercise.

      Who’s to say what’s right – there seems to be a lot of influences, with money and corporation greed being one of the biggest in my opinion.

      1
  4. 8

    This season enjoy the memorable time in the company of your best friends and family members. Gifts are given with a valid purpose and in most cases gifts are the most valuable items in an individual’s life and people send Online Gifts Delivery in India to satisfy the people who are close to their heart.

    -10

↑ Back to top