While the digital detox has become our primary strategy for dealing with the challenges of digital life, app creator and writer Rohan Gunatillake thinks that our urge to unplug is unfortunate and unnecessary. Like it or not, he says, technology is here to stay and pathologizing our relationship with it is not a sustainable path forward. Instead we can use our relationship with technology as an opportunity to practice mindfulness.
However, despite his optimistic view (and despite being one of the few people on the planet who makes a living from making mindfulness apps), Gunatillake remains a vocal critic of the technology industry. He says that developers and designers “need to grow up” by making products that aren’t created to “make users feel bad about themselves.”
Gunatillake is the creator of the buddhify app and author of Modern Mindfulness: How to Be More Relaxed, Focused, and Kind While Living in a Fast, Digital, Always-On World. We recently spoke with him at a coffee shop in Brooklyn.
You’ve written that you “intensely dislike the term digital detox.” Why?
First I’d like to mention that I think the practice of spending time away from our devices and making sure we’re not overly dependent on them is a valuable thing to do. But in the current conversation about well-being and technology, this seems to be the main strategy, and I think that’s a problem.
When some people discover than I’m a maker of mindfulness apps they ask, “Isn’t that an oxymoron? Aren’t mindfulness and technology the opposite of each other?” They think the only way to practice mindfulness with technology is to turn the technology off and throw it away. And I just think that’s a really limiting, highly unsustainable way to go about it. If our only way to mindfully work with screen-based and content-based technologies—which is what people are referring to when they say “technology”—is to not use them, that’s not going to work because they’re not going away. They’re increasingly underpinning our society, economy, and individual lifestyles.
So do you want to live in a world where we pathologize this force that is fundamental to our lives? Or do you want to live in a world where we can actually use these technologies in support of our practice?
The term “digital detox” is literally saying these technologies are toxic. Yes, it’s true that a lot of these technologies are designed to be addictive and the technology industry has a lot of growing up to do. But, while that happens, we have two strategies. One is to hide our heads in the sand and the other is to learn to include technology in our mindfulness practice. I believe that you can practice wherever you are and while doing whatever you’re doing. That includes when you’re using technology. Technology is just the domain of our lives. It’s no different than practicing mindfulness in our relationships, with our children, while on the subway or at the gym.
I agree that it’s problematic to say that the technologies themselves are toxic, but it does seem like there’s often something toxic about the way that we relate to technology.
Absolutely. The danger of that reality is that we enter into a binary relationship with technology, where we associate our well-being with turning it off. We don’t even give ourselves a chance to have a relationship with technology that can be supportive of our well-being. This reminds me of something that happens at meditation retreats. At the end of every retreat, there’s always this question: How do we take the state of mind that we’ve cultivated here back into our lives?
When we ask that, we’ve already made this division between retreat life and the “real world.” We use this language in retreat culture. We make these divisions between our practice and the real world, which is dangerous because we’re imprinting boundaries that I don’t think need to be there. It is possible to use meditative techniques in all situations, including when we’re using technology. We can learn to develop the qualities that come from meditation in the messiness of our lives. And then, when we’re ready and able to find time to deepen our insights through formal practice, we can do that.
Do you think that a person can really cultivate concentration and mindfulness while scrolling through their Twitter feed?
A lovely thing that happens when you do take a “practice everywhere” approach to meditation is that you have a much better chance of mindfulness being your default state of awareness. If you’re used to training only in a formal setting, then it’s hard to have that awareness when you’re in a different setting. Also, I know people who are incredibly gifted formal practitioners and social basket cases. We all know those people. I think that’s an extreme result of the emphasis on training only in a formal setting.
Getting back to our sometimes toxic relationship with technology, earlier you mentioned that people in the technology industry have some “growing up to do,” which is an idea we’ve touched on before on the Garrison Institute blog. For example, developers might not put all of their energy into creating addictive products. Can you say more about this?
This is a massive issue. The tech industry is currently trying to trick us into thinking it’s our problem. The whole conversation around tech stressors is that we should learn to digital detox. The more structural issue, however, is that most screen-based technologies are attention-based products and—as mindfulness practitioners know well—the quality of our attention has a direct impact on our well-being.
If we go with that idea, that means these technology companies are impacting our well-being by impacting the quality of our attention. What’s worse is that they don’t actually care about our well-being. If you think about advertising-based revenue companies, like Facebook or Twitter, they’re massively incentivized to do all they can to keep our attention. Google is an empire built on the carcasses of our broken awareness. It’s awful, but we valorize these companies. They’re our heroes of the 21st century.
Companies are going to have to take responsibility, I think, for what they’re doing with our attention and awareness. At the very least, they can start making some changes to make their products a little more mind neutral. They won’t all become mind positive. You shouldn’t expect to start going into a spontaneous compassionate state by simply using a mobile game. But, at the very least, they can make it so you don’t feel terrible or bad about yourself for using their products.
Would you say that there are products that are purposefully designed to make us feel bad about ourselves?
So many products are based on us feeling bad about ourselves! They make it so that we judge ourselves against others, or feel like we’re missing out on stuff going on. They bring about all these emotional states that we work so hard to mitigate during meditation. They create a lot of anxiety.
But I’m optimistic when I think about how popular mindfulness practice has become within the technology world, from the big giants down to the small startups. They all meditate. I trust in the practice, so I trust that there’ll be a point when product designers go to these mindfulness courses and start to think, “I need to incorporate this into how I design my product.” Because if the technology world doesn’t truly utilize mindfulness as a well-being tool, then we’ll have missed a world-changing opportunity.
Can you give us some specific examples of the ways that developers design products that manipulate our attention?
Notification systems are always used to drag you back into a product. “Hey, 12 people looked at your Instagram post.” They optimize your sense of self-involvement. The most fundamental trick is what’s called “variable reward,” which comes straight out of these experiments with rats in the ‘50s. There were two groups of rats, one that got their food every time they tapped a button and another that was given food in a more chaotic, random way when they tapped the button. With the first group, because the process was entirely predictable, the rats would tap the button until they were no longer hungry and then they just went away. With the second group, however, the rats would hammer the button repeatedly because they never knew what they were going to get. That principle dictates how social media feeds are designed. It’s how Candy Crush and Clash of Clans work.
There’s an amazing book about this called Addiction by Design by Natasha Dow Schüll. She talks about machine gambling in Las Vegas. The mechanics of machine gambling have been directly imported into app design. They trap the user within the system, so that users are more likely to pay for the product or see the ads you’re selling.
Do you have ideas about how they might do things differently if they kept users’ well-being in mind?
The first thing is obviously to just start this conversation so that organizations actually recognize that this is happening. Honesty is step one.
Step two is that there are different ways that products might be designed. Recently I’ve been thinking of ways to design a notification system that doesn’t make you feel bad about yourself. They can be much subtler than the red dot in your face that says, “There’s a message you haven’t seen!” I think some design thinking could be applied to make a calmer, more ambient notification system.
Another way of thinking about all of this is that the technology world has largely been built by engineers. It’s a data-led world that doesn’t always care about human emotions. I think that’s a way the technology world can mature, by making design decisions based on how people feel in addition to the data they need to have.
As a maker of mindfulness apps, are you making design decisions based on how people feel while using your products?
I think it’s important that those making products meant for increasing well-being start being leaders in this conversation. Can every interaction with a product be supportive of well-being? If you’re onboarding a user and you’re making them feel bad about themselves—perhaps by saying “oh, you haven’t meditated today”—then that is sabotaging the whole enterprise. Maybe they’ll have to go and do some practice to deal with the anxiety that you’ve caused them. So you create anxiety and then you sell the service to deal with it.
I’m trying to walk the walk. Currently, one of the structural issues is that a lot of the people who use mindfulness apps and get a lot of benefit out of them will become dependent on the product as well. I get emails from people telling us things like, “I use this one track every time before I go to bed because it helps me go to sleep.” It makes me a little sad, actually, because my hope is that they will be able to internalize some of the techniques. We want our app to show that there’s more to meditation than just plugging in some headphones and being a consumer of content. Mindfulness has become a content business rather than a wisdom business. I think we’re doing users and the mindfulness tradition a disservice.
Top image courtesy of unsplash.com
Cracked iPhone image courtesy of Andrew Mager / Flickr