Screen time and social media may not be as bad for mental health as people think
Photo by Luke Porter on Unsplash

Even a casual follower of the news over the last few years is likely to have encountered stories about research showing that digital technologies like social media and smartphones are harming young people's mental health. Rates of depression and suicide among young people have risen steadily since the mid-2000s, around the time that the first smartphones and social media platforms were being released. These technologies have become ubiquitous, and young people's distress has continued to increase since then.

Many articles in the popular and academic press assert that digital technology is to blame. Some experts, including those recently featured in stories by major news outlets, state that excessive use of digital technology is clearly linked to psychological distress in young people. To deny this connection, according to a prominent proponent of the link, is akin to denying the link between human activity and climate change.

In an effort to protect young people from the harms of digital tech, some politicians have introduced legislation that would, among other things, automatically limit users' time spent on a social media platform to 30 minutes a day. If the evidence is so definitive that digital technology is harming America's youth in such substantial ways, then reducing young people's use of these devices could be one of the most important public health interventions in American history.

There's just one problem: The evidence for a link between time spent using technology and mental health is fatally flawed.


Know thyself – easier said than done

Absent from the discussion about the putative harms of digital tech is the fact that practically all academic studies in this area have used highly flawed self-report measures. These measures typically ask people to give their best guesses about how often they used digital technologies over the past week or month or even year. The problem is that people are terrible at estimating their digital technology use, and there's evidence that people who are psychologically distressed are even worse at it. This is understandable because it's very hard to pay attention to and accurately recall something that you do frequently and habitually.

Researchers have recently begun to expose the discrepancy between self-reported and actual technology use, including for Facebook, smartphones and the internet. My colleagues and I carried out a systematic review and meta-analysis of discrepancies between actual and self-reported digital media use and found that self-reported use is rarely an accurate reflection of actual use.

This has enormous implications. Although measurement isn't a sexy topic, it forms the foundation of scientific research. Simply put, to make conclusions – and subsequent recommendations – about something you're studying, you must ensure you're measuring the thing you're intending to measure. If your measures are defective, then your data is untrustworthy. And if the measures are more inaccurate for certain people – like young people or those with depression – then the data is even more untrustworthy. This is the case for the majority of research into the effects of technology use over the past 15 years.

Imagine that everything known about the COVID-19 pandemic was based on people giving their best guesses about whether they have the virus, instead of highly reliable medical tests. Now imagine that people who actually have the virus are more likely to misdiagnose themselves. The consequences of relying on this unreliable measure would be far-reaching. The health effects of the virus, how it's spreading, how to combat it – practically every bit of information gathered about the virus would be tainted. And the resources expended based on this flawed information would be largely wasted.

The uncomfortable truth is that shoddy measurement, as well as other methodological issues including inconsistent ways of conceiving of different types of digital tech use and research design that falls short of establishing a causal connection, is widespread. This means that the putative link between digital technology and psychological distress remains inconclusive.

Social media has a lot to answer for, but in terms of time spent on them, the mental health of young people might not belong on the list. images.theconversation.com

In my own research as a doctoral student in social work, I found that the link between digital technology use and mental health was stronger when self-report measures were used than when objective measures were used. An example of an objective measure is Apple's "Screen Time" application, which automatically tracks device use. And when I used these objective measures to track digital technology use among young adults over time, I found that increased use was not associated with increased depression, anxiety or suicidal thoughts. In fact, those who used their smartphones more frequently reported lower levels of depression and anxiety.

From believer to skeptic

That the link between digital tech use and psychological distress is inconclusive would have come as a big surprise to me five years ago. I was shocked by the levels of depression and thoughts of suicide among the students I treated when I worked as a mental health therapist at a college counseling center. I, like most people, accepted the conventional narrative that all these smartphones and social media were harming young people.

Wanting to investigate this further, I left clinical practice for a Ph.D. program so I could research why these technologies were harmful and what could be done to prevent these harms. As I dove into the scientific literature and conducted studies of my own, I came to realize that the link between digital technology and well-being was much more convoluted than the typical narrative portrayed by popular media. The scientific literature was a mess of contradiction: Some studies found harmful effects, others found beneficial effects and still others found no effects. The reasons for this inconsistency are many, but flawed measurement is at the top of the list.

This is unfortunate, not just because it represents a huge waste of time and resources, or because the narrative that these technologies are harmful to young people has been widely popularized and it's hard to get the cat back in the bag, but also because it forces me to agree with Mark Zuckerberg.

Getting at the truth

Now, this doesn't mean that any amount or kind of digital technology use is fine. It's fairly clear that certain aspects, such as cyber-victimization and exposure to harmful online content, can be damaging to young people. But simply taking tech away from them may not fix the problem, and some researchers suggest it may actually do more harm than good.

Whether, how and for whom digital tech use is harmful is likely much more complicated than the picture often presented in popular media. However, the reality is likely to remain unclear until more reliable evidence comes in.

Craig J.R. Sewall is a Postdoctoral Scholar of Child and Adolescent Mental Health at the University of Pittsburgh.

This article first appeared on The Conversation. You can read it here.



Photo by Louis Hansel on Unsplash
True

This story was originally shared on Capital One.

Inside the walls of her kitchen at her childhood home in Guatemala, Evelyn Klohr, the founder of a Washington, D.C.-area bakery called Kakeshionista, was taught a lesson that remains central to her business operations today.

"Baking cakes gave me the confidence to believe in my own brand and now I put my heart into giving my customers something they'll enjoy eating," Klohr said.

While driven to launch her own baking business, pursuing a dream in the culinary arts was economically challenging for Klohr. In the United States, culinary schools can open doors to future careers, but the cost of entry can be upwards of $36,000 a year.

Through a friend, Klohr learned about La Cocina VA, a nonprofit dedicated to providing job training and entrepreneurship development services at a training facility in the Washington, D.C-area.

La Cocina VA's, which translates to "the kitchen" in Spanish, offers its Bilingual Culinary Training program to prepare low-and moderate-income individuals from diverse backgrounds to launch careers in the food industry.

That program gave Klohr the ability to fully immerse herself in the baking industry within a professional kitchen facility and receive training in an array of subjects including culinary skills, food safety, career development and English language classes.

Keep Reading Show less
Gage Skidmore/Wikimedia Commons

Wil Wheaton speaking to an audience at 2019 Wondercon.

In an era of debates over cancel culture and increased accountability for people with horrendous views and behaviors, the question of art vs. artist is a tricky one. When you find out an actor whose work you enjoy is blatantly racist and anti-semitic in real life, does that realization ruin every movie they've been a part of? What about an author who has expressed harmful opinions about a marginalized group? What about a smart, witty comedian who turns out to be a serial sexual assaulter? Where do you draw the line between a creator and their creation?

As someone with his feet in both worlds, actor Wil Wheaton weighed in on that question and offered a refreshingly reasonable perspective.

A reader who goes by @avinlander asked Wheaton on Tumblr:

"Question: I have more of an opinion question for you. When fans of things hear about misconduct happening on sets/behind-the-scenes are they allowed to still enjoy the thing? Or should it be boycotted completely? Example: I've been a major fan of Buffy the Vampire Slayer since I was a teenager and it was currently airing. I really nerded out on it and when I lost my Dad at age 16 'The Body' episode had me in such cathartic tears. Now we know about Joss Whedon. I haven't rewatched a single episode since his behavior came to light. As a fan, do I respectfully have to just box that away? Is it disrespectful of the actors that went through it to knowingly keep watching?"

And Wheaton offered this response, which he shared on Facebook:

Keep Reading Show less
True

When a pet is admitted to a shelter it can be a traumatizing experience. Many are afraid of their new surroundings and are far from comfortable showing off their unique personalities. The problem is that's when many of them have their photos taken to appear in online searches.

Chewy, the pet retailer who has dedicated themselves to supporting shelters and rescues throughout the country, recognized the important work of a couple in Tampa, FL who have been taking professional photos of shelter pets to help get them adopted.

"If it's a photo of a scared animal, most people, subconsciously or even consciously, are going to skip over it," pet photographer Adam Goldberg says. "They can't visualize that dog in their home."

Adam realized the importance of quality shelter photos while working as a social media specialist for the Humane Society of Broward County in Fort Lauderdale, Florida.

"The photos were taken top-down so you couldn't see the size of the pet, and the flash would create these red eyes," he recalls. "Sometimes [volunteers] would shoot the photos through the chain-link fences."

That's why Adam and his wife, Mary, have spent much of their free time over the past five years photographing over 1,200 shelter animals to show off their unique personalities to potential adoptive families. The Goldbergs' wonderful work was recently profiled by Chewy in the video above entitled, "A Day in the Life of a Shelter Pet Photographer."