+
upworthy

iphone

An iPhone charging at night

Apple has just issued a service announcement warning people that sleeping on a charging device could lead to fire, electric shock, injury, or damage to the phone or property. This is big news: nearly 1.5 million people worldwide are iPhone users, and according to a survey, 64% of those who live in America charge their phones while they sleep.

That means every night, somewhere around a billion people are at some risk, however statistically small, of starting a fire.

Apple says that its phone and USB power adaptors can become hot while charging, which may lead to discomfort or injury. “Use common sense to avoid situations where your skin is in contact with a device, its power adapter, or a wireless charger when it’s operating or connected to a power source for long periods of time,” the statement reads. “For example, don’t sleep on a device, power adapter, or wireless charger, or place them under a blanket, pillow, or your body, when it’s connected to a power source. Keep your iPhone, the power adapter, and any wireless charger in a well-ventilated area when in use or charging.”


Apple also warns against charging a phone near liquids and asks users to discard damaged chargers immediately. "Using damaged cables or chargers, or charging when moisture is present, can cause fire, electric shock, injury, or damage to iPhone or other property," the company said in a statement.

Additionally, Apple cautioned users about the heightened risk of fire when using third-party chargers, noting that some cheaper chargers might not meet the safety standards of Apple's official products. The company advises using chargers paired with “Made for iPhone” cables that adhere to global safety norms.

The tech giant isn’t the only one sounding the alarm about overnight charging dangers. A fire department in Kent, England, has also warned iPhone users about the risks of sleep-charging.

"We get a lot of questions here at Kent Fire & Rescue about why you shouldn't charge phones overnight," the fire expert warned in a viral TikTok video. ”So here are the reasons why. Number one: you can't smell anything when you're asleep, so if it starts to burn, the fire won't wake you up. Number two: it only takes three breaths of smoke to knock you unconscious. Number three: lots of people have cheap or faulty phone chargers, but even genuine ones have been known to start fires.”

Even though Apple and a fire department have warned about charging at night, changing the public’s habits will be hard. Those who want to see the real dangers of charging a phone at night look no further than this video of an iPhone 4, all by itself, catching a blaze at a home in Green Township, Ohio.

Older iPhones are more likely to catch fire because their lithium batteries become larger with age. A chemical reaction inside the battery provides power, but the chemical reaction can fail over time and create a gas. "We were extremely lucky to avoid a house fire," Brian Leisgang told WCPO. "Luckily we had just cleaned off the counter."

For over a decade, Apple's done everything in its power to keep your eyes, ears, and fingers glued to your cellphone. This makes their latest feature a little puzzling.

Tucked away in iOS 12, the mid-2018 iteration of Apple's mobile operating system, is a feature called Screen Time. This feature will monitor user activity about app usage, time spent on the device, and more. It will also allow people to set limits for themselves. Parental controls are nothing new when it comes to pieces of tech, but Screen Time is a little different in that it's not necessarily for children.

"With Screen Time, these new tools are empowering users who want help managing their device time and balancing the many things that are important to them," Craig Federighi, Apple's senior vice president of software engineering, said during the product announcement. In effect, Apple is giving users the option to limit themselves and the time spent on their devices.


A look at what Apple's Screen Time feature will look like on iPhone. Image from Apple.

The need for Screen Time illustrates a growing consciousness around the issue of tech addiction.

It may sound silly, but people are becoming increasingly dependent on mobile devices. Figures vary, but it's estimated that the average U.S. adult spends somewhere around four hours on their phones and tablets each day, a number that's climbed higher in recent years. Whether it's actually an "addiction" is up for debate (it's not currently listed in the American Psychiatric Association's Diagnostic and Statistical Manual of Mental Disorders), but it and similar technology-related issues are being studied.

Whether or not tech can actually be addictive, there's a lot of data to suggest that it's just simply not great for our health in large doses.

If tech addiction doesn't exist, it's not for a lack of trying.

In a November interview with Axios, Sean Parker, an early investor in Facebook and its first president, explained the driving question behind the company's development: "How do we consume as much of your time and conscious attention as possible?"

"That means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that's going to get you to contribute more content, and that's going to get you ... more likes and comments. It's a social-validation feedback loop ... exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology."

To be fair, getting people to use a product as much as possible isn't exactly a remarkable goal for any company. Facebook just succeeded in ways other businesses haven't.

Sean Parker addresses a conference in 2017. Photo by Theo Wargo/Getty Images for Global Citizen.

Some in the tech industry are finally asking questions and drawing conclusions about the long-term effects of dependency on technology.

Former Facebook vice president of user growth Chamath Palihapitiya told an audience at Stanford University that the "short-term, dopamine-driven feedback loops we've created" pose a threat to society as a whole. "No civil discourse, no cooperation; misinformation, mistruth. And it's not an American problem — this is not about Russian ads. This is a global problem."

During its 2018 I/O conference, Google acknowledged that technology as we currently know it comes with some downsides. "Great technology should improve life, not distract from it," the company's Digital Wellbeing website proclaims. This new suite of tools, similar to Apple's Screen Time, comes with a simple goal: Ensure that "life, not the technology in it, stays front and center."

Without a doubt, tools like those in Google's Digital Wellbeing and Apple's Screen Time are a good thing. But they're probably not enough.

In his 2016 TED Talk on how "better tech could protect us from distraction," former Google design ethicist Tristan Harris laid out a plan to "restore choice" in the relationship we have with technology. The goal is to convince companies to pursue a metric of "time well spent" rather than simply time spent. Harris called on companies to judge their success on the company's "net positive contribution to human life," on designers to resist the urge to simply create unproductive time-sucks, and on consumers to "demand technology that works this way."

A healthier relationship with technology requires companies to rethink their businesses as a whole. Tools like Digital Wellbeing and Screen Time on their own don't address the underlying issue.

If you feel like you're having a tough time reducing your time on your mobile devices and you want to cut back, there are simple things you can do right now.

As co-founder and executive director at the Center for Humane Technology, Harris advocates for better design. The organization's website is full of great resources, but none better and more instantly applicable than its list of ways to "live more intentionally with your devices." Here are five suggestions for ways you can cut back on mobile device dependence:

1. Manage your notifications.

CHT recommends turning off all notifications for everything except messaging apps, text, and email.

2. Change your display to black, white, and gray.

Did you know that you can make your iPhone display grayscale? CHT outlines how to do that, removing some of the bright colors that demand our attention.

3. Sleep with your phone in a different room.

Not only do phones have a nasty habit of keeping us up late when we're trying to sleep, but waking up next to one reinforces a habit that starts the day diving headfirst into technology.

4. Reorganize your home screen.

Think about what apps you spend a lot of time mindlessly browsing. Now move them to the second screen. CHT suggests using the home screen for "apps you use for quick in-and-out tasks."

5. Use available tools and apps to help you.

Tools like Digital Wellbeing, Screen Time, and third-party apps are designed to reduce distraction. Did you know that there's an app you can download that temporarily locks you out of other apps? How about an extension that blocks out Facebook's newsfeed? There are loads of productivity apps that  make your phone usage a bit more deliberate without having to cut yourself off from technology entirely.

Apple CEO Tim Cook appears at Apple's 2018 Worldwide Developers Conference. Photo by Apple.

Technology can be wonderful, and social media can connect us in powerful new ways, but remember that too much of a good thing can have its downsides.

No one is saying that you shouldn't use the internet or your smartphone. Those things are simply a part of people's lives now. What you should do, if you want to, is set boundaries for yourself. If even the companies whose profits depend on getting people hooked on the use of their products are taking steps to help you dial things back, it's probably worth a shot.

Starting in September, Apple will make another update to its iconic and useful emojis.

As part of the update, the company is getting rid of the pistol emoji and replacing it with a green water gun.


While Apple hasn't officially addressed the reasons for the swap, it seems pretty clear that, after another year filled with horrific gun violence, the company is responding in some small way to America's frustration with gun culture.


Before I continue, let's get one thing straight:

No, of course swapping the pistol emoji for a water gun is not going to solve America's gun problem.

Obviously.

You will never see a news story with the headline: "New Water Gun Emoji Directly Responsible for Decline in Gun Violence."

HOWEVER...

Ask yourself another question: "Is one person recycling water bottles going to solve global warming?" No, of course not.

Is recycling those water bottles still the right thing to do? Will it still help make a small dent of progress in the face of an overwhelming challenge? Yes.

Like it or not — emojis are a big part of our cultural lingo.

They're not the biggest, most important, or most central part of our culture, but millions of people use them regularly to communicate, laugh, make plans, and occasionally to represent body parts ("Peach and eggplant emoji" to you as well, good sir).

Photo by Miguel Medina/AFP/Getty Images.

Which is why emojis have been updated on multiple occasions to better represent the times we're in.

In 2015, a variety of skin tones were added to help represent people of different races, same-sex couples and families were added to help represent people of different sexual orientations, and this latest update will also include a pride flag and a more diverse array of female emojis, after an official bid from Google.

There's nothing wrong with adding and changing emojis to be more representative of the things we talk and care about, while also acknowledging that the cartoon keyboard in our phones is not the axis on which the most critical conversations of our culture turn.

But I digress. Back to the revolver emoji. It's already pretty troubling.

Aside from being yet another byproduct of our gun-obsessed culture, the gun emoji has been a key factor in a few real-life incidences in which the police got involved.

In February 2016, a 12-year-old got in trouble with the police after posting a message on Instagram containing the gun emoji along with the bomb and knife emoji. In Brooklyn, New York, a teenager was arrested on terror charges after making a perceived threat to police officers using emojis. His charges were eventually dropped.

No word yet on the swords or the clearly dangerous chemistry set. Screengrab of iPhone emojis taken on my phone.

As far as people being frustrated at gun culture, though, you probably don't need reminding that 2016 has been as close to a tipping-point year as we've ever had in recent memory.

Multiple police-involved shootings, a horrific massacre at a nightclub in Orlando, an outright attack on the Dallas Police Department, and hundreds of mass-shooting deaths have created an environment where lawmakers are (finally, maybe, possibly, hopefully) ready to step up and do something.

Rep. John Lewis speaking to the press during his gun control protest in June 2016. Photo by Pete Marovich/Getty Images.

There was a 15-hour filibuster on gun control after the Orlando shooting as well as a congressional sit-in led by civil rights activist Rep. John Lewis.

People have also been taking out their frustration toward the lack of action on gun control in little ways, like defacing posters for the film "Jason Bourne," which prominently feature actor Matt Damon holding a gun.


People have had it with a culture that consistently fetishizes and glorifies guns, and replacing the gun emoji with a water pistol is a small way to lessen the presence of guns in daily conversation.

No, the water pistol emoji isn't going to solve America's problem with gun violence, or make you dinner, or tie your shoes for you, or make "True Detective" great again, or anything else.

We still need to work on gun control. We still need to stand up to gun lobbyists and politicians and others who stand by, complicit, as gun violence continues to claim more lives in America than anywhere else in the world.

Photo by Mark Wilson/Getty Images.

In the meantime, we can also appreciate that the revolver emoji is now a more fun and less deadly water pistol.

It's a small gesture that shows that we, as a people, with our incredible technology and advanced methods of communication, don't need a little cartoon gun to live our lives or communicate with each other.

The FBI recently pressured Apple into creating a special iPhone security override for them — and Apple very politely told them to screw off.

The TL;DR version is that the FBI is having trouble breaking into the iPhone formerly belonging to Syed Farook, one of the shooters involved in the tragic massacre in San Bernardino, California. Apple agreed to help ... but the FBI took this a step further and obtained a court order for Apple to provide a way to bypass several security features on the phone without erasing its data. Apple claims this would involve creating a new version of iOS (which some have dubbed "FBiOS") with a back door that has serious privacy and security implications.

It's not that Apple can't do what the FBI is asking of them; it's that they shouldn't. The company did cooperate by providing the data that was already in their possession. But they were less comfortable with the potential slippery slope of the FBI's override request and the precedent that kind of government overreach would establish for the future.


Apple CEO Tim Cook told the feds as much.

"You want master access to every Apple device? Nah-uh. Not on my watch, pal." Photo by Astrid Stawiarz/Stringer/Getty Images.

Regardless of how you feel about the FBI's request, you have to wonder: Why is this only coming up now?

Actually, it's not.

While the iPhone itself has been around since 2007, this specific issue has to do with the new encryption policies Apple introduced with iOS 8 in 2014. "Apple cannot bypass your passcode and therefore cannot access [your personal] data," the company said at the time. "So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8."

It's almost like they saw this coming.

And the FBI wasn't happy about it back then, either. As FBI Director James Comey said after Apple's encryption started catching on, "This disconnect has created a significant public safety problem. ... Uploading to the cloud doesn't include all of the stored data on a bad guy's phone, which has the potential to create a black hole for law enforcement."

It's almost like they saw this coming, too. And they've been asking for access ever since.

Photo by Carrrrrlos/Flickr.

Since then, the FBI has tried repeatedly to inch Apple toward their big ask.

This all came to a head in fall of 2015, when the Justice Department asked for Apple's help to crack the iPhone (running iOS 7) of a drug dealer named Jun Feng.

"Apple has repeatedly assisted law enforcement officers in federal criminal cases by extracting data from passcode-locked iPhones pursuant to court orders," the government argued. "Apple has acknowledged that it has the technical capability to do so again in this case."

It's a classic method of manipulation. "Just one more tiny favor, that's all! Just this once!"

This time, it was a drug dealer; but next time, it could just be a kid who illegally downloaded the new Kanye record.

So Apple drew a line in the sand.

Photo by Robyn Beck/Getty Images.

Then the San Bernardino shooting happened.

On one hand, it was the highest death toll since the Sandy Hook Elementary School massacre three years prior, and an absolute tragedy.

But mass shootings aren't hard to come by in this country, even if there is some debate about what exactly qualifies as a "mass shooting."

There was Dylann Roof, for example, the radical white supremacist who killed nine people at a historic black church in South Carolina. There was Robert Lewis Dear, an anti-abortion radical who killed three people, including a university police officer, and injured nine more at a Planned Parenthood in Colorado.

Elliot Rodger was spurred on by radical misogyny and killed six people and wounded seven others in Isla Vista, California.

And who can forget Wade Michael Page, another radical white supremacist who killed six people at a Sikh temple in Oak Creek, Wisconsin? Or Jared Loughner, whose radical right-wing anti-government ideology led him to kill six people and injure 11 more, including a Congresswoman, at a supermarket in Tucson, Arizona?

The difference between these and the San Bernardino shootings? Syed Farook represents a unique opportunity for the FBI that the other shooters didn't.

Photo by Robyn Beck/Getty Images.

In case you didn't notice the pattern: the majority of mass shooters in the United States are white extremists.

And the people who were allegedly responsible for the San Bernardino massacre? They were Muslims.

So why did the FBI decide that it was finally the right time to ask for that super-special secret master key that they've been after for years?

Because they could.

Because Islamophobia is on the rise, which makes it easier for them to get the unrestricted access they've been after so they can use it in the future whenever they want, regardless of the "who" or the "why."

Photo by Patrick T. Fallon/Stringer/Getty Images.

So while Apple should be applauded for standing up to the FBI and defending our right to privacy, there's another deeply concerning issue lurking in the foreground.

As the face of the anti-surveillance movement said himself:

We know the U.S. government already spends a lot of time and resources spying on Muslims, even without an Apple master key. They do the same to "black extremists" and other left-wing "radical" movements such as Occupy as well.

And for the $500 million spent for every victim of terrorism, 90% of those caught up in this snooping are normal people like you and me.

The government's desire to compromise the privacy of its people under the auspices of "safety" is incredibly dangerous.

Let's refuse to perpetuate the racial fears that make this kind of subtle attack on our privacy possible.