+
Apple issued rare apology after report revealed employees regularly listened to private conversations
Photo by Tyler Lastovich on Unsplash

The hardest words to say are, "I'm sorry," but Apple (surprisingly) doesn't have a problem saying them after a whistleblower revealed that human strangers were listening to your private conversations. Apple commendably went a step further and actually fixed the issue that makes it feel like your phone is eavesdropping on you.

The unnamed whistleblower told The Guardian that Siri records conversations as a form of quality control called "grading." The purpose was to allow Apple to improve Siri, but it ended up feeling like one huge privacy violation.

It turns out, Apple's voice assistant could be triggered accidentally, even by muffled background noises or zippers. Once triggered, Siri made audio recordings, some of which included personal discussions about medical information, business deals, and even people having sex. The percentage of people yelling out, "Hey Siri!" while getting it on is probably very small.


RELATED: After someone tried selling her nude photos, Sia shared them on Twitter

Apple ensured that these recordings wouldn't be linked to data that could identify you, but the recordings were linked to user data that showed location, app data, and contact details. So, yeah, they could actually identify you.

To make things worse, the recordings were listened to by third-party contractors, not Apple employees. "[T]here's a high turnover. It's not like people are being encouraged to have consideration for people's privacy, or even consider it. If there were someone with nefarious intentions, it wouldn't be hard to identify [people on the recordings]," the whistleblower told The Guardian.

Apple did the right thing and apologized for the practice. "We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process. We realize we haven't been fully living up to our high ideals, and for that we apologize," Apple said in a post.

Not only that, they changed their policy to address the concerns revealed in The Guardian article. Now, Apple will no longer record conversations as a default. If you want to share your conversations with Apple so they can make improvements on Siri, you have to specifically opt in. Apple will also stop using third-party contractors to listen to the recordings. Quality control will be left to Apple employees who will review computer-generated transcripts instead of recordings. Any accidental recordings will be deleted.

RELATED: IT pros share some crucial lessons on how to avoid getting hacked

Technology has made our lives easier, but it's also ushered in a whole slew of privacy concerns. It's hard not to feel like your phone is your own personal telescreen from "1984," but worse because at least telescreens didn't have addictive Snapchat filters. Why should privacy be the trade-off just because we want the convenience of being able to say, "Hey Siri, what's the difference between a dolphin and a whale?" It's nice to have the peace of mind that we can make robots do our bidding without feeling like they're spying on us – at least when it comes to our iPhones.

Joy

Nurse turns inappropriate things men say in the delivery room into ‘inspirational’ art

"Can you move to the birthing ball so I can sleep in the bed?"

Holly the delivery nurse.

After working six years as a labor and delivery nurse Holly, 30, has heard a lot of inappropriate remarks made by men while their partners are in labor. “Sometimes the moms think it’s funny—and if they think it’s funny, then I’ll laugh with them,” Holly told TODAY Parents. “But if they get upset, I’ll try to be the buffer. I’ll change the subject.”

Some of the comments are so wrong that she did something creative with them by turning them into “inspirational” quotes and setting them to “A Thousand Miles” by Vanessa Carlton on TikTok.

“Some partners are hard to live up to!” she jokingly captioned the video.

Keep ReadingShow less
All images provided by Adewole Adamson

It begins with more inclusive conversations at a patient level

True

Adewole Adamson, MD, of the University of Texas, Austin, aims to create more equity in health care by gathering data from more diverse populations by using artificial intelligence (AI), a type of machine learning. Dr. Adamson’s work is funded by the American Cancer Society (ACS), an organization committed to advancing health equity through research priorities, programs and services for groups who have been marginalized.

Melanoma became a particular focus for Dr. Adamson after meeting Avery Smith, who lost his wife—a Black woman—to the deadly disease.

melanoma,  melanoma for dark skin Avery Smith (left) and Adamson (sidenote)

This personal encounter, coupled with multiple conversations with Black dermatology patients, drove Dr. Adamson to a concerning discovery: as advanced as AI is at detecting possible skin cancers, it is heavily biased.

To understand this bias, it helps to first know how AI works in the early detection of skin cancer, which Dr. Adamson explains in his paper for the New England Journal of Medicine (paywall). The process uses computers that rely on sets of accumulated data to learn what healthy or unhealthy skin looks like and then create an algorithm to predict diagnoses based on those data sets.

This process, known as supervised learning, could lead to huge benefits in preventive care.

After all, early detection is key to better outcomes. The problem is that the data sets don’t include enough information about darker skin tones. As Adamson put it, “everything is viewed through a ‘white lens.’”

“If you don’t teach the algorithm with a diverse set of images, then that algorithm won’t work out in the public that is diverse,” writes Adamson in a study he co-wrote with Smith (according to a story in The Atlantic). “So there’s risk, then, for people with skin of color to fall through the cracks.”

Tragically, Smith’s wife was diagnosed with melanoma too late and paid the ultimate price for it. And she was not an anomaly—though the disease is more common for White patients, Black cancer patients are far more likely to be diagnosed at later stages, causing a notable disparity in survival rates between non-Hispanics whites (90%) and non-Hispanic blacks (66%).

As a computer scientist, Smith suspected this racial bias and reached out to Adamson, hoping a Black dermatologist would have more diverse data sets. Though Adamson didn’t have what Smith was initially looking for, this realization ignited a personal mission to investigate and reduce disparities.

Now, Adamson uses the knowledge gained through his years of research to help advance the fight for health equity. To him, that means not only gaining a wider array of data sets, but also having more conversations with patients to understand how socioeconomic status impacts the level and efficiency of care.

“At the end of the day, what matters most is how we help patients at the patient level,” Adamson told Upworthy. “And how can you do that without knowing exactly what barriers they face?”

american cancer society, skin cacner treatment"What matters most is how we help patients at the patient level."https://www.kellydavidsonstudio.com/

The American Cancer Society believes everyone deserves a fair and just opportunity to prevent, find, treat, and survive cancer—regardless of how much money they make, the color of their skin, their sexual orientation, gender identity, their disability status, or where they live. Inclusive tools and resources on the Health Equity section of their website can be found here. For more information about skin cancer, visit cancer.org/skincancer.

The mesmerizing lost art of darning knit fabric.

For most of human history, people had to make their own clothing by hand, and sewing skills were subsequently passed down from generation to generation. Because clothing was so time-consuming and labor-intensive to make, people also had to know how to repair clothing items that got torn or damaged in some way.

The invention of sewing and knitting machines changed the way we acquire clothing, and the skills people used to possess have largely gone by the wayside. If we get a hole in a sock nowadays, we toss it and replace it. Most of us have no idea how to darn a sock or fix a hole in any knit fabric. It's far easier for us to replace than to repair.

But there are still some among us who do have the skills to repair clothing in a way that makes it look like the rip, tear or hole never happened, and to watch them do it is mesmerizing.

Keep ReadingShow less
Pop Culture

Artist uses AI to create ultra realistic portraits of celebrities who left us too soon

What would certain icons look like if nothing had happened to them?

Mercury would be 76 today.

Some icons have truly left this world too early. It’s a tragedy when anyone doesn’t make it to see old age, but when it happens to a well-known public figure, it’s like a bit of their art and legacy dies with them. What might Freddie Mercury have created if he were granted the gift of long life? Bruce Lee? Princess Diana?

Their futures might be mere musings of our imagination, but thanks to a lot of creativity (and a little tech) we can now get a glimpse into what these celebrities might have looked like when they were older.

Alper Yesiltas, an Istanbul-based lawyer and photographer, created a photography series titled “As If Nothing Happened,” which features eerily realistic portraits of long gone celebrities in their golden years. To make the images as real looking as possible, Yesiltas incorporated various photo editing programs such as Adobe Lightroom and VSCO, as well as the AI photo-enhancing software Remini.

“The hardest part of the creative process for me is making the image feel ‘real’ to me,” Yesiltas wrote about his passion project. “The moment I like the most is when I think the image in front of me looks as if it was taken by a photographer.”

Yesiltas’ meticulousness paid off, because the results are uncanny.

Along with each photo, Yesiltas writes a bittersweet message “wishing” how things might have gone differently … as if nothing happened.
Keep ReadingShow less