+
A PERSONAL MESSAGE FROM UPWORTHY
We are a small, independent media company on a mission to share the best of humanity with the world.
If you think the work we do matters, pre-ordering a copy of our first book would make a huge difference in helping us succeed.
GOOD PEOPLE Book
upworthy

privacy

Policy

Professor warns students about exactly how much private information available to them

"I know many students don't understand all of the ways they're being tracked."

Professor warns students of available private information

College is generally a time where kids are feeling much more freedom than they're used to. They're finally able to come and go as they please, stay out as late as they want and sleep in as long as they feel like it without immediately having to answer to a parent. It's a mix of freedom and privacy that nearly every adolescent craves and as long as they do well in classes, no one asks too many questions.

Well, about that privacy, professors are saying not so fast in feeling untethered. Apparently the software used on many college campuses is not just for submitting assignments or downloading readings. After a recent article was published by The Markup titled, He Wanted Privacy. His College Gave Him None. written by Tara García Mathewson, college professor and doctoral candidate Victoria Alexander took to social media to help ring the alarm.


There's some level of privacy expected while attending college but according to Alexander, that privacy is simply an illusion. The educational software the schools use not only track when you log on, off and how long you spend logged into the site, it tells so much more. It lets professors know where you logged on from, what materials you accessed and how long you accessed the material. But the lack of privacy doesn't stop there according to Alexander and Mathewson.

"Those are just what I can see as a professor. The general university surveillance can see many other things," Alexander explains. "If your phone's connected to university WiFi they can tell where you're going on the internet and where you're physically going on campus. Many universities also use facial recognition through their security cameras so they know where you are in person and they know where you are online and if you've logged into your social media they can also know what you're up to on there and what your friends are sharing."

This information gathering isn't just for the universities to use, some may also sell it. And this isn't something that college campuses are forthcoming about or really give students the option to opt out.

"Still, whether living on campus or off, taking classes in person or remotely, students simply cannot opt out of most data collection and still pursue a degree," Mathewson says in The Markup.

@victoria_phd

#stitch with @The Markup 💥 Read: He Wanted Privacy. His College Gave Him None by Tara García Mathewson #Surveillance #Privacy #DataPrivacy #MarkupPartner #SchoolSurveillance

Some professors let it be known in the comments of Alexander's video that they don't use the excess information provided by the learning management systems.

"I make it a point not to use this information against my students. They're adults, they can lie, I will judge their work and participation," one professor writes.

"I choose not to access ANY of this info about my students. They're adults, I'm not their keeper," another professor says.

While it's great that most of the professors who revealed themselves in the comments aren't using this private information, the average commenter was flabbergasted and a bit upset that it's available.

"There's actually zero justification for my professor knowing my location...like ever. I'm an adult who's PAYING to be there," one person says.

"Help me out, why would any college or university need access to that kind of information? How is that not an invasion of privacy," someone asks.

"So you're telling me universities are practically FBI agents," another person questions.

Certainly this isn't something that is advertised when taking college tours and surely if students knew how much they were being monitored, many tech savvy kids would find a way around most of it. But when using facial recognition and location tracking via student IDs it may be a bit more tricky. Either way it seems the concerns raised are valid and something that may need to be discussed before sending your student off to college in the fall.

Photo by Tyler Lastovich on Unsplash

The hardest words to say are, "I'm sorry," but Apple (surprisingly) doesn't have a problem saying them after a whistleblower revealed that human strangers were listening to your private conversations. Apple commendably went a step further and actually fixed the issue that makes it feel like your phone is eavesdropping on you.

The unnamed whistleblower told The Guardian that Siri records conversations as a form of quality control called "grading." The purpose was to allow Apple to improve Siri, but it ended up feeling like one huge privacy violation.

It turns out, Apple's voice assistant could be triggered accidentally, even by muffled background noises or zippers. Once triggered, Siri made audio recordings, some of which included personal discussions about medical information, business deals, and even people having sex. The percentage of people yelling out, "Hey Siri!" while getting it on is probably very small.


RELATED: After someone tried selling her nude photos, Sia shared them on Twitter

Apple ensured that these recordings wouldn't be linked to data that could identify you, but the recordings were linked to user data that showed location, app data, and contact details. So, yeah, they could actually identify you.

To make things worse, the recordings were listened to by third-party contractors, not Apple employees. "[T]here's a high turnover. It's not like people are being encouraged to have consideration for people's privacy, or even consider it. If there were someone with nefarious intentions, it wouldn't be hard to identify [people on the recordings]," the whistleblower told The Guardian.

Apple did the right thing and apologized for the practice. "We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process. We realize we haven't been fully living up to our high ideals, and for that we apologize," Apple said in a post.

Not only that, they changed their policy to address the concerns revealed in The Guardian article. Now, Apple will no longer record conversations as a default. If you want to share your conversations with Apple so they can make improvements on Siri, you have to specifically opt in. Apple will also stop using third-party contractors to listen to the recordings. Quality control will be left to Apple employees who will review computer-generated transcripts instead of recordings. Any accidental recordings will be deleted.

RELATED: IT pros share some crucial lessons on how to avoid getting hacked

Technology has made our lives easier, but it's also ushered in a whole slew of privacy concerns. It's hard not to feel like your phone is your own personal telescreen from "1984," but worse because at least telescreens didn't have addictive Snapchat filters. Why should privacy be the trade-off just because we want the convenience of being able to say, "Hey Siri, what's the difference between a dolphin and a whale?" It's nice to have the peace of mind that we can make robots do our bidding without feeling like they're spying on us – at least when it comes to our iPhones.

Instagram / Sia Music

Warning: one NSFW photo below.

She's far from the first celebrity to have nude photos leaked online, but Sia's response is certainly unique — and kind of awesome.

On Nov. 6, the pop singer tweeted out a blurred photo of her naked backside with a message: "Someone is apparently trying to sell naked photos of me to my fans. Save your money, here it is for free. Everyday is Christmas!"

It was a brilliant tweet, at once diminishing the photo's value (hard to make money on something that's been sent out for free to 3.2 million people on Twitter) and embracing herself for who she is.


You could even say that she turned the photographer into the butt of the joke.

When it comes to what people feel entitled to from celebrities, society's expectations need a revamp in a big way.

A person — a fan, even — might enjoy the music, acting, or art created by a celebrity. That doesn't mean they're entitled to nude photos or other breaches of personal privacy. Sia, for all her fame and success, is a person just like the rest of us. Just because she's not ashamed of her body (which is a good thing, obviously), that doesn't mean any of us are entitled to see it — especially without her consent.

While this sort of blackmail is never OK, it seems especially cruel to target her like this given her very publicly stated feelings about the concept of fame: "If anyone besides famous people knew what it was like to be a famous person, they would never want to be famous," she wrote in her 2013 "Anti-Fame Manifesto," making a point of obscuring her face during her public performances ever since.

So just be cool. Treat celebrities like people. Is that really so hard?

This article was originally published on November 7, 2017.

The future is now! And it's kind of scary.

Amazon's come a long way from being the little online bookstore that could. Now, in addition to delivering your packages, running your smart home features, and telling you what to wear, it may also soon be helping the government track every move you make.

A few items on that list are a little creepy, but it's really that last one that's setting off red flags with people and groups like the ACLU concerned with civil liberties.


In 2016, the company launched Amazon Rekognition, its flagship image recognition software. The basic premise was that you could take a picture, run it through the software, and it'd respond by telling you what the picture was. The example used in the rollout was a photo of a dog. Awww!

Fast forward to 2018, and Rekognition has gotten a few upgrades. It's even being tested out by a handful of police departments. The company boasts about the technology's ability to detect, track, and analyze photos or videos of people. They refer to it as "high-quality person tracking" and "activity detection."

"Activating a city-wide facial recognition system could be as easy as flipping a switch," the ACLU's Matt Cagle warns in a YouTube video. "Body cams were designed to keep officers accountable to the public, but facial recognition turns these devices into surveillance machines."

The ACLU has been trying to sound the alarm about the dangers of facial recognition, and it might have just found a way to get the attention of people who can help: Congress.

It's unlikely a profit-driven company like Amazon will simply choose to abandon this admittedly impressive and lucrative tech on its own. Even if it did, another company would surely swoop in with its own version. To protect people from the obvious abuses that can come with far-reaching surveillance, it's going to take an act of Congress to put restrictions on how this technology can be used.

To prove a point, the ACLU ran photos of every member of Congress through the Rekognition software, comparing it with criminal databases. What they found was shocking.  

The analysis incorrectly matched the faces of 28 members of Congress with mugshots.

In other words, not only might this new software be used as the backbone of a new surveillance state, but it also might flag you as a criminal. That's not ideal! Thankfully, it caught congresspeople's attention, with a number of senators and representatives issuing statements about the experiment.

The ACLU's study also revealed another issue with the technology: People of color are disproportionately likely to get a false match.

Six members of the Congressional Black Caucus were falsely matched to mugshots. Despite that just 20% of members of Congress are people of color, 39% of false matches were people of color.

Rep. Luis Gutiérrez (D-Illinois), who was one of the politicians wrongly matched by Rekognition, signed a letter with other congresspeople to Amazon CEO Jeff Bezos expressing concerns. Photo by Scott Olson/Getty Images.

"It’s not hard to imagine a police officer getting a 'match' indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins," the ACLU's Jacob Snow wrote on the group's blog. "Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification."

Snow continued:

"An identification — whether accurate or not — could cost people their freedom or even their lives. People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that. A recent incident in San Francisco provides a disturbing illustration of that risk. Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle."

But there are some simple things you can do to prevent facial recognition software from being used the wrong way.

For one, you can join the ACLU's efforts to petition Amazon to do the right thing and stop selling surveillance equipment to the government. You can also donate to the ACLU to help fund its efforts to fight back against government overreach and threats to our privacy.

The most important thing you can do is to call up your representatives at the federal, state, and local levels. Let them know that this is something that concerns you and that you'd like to see action taken to make sure this technology doesn't get misused.