upworthy
Add Upworthy to your Google News feed.
Google News Button
More

“Am I disabled?” For millions, there’s not always a clear answer

At some point in my childhood, my hands began to shake.

Not badly at first—I couldn’t draw a straight line, but I didn’t mind much since I’d never had any inclination towards art. As I entered my teenage years, though, the tremors got worse and started spreading out of my hands.

Although I tried to control the spasms in my face with ever-increasing doses of beta-blockers and anti-epileptics, by middle school I’d acquired the nickname “Twitchy.” In high school, my handwriting had gotten so atrocious it’d become a running joke among the teachers. And by the time I reached college, I finally admitted to myself that my neurological condition made a few things in life definitively harder.


Yet the first time a classmate recommended, after watching me take notes by hand, that I ask for an accommodation to help with written exams, I balked.

I told him I felt like anything they could offer would be unfair because I could use extra time to think over questions longer than other students. Besides, I type faster than average. My tremors, which this friend pointed out are explicitly covered under the Americans with Disabilities Act (or ADA), are kinetic. That means that most of the time, they’re mild, especially when I’m at rest, though they spike drastically under certain conditions. But I knew in the back of my head that the real reason I didn’t want an accommodation was because it’d mean officially claiming a disability.

It wasn’t that I was ashamed of the classification. Instead, I felt like calling my tremors a “disability” would cheapen the word.

I’d grown up around adults with tremors who made no bones about them, and around people with far more obvious disabilities—like my hemiplegic mother. To me, a disability was something that interfered with pretty much every aspect of your life.

I felt like calling my tremors a ‘disability’ would cheapen the word.

I’m not alone in this. In forums and blogs, and among my friends with brain damage, serious mental illness, or other less-apparent issues, plenty of people want to know whether it’s accurate, acceptable, or functionally vital to call themselves disabled. But there just aren’t that many guidelines out there.

Yet after sustaining a serious of minor injuries with lasting effects on my chest and shoulder in recent years, making it incrementally harder to do select tasks, I've found my self questioning my status more and more.

So after scalding myself with hot tea and feeling something go pop in my ribcage while working out for the umpteenth, I decided it was time to try to resolve this debate within myself once and for all by reaching out to as many experts on disability identity and politics as I could.

"The first thing to know is that there's a difference between disability and impairment," says Professor Lex Frieden, a wheelchair-bound spinal cord injury victim and architect of the ADA.

“Impairment relates specifically to the kind of condition that somebody might have,” explains Frieden. “It’s kind of a general description” of whatever affects you physically or mentally.

Disability, on the other hand, is harder to define. There are legal definitions, the best known of which is the ADA’s, which counts any physical or mental impairment that limits at least one major life activity as a disability. But other regulations differ—under the Social Security Disability Insurance program, you need to be limited to the point of being unable to work. Different states have different barriers and metrics when it comes to measuring disability for parking permits or accommodations. And what counts as a “reasonable accommodation” by an employer is surprisingly hazy, which makes it a challenge to effectively argue an ADA discrimination case.

Frieden says that disabilities really come down to context and interpretation. “If you have a vision problem, for example, that doesn’t prevent you from driving, then you don’t have a disability when you’re driving,” he says. “But if that vision problem prevents you from being a professional target shooter, then you do have a disability” if that’s your dream vocation.

This conditional view of disability is a fine way of thinking about it as a legal or physical state. It helps us to determine who is eligible for which benefits. But a conditional view of disability doesn’t do much for someone who’s just trying to figure out whether it’s okay to think of him or herself as disabled.

For example, says Carrie Sandahl, the head of the University of Illinois at Chicago’s Program on Disability Art, Culture, and Humanities, if someone has a disfigured hand that doesn’t actually limit his or her functional abilities, someone else can still see the difference and treat that person differently. The ADA has a provision protecting people who experience disability discrimination when they might not actually have a disability, so that individual might be entitled to some form of "disability" benefit or consideration. But even if the individual claims that benefit, he or she might still not think it's okay to identify as disabled if he or she is only perceived as such, but not physically prevented from doing anything.

The fuzziness of who’s roped into disability as a concept or identity, versus who’s entitled to a benefit, can cause serious issues—especially when people with disabilities police each other over discrepancies between one’s apparent identity and claimed community or benefits.

“You witness people with some impairments who are able to have more access to things than people with other kinds of impairments,” says Sandahl of her own experience. “You can call it a disability hierarchy… We’re able to come up with a bunch of ways to differentiate amongst us.”

Cheryl Green, an artist working with brain injury victims whose own brain injury is not readily apparent to many, describes a double whammy experience of what’s seen by some as a “lesser” disability: Non-disabled people discredit or disbelieve the impairment’s impact, going so far as to deny accommodations, as do more obviously disabled people, who decide not to welcome someone like Green into the impaired community.

“[This] might help people get things that they need and keep other people from getting things that they don’t really need,” says Green, who argues that even though she’s able to pass as fully abled in certain contexts, that shouldn’t affect her commitment to or identification with disabled communities. “It also makes everybody into a vigilante instead of focusing [on] what the actual problem is—the idea that we don’t need to make the world as accessible as possible.”

To someone like me who’s dead terrified of overstepping and getting rebuked for claiming something beyond my need or (in)ability, this isn’t a reassuring status quo.

Many of those who'd be considered by most to be disabled, at least from the outside, choose to shirk the label—even if they might be able to benefit from it.

That’s not always because internally they decided their impairments didn’t affect them enough to warrant the identity; often it has to do with judgments from the outside world.

That said, Frieden believes that if Franklin Roosevelt were politicking today, he might not feel the need to hide his disability for fear of losing standing with world leaders or the electorate. “I think we’ve probably made a lot of progress in our culture by having people who don’t have disabilities willing to claim they do in order to get some modest benefit”—as with those feigning some type of impairment to get better a parking spot.

But both Frieden and Green think we’ve got a long way to go before we reach full acceptance, especially for people whose disabilities are invisible. As Green puts it: “Wheelchair is cool. Blind is cool. Deaf is cool because you get those cool interpreters at concerts who are fun to watch. The little things that people can latch onto that they kind of like. But cognitive and intellectual disabilities—that is a group that is really, really kind of hated by society.”

Self-righteous able-bodied people have taken to policing what they see as fraudulent claims to disabled status. These folks, points out Frieden, are often making a slew of wrong assumptions about those with truly debilitating heart conditions or respiratory problems. Occasionally, adds Sandahl, individuals with less outwardly conspicuous disabilities might feel as if they need to use a symbol of their impairment, like a cane. But if that a cane-user’s balance issues are only intermittent, as soon as someone catches that individual walking cane-free, they’re sure to face an undue amount of scorn.

“That’s the reason why I think a lot of people with non-apparent disabilities have a hard time coming out,” she says, “because it’s difficult to signal to people, and then people are suspicious.”

Admitting we need support can feel like an admission of deep personal failure.

It’s all summed up, says Sandahl, in that regrettable pejorative idiom, “Use it as a crutch,” which has turned a vital mobility tool into something unnecessary to be transcended.  

“The medical model tells us that we should all be striving for normalcy,” says Sandhal. “That if we’re not [normal], we need to be corrected. We need to be rehabbed. We need special education. We need radiation. We should put our energy into approximating normal. Not only is it something we should strive for, but it’s our personal responsibility. So if we don’t try to ameliorate our conditions, then it’s [our] fault that [we] are impaired—that [we] haven’t tried hard enough… It’s just a perverse cultural construct.”

For Sandahl, who has mobility issues, this internal stigma forced her, against her best knowledge and logic, to avoid using a walking aid for a long time—until she blew out a hip. Then she opted to use a cane, only to develop scoliosis from relying too much on it. This led her to crutches, but she could only support her weight on her arms for so long, limiting her mobility. So eventually she decided to start using a wheelchair, which she admits made her feel like a lesser human.

“People will hurt themselves by not using an aid before they need it,” says Sandahl. I know from the pain in my joints and chest and twinge in my hands that I’ve done just that for years. And the twisted part is that, even as I write this, I worry that becoming disabled in my own mind will still feel wrong. As if I’d be an interloper, a fraud, rather than someone with a real need.

I've come to believe that if you know in your aching bones that you need an accommodation or a community to identify with, then you ought to claim your disability.

The problem becomes knowing the difference between what we need and what we want. I’m not entirely sure that I need people to help me carry a hot beverage every now and then, or if I just want one damn thing to go a little easier next time. (It sure would be nice to avoid tremor-splashing coffee on my next first date, though.)

I worry about making the wrong call. If I claim more for myself than I should, I’ll be one more person contributing to a diminished perception of others with more severe impairments. If I claim less than I should, my complacency might lead to more vigilantism against people with less evident but still impactful conditions. I also still worry about the way others will judge me for either identifying as disabled, or claiming an unnecessary benefit—or both. This continuing murkiness will probably drive me to claim a disability benefit on occasion and only situationally refer to myself as “minorly” disabled.

But that's just my personal calculation. If, like me, you’ve found yourself asking whether something's bad enough to merit help, or just to call yourself disabled, I hope you’ll give yourself permission to answer that question honestly.

This article originally appeared on GOOD.

A waiter talking with his hands.

One of the great things about America is that we have a relatively young culture, so many of the foods that we eat were brought over from other countries. That makes America a great place to try out all the different types of food from around the world.

However, we also like to put our own stamp on staples from around the globe that give the American version its own unique flair. Some foods that we claim originated overseas were actually first made right here in the U.S. of A. For example, chimichangas, which can be found in many Mexican restaurants, actually originated in the state of Arizona. Crab Rangoon, a popular “Chinese” dish, was actually invented in San Francisco, and spaghetti and meatballs were never a thing in Italy.

TikTok creator Gabby Donahue posted a video that’s the perfect example of how some ethnic foods get remixed once they become popular in the States. In a video with over 7 million views, her father shows a waiter in Italy a photo of chicken parmesan from Olive Garden so he can order it at the restaurant. The waiter's reaction is an excellent example of someone trying to be polite while he cannot believe what he is seeing.

“My Boston Irish father trying to order a Google image of the Olive Garden chicken parm in Italy,” Donahue wrote in the text overlay.

@gabbydonahuee

@Olive Garden ‘s biggest fan 😭😭😭😭 #italy #cultureshock #chickenparm #olivegarden


When the father showed the picture to the waiter, he seemed a bit confused about the image. “Only in the States,” he said. “It doesn’t exist in Italy.” The father couldn’t believe what he was hearing: “It doesn’t exist in Italy?”

“I don’t know what it is…on the pasta?” the waiter said, trying to make sense of the chicken breast smothered in cheese and sauce. The waiter gave his final verdict while holding his chin: “No. That’s horrible.”

“Horrible? Wow. Look at that. That doesn’t,” the father laughed. “That looks good… but,” the waiter shrugged off the father. “It does look good,” the father continued. “It tastes good. I’ll tell you what, I’m gonna mail you some. I’ll send it to you.”

“Okay? Olive Garden chicken, I’m gonna search,” the waiter said, walking away from the table.


The commenters had a field day analyzing the waiter’s body language. “‘No, that looks good’ while looking completely disgusted was the most Italian reaction ever,” one commenter wrote. “Bro remembered halfway through his disgust that he’s at work,” another added.

It’s not crazy that an American would think that chicken parmesan is an Italian dish; after all, it’s served in most Italian-American restaurants. However, according to Paesana, it was created in America by the Italian diaspora.

“In the Old World, that’s Italy prior to the Italian diaspora—the large-scale emigration of Italians from Italy to America—proteins like chicken were not widely available," according to an article on the site. "As such, the prototypical chicken parmigiana was actually made with breaded, fried slices of eggplant in place of chicken for a dish called melanzane alla Parmigiana."


Even though chicken parmesan didn’t originate in the old country, Pasquale Sciarappa, a popular Italian-born food influencer living in America, has no problem cooking the dish.

"'That’s not Italian!’ I hear this every time I share a dish like Chicken Parmigiana. And you know what? They’re right — it’s not something you’d traditionally find in Italy. But you know what else is true? It’s Italian-American. It was born in immigrant kitchens — from people who left Italy, landed in the U.S., and made do with what they had. They took inspiration from dishes like melanzane alla parmigiana and recreated comfort from memory using what was available,” he wrote.

It’s understandable that an American could go to Italy without knowing that something he’d had in Italian restaurants wasn’t actually from Italy. It’s understandable for an Italian server to balk at a photo of a dish served in an American restaurant that you’d find in a shopping mall.

But we should all agree that one of the wonderful things about American culture is that it's an amalgamation of different cultures stirred around in the same pot, and if that means we get a fresh variation on the burrito, a new way to eat Chinese crab, or a tasty piece of chicken where eggplant used to be, the more the better.

Education

Stop struggling with small talk by using the easy 'COST method'

This simple acronym will make your next social gathering a lot more enjoyable.

A man and woman chatting at a dinner party.

There are several reasons why people are hesitant to engage in small talk at a party or around the water cooler at work. Some people simply avoid it because they don’t find chatting about the weather, sports, or what they saw on television the night before very interesting.

Others are afraid that they may run out of things to say or that there will be an awkward pause that makes them want to hide their head in the sand, like an ostrich. Mary, a friendship educator with a degree in interpersonal communication, has a solution for those of us who want to be friendly and meet people but abhor small talk; she calls it the COST method.

What is the COST method for making small talk?

According to Mary, who goes by @better.social.skills on TikTok, COST stands for Compliment, Observation, Story, and Tip. These are four options you can turn to when you're in need of a conversation topic.

@better.social.skills

Remember the acronym C.O.S.T. and you’ll always have something to talk about at parties or events. C stands for compliment. Tell somebody you like their shirt or shoes, for example, and see where the conversation leads. O stands for observation. Remark on something happening around you, like if you enjoy the music or feel a certain way about the weather. A stands for story, in which you share a little anecdote about yourself. For example, maybe you were late to the party for some reason, or you’re excited to get home and watch a show you’re loving. T stands for tip, in which you give a small recommendation to someone. For example, where the shortest bathroom lines are, which food is particular particularly delicious, or point out an interesting person they might want to talk to. What do you think? Would you use these? #creatorsearchinsights #conversationstarters

1. Compliment

“Tell somebody you like their shirt or shoes, for example, and see where the conversation leads,” Mary says.

“Oh, I like your shoes.”

“I like your shirt.”

“You have such a soothing voice.”

2. Observation

“Remark on something happening around you,” Mary says.

“This song is amazing.”

“I really love how Jeanie decorated this room.”

“There’s a lot more people here than last night.”

3. Story

“Share a little anecdote about yourself. For example, maybe you were late to the party for some reason, or you’re excited to get home and watch a show you’re loving,” she said.

4. Tip

“Give a small recommendation to someone. For example, where the shortest bathroom lines are, which food is particularly delicious, or point out an interesting person they might want to talk to,” Mary said.

“I don’t know if you’ve tried the new Mexican place on South Street yet…”

“I’d have one of Jeanie’s margaritas now, before they are all gone.”

“Be careful if you talk to Brian. He can get a bit long-winded.”

chatting before movie, popcorn, movie theater, snacks, small talk Three people chatitng before a movie.via Canva/Photos

The great thing, if you’re a little shy about making small talk, is that studies show that you definitely don’t need to do all the heavy lifting in the conversation. In fact, a Gong.io study found that the best way to make a connection with someone is to speak 43% of the time and let your new friend talk for the other 57% of the conversation.

Further proof that the best way to make a great first impression is a study published in the Journal of Personality and Social Psychology. It found that when meeting someone for the first time, ask them a question and then be sure to ask two additional follow-ups before discussing yourself. This has been found to dramatically increase your likability.

“We identify a robust and consistent relationship between question-asking and liking,” the authors of the study write. “People who ask more questions, particularly follow-up questions, are better liked by their conversation partners.”

For those of you who have always felt that you're bad at making small talk. while others seemed to do it naturally, realize that people aren’t born great communicators; it’s a skill that can be learned just like anything else. With a few tips from the experts, you can go from dreading small talk to enjoying striking up a conversation with just about anyone.

Harvard researcher Arthur C. Brooks studies what leads to human happiness.

We live in a society that prizes ambition, celebrating goal-setting, and hustle culture as praiseworthy vehicles on the road to success. We also live in a society that associates successfully getting whatever our hearts desire with happiness. The formula we internalize from an early age is that desire + ambition + goal-setting + doing what it takes = a successful, happy life.

But as Harvard University happiness researcher Arthur C. Brooks has found, in his studies as well as his own experience, that happiness doesn't follow that formula. "It took me too long to figure this one out," Brooks told podcast host Tim Ferris, explaining why he uses a "reverse bucket list" to live a happier life.

bucket list, wants, desires, goals, detachment Many people make bucket lists of things they want in life. Giphy

Brooks shared that on his birthday, he would always make a list of his desires, ambitions, and things he wanted to accomplish—a bucket list. But when he was 50, he found his bucket list from when he was 40 and had an epiphany: "I looked at that list from when I was 40, and I'd checked everything off that list. And I was less happy at 50 than I was at 40."

As a social scientist, he recognized that he was doing something wrong and analyzed it.

"This is a neurophysiological problem and a psychological problem all rolled into one handy package," he said. "I was making the mistake of thinking that my satisfaction would come from having more. And the truth of the matter is that lasting and stable satisfaction, which doesn't wear off in a minute, comes when you understand that your satisfaction is your haves divided by your wants…You can increase your satisfaction temporarily and inefficiently by having more, or permanently and securely by wanting less."

Brooks concluded that he needed a "reverse bucket list" that would help him "consciously detach" from his worldly wants and desires by simply writing them down and crossing them off.

"I know that these things are going to occur to me as natural goals," Brooks said, citing human evolutionary psychology. "But I do not want to be owned by them. I want to manage them." He discussed moving those desires from the instinctual limbic system to the conscious pre-frontal cortex by examining each one and saying, "Maybe I get it, maybe I don't," but crossing them off as attachments. "And I'm free…it works," he said.

- YouTube www.youtube.com

"When I write them down, I acknowledge that I have the desire," he explained on X. "When I cross them out, I acknowledge that I will not be attached to this goal."

The idea that attachment itself causes unhappiness is a concept found in many spiritual traditions, but it is most closely associated with Buddhism. Mike Brooks, PhD, explains that humans need healthy attachments, such as an attachment to staying alive and attachments to loved ones, to avoid suffering. But many things to which we are attached are not necessarily healthy, either by degree (over-attachment) or by nature (being attached to things that are impermanent).

"We should strive for flexibility in our attachments because the objects of our attachment are inherently in flux," Brooks writes in Psychology Today. "In this way, we suffer unnecessarily when we don't accept their impermanent nature."

What Arthur C. Brooks suggests that we strive to detach ourselves from our wants and desires because the simplest way to solve the 'haves/wants = happiness' formula is to reduce the denominator. The reverse bucket list, in which you cross off desires before you fulfill them, can help free you from attachment and lead to a happier overall existence.

Did Julius Caesar have his armpits plucked? Probably.

Modern life may have us shaving, waxing, microblading, laser treating, Botoxing, and altering our natural appearance in all manner of ways in the name of beauty, but the idea of grooming to specific societal standards is nothing new. In cultures all around the world and throughout history, humans have found countless creative ways to make ourselves (ostensibly) look better.

Of course, what looks better is subjective and always has been. Take, for example, the ancient Romans. If you wanted to be seen as a studly man 2,000 years ago in the Roman Empire, you'd remove as much of your body hair as possible. That meant tweezing—or being tweezed by someone else, most likely an enslaved person.

armpit hair, grooming, hair removal, hairless, beauty standard Armpit hair wasn't cook in ancient Rome. Giphy

The Romans, in general, weren't big on body hair for men or women.

"You had to have the look,” Cameron Moffett, English Heritage’s curator at the Wroxeter Roman City museum in Shropshire, U.K., told The Times. “And the look was hairlessness, particularly the underarms.” A collection of 50 tweezers on display at the museum, recovered from the archeological site that was once the Roman city of Viriconium, speaks to Roman tweezing habits, but that's not the only evidence we have.

Stoic philosopher Seneca once wrote in a letter lamenting how the noise from the Roman baths was disrupting his work: "Besides those who just have loud voices, imagine the skinny armpit-hair plucker whose cries are shrill to draw people's attention and never stop except when he's doing his job and making someone else shriek for him."

When we picture the ancient Romans, "skinny armpit-hair plucker" may not be the image that comes to mind, yet here we are.


teeth brushing, toothbrush, oral hygiene, toothpaste, dental hygiene They brushed with what now? Giphy

While we fret over fluoride, the Romans brushed their teeth with pee and mouse brains.

Toothpastes of the past were made with all kinds of things—herbs, spices, salts, crushed bone, and more. For the ancient Romans, that "more" included mouse brains and human urine, according to Decisions in Dentistry. Mouse brains were believed to enhance the effectiveness of toothpaste, and urine, imported in large quantities from Portugal, was utilized for its ammonia content and whitening properties. A standard Roman toothpaste would be a mixture of herbs, mouse brains, urine, and a binder such as honey. Oddly enough, it appeared to be somewhat effective, with archeological findings showing a relatively low number of cavities and tooth decay.

@charissekenion

Sailorr has everyone talking about her sound - and her teeth. Here’s my super short history lesson on the practice of ohaguro #ohaguro #geisha #japanese #japantok #aapi #history #japan #historytok #sailorr #japanesebeauty

Meanwhile, in ancient Japan, women tried to blacken their teeth

Teeth whitening is all the rage in modern times, but in the distant past in parts of Asia, making your teeth black was considered beautiful. The practice known as ohaguro was a traditional Japanese practice that, ironically, was intended to prevent tooth decay.

According to a letter in the British Dental Journal, women in ancient Japan would paint a solution of ferric acetate (from iron filings), vinegar, and tannin from tea or vegetables. It was called kanemizuonto and made the teeth appear black. The practice has made a comeback among some rural areas of Southeast Asia, and the Vietnamese-American singer Sailorr has made waves with her blackened teeth as well.

ear picker, history, artifact, grooming, beauty An ornate ear picker.The Swedish History Museum, Stockholm/Wikimedia Commons

Ear pickers were much prettier than Q-tips. In fact, they were an accessory.

The old saying, "Don't put anything in your ear except your elbow," may not be as old as it seems, as people have been inserting objects into their ears to remove wax for a long time.

In the 16th and 17th centuries, it was common to see beautiful, ornate "ear pickers"—small metal tools with a small scoop at the end for cleaning ears as well as teeth and fingernails. According to Jamestown Rediscovery, it was fashionable to wear gold and silver toiletry tools, such as ear pickers or toothpicks, as accessories. It's hard to imagine wearing Q-tips and toothpicks around. Also, ew. But if you look up "ear pickers," you'll find ornate examples from various parts of the world.

At the very least, it's nice to know that modern humans are not the first ones to go to great—and sometimes interesting—lengths to meet an arbitrary social standard of beauty. (And three cheers for modern toothpaste. Seriously.)