+
A PERSONAL MESSAGE FROM UPWORTHY
We are a small, independent media company on a mission to share the best of humanity with the world.
If you think the work we do matters, pre-ordering a copy of our first book would make a huge difference in helping us succeed.
GOOD PEOPLE Book
upworthy

forgotten history

Pop Culture

3 moments that might convince you Edgar Allan Poe was a time traveler.

In the case of Poe, it was his fiction that was, well, stranger than fiction.


I'm pretty positive that Edgar Allan Poe had (has?) the power to travel through time. Hear me out on this one.

It's not just the well-known circumstances of his life — orphaned at a young age, father of the mystery novel, master of cryptology, maestro of the macabre. Nor am I referring to the head-scratching details of the days leading up to his death: how he was found on the street near a voting poll wearing someone else's clothes, and during his subsequent hospitalization, he was alleged to babble incoherently about an unidentified person named “Reynolds."

And I won't even get into the confounding reports of a nameless figure who, for seven decades, would show up to Poe's gravesite in the early hours of his birthday with a glass of cognac and three roses.



Tragic and curious, yes, but hardly evidence that the acclaimed horror writer could transcend the limits of space and time. No, my time travel theory concerns the author's creative output, which you'll soon see is so flukishly prophetic as to make my outlandish claim seem plausible — nay, probable!

The proof is in the pudding, and the pudding is a loosely linked map of flesh-eating floaters, crunched skull survivors, and primordial particles. OK, here we go…

Photo by Albert Sterner/Wikimedia Commons.

Exhibit A: "The Narrative of Arthur Gordon Pym of Nantucket"

Published in 1838, Poe's only completed novel details a mutiny on a whaling ship lost at sea. Out of supplies, the men revert to cannibalism, drawing straws to elect a sacrifice. A boy named Richard Parker draws the shortest straw and is subsequently eaten.

Now here's where it gets weird(er): In 1884, 46 years after the novel's publication, four men would be set adrift following the sinking of their yacht. Shipwrecked and without food, they too would go the survival cannibalism route, electing to kill and eat a 17-year-old cabin boy. The boy's name: Richard Parker.

The extraordinary parallel went unnoticed for nearly a century, until a widely-circulated letter from a descendant of the real Parker outlined the similarities between the novel's scene and the actual event. The letter was selected for publication in The Sunday Times after journalist Arthur Koestler put out a call for tales of “striking coincidence." Striking indeed.

Image from the collection of Jack and Beverly Wilgus/Wikimedia Commons/Wikimedia Commons.

Exhibit B: "The Businessman"

In 1848, a railroad worker named Phineas Gage suffered a traumatic brain injury after taking an iron spike through the skull. Somehow he survived, though his personality would change drastically. These behavioral changes were closely studied, allowing the medical community to develop the first understanding of the role played by the frontal lobe on social cognition.

Except for Poe, who'd inexplicably understood the profound personality changes caused by frontal lobe syndrome nearly a decade earlier. In 1840, he penned a characteristically gruesome story called “The Businessman" about an unnamed narrator who suffers a traumatic head injury as a young boy, leading to a life of obsessive regularity and violent, sociopathic outbursts.

Poe's grasp of frontal lobe syndrome is so precise that neurologist Eric Altshuler wrote, “There's a dozen symptoms and he knows every single one… There's everything in that story, we've hardly learned anything more." Altshuler, who, to reiterate, is a medically-licensed neurologist and not at all a crackpot, went on to say, “It's so exact that it's just weird, it's like he had a time machine."

Photo via NASA/Wikimedia Commons.

Exhibit C: "Eureka"

Still unconvinced? What if I told you that Poe predicted the origins of the universe 80 years before modern science would begin to formulate the Big Bang theory? Surely, an amateur stargazer with no formal training in cosmology could not accurately describe the machinery of the universe, rejecting widely-held inaccuracies while solving a theoretical paradox that had bewildered astronomers since Kepler. Except that's exactly what happened.

The prophetic vision came in the form of "Eureka," a 150-page prose poem critically panned for its complexity and regarded by many as the work of a madman. Written in the final year of Poe's life, "Eureka" describes an expanding universe that began in “one instantaneous flash" derived from a single “primordial particle."

Poe goes on to put forth the first legitimate solution to Olbers' paradox — the question of why, given the vast number of stars in the universe, the night sky is dark — by explaining that light from the expanding universe had not yet reached our solar system. When Edward Robert Harrison published "Darkness at Night" in 1987, he credited "Eureka" as having anticipated his findings.

In an interview with Nautilus, Italian astronomer Alberto Cappi speaks of Poe's prescience, admitting, “It's surprising that Poe arrived at his dynamically evolving universe because there was no observational or theoretical evidence suggesting such a possibility. No astronomer in Poe's day could imagine a non-static universe."

Photo from Dodd, Mead and Company/Wikimedia Commons.

But what if Poe wasn't of a day at all, but of all the days?

What if his written prophecies — on the cannibalistic demise of Richard Parker, the symptoms of frontal lobe syndrome, and the Big Bang theory — were merely reportage from his journey through the extratemporal continuum?

Surely I sound like a tinfoil-capped loon, but maybe, maybe, there are many more prophecies scattered throughout the author's work, a possibility made all the more likely by the fact that, as The New York Times notes, “Poe was so undervalued for so long, there is not a lot of Poe-related material around."

I'll leave you with this quote, taken from a letter that Poe wrote to James Russell Lowell in 1844, in which he apologizes for his absence and slothfulness:

"I live continually in a reverie of the future. I have no faith in human perfectibility. I think that human exertion will have no appreciable effect upon humanity. Man is now only more active — not more happy — nor more wise, than he was 6000 years ago. The result will never vary — and to suppose that it will, is to suppose that the foregone man has lived in vain — that the foregone time is but the rudiment of the future — that the myriads who have perished have not been upon equal footing with ourselves — nor are we with our posterity. I cannot agree to lose sight of man the individual, in man the mass… You speak of “an estimate of my life" — and, from what I have already said, you will see that I have none to give. I have been too deeply conscious of the mutability and evanescence of temporal things, to give any continuous effort to anything — to be consistent in anything. My life has been whim — impulse — passion — a longing for solitude — a scorn of all things present, in an earnest desire for the future."


This story was originally published on HistoryBuff and first appeared on 8.16.16



Education

You may not know Gladys West, but her calculations revolutionized navigation.

She couldn't have imagined how much her calculations would affect the world.

US Air Force/Wikimedia Commons.

Dr. Gladys West is inducted into the Air Force Space and Missile Pioneers Hall of Fame, 2018.

This article originally appeared on 02.08.18


If you've never driven your car into a lake, thank Gladys West.

She is one of the mathematicians responsible for developing the global positioning system, better known as GPS.

Like many of the black women responsible for American achievements in math and science, West isn't exactly a household name. But after she mentioned her contribution in a biography she wrote for a sorority function, her community turned their attention to this local "hidden figure."


West was one of only four black employees at the Naval Proving Ground in 1956.

She accepted a position at the Dahlgren, Virginia, facility doing calculations, with her early work focusing on satellites. West also programmed early computers and examined the information that determined the precise location and elevation of satellites in space. Her data collection and calculations would ultimately aid in the development of GPS.

Employe testing the circuits on a super computer 1950s.

U.S. Census Bureau employees/Wikimedia Commons.

West and her colleagues back then probably could not have speculated just how much their calculations would affect the world.

Pretty much every "smart" device — from cellphones to fridges to dog collars — has GPS capabilities these days. The technology has changed the way we play, work, navigate, and explore our communities.

"When you're working every day, you're not thinking, 'What impact is this going to have on the world?' You're thinking, 'I've got to get this right,'" West once said in an interview with The Associated Press.

GPS, technology, community, inventors

GPS has intrigated into many of the devices we use today.

Photo by Psk Slayer on Unsplash

West would continue her work until her retirement in 1998.

After more than 40 years of calculations and complex data analysis, West retired. And following a well-earned vacation with her husband, she suffered a major stroke. But during her recovery, she worked toward returning to school and earned a doctorate. Her go-forward determination led to her regain most of her mobility, and she even survived heart surgery and cancer years later.

While she may not be as well known as other women in STEM fields, West's contribution is undeniable.

At 87, West is working on her memoir and spending time with her husband, children, and grandchildren. And according to her oldest daughter, West — despite the advent of GPS — still likes to have a paper map on hand.

Who are we to argue with greatness?

Unless you're a child, New York City resident, or UPS driver, chances are you've made a left turn in your car at least once this week.

Chances are, you didn't think too much about how you did it or why you did it that way.

You just clicked on your turn signal...

...and turned left.

GIF from United States Auto Club.


The New York State Department of Motor Vehicles instructs drivers to "try to use the left side of the intersection to help make sure that you do not interfere with traffic headed toward you that wants to turn left," as depicted in this thrilling official state government animation:

GIF from New York Department of Motor Vehicles.

Slick, smooth, and — in theory — as safe as can be.

Your Drivers Ed teacher would give you full marks for that beautifully executed maneuver.

[rebelmouse-image 19530938 dam="1" original_size="500x332" caption="GIF from "Baywatch"/NBC." expand=1]GIF from "Baywatch"/NBC.

Your great-grandfather, on the other hand, would be horrified.

[rebelmouse-image 19530939 dam="1" original_size="400x309" caption="GIF from "Are You Afraid of the Dark"/Nickelodeon." expand=1]GIF from "Are You Afraid of the Dark"/Nickelodeon.

Before 1930, if you wanted to hang a left in a medium-to-large American city, you most likely did it like so:

[rebelmouse-image 19530940 dam="1" original_size="700x284" caption="Photo via Fighting Traffic/Facebook." expand=1]Photo via Fighting Traffic/Facebook.

Instead of proceeding in an arc across the intersection, drivers carefully proceeded straight out across the center line of the road they were turning on and turned at a near-90-degree angle.

Often, there was a giant cast-iron tower in the middle of the road to make sure drivers didn't cheat.

Some were pretty big. Photo by Topical Press Agency/Getty Images.

These old-timey driving rules transformed busy intersections into informal roundabouts, forcing cars to slow down so that they didn't hit pedestrians from behind.

[rebelmouse-image 19530942 dam="1" original_size="480x205" caption="GIF from "Time After Time"/Warner Bros." expand=1]GIF from "Time After Time"/Warner Bros.

Or so that, if they did, it wasn't too painful.

"There was a real struggle first of all by the urban majority against cars taking over the street, and then a sort of counter-struggle by the people who wanted to sell cars," explains Peter Norton, Associate Professor of History at the University of Virginia and author of "Fighting Traffic: The Dawn of the Motor Age in the American City."

Norton posted the vintage left-turn instructional image, originally published in a 1919 St. Louis drivers' manual — to Facebook on July 9. While regulations were laxer in suburban and rural areas, he explains, the sharp right-angle turn was standard in nearly every major American city through the late '20s.

“That left turn rule was a real nuisance if you were a driver, but it was a real blessing if you were a walker," he says.

Early traffic laws focused mainly on protecting pedestrians from cars, which were considered a public menace.

Pedestrians on the Bowery in New York City, 1900. Photo by Hulton Archive/Getty Images.

For a few blissful decades after the automobile was invented, the question of how to prevent drivers from mowing down all of midtown every day was front-of-mind for many urban policymakers.

Pedestrians, Norton explains, accounted for a whopping 75 percent of road deaths back then. City-dwellers who, unlike their country counterparts, often walked on streets were predictably pretty pissed about that.

In 1903, New York City implemented one of the first traffic ordinances in the country, which codified the right-angle left. Initially, no one knew or cared, so the following year, the city stuck a bunch of big metal towers in the middle of the intersections, which pretty well spelled things out.

A Traffic Tower keeps watch at the intersection of 42nd Street and 5th Avenue in New York City in 1925. Photo by Topical Press Agency/Getty Images.

Some cities installed unmanned versions, dubbed "silent policemen," which instructed motorists to "keep to the right."

Drivers finally got the message, and soon, the right-angle left turn spread to virtually every city in America.

Things were pretty good for pedestrians — for a while.

In the 1920s, that changed when automobile groups banded together to impose a shiny new left turn on America's drivers.

According to Norton, a sales slump in 1922 to 1923 convinced many automakers that they'd maxed out their market potential in big cities. Few people, it seemed, wanted to drive in urban America. Parking spaces were nonexistent, traffic was slow-moving, and turning left was a time-consuming hassle. Most importantly, there were too many people on the road.

In order to attract more customers, they needed to make cities more hospitable to cars.

Thus began an effort to shift the presumed owner of the road, "from the pedestrian to the driver."

FDR Drive off-ramps in 1955. Photo by Three Lions/Getty Images.

"It was a multi-front campaign," Norton says.

The lobbying started with local groups — taxi cab companies, truck fleet operators, car dealers associations — and eventually grew to include groups like the National Automobile Chamber of Commerce, which represented most major U.S. automakers.

Car advocates initially worked to take control of the traffic engineering profession. The first national firm, the Albert Erskine Bureau for Street Traffic Research, was founded in 1925 at Harvard University, with funds from Studebaker to make recommendations to cities on how to design streets.

Driving fast, they argued, was not inherently dangerous, but something that could be safe with proper road design.

Drivers weren't responsible for road collisions. Pedestrians were.

Therefore, impeding traffic flow to give walkers an advantage at the expense of motor vehicle operators, they argued, is wasteful, inconvenient, and inefficient.

Out went the right-angle left turn.

Industry-led automotive interest groups began producing off-the-shelf traffic ordinances modeled on Los Angeles' driver-friendly 1925 traffic code, including our modern-day left turn, which was adopted by municipalities across the country.

The towering silent policemen were replaced by dome-shaped bumps called "traffic mushrooms," which could be driven over.

[rebelmouse-image 19530946 dam="1" original_size="700x465" caption="A modern "traffic mushroom" in Forbes, New South Wales. Photo by Mattinbgn/Wikimedia Commons." expand=1]A modern "traffic mushroom" in Forbes, New South Wales. Photo by Mattinbgn/Wikimedia Commons.

Eventually the bumps were removed altogether. Barriers and double yellow lines that ended at the beginning of an intersection encouraged drivers to begin their left turns immediately.

The old way of hanging a left was mostly extinct by 1930 as the new, auto-friendly ordinances proved durable.

So ... is the new left turn better?

Yes. Also, no.

It's complicated.

The shift to a "car-dominant status quo," Norton explains, wasn't completely manufactured — nor entirely negative.

An L.A. motorway in 1953. Photo by L.J. Willinger/Getty Images.

As more Americans bought cars, public opinion of who should run the road really did change. The current left turn model is better and more efficient for drivers — who have to cross fewer lanes of traffic — and streets are less chaotic than they were in the early part of the 20th century.

Meanwhile, pedestrian deaths have declined markedly over the years. While walkers made up 75% of all traffic fatalities in the 1920s in some cities, by 2015, just over 5,000 pedestrians were killed by cars on the street, roughly 15% of all vehicle-related deaths.

There's a catch, of course.

While no one factor fully accounts for the decrease in pedestrian deaths, Norton believes the industry's success in making roadways completely inhospitable to walkers helps explain the trend.

Simply put, fewer people are hit because fewer people are crossing the street (or walking at all). The explosion of auto-friendly city ordinances — which, among other things, allowed drivers to make faster, more aggressive left turns — pushed people off the sidewalks and into their own vehicles.

When that happened, the nature of traffic accidents changed.

A man fixes a bent fender, 1953. Photo by Sherman/Three Lions/Getty Images.

"Very often, a person killed in a car in 1960 would have been a pedestrian a couple of decades earlier," Norton says.

We still live with that car-dominant model and the challenges that arise from it. Urban design that prioritizes drivers over walkers contributes to sprawl and, ultimately, to carbon emissions. A system engineered to facilitate auto movement also allows motor vehicle operators to avoid responsibility for sharing the street in subtle ways. The Centers for Disease Control and Prevention lists three tips to prevent injuries and deaths from car-human collisions — all for pedestrians, including "carrying a flashlight when walking," and "wearing retro-reflective clothing."

A Minneapolis Star-Tribune analysis found that, of over 3,000 total collisions with pedestrians (including 95 fatalities) in the Twin Cities area between 2010 and 2014, only 28 drivers were charged and convicted a crime — mostly misdemeanors.

Norton says he's encouraged, however, by recent efforts to reclaim city streets and make them safe for walkers.

Pedestrians walk through New York's Times Square, 2015. Photo by Spencer Platt/Getty Images.

That includes a push by groups like Transportation Alternatives to install pedestrian plazas and bike lanes and to promote bus rapid transit. It also includes Vision Zero, a safety initiative in cities across America, which aims to end traffic fatalities by upgrading road signage, lowering speed limits, and installing more traffic circles, among other things.

As a historian, Norton hopes Americans come to understand that the way we behave on the road isn't static or, necessarily, what we naturally prefer. Often, he explains, it results from hundreds of conscious decisions made over decades.

"We're surrounded by assumptions that are affecting our choices, and we don't know where those assumptions come from because we don't know our own history," he says.

Even something as mindless as hanging a left.

This article was originally published on July 14, 2017.

In the years following the bitter Civil War, a former Union general took a holiday originated by former Confederates and helped spread it across the entire country.

The holiday was Memorial Day, and the 2018 commemoration on May 28 marks the 150th anniversary of its official nationwide observance. The annual commemoration was born in the former Confederate States in 1866 and adopted by the United States in 1868. It is a holiday in which the nation honors its military dead.

Gen. John A. Logan, who headed the largest Union veterans fraternity at that time, the Grand Army of the Republic, is usually credited as being the originator of the holiday.


Civil War Union Gen. John A. Logan. Photo via the Library of Congress.

Yet when Logan established the holiday, he acknowledged its genesis among the Union's former enemies, saying, "It was not too late for the Union men of the nation to follow the example of the people of the South."

Cities and towns across America have for more than a century claimed to be the birthplace of Memorial Day.

But I and my co-author Daniel Bellware have sifted through the myths and half-truths and uncovered the authentic story of how this holiday came into being.

During 1866, the first year Memorial Day was observed in the South, a feature of the holiday emerged that made awareness, admiration and eventually imitation of it spread quickly to the North.

During the inaugural Memorial Day observances in Columbus, Georgia, many Southern participants — especially women — decorated graves of Confederate soldiers as well as those of their former enemies who fought for the Union.

Shortly after those first Memorial Day observances all across the South, newspaper coverage in the North was highly favorable to the ex-Confederates.

"The action of the ladies on this occasion, in burying whatever animosities or ill-feeling may have been engendered in the late war towards those who fought against them, is worthy of all praise and commendation," wrote one paper.

On May 9, 1866, the Cleveland Daily Leader lauded the Southern women during their first Memorial Day.

"The act was as beautiful as it was unselfish, and will be appreciated in the North."

The New York Commercial Advertiser, recognizing the magnanimous deeds of the women of Columbus, echoed the sentiment. "Let this incident, touching and beautiful as it is, impart to our Washington authorities a lesson in conciliation."

To be sure, this sentiment was not unanimous. There were many in both parts of the U.S. who had no interest in conciliation.

But as a result of one of these news reports, Francis Miles Finch, a Northern judge, academic and poet, wrote a poem titled "The Blue and the Gray." Finch's poem quickly became part of the American literary canon. He explained what inspired him to write it:

"It struck me that the South was holding out a friendly hand, and that it was our duty, not only as conquerors, but as men and their fellow citizens of the nation, to grasp it."

Finch's poem seemed to extend a full pardon to the South: "They banish our anger forever when they laurel the graves of our dead" was one of the lines.

Almost immediately, the poem circulated across America in books, magazines and newspapers. By the end of the 19th century, school children everywhere were required to memorize Finch's poem.

[rebelmouse-image 19398143 dam="1" original_size="237x305" caption="Not just poems: Sheet music written to commemorate Memorial Day in 1870. Image via the Library of Congress." expand=1]Not just poems: Sheet music written to commemorate Memorial Day in 1870. Image via the Library of Congress.

As Finch's poem circulated the country, the Southern Memorial Day holiday became a familiar phenomenon throughout America.

Logan was aware of the forgiving sentiments of people like Finch. When Logan's order establishing Memorial Day was published in various newspapers in May 1868, Finch's poem was sometimes appended to the order.

It was not long before Northerners decided that they would not only adopt the Southern custom of Memorial Day, but also the Southern custom of "burying the hatchet."

A group of Union veterans explained their intentions in a letter to the Philadelphia Evening Telegraph on May 28, 1869:

"Wishing to bury forever the harsh feelings engendered by the war, Post 19 has decided not to pass by the graves of the Confederates sleeping in our lines, but divide each year between the blue and the grey the first floral offerings of a common country. We have no powerless foes. Post 19 thinks of the Southern dead only as brave men."

Other reports of reciprocal magnanimity circulated in the North, including the gesture of a 10-year-old who made a wreath of flowers and sent it to the overseer of the holiday, a Col. Leaming in Lafayette, Indiana, with the following note attached, published in The New Hampshire Patriot on July 15, 1868:

"Will you please put this wreath upon some rebel soldier’s grave? My dear papa is buried at Andersonville, (Georgia) and perhaps some little girl will be kind enough to put a few flowers upon his grave."

Although not known by many today, the early evolution of the Memorial Day holiday was a manifestation of Abraham Lincoln's hope for reconciliation between North and South.

Lincoln's wish was that there be "malice toward none" and "charity for all." These wishes were clearly fulfilled in the magnanimous actions of citizens on both sides, who extended an olive branch during those very first Memorial Day observances.

This story originally appeared on The Conversation and is printed here with permission.