+
upworthy

forgotten history

Pop Culture

3 moments that might convince you Edgar Allan Poe was a time traveler.

In the case of Poe, it was his fiction that was, well, stranger than fiction.


I'm pretty positive that Edgar Allan Poe had (has?) the power to travel through time. Hear me out on this one.

It's not just the well-known circumstances of his life — orphaned at a young age, father of the mystery novel, master of cryptology, maestro of the macabre. Nor am I referring to the head-scratching details of the days leading up to his death: how he was found on the street near a voting poll wearing someone else's clothes, and during his subsequent hospitalization, he was alleged to babble incoherently about an unidentified person named “Reynolds."

And I won't even get into the confounding reports of a nameless figure who, for seven decades, would show up to Poe's gravesite in the early hours of his birthday with a glass of cognac and three roses.



Tragic and curious, yes, but hardly evidence that the acclaimed horror writer could transcend the limits of space and time. No, my time travel theory concerns the author's creative output, which you'll soon see is so flukishly prophetic as to make my outlandish claim seem plausible — nay, probable!

The proof is in the pudding, and the pudding is a loosely linked map of flesh-eating floaters, crunched skull survivors, and primordial particles. OK, here we go…

Photo by Albert Sterner/Wikimedia Commons.

Exhibit A: "The Narrative of Arthur Gordon Pym of Nantucket"

Published in 1838, Poe's only completed novel details a mutiny on a whaling ship lost at sea. Out of supplies, the men revert to cannibalism, drawing straws to elect a sacrifice. A boy named Richard Parker draws the shortest straw and is subsequently eaten.

Now here's where it gets weird(er): In 1884, 46 years after the novel's publication, four men would be set adrift following the sinking of their yacht. Shipwrecked and without food, they too would go the survival cannibalism route, electing to kill and eat a 17-year-old cabin boy. The boy's name: Richard Parker.

The extraordinary parallel went unnoticed for nearly a century, until a widely-circulated letter from a descendant of the real Parker outlined the similarities between the novel's scene and the actual event. The letter was selected for publication in The Sunday Times after journalist Arthur Koestler put out a call for tales of “striking coincidence." Striking indeed.

Image from the collection of Jack and Beverly Wilgus/Wikimedia Commons/Wikimedia Commons.

Exhibit B: "The Businessman"

In 1848, a railroad worker named Phineas Gage suffered a traumatic brain injury after taking an iron spike through the skull. Somehow he survived, though his personality would change drastically. These behavioral changes were closely studied, allowing the medical community to develop the first understanding of the role played by the frontal lobe on social cognition.

Except for Poe, who'd inexplicably understood the profound personality changes caused by frontal lobe syndrome nearly a decade earlier. In 1840, he penned a characteristically gruesome story called “The Businessman" about an unnamed narrator who suffers a traumatic head injury as a young boy, leading to a life of obsessive regularity and violent, sociopathic outbursts.

Poe's grasp of frontal lobe syndrome is so precise that neurologist Eric Altshuler wrote, “There's a dozen symptoms and he knows every single one… There's everything in that story, we've hardly learned anything more." Altshuler, who, to reiterate, is a medically-licensed neurologist and not at all a crackpot, went on to say, “It's so exact that it's just weird, it's like he had a time machine."

Photo via NASA/Wikimedia Commons.

Exhibit C: "Eureka"

Still unconvinced? What if I told you that Poe predicted the origins of the universe 80 years before modern science would begin to formulate the Big Bang theory? Surely, an amateur stargazer with no formal training in cosmology could not accurately describe the machinery of the universe, rejecting widely-held inaccuracies while solving a theoretical paradox that had bewildered astronomers since Kepler. Except that's exactly what happened.

The prophetic vision came in the form of "Eureka," a 150-page prose poem critically panned for its complexity and regarded by many as the work of a madman. Written in the final year of Poe's life, "Eureka" describes an expanding universe that began in “one instantaneous flash" derived from a single “primordial particle."

Poe goes on to put forth the first legitimate solution to Olbers' paradox — the question of why, given the vast number of stars in the universe, the night sky is dark — by explaining that light from the expanding universe had not yet reached our solar system. When Edward Robert Harrison published "Darkness at Night" in 1987, he credited "Eureka" as having anticipated his findings.

In an interview with Nautilus, Italian astronomer Alberto Cappi speaks of Poe's prescience, admitting, “It's surprising that Poe arrived at his dynamically evolving universe because there was no observational or theoretical evidence suggesting such a possibility. No astronomer in Poe's day could imagine a non-static universe."

Photo from Dodd, Mead and Company/Wikimedia Commons.

But what if Poe wasn't of a day at all, but of all the days?

What if his written prophecies — on the cannibalistic demise of Richard Parker, the symptoms of frontal lobe syndrome, and the Big Bang theory — were merely reportage from his journey through the extratemporal continuum?

Surely I sound like a tinfoil-capped loon, but maybe, maybe, there are many more prophecies scattered throughout the author's work, a possibility made all the more likely by the fact that, as The New York Times notes, “Poe was so undervalued for so long, there is not a lot of Poe-related material around."

I'll leave you with this quote, taken from a letter that Poe wrote to James Russell Lowell in 1844, in which he apologizes for his absence and slothfulness:

"I live continually in a reverie of the future. I have no faith in human perfectibility. I think that human exertion will have no appreciable effect upon humanity. Man is now only more active — not more happy — nor more wise, than he was 6000 years ago. The result will never vary — and to suppose that it will, is to suppose that the foregone man has lived in vain — that the foregone time is but the rudiment of the future — that the myriads who have perished have not been upon equal footing with ourselves — nor are we with our posterity. I cannot agree to lose sight of man the individual, in man the mass… You speak of “an estimate of my life" — and, from what I have already said, you will see that I have none to give. I have been too deeply conscious of the mutability and evanescence of temporal things, to give any continuous effort to anything — to be consistent in anything. My life has been whim — impulse — passion — a longing for solitude — a scorn of all things present, in an earnest desire for the future."


This story was originally published on HistoryBuff and first appeared on 8.16.16



Unless you're a child, New York City resident, or UPS driver, chances are you've made a left turn in your car at least once this week.

Chances are, you didn't think too much about how you did it or why you did it that way.

You just clicked on your turn signal...

...and turned left.

GIF from United States Auto Club.


The New York State Department of Motor Vehicles instructs drivers to "try to use the left side of the intersection to help make sure that you do not interfere with traffic headed toward you that wants to turn left," as depicted in this thrilling official state government animation:

GIF from New York Department of Motor Vehicles.

Slick, smooth, and — in theory — as safe as can be.

Your Drivers Ed teacher would give you full marks for that beautifully executed maneuver.

[rebelmouse-image 19530938 dam="1" original_size="500x332" caption="GIF from "Baywatch"/NBC." expand=1]GIF from "Baywatch"/NBC.

Your great-grandfather, on the other hand, would be horrified.

[rebelmouse-image 19530939 dam="1" original_size="400x309" caption="GIF from "Are You Afraid of the Dark"/Nickelodeon." expand=1]GIF from "Are You Afraid of the Dark"/Nickelodeon.

Before 1930, if you wanted to hang a left in a medium-to-large American city, you most likely did it like so:

[rebelmouse-image 19530940 dam="1" original_size="700x284" caption="Photo via Fighting Traffic/Facebook." expand=1]Photo via Fighting Traffic/Facebook.

Instead of proceeding in an arc across the intersection, drivers carefully proceeded straight out across the center line of the road they were turning on and turned at a near-90-degree angle.

Often, there was a giant cast-iron tower in the middle of the road to make sure drivers didn't cheat.

Some were pretty big. Photo by Topical Press Agency/Getty Images.

These old-timey driving rules transformed busy intersections into informal roundabouts, forcing cars to slow down so that they didn't hit pedestrians from behind.

[rebelmouse-image 19530942 dam="1" original_size="480x205" caption="GIF from "Time After Time"/Warner Bros." expand=1]GIF from "Time After Time"/Warner Bros.

Or so that, if they did, it wasn't too painful.

"There was a real struggle first of all by the urban majority against cars taking over the street, and then a sort of counter-struggle by the people who wanted to sell cars," explains Peter Norton, Associate Professor of History at the University of Virginia and author of "Fighting Traffic: The Dawn of the Motor Age in the American City."

Norton posted the vintage left-turn instructional image, originally published in a 1919 St. Louis drivers' manual — to Facebook on July 9. While regulations were laxer in suburban and rural areas, he explains, the sharp right-angle turn was standard in nearly every major American city through the late '20s.

“That left turn rule was a real nuisance if you were a driver, but it was a real blessing if you were a walker," he says.

Early traffic laws focused mainly on protecting pedestrians from cars, which were considered a public menace.

Pedestrians on the Bowery in New York City, 1900. Photo by Hulton Archive/Getty Images.

For a few blissful decades after the automobile was invented, the question of how to prevent drivers from mowing down all of midtown every day was front-of-mind for many urban policymakers.

Pedestrians, Norton explains, accounted for a whopping 75 percent of road deaths back then. City-dwellers who, unlike their country counterparts, often walked on streets were predictably pretty pissed about that.

In 1903, New York City implemented one of the first traffic ordinances in the country, which codified the right-angle left. Initially, no one knew or cared, so the following year, the city stuck a bunch of big metal towers in the middle of the intersections, which pretty well spelled things out.

A Traffic Tower keeps watch at the intersection of 42nd Street and 5th Avenue in New York City in 1925. Photo by Topical Press Agency/Getty Images.

Some cities installed unmanned versions, dubbed "silent policemen," which instructed motorists to "keep to the right."

Drivers finally got the message, and soon, the right-angle left turn spread to virtually every city in America.

Things were pretty good for pedestrians — for a while.

In the 1920s, that changed when automobile groups banded together to impose a shiny new left turn on America's drivers.

According to Norton, a sales slump in 1922 to 1923 convinced many automakers that they'd maxed out their market potential in big cities. Few people, it seemed, wanted to drive in urban America. Parking spaces were nonexistent, traffic was slow-moving, and turning left was a time-consuming hassle. Most importantly, there were too many people on the road.

In order to attract more customers, they needed to make cities more hospitable to cars.

Thus began an effort to shift the presumed owner of the road, "from the pedestrian to the driver."

FDR Drive off-ramps in 1955. Photo by Three Lions/Getty Images.

"It was a multi-front campaign," Norton says.

The lobbying started with local groups — taxi cab companies, truck fleet operators, car dealers associations — and eventually grew to include groups like the National Automobile Chamber of Commerce, which represented most major U.S. automakers.

Car advocates initially worked to take control of the traffic engineering profession. The first national firm, the Albert Erskine Bureau for Street Traffic Research, was founded in 1925 at Harvard University, with funds from Studebaker to make recommendations to cities on how to design streets.

Driving fast, they argued, was not inherently dangerous, but something that could be safe with proper road design.

Drivers weren't responsible for road collisions. Pedestrians were.

Therefore, impeding traffic flow to give walkers an advantage at the expense of motor vehicle operators, they argued, is wasteful, inconvenient, and inefficient.

Out went the right-angle left turn.

Industry-led automotive interest groups began producing off-the-shelf traffic ordinances modeled on Los Angeles' driver-friendly 1925 traffic code, including our modern-day left turn, which was adopted by municipalities across the country.

The towering silent policemen were replaced by dome-shaped bumps called "traffic mushrooms," which could be driven over.

[rebelmouse-image 19530946 dam="1" original_size="700x465" caption="A modern "traffic mushroom" in Forbes, New South Wales. Photo by Mattinbgn/Wikimedia Commons." expand=1]A modern "traffic mushroom" in Forbes, New South Wales. Photo by Mattinbgn/Wikimedia Commons.

Eventually the bumps were removed altogether. Barriers and double yellow lines that ended at the beginning of an intersection encouraged drivers to begin their left turns immediately.

The old way of hanging a left was mostly extinct by 1930 as the new, auto-friendly ordinances proved durable.

So ... is the new left turn better?

Yes. Also, no.

It's complicated.

The shift to a "car-dominant status quo," Norton explains, wasn't completely manufactured — nor entirely negative.

An L.A. motorway in 1953. Photo by L.J. Willinger/Getty Images.

As more Americans bought cars, public opinion of who should run the road really did change. The current left turn model is better and more efficient for drivers — who have to cross fewer lanes of traffic — and streets are less chaotic than they were in the early part of the 20th century.

Meanwhile, pedestrian deaths have declined markedly over the years. While walkers made up 75% of all traffic fatalities in the 1920s in some cities, by 2015, just over 5,000 pedestrians were killed by cars on the street, roughly 15% of all vehicle-related deaths.

There's a catch, of course.

While no one factor fully accounts for the decrease in pedestrian deaths, Norton believes the industry's success in making roadways completely inhospitable to walkers helps explain the trend.

Simply put, fewer people are hit because fewer people are crossing the street (or walking at all). The explosion of auto-friendly city ordinances — which, among other things, allowed drivers to make faster, more aggressive left turns — pushed people off the sidewalks and into their own vehicles.

When that happened, the nature of traffic accidents changed.

A man fixes a bent fender, 1953. Photo by Sherman/Three Lions/Getty Images.

"Very often, a person killed in a car in 1960 would have been a pedestrian a couple of decades earlier," Norton says.

We still live with that car-dominant model and the challenges that arise from it. Urban design that prioritizes drivers over walkers contributes to sprawl and, ultimately, to carbon emissions. A system engineered to facilitate auto movement also allows motor vehicle operators to avoid responsibility for sharing the street in subtle ways. The Centers for Disease Control and Prevention lists three tips to prevent injuries and deaths from car-human collisions — all for pedestrians, including "carrying a flashlight when walking," and "wearing retro-reflective clothing."

A Minneapolis Star-Tribune analysis found that, of over 3,000 total collisions with pedestrians (including 95 fatalities) in the Twin Cities area between 2010 and 2014, only 28 drivers were charged and convicted a crime — mostly misdemeanors.

Norton says he's encouraged, however, by recent efforts to reclaim city streets and make them safe for walkers.

Pedestrians walk through New York's Times Square, 2015. Photo by Spencer Platt/Getty Images.

That includes a push by groups like Transportation Alternatives to install pedestrian plazas and bike lanes and to promote bus rapid transit. It also includes Vision Zero, a safety initiative in cities across America, which aims to end traffic fatalities by upgrading road signage, lowering speed limits, and installing more traffic circles, among other things.

As a historian, Norton hopes Americans come to understand that the way we behave on the road isn't static or, necessarily, what we naturally prefer. Often, he explains, it results from hundreds of conscious decisions made over decades.

"We're surrounded by assumptions that are affecting our choices, and we don't know where those assumptions come from because we don't know our own history," he says.

Even something as mindless as hanging a left.

This article was originally published on July 14, 2017.

In the years following the bitter Civil War, a former Union general took a holiday originated by former Confederates and helped spread it across the entire country.

The holiday was Memorial Day, and the 2018 commemoration on May 28 marks the 150th anniversary of its official nationwide observance. The annual commemoration was born in the former Confederate States in 1866 and adopted by the United States in 1868. It is a holiday in which the nation honors its military dead.

Gen. John A. Logan, who headed the largest Union veterans fraternity at that time, the Grand Army of the Republic, is usually credited as being the originator of the holiday.


Civil War Union Gen. John A. Logan. Photo via the Library of Congress.

Yet when Logan established the holiday, he acknowledged its genesis among the Union's former enemies, saying, "It was not too late for the Union men of the nation to follow the example of the people of the South."

Cities and towns across America have for more than a century claimed to be the birthplace of Memorial Day.

But I and my co-author Daniel Bellware have sifted through the myths and half-truths and uncovered the authentic story of how this holiday came into being.

During 1866, the first year Memorial Day was observed in the South, a feature of the holiday emerged that made awareness, admiration and eventually imitation of it spread quickly to the North.

During the inaugural Memorial Day observances in Columbus, Georgia, many Southern participants — especially women — decorated graves of Confederate soldiers as well as those of their former enemies who fought for the Union.

Shortly after those first Memorial Day observances all across the South, newspaper coverage in the North was highly favorable to the ex-Confederates.

"The action of the ladies on this occasion, in burying whatever animosities or ill-feeling may have been engendered in the late war towards those who fought against them, is worthy of all praise and commendation," wrote one paper.

On May 9, 1866, the Cleveland Daily Leader lauded the Southern women during their first Memorial Day.

"The act was as beautiful as it was unselfish, and will be appreciated in the North."

The New York Commercial Advertiser, recognizing the magnanimous deeds of the women of Columbus, echoed the sentiment. "Let this incident, touching and beautiful as it is, impart to our Washington authorities a lesson in conciliation."

To be sure, this sentiment was not unanimous. There were many in both parts of the U.S. who had no interest in conciliation.

But as a result of one of these news reports, Francis Miles Finch, a Northern judge, academic and poet, wrote a poem titled "The Blue and the Gray." Finch's poem quickly became part of the American literary canon. He explained what inspired him to write it:

"It struck me that the South was holding out a friendly hand, and that it was our duty, not only as conquerors, but as men and their fellow citizens of the nation, to grasp it."

Finch's poem seemed to extend a full pardon to the South: "They banish our anger forever when they laurel the graves of our dead" was one of the lines.

Almost immediately, the poem circulated across America in books, magazines and newspapers. By the end of the 19th century, school children everywhere were required to memorize Finch's poem.

[rebelmouse-image 19398143 dam="1" original_size="237x305" caption="Not just poems: Sheet music written to commemorate Memorial Day in 1870. Image via the Library of Congress." expand=1]Not just poems: Sheet music written to commemorate Memorial Day in 1870. Image via the Library of Congress.

As Finch's poem circulated the country, the Southern Memorial Day holiday became a familiar phenomenon throughout America.

Logan was aware of the forgiving sentiments of people like Finch. When Logan's order establishing Memorial Day was published in various newspapers in May 1868, Finch's poem was sometimes appended to the order.

It was not long before Northerners decided that they would not only adopt the Southern custom of Memorial Day, but also the Southern custom of "burying the hatchet."

A group of Union veterans explained their intentions in a letter to the Philadelphia Evening Telegraph on May 28, 1869:

"Wishing to bury forever the harsh feelings engendered by the war, Post 19 has decided not to pass by the graves of the Confederates sleeping in our lines, but divide each year between the blue and the grey the first floral offerings of a common country. We have no powerless foes. Post 19 thinks of the Southern dead only as brave men."

Other reports of reciprocal magnanimity circulated in the North, including the gesture of a 10-year-old who made a wreath of flowers and sent it to the overseer of the holiday, a Col. Leaming in Lafayette, Indiana, with the following note attached, published in The New Hampshire Patriot on July 15, 1868:

"Will you please put this wreath upon some rebel soldier’s grave? My dear papa is buried at Andersonville, (Georgia) and perhaps some little girl will be kind enough to put a few flowers upon his grave."

Although not known by many today, the early evolution of the Memorial Day holiday was a manifestation of Abraham Lincoln's hope for reconciliation between North and South.

Lincoln's wish was that there be "malice toward none" and "charity for all." These wishes were clearly fulfilled in the magnanimous actions of citizens on both sides, who extended an olive branch during those very first Memorial Day observances.

This story originally appeared on The Conversation and is printed here with permission.

True
DICK'S Sporting Goods

In 1951, the University of San Francisco football team was living out a Cinderella season the school had never seen in its history.

This small all-male school’s success was as unlikely as it was unexpected. Finishing the regular season undefeated, the team was poised to make a run at the national championship.

But despite its success on the field, the football team struggled to cover its mounting expenses. Keeping up with teams from bigger schools wasn’t cheap — USF’s football team had tallied a $70,000 deficit that year alone. USF’s ability to save not just its season, but the team’s future, hinged on the type of financial windfall that only a bowl game provides.


Shortly after their last game, the USF team got their bowl game invitation, but with it came a caveat:

In order for USF to play in the Orange Bowl, and collect the game’s $50,000, two players — Ollie Matson and Burl Toler — would be excluded for no other reason than the color of their skin.

[rebelmouse-image 19346265 dam="1" original_size="560x291" caption="The USF 1951 team. Image via San Francisco Foghorn/Flickr." expand=1]The USF 1951 team. Image via San Francisco Foghorn/Flickr.

As one of the few desegregated teams in the top tier of college football, this wasn’t the first time that the 1951 USF team had experienced racism.

"Our first inkling of it came a year before when we went to play in Tulsa," Bob Weibel, a white player on the team, told the NFL. "They wouldn't let Burl and Ollie stay in the team hotel because it was white-only. They had to go across town and stay at the other one."

Racial slurs from opposing teams were also common.

But the players from San Francisco weren’t aware of how embedded racism truly was throughout institutions — like college football — until they got their ultimatum.

"These guys were very naïve," Kristine Setting Clark, author of the 2002 account "Undefeated, Untied, and Uninvited,” told the NFL. “To them, there was no color barrier. They looked at things like what happened in the Orange Bowl and thought, What's wrong with these people?"

Despite the success of Jackie Robinson knocking down the color barrier in baseball, systematic racism in sports was the rule, rather than the exception, in 1951. And the expectation at the time was that USF would need to bend to the will of the bowl organizers to continue their season and save their program.

The ultimatum could have driven the team apart, but instead, it brought the players closer together.

The team’s response didn’t take long. Within minutes after arriving home and learning of the unacceptable stipulation to partake in a bowl game, the response was singular and resounding:

We told them to go to hell," said Bill Henneberry, USF’s backup quarterback that season. "If Ollie and Burl didn't go, none of us were going. We walked out, and that was the end of it."

At the time, the team’s act of solidarity didn’t serve as a springboard for change or progress. In fact, the entire series of events went unreported and undiscussed for decades.

With their unequivocal response, the USF football team had sealed its fate — the program, now out of money, shuttered prior to the next season.

The Orange Bowl announced that it was the team’s “soft schedule” that cost USF a chance to play in the game.

And San Francisco sports journalist Ira Blue, who had been covering the Dons’ season, reported from his sources that the Gator Bowl, Sugar Bowl, and Orange Bowl — all steeped in “Southern tradition” — mutually agreed to overlook teams with black players.

It wasn’t until 1990 that the vast majority of sports fans learned the true story.

39 years after the team’s decision was made, Sports Illustrated chronicled USF’s season and the fallout from their trying decision in “Best Team You’ve Never Met,” the first account of the events shared with a broad audience. The players and their courage were finally getting their due, albeit long after media coverage could have effected any real change in the sports world.

That feature proved to be the first of many later celebrations of the team’s quiet integrity in the face of grave consequences.

In 2006, the surviving team members were flown out by the Emerald Bowl so they could be honored at halftime. Two years later, Fiesta Bowl officials extended the same invitation.

Both the NFL and ESPN raced to turn the Dons’ story into a film. ESPN won, putting out “The ‘51 Dons” in 2014.

Many USF players from the 1951 team didn’t live long enough to enjoy the belated honors bestowed upon the team, but their mark on sports history remains nonetheless.

Their story conjures the very definition of integrity: doing what’s right regardless of the consequences. At the time, few would have faulted the team for accepting the conditions and taking the bowl bid. But the 1951 San Francisco Dons football players made their choice — because to them, the alternative was worse than the end of their football team.

“When you look back on it, I guess you could say we really were a team that was ahead of our time," saidUSF’s Henneberry in an NFL interview.

This story was produced as part of a campaign called "17 Days" with DICK'S Sporting Goods. These stories aim to shine a light on real occurrences of sports bringing people together.