7 ABSURD IDEAS THAT BECAME REALITY
Have you ever told a friend an idea of yours and been told, “That’s absurd! It will never work!” Or maybe your idea made people say, “That’ll never happen.” Well, these 7 Absurd Ideas became reality – some of them for an individual, and for others, they changed the way we look at the world and operate in it. All of them took time to come to fruition, and some took even longer to create lasting change. Check out these 7 Absurd Ideas that became reality:
Karl von Drais, a German inventor, is generally credited with the invention of a two-wheeled, human-powered vehicle called a swift walker, velocipede, Laufmaschine, draisine, among others. The first velocipedes had no pedals but were powered by running and pushing the two-wheeled contraption and then gliding along with it, much like today’s skateboard.
Many initially considered the bicycle a novelty of sorts; something hobbyists would indulge in for fun. While the novel idea had its merits, some thought the new invention a fad and a nuisance. On August 9th, 1819, New York City banned the velocipede from parks and sidewalks with a $5 fine for breaking the ordinance.
Today, however, the bicycle isn’t a fad. It serves as an economic and environmentally friendly way to get from point A to point B. The bicycle is also a sport and one of the first rites of passage in childhood – removing the training wheels and riding on two wheels.
Handwashing is a hot topic from time to time, but many agree, handwashing improves hygiene and prevents the spread of germs. In 1846, the medical community was on the verge of discovering microorganisms through “germ theory.”
However, before germ theory, several practitioners snubbed the benefits of handwashing in the 1800s, a practice that generally was not believed to thwart disease. Others strongly supported the idea, including Dr. Ignaz Semmelweis. The Hungarian doctor joined an obstetric clinic in Vienna in 1844, and it wasn’t long before he noticed differences between the different wards. Women died at an alarming rate after giving birth – up to a 30 percent mortality rate. However, one of the maternity wards overseen by midwives recorded dramatically fewer deaths than a ward overseen by obstetric doctors and nurses. After studying both wards scientifically over two years, Semmelweis discovered one significant difference between the wards: doctors performed autopsies, midwives did not.
Semmelweis theorized that the doctors’ hands became contaminated during autopsies, spreading the infections to women during delivery. He recommended washing hands and equipment with a chlorine solution. After implementing the practice, mortality rates dropped, but resistance won out. Semmelweis was dismissed, and doctors returned to their previous routines.
Another proponent of handwashing was Florence Nightingale. The British nurse is noted for requiring various sanitary practices during the Crimean War, including hand washing. She wrote extensively about her experiences and protocols in Notes On Nursing. Unfortunately, while she promoted the practice of frequent hand washing to prevent disease, she didn’t fully understand the cause of disease and would be slow to accept germ theory.
Another proponent of handwashing was British surgeon Dr. Joseph Lister. In the surgical arena, sepsis was a deadly infection. As doctors became more familiar with the human anatomy and how to fix it, more surgeries were performed. However, the mortality rates of surgery patients were climbing. In 1865, Lister implemented carbolic acid washes to sterilize surgical arenas, equipment, linens, wounds, and surgeon’s hands between and during operations. These protocols decreased the number of surgical-related infections and death associated with them.
Today, medical facilities worldwide follow protocols to ensure sterile environments to prevent the spread of infection.
Women in the Workplace
Today, more than half of the women in the United States are a part of the labor force. While society continues to work toward more equality, there was a time when a women’s primary place was in the home. Anyone who suggested otherwise received a wake-up call from society. That is until World War II, when that impression began to change. Men exchanged their tools and ledgers for a uniform and weapon to join the war effort. With a shortage of workers and increased demand for labor, women were called upon to fill the vacancies. They worked on assembly lines, drove trucks, and even delivered the mail. Women also filled military roles – shuttling planes overseas, providing medical care, and much more. By 1946, with the war over, the troops returned home, and the men wanted their jobs back.
Before World War II, approximately 27 percent of women were in the workforce. By World War II, that number increased by nearly 10 percent. After the war, those numbers dipped slightly, but they remained steady until the 1960s when they gradually increased. Today, women in the workplace – just about any workplace – are commonplace and rarely seen as absurd in the United States.
Computer Animated Films
Many animated films today are completely computer-generated. From beginning to end, these films go from computer creation to the big screen. Early in its beginnings, though, full-length animation seemed an absurd idea. The time involved and quality made large projects cost-prohibitive. Additionally, the technology to shorten time frames and costs wasn’t available yet. A full-length, animated film was a long time coming.
It might come as a surprise, but animators have been dabbling with computer illustrations and graphics since the 1940s. In 1960, the Swedish Royal Institute of Technology created a 49-second animated film. The vector animation displayed a driver’s view of traveling down a planned highway. The 1960s introduced a wave of animation pioneers – Bell Laboratories, Ivan Sutherland, Lee Harrison, IBM, to name a few. These pioneers not only advanced the opportunities for computer animation in film but also advanced the video game industry.
A full-length featured film was still decades away. By the 1990s, computer animation had come a long way. Enter John Lasseter. His career as an animator began with The Walt Disney Company. His interest in computer animation got him fired. In 1986, he joined forces with Steve Jobs and founded Pixar. Two years later, their short film Tin Toy won the Academy Award for Best Animated Short Film. Disney approached Pixar about a collaboration – Toy Story. After some fits and starts, the full-length, fully animated film wowed audiences, earned the team three Oscar nods, and created a franchise.
Today we know that electromagnetic (EM) radiation comes in many forms, including radio waves, microwaves, X-rays, and visible light. These wavelengths are visible to the human eye. However, there was a time when scientists thought electricity, magnetism, and light were separate. One English scientist, Michael Faraday, helped bring the three together. His name conjures images of spools of electrical wire, electric motors, and large metal cages that distribute electrical charges. However, in 1846, Faraday hypothesized that visible light was a form of electromagnetic (EM) radiation. Most scientists dismissed Faraday’s idea. However, years later, Scottish Physicist James Clear Maxwell backed up the hypothesis.
In 1946, French designer Louis Louis Réard had an itsy, bitsy, teeny, tiny idea. He wanted to revolutionize the two-piece bathing suit and shocked the fashion world, as did other designers. War-time shortages spurred on the idea of something made from rationed pieces. When the tiny bathing suit hit beaches in the U.S. two years later, some called the fashion “the Molotov cocktail of the summer” and “bra and diaper,” among other descriptions. American’s shunned the bikini until the 1960s when film idols fueled the fashion.
Which of today’s absurd ideas will be commonplace in the future? Only time will tell.
There are over 1,500 national days. Don’t miss a single one. Celebrate Every Day® with National Day Calendar®!