Exploring the Goldilocks Zone of Invention and Creativity
Written on
When it comes to transformative ideas, they often need to emerge within a specific "Goldilocks Zone" where the timing is just right.
Recently, I shared an article on Medium that gained considerable traction, titled "Simultaneous Invention: Why You Shouldn’t Wait to Create." This piece delves into the phenomenon of simultaneous invention—when multiple individuals independently conceive the same idea around the same time without influence from one another.
The annals of history are filled with instances of simultaneous invention. For example, Louis-Jacques-Mandé Daguerre and William Henry Fox Talbot both developed photography concurrently. Similarly, Charles Darwin and Alfred Russel Wallace formulated the Theory of Evolution independently and simultaneously. Joseph Priestley discovered oxygen in 1774, while Carl Wilhelm Scheele had done so a year earlier, in 1773. These examples illustrate how ideas can sprout in unison, often unacknowledged.
In my article, I referenced the research of Dean Simonton, a social psychologist from the University of California, Davis, who examined numerous cases of simultaneous invention. He posited three key factors that may explain this occurrence: genius, chance, and zeitgeist.
The feedback I received was incredibly engaging. Many readers recounted their own experiences with simultaneous invention, while others shared historical examples I had not encountered before. Some expressed gratitude for the encouragement to overcome procrastination and share their creative endeavors. A few even suggested a fourth factor, which might shed light on the peculiarity of simultaneous invention.
Gordon Hart, a writer and rocket scientist, commented on the necessity of technological infrastructure for an invention to materialize. This perspective resonated with me; technological advancements often lay the groundwork for what we can create. As Stuart Kauffman, a physician and theoretical biologist, described, technology opens up new avenues to what he refers to as “the adjacent possible.” Many visionary concepts have simply awaited the right technological advancements to be realized.
Ada Lovelace, born on December 10, 1815, in what is now London, was the daughter of the poet Lord Byron and Annabella Milbanke Byron. Following her parents' divorce shortly after her birth, Byron left England and never saw his daughter again. Ada was educated by private tutors and quickly outgrew their teachings, becoming largely self-taught until she studied advanced mathematics under Augustus De Morgan, the first professor of mathematics at the University of London.
Lovelace was a brilliant mathematician and insightful writer. She famously remarked, "The more I study, the more insatiable do I feel my genius for it to be." In 1833, she met Charles Babbage, a fellow mathematician who was then developing what he termed the "Analytical Engine."
The Analytical Engine is widely regarded as the first computer. At first glance, it appears quite different from today’s computers, being a complex assembly of wood, metal rods, levers, pulleys, and slides. Babbage envisioned a machine capable of executing any calculation humans requested.
Babbage himself stated, "The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform."
The revolutionary nature of this invention cannot be overstated. Even the most inventive science fiction writers of the time could not have conceived of such a device, and no inventor had attempted to construct one.
Babbage believed that once the Analytical Engine was realized, it would inevitably shape the future of science. He forewarned that inquiries regarding results would focus on optimizing calculations for the machine.
While Babbage recognized the immense potential of his invention, it was Ada Lovelace who foresaw its profound implications. As Babbage's collaborator, she comprehensively understood how this machine could transform human understanding and society itself. Lovelace foresaw the Analytical Engine's capabilities extending far beyond mere arithmetic.
"The Analytical Engine might act upon other things besides number," she posited, suggesting it could potentially compose intricate musical pieces if the relationships of sound could be articulated mathematically.
In one eloquent passage, Lovelace anticipated the Analytical Engine’s applications across various scientific fields, including areas like bioacoustics, which examines the sounds produced by organisms within ecosystems. She envisioned a future where mathematics would illuminate complexities of the world.
"Imagination is the Discovering Faculty, pre-eminently," Lovelace asserted. She believed that any quantifiable phenomenon could be analyzed by the Analytical Engine or its successors. Unfortunately, it would take over a century for technology to align with her visionary ideas.
Lovelace’s groundbreaking concepts might have been too innovative for her time. She often faced skepticism from her contemporaries, leaving them to wonder whether she was a genius or simply eccentric. "That brain of mine is something more than merely mortal; as time will show," she claimed, and her foresight proved correct. The early programming language, Ada, was named in her honor, yet she did not live to witness the realization of many of her predictions. Today, her contributions to science and technology remain largely unknown to the public.
History showcases numerous instances of ideas that were advanced for their time. Leonardo da Vinci sketched plans for a helicopter in the 15th century, which wouldn’t take flight until the 19th century. Charles Fritts developed the first solar panel in 1883, but it wasn't until the 1950s that solar panels became commercially available, and it took another 50 years for them to gain widespread use. The first electric car, designed by Andreas Flocken in 1888, only gained popularity with the advent of Tesla’s Model S in 2010. Such inventions were stymied by a lack of the necessary technological infrastructure, as Gordon Hart pointed out.
For an idea to effect meaningful change, it must emerge at a moment when the technology is ripe for implementation and before anyone else has conceived it. If it arrives too early, it seems impractical; too late, and it is merely commonplace. This principle applies to both scientific and creative endeavors.
Consider the book Sex at Dawn, authored by Christopher Ryan, PhD, and Dr. Cacilda Jetha. It boldly argues that humans are inherently sexually diverse, challenging the long-held belief in innate monogamy. The book stirred controversy and ignited debate, yet it represented a pivotal moment in societal discourse regarding love and relationships.
Published in 2010, Sex at Dawn would likely have been dismissed a decade earlier, while a decade later, it might have simply added to an already ongoing conversation. The ideas that reshape our world are born from creativity and ingenuity but also hinge on the serendipitous timing of their emergence.
Reflecting on the notion of "changing the world" through creativity raises the question: is that truly the ultimate goal?
A reader's comment on my piece about simultaneous invention resonated with me:
> How can we best account for inventing and creating plenty of brilliant and novel things in their entirety, yet never having anyone find our work? I create immediately, but then the creation, invention, or work lands in the void somehow. It feels very discouraging.
At the core, many creative individuals wish for their work to resonate with others. Aspiring to have your art or writing “change the world” may be overly ambitious, yet every creator desires their efforts to make an impact, however small. The disappointment of sharing a creative project only to have it fade into obscurity is a familiar struggle, leaving many to ponder its purpose.
The influence of a particular idea on the world is shaped by both serendipity and the dedication required to bring it to fruition. The world's response—whether enthusiastic, indifferent, or dismissive—often lies beyond our control. What we can control is the time and energy we invest in our creative pursuits.
For me, the essence of creativity is not merely to change the world; it is to foster personal growth. The creative process is a journey toward self-fulfillment, allowing us to evolve into the individuals we aspire to be. If we measure success by external validation—likes, accolades, or financial success—we set ourselves up for disappointment, waiting for external approval that may never arrive.
The stark reality is that only a small fraction of artists earn a living solely from their creative endeavors. Jackson Pollock worked as a custodian at the Guggenheim Museum until his art gained recognition. Harper Lee held a job as an airline reservation clerk while crafting To Kill a Mockingbird. Even the acclaimed writer Terry Pratchett juggled a job as a press officer while publishing five novels.
Thus, it is wise to redefine success in artistic terms, focusing on internal criteria rather than external validation. Personally, I find success in those quiet moments of satisfaction when I reflect on my writing, feeling that I have crafted something true to myself—beautiful and original. Any recognition that follows is merely a bonus.
I strive to concentrate on what lies within my control: the act of creating.
As Kurt Vonnegut aptly stated, "Nobody will stop you from creating. Do it tonight. Do it tomorrow. That is the way to make your soul grow—whether there is a market for it or not! The kick of creation is the act of creating, not anything that happens afterward."