Full disclosure: I’m well aware that this post is going to have much more to do with engineering than it is with Christianity. Tough. It’s my blog, and this is what’s on my mind. Enjoy the end, where I will awkwardly try to twist it into a spiritual discussion to validate posting it here.
In ancient Greek mythology, there was a legendary inventor named Daedalus. Among his inventions was The Labyrinth, which played a central role in the myth of Theseus. Later, in Ovid’s Metamorophoses, Daedalus is shown to the reader as being imprisoned in a tower with his son Icarus, in order to prevent Daedalus’ knowledge of the Labyrinth escaping. Daedalus built them each some wings, and trained his son how to fly. Daedalus warned that if Icarus flew too high the wax would melt and the wings fall apart; if he flew too low, the feathers would become soaked in seawater and fall apart. As they were escaping, Icarus forgot and flew too high, melting the wax. He fell into the sea and drowned.
I could not help but think of this myth while over in Denmark last week on a business trip. As I may have mentioned, my new role at work is to oversee the launch of new products into the manufacturing side. As such, I work with R&D quite a bit; I also interacted quite a bit with R&D in my past two companies (in fact, I interned at R&D in an automotive company while in college).
One consistent theme that I have seen in engineering—often leading to engineering disasters—is related to the interaction between engineers at the R&D level and at the factory level. When it comes to manufacturing new product designs, I have found that disasters routinely occur because of two tendencies which I shall call the Daedalus Effect and the Icarus Effect.
The Daedalus Effect is the tendency of the R&D engineer to knowingly produce a product that is inferior, and rely on communication and watchfulness to cover the mistakes in the design. When I read the Daedalus myth, I do not get the normal cautionary tale that most people get; instead, I look at Daedalus as a failed designer. He was creating wings and knew that he had two fundamental and major quality flaws in his design: one, that the wax was subject to melting at high altitude, and two, that the feathers were useless if soaked in sea foam.
So how did Daedalus react to his known failure modes with his design? Rather than try and fix those failure modes—a stronger wax mixture, perhaps, or coating the feathers with something to resist the sea foam—he simply called the design a success. Instead of fixing the problem, he simply communicated the concern to his son and hoped that his son would both remember and understand the importance—and, having done so, out the window they jump. He bet his son’s life on whether, in the rush of the moment, he could remember the cautionary tales of his father.
This is a very common failure of R&D engineers. Typically when designing a part, development engineers are under intensive pressure for “time to market”—the amount of time it takes to get from design phase to rollout into production. To be at the testing phase (as Daedalus was) and discover a problem there is immense pressure to simply put a ‘containment action’ in place—a short-term fix that attempts to avoid the symptoms of the problem but does nothing to address the problem’s root cause.
Because of this, a product gets rolled out into manufacturing with known flaws, and it is hoped that an alarm system or a series of quality checks or diligent manufacturers will be able to handle the issue—just as Daedalus went ahead and used the faulty wings on the hopes that his warning to Icarus was sufficient. Then the R&D engineer rests easy: I have provided documented evidence of a risk, he says, so if they fail to follow it then that’s their fault rather than mine.
Unfortunately, those diligent manufacturers have their own problem: the Icarus Effect.
Icarus, you see, heard and understood the warning from Daedalus. But once things started going, he found himself distracted—the wind rushing through his hair, the thrill of success and freedom, etc. And so he began to fly just a little higher than he should, and had no issues. Then just a bit higher again. Then just a bit again. Until, eventually, the wax melted and he dies.
This is very common in manufacturing. The manufacturing group hears the warnings of R&D, but as they get more and more familiar with the product, and they produce products at (or past) the approved safety limits with no ill effects, they gain false security with the product and continue to force it beyond its natural limits. By the end, you are far beyond the allowable limits of manufacturing, and disaster occurs.
If you study engineering disasters, you see this rather frequently. It is what Michael Crichton once called “event cascades”—structural failure happens not due to one person’s mistake, but a series of mistakes beginning in design and continuing through usage. Take the BP oil disaster in the Gulf a couple of years ago. They are still arguing over who is “most” responsible—because BP, Halliburton, and dozens of others each took liberties with seemingly insignificant policies which, all together, created an event cascade and destroyed the rig. The same is true in virtually every engineering disaster you can find: a combination of Daedalus and Icarus effects, creating a string of very slight deviations from allowable conditions, which when accumulated create a massive failure and risk for public safety.
In the end, the Daedalus and Icarus effects both come down to a matter of pride and selfishness. When placed under pressure, our depraved natures have a tendency of coming out. (I often tell my engineers that “Crisis reveals character”—it is only when we are pressured that we find out whether we really have good engineering ethics or not.)
So, when the R&D engineer should be taking the entire line back to the drawing board, he fails to do so—because to admit failure on a potentially very expensive, months (or years) long design process could be a career-killer. So he releases a bad part into production out of either pride (failure to admit he made a mistake) or selfishness (caring more about his career than the company’s future). Then it gets to manufacturing, and the manufacturing engineer is pressured to improve output so he bends the rules slightly, then slightly more, then slightly more—either due to pride (believing he knows better than the designer) or selfishness (caring more about his career than following ethical rules).
In the end, you see that the spiritual condition of a person, and his ethical approach to the job, has as much or more to do with an engineer or businessperson’s success than his or her training and technical aptitude. Only through a humility in yourself and a focus on Someone greater than yourself are you willing to do the thing that might end your career—but also might save thousands of lives or your company’s future. Because only by denying our natural depraved, selfish characters can we actually avoid the Daedalus and Icarus Effects.