Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

An insightful warning about the many unforeseen disasters technology might someday bring

Wallach A Dangerous
Basic Books

The following passage comes from "A Dangerous Master: How to Keep Technology from Slipping Beyond Our Control" by Wendell Wallach

Advertisement

SOCIAL DISRUPTIONS, PUBLIC HEALTH AND ECONOMIC CRISES, environmental damage, and personal tragedies all made possible by the adoption of new technologies will increase dramatically over the next twenty years.

Some of these events will result in the death of many people. This prediction is not meant to be melodramatic or to generate fear. Nor am I interested in thwarting the progress of the many scientific paths of research that will improve our lives.

I offer this warning in the hope that, through a little foresight and planning, and the willingness to make some hard choices, many of the dangers will be addressed.

Unfortunately, there is little evidence that we or our governments have the will, intelligence, or intention to make those hard choices. Indeed, there are reasons to believe that such crises are inevitable, that the pace of calamities involving new technologies will accelerate, and that the opportunity to give direction to the future of humanity is slipping away.

Advertisement

Thalidomide babies, Chernobyl, the explosion at a Union Carbide chemical factory in Bhopal, India, the Challenger space shuttle, and the BP oil spill evoke images of tragedies in which technology was complicit. To this list add the use of harmful technologies consciously designed for destructive purposes: the gas chambers at Auschwitz, the firebombing of Dresden, Hiroshima and Nagasaki, use of Agent Orange in Vietnam, sarin gas attacks in the Tokyo subways, and the dangers posed by the proliferation of cruise missiles and biological weapons.

Many new risks are posed by technologies under development. A failure of, or a cyber attack upon, critical information systems will cause a major banking crisis or a sustained loss of electricity.

A nanomaterial used in many consumer products will be discovered to cause cancer. Students will suffer brain damage as they mix drugs, each of which is intended to give them a competitive edge in their studies. An enraged teenager will kill her father with a plastic gun produced on a $275 3D printer. In his home laboratory, a psychopath or terrorist will brew a designer pathogen capable of starting a worldwide flu pandemic. Autonomous weapon systems will kill civilians, and may even start new wars. An island nation, threatened by rising tides, will engineer local climate and cause a drought in neighboring regions.

Adderall pills on magazine
Alex Dodd/flickr

In our 2009 book, "Moral Machines: Teaching Robots Right From Wrong", my co-author Colin Allen and I made a similar prediction about a catastrophic event caused by a computer system taking actions independent of direct human oversight.

Advertisement

We sketched a fictionalized example of how a disaster caused by computers might unfold. The scenario entailed a series of plausible incidents based upon present-day or near-term computer technology. Collectively, these occurrences triggered a spike in oil prices, a failure in the electrical grid, a Homeland Security alert, and the unnecessary loss of lives.

A real-life incident created by computers occurred at 2:45 PM on March 6, 2010. The Dow Jones Industrial Average took a steep dive and then recovered in a matter of minutes. What has been named the flash crash is the biggest intraday point decline (998.5 points, 9 percent) in the history of the Dow Jones Industrial Average. High-speed automated trading by computer systems that buy and sell shares was a key contributing factor. At the time of the crash, high-frequency traders tendered at least 60 percent of all transactions.

By some estimates roughly half of all trades today are made automatically by computers that tender buy and sell orders algorithmically once mathematically determined thresholds are crossed. The percentage exaggerates the overall importance of computerized market activity in that high-frequency trades take the form of two transactions—a buy and a sell order occurring within a few minutes or even a fraction of a second of each other.

RTXTGT6
The final numbers of the day's trading is shown on a board on the floor of the New York Stock Exchange in New York in this May 6, 2010 file photo. REUTERS / Lucas Jackson

The flash crash unduly robbed some investors while rewarding others, and undermined confidence in the operations of the stock exchanges. To avert a future flash crash, additional circuit breakers that automatically kick in when errant trades are detected were built into the stock markets.

Advertisement

Nevertheless, on August 1, 2012 a “rogue algorithm” from Knight, a company that specializes in computer-driven trading, tendered buy and sell orders for millions of shares in 148 different companies before the circuit breakers halted trading. The major harm was to the trading company Knight and its clients, who lost $440 million in less than an hour. But this was just one more in a series of events that have reinforced an image in the minds of small investors that the robots are already in control of financial markets, and that the investment game is fixed.

The reliance on computers by financial markets goes well beyond systems responsible for high-frequency trading. They also played a role in the earlier real estate collapse of 2008. Computers enabled the assembly of complex derivatives that bundled bad loans together with good loans. Once the real estate market collapsed, it became impossible for anyone to evaluate the worth of the derivative shares held by banks.

Even heads of large banks could not determine the viability of their own institutions. What is interesting about the role of computers in the derivative crisis is that the machines did not fail. They functioned as critical infrastructure supporting a faulty banking system. Yet their very existence enabled the creation of the complex derivative market that helped cause the crisis, and certainly exacerbated its impact.

Traders looking up and covering their mouths.
Traders work on the floor of the New York Stock Exchange in New York on November 25, 2008. Lucas Jackson/Reuters

The role computers played in the real estate collapse was noted by a few analysts, but the emphasis has been more upon assigning blame to greedy bankers making bad bets and uncovering fraudulent activity by crooks such as Bernie Madoff. The flash crash and the “rogue algorithm” were initially attributed to human error. Computers have escaped blame.

Advertisement

From our present vantage point, it is impossible to know all the potential dangers. Bad agents intent on using a technology to pursue destructive and illegal goals will be responsible for much of the risk. Those risks range from individual tragedies to public health crises; from the societal and ethical impact of technologies that enhance human capabilities to the loss of privacy, property, and liberty; and from the collapse of critical infrastructure to the onset of a dystopian society.

The promoters of a cutting-edge technology submerge its dangers beneath enthusiasm for the potential benefits of the research. They boast that the blind will see and the lame will walk with the help of cameras and mechanical limbs wired into the nervous system.

Wars will be won with the latest in high-tech weaponry. Deciphering an individual’s genome will lead to personalized medical treatments for inherited diseases. Autonomous cars will have fewer accidents and free drivers to text message while traveling to and from work. Each of us will be happier, smarter, and even more moral if we elect to take a morning cocktail of cognitive enhancers.

Of course, we know that investment bankers, venture capitalists, entrepreneurs, and even a few inventors will get rich along the way. Politicians will receive the patronage of the powerful, contracts for industries within their districts, and support from voters who receive jobs. Scientists and engineers will be rewarded with tenured professorships, well-financed laboratories, bright research assistants, and the occasional Nobel Prize or other prestigious award.

Advertisement

Short of one catastrophe that threatens human existence, the greatest challenge would be the convergence of many disasters from different quarters occurring within a short period of time. Technological development is intimately entangled with health care, environmental, and economic challenges.

Usually technology provides solutions to these problems, but it can also create public health crises, damage to the environment, or, as discussed, economic disruption. If a confluence of disasters should occur, technology may not be implicated in them all.

While many other scholars and reporters ably cover the consequences of failing to address challenges in these other spheres, this book addresses the technology side of the equation.

DNA microscope genes genome
Shutterstock/18percentgrey

Social systems can manage only so much stress. Reacting to multiple concurrent disasters taxes and quickly overwhelms even robust institutions. The government may or may not bear direct responsibility for a crisis, but when it fails to effectively respond, it loses the confidence of citizens.

Advertisement

In democratic countries, small failures in governance often lead to a turnover in the ruling party. But large failures undermine the public’s faith in its governing system. One very large crisis or multiple crises could potentially bring on the collapse of major social institutions and a government.

Finding a resolution to each challenge as it arises provides the best method for staving off a future situation in which multiple crises arise simultaneously. This usually entails putting in place precautionary measures that can be costly.

However, short of a self-evident instance in which a disaster is prevented, there is no good way to determine the efficacy of precautionary measures precisely because the disasters averted never actually occur. Furthermore, precautionary measures in the form of regulations and governmental oversight can slow the development of research whose overall societal impact will be beneficial.

Thus, legislators are reluctant to ask businesses and citizens to make sacrifices. In recent years, politicians have gone one step further by pretending that problems from global climate change to high-frequency trading either do not exist or cannot be tamed.

Advertisement

Without precautionary measures we are left with the often unsatisfactory downstream attempt to solve a problem after a tragedy has occurred. Disasters do focus attention. Bovine Spongiform Encephalopathy, commonly known as mad cow disease, was a wake-up call for the European Union.

The meltdown of reactors at the Fukushima Nuclear Power Plant after the giant tsunami on March 11, 2011, alerted the Japanese people to failures that pervade their management of a potentially dangerous technology.

fukushima
The aftermath of the Fukushima disaster. Reuters

Unfortunately, responses to a disaster tend to be more reactionary than well thought out. Four million four hundred thousand cattle were slaughtered in the UK in the attempt to stem Creutzfeldt–Jakob disease, the human variant of mad cow disease. Japan, which is heavily dependent on nuclear power generation, was forced to shut down all reactors by May 2012. After extensive testing, a few are slated to restart in 2015.

Governing in reaction to disasters is costly.

Advertisement

Whether the costs incurred from waiting until a tragedy happens are greater than the losses incurred by zealous upstream regulation is a matter on which policy planners disagree. Time, however, will answer that question.

If a convergence of multiple crises takes place, many of which result from unaddressed, foreseen problems, the answer could be the rapid onset of a dystopian future. 

Republished with permission from "A Dangerous Master: How to Keep Technology from Slipping Beyond Our Control by Wendell Wallach with permission of Basic Books, Copyright June 2, 2015.

Technology Innovation
Advertisement
Close icon Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account