From the President: As Technology Evolves, the Law Must Keep Up
From Washington Lawyer, February 2011
By Ronald S. FlaggThe emergence of new technologies and novel legal issues have been linked throughout our history. As cotton gins, railroads, automobiles, airplanes, computers, and the Internet have transformed society, they also have created myriad new legal issues. The historical ties between new technology and legal developments beg the question: What emerging technologies are likely to spawn the legal issues upon which governments and lawyers will focus over the next decade? In a presentation at the D.C. Circuit Judicial Conference, Arizona State University professor Gary Marchant identified multiple areas of technological development that could radically change our lives and the law.
Nanotechnology involves the study, manipulation, and design of materials on a molecular scale—a tiny fraction of the width of a human hair. Sales of products incorporating nanotechnology are projected to grow from about $300 billion in 2005 to $2.9 trillion in 2014. Nanotechnology is being used to develop “smart dusts”—clouds of tiny wireless sensors, so small they can be injected into the human body to detect early signs of disease, or scattered into the environment to detect pollutants, pathogens, bioweapons, illicit drugs, or people. As Professor Marchant commented: “This technology has the potential to provide both a medical and national security bonanza, and the ultimate Big Brother scenario of all-pervasive, invisible spybots.” In the United States, more than a half-dozen government agencies already are heavily involved in nanotechnology initiatives, giving rise to legal issues in areas such as intellectual property, privacy, and international regulatory harmonization.
Genetic profiling will become far more prevalent as it will be possible, in the next couple of years, to sequence an entire human genome for less than $1,000. Because of the potential medical benefits from such information, people are likely to store their entire genetic sequence on a computer chip. Although the primary use of this information will be to personalize medical care based on our own unique genetic profile, it also can be used to predict a person’s future disease risks, behavioral tendencies, performance abilities in areas from music to athletics, and other traits that we may prefer not to know or that others could misuse.
Genetic profiling will raise scores of difficult legal and ethical questions, as two existing examples illustrate. A criminal defendant in Arizona, whose biological family includes four generations of violent criminals and who started committing crimes at age five, raised as a mitigating factor in his capital defense that his genome included the “murder gene.” Thousands of human babies already have been born from embryos selected for implantation in their mothers based on genetic testing using a technique called pre-implantation genetic diagnosis. The set of available genetic traits for which embryos can be tested is rapidly expanding beyond disease traits to include those related to appearance, abilities, and inclinations. If these “designer babies” selected for their optimal genetic predispositions indeed tend to be more successful in their lives, will having babies the old-fashioned way become increasingly obsolete (at least for those who can afford the new reproductive technologies)?
Surveillance technologies are becoming more sophisticated and ubiquitous. More and more products, including cell phones, cars, clothing, store merchandise, identification and credit cards, pets, and even some humans, are being implanted with computer chips that track their locations. The almost unlimited applications of this new gold mine of location data are just starting to be explored, but already law enforcement officials, divorce attorneys, parents, employers, advertisers, insurers, criminals, and cyber-stalkers are finding innovative ways to use (or in some cases, misuse) this information. Thorny privacy issues related to surveillance technologies already are being threshed out, including: How long will service providers retain location information; can they sell information to other companies; under what conditions will information be disclosed to police or to parties in civil litigation such as divorce cases; and can and how should federal agencies regulate the use of locational and tracking technologies?
New functional and structural brain-scanning technologies can reveal a predisposition to violence, just as genetic testing can show a predisposition to certain diseases or conditions. Again, these developments raise significant policy, legal, and ethical questions. How should society react to such foreknowledge? How might schools, employers, the military, and the criminal law system use this information? Europe already has seen the first case of an individual whose movements have been restricted “for the good of society” before he committed any crime. As techniques for interpreting neurological and physiological predispositions improve, there is likely to be growing pressure to remove potentially “dangerous” people from society before they harm others.
As Professor Marchant put it: “While techno-optimists believe that these developing technologies will lift humanity into a new Golden Age, pessimists fear we may lose what it means to be human, even destroying ourselves (perhaps with an artificially created pathogen).” Although technology creates its own momentum, it is not independent of human control. Would increased government oversight or regulation of these technologies prevent the potential adverse health, safety, environmental, and privacy effects associated with their development? Can we undertake this sort of regulation without chilling the development of valuable technology or putting us at a competitive disadvantage vis-à-vis other countries?
I do not profess to know the answers to these questions, but I am confident that many of them will become the legal issues with which our profession will have to grapple in the coming years.
Reach Ronald S. Flagg at firstname.lastname@example.org.