Two Gen-Tech entrepreneurs probed the ethical frontiers of privacy and consent for personal genomic data and genetic engineering at the Long Now Boston conversation this month: DNA – Our Past is Our Future. How can you safely and securely protect your own genetic data? Or, do you value the benefits of genetic research enough to give open access to your genetic data — knowing that individuals, corporations or governments could use it for malicious purposes? This conversation adds new information to factor into our previous considerations of technology and ethics.
Technology and Moral Norms
The moral norms of privacy and consent are baked deeply into the culture of individual rights and liberties that underpins most modern liberal democracies. Advancements in science and technology have had profound effects on such moral norms, as Juan Enriquex explored at the Long Now Boston event in May: It’s Time for Hard Conversation in Ethics. As he pointed out, technological change can force the reassessment of moral norms, and this process is sometimes messy, giving rise to so-called culture wars. His question was: What ethical norms do we embrace today that will no longer work in the context of a rapidly transforming society and changing technology? To put it another way, how will our grandchildren in the future view our current behaviors and attitudes? Juan argues for honest, respectful conversation across the divides – with a strong dose of humility.
The Morality of Personal and Public Interest
I’ve also written previously on technology and privacy. In When Public Interest Clashes with Personal Privacy, I highlighted the Bentonville AK, murder case. The local police demanded that Amazon hand over any information they had from the home’s Alexa device, which Amazon vociferously resisted. Ironically, Amazon finally complied with the request, after the defendant said it was OK. The murder case against him was recently thrown out — there are reports that the Alexa data may have helped exonerate the accused.
In that post I also highlighted the more tragic San Bernadino terrorist attack, where the FBI asked Apple for help in cracking the perpetrators iPhone. Apple vehemently resisted. This case turned out differently – the FBI was able to hire outside expertise to open the phone, reportedly at a cost of nearly $1 million. More recent reports indicate the FBI, and other law enforcement agencies, now routinely employ independent services for the same task at $50 per hack! I concluded by exploring the nuance between privacy and secrecy and a powerful human personality trait — we all tend to behave more ethically when we think we are being watched.
The tension between public and private interest, as I wrote in When Tech Claims Clash With Government Requests, in a practical sense, boils down to —- who do you trust more? In both examples, I argued that the Tech industry was not simply protecting personal privacy, they were very much engaged in a battle to protect their corporate interests. Where you live makes a huge difference in how you answer this question. In the USA, we have significant legal and constitutional protections (the Fourth Amendment to the Constitution against bad behavior by government. On the other hand, we seem willing to give away our most private information to corporate interests whose business premise is to make money selling the value of that data (in the form of advertising). In the context of possible misuse of private information, I trust our government institutions (and protections) more than the private sector institutions. By the way, that was written more than a year ago, before the FaceBook / Analytica scandal broke. The tech industry, globally, is now in hot water, both from consumers and government regulators.
DNA Science – the New Privacy Divide
Preston Estep and Dennis Grishin, the co-presenters at the Long Now Boston conversation DNA – Our Past is Our Future, split open the Privacy issue along two fault lines. Clearly personal genomic data, including whole genome sequencing, which is now available for less than $1,000, is incredibly valuable. This value includes the personal value to the individual – that genome is ultimately responsible for many of a person’s traits and identifying features, including heath, longevity, and physical and mental capacities. In some cases the genome sequence reveals a predisposition to a fatal diseases and can serve as the basis for a cure. But the value also includes public value. Good research on genetic diseases and disorders requires access to large genomic databases, as does research on enhancing genetic attributes. Yet genomic data also offers considerable value to those who wish to misuse it. Insurance companies might discriminate, criminals might extort, medical and health providers might target manipulative ads, governments or others might use it for intrusive personal tracing.
Dennis offered one solution. His company, Nebula Genomics has developed a program to encrypt an individual’s DNA with multiparty keys located in different jurisdictions (thus foiling government access). Yet the data would also serve as a secure repository for legitimate researchers wishing to access the data. In this system, an individual could maintain privacy while also enabling important research. That is, until some third party takes a tiny tissue sample (saliva, for example) and spends the $1000 to get it sequenced, in which case your privacy efforts are all for naught.
Preston, in the Q&A, offered another. He explained how he personally, after extensive conversation with his wife and relatives, agreed to voluntary donate his genomic sequence data for research in a program at Harvard. The National Institutes of Health now offers a similar program “All of Us.” As Preston said, nobody can steal your data if you give it away first.
Both Dennis and Preston put a high priority on the concept of consent. Nobody should be able to use your DNA for any purpose, including research, without your consent. However, the consent principle becomes more difficult when you move from research to therapeutic genomic intervention. Changing the genome is irreversible. Moreover, if the change is made to the germ line (an individual’s egg cells or sperm cells), or in an embryo, then the idea of consent is meaningless. It will be at least 18 years before the two genetically engineered baby girls in China would be able to give informed consent for the embryonic genetic engineering procedure completed in 2018. Extrapolate the scenario to the one Preston postulated of genetically engineering a Humans 2.0 capable of surviving space travel, and it is hard to see how our ideas of either consent or of the privacy of one’s genomic data could survive.
Conclusion
There is no simple solution to these tensions. As I said in When Tech Claims Clash With Government Requests, we need continued vigilance to maintain the protections we have against government over-reach. At the same time, we need to increase our vigilance against the manipulation and misuse of private information by private enterprise. As for personal genomic data, my personal view is that the public interest benefits should weigh more heavily than the private ones. This is not an unfamiliar or radical position to take. It is the balance customarily applied to the question of vaccination – the good of the whole outweighs the small risks to any individual.
The question of genetically engineering the human race as Humans 2.0 is a different matter. In this case we would be designing consequential outcomes for future individuals without their consent. One might argue that it would be for their benefit. By if one designs an individual for space, it would also demonstrate the feasibility of designing for war – or for slavery.
In addition to the ethical concerns, however, I would add a more practical concern. Our knowledge of the human body (indeed any living entity) is still extremely limited. The body, and its genetic trajectory, both in the individual sense and the evolutionary sense, is more complex than we can possibly understand at present. There are feedback loops and multiple communication and transmission pathways we have not mapped. Introducing a single gene replacement for a specific purpose may seem innocuous – but we do not know, as some of the early experiments in gene therapy for cancer have, infortunately shown. Introducing multiple genes for multiple purposes without knowing how all of it links together would be foolhardy. Unfortunately, those potentially horrific consequences would be experience by the subject and not the researchers responsible for them.
As a bottom line, I call for Rational Skepticism. We should accelerate the research but go slow on the engineering. To rephrase my conclusion on the GMO food debate:
“I have no argument with those who want to research GMOGen-tech or to apply it to address specific, isolated, documented problems in a transparent and controlled program. I have few concerns with synthetic biology in the laboratory. But please, do not make spurious claims about GMOGen-tech safety on the basis of current science and an apparent lack of evidence about harm. We just do not know enough. We may be looking at things wrong — or looking at the wrong things.”