The first nine months of 2020 have felt like a lifetime and like no time at all. In one quick moment, everything important became mediated through screens: the joy of weddings, the sorrow of funerals, conversations with our closest friends. Meanwhile, mom-and-pop stores shuttered, and big tech companies announced record profits. Technologies are lifelines connecting us to one another; technologies are increasingly pulling us apart.

Through the pandemic and the protests for racial justice, 2020 has made it ever more clear how critical it is to consider the public purpose and implications of the technologies we use. When developed and deployed in thoughtful ways, technology can be transformational and live-saving; it can affirm core values such as security, safety, privacy, transparency, accountability, and inclusion. But when these values are not at the center of the development process, technologies can cause more problems than they solve, and fray the social fabric.

WIRED OPINION

ABOUT

Ash Carter, a former secretary of the US Defense Department, is the director of the Belfer Center for Science and International Affairs at Harvard Kennedy School, where he leads the Technology and Public Purpose project. He’s also an Innovation Fellow at MIT. Nicholas Thompson (@nxthompson) is WIRED’s editor in chief.

The response to Covid-19 is emblematic of both the promises and the perils of emerging technologies. Genetic sequencing and synthesis tools allowed researchers to create diagnostics in a matter of days; clinical trials of a pathbreaking vaccine began in just over two months. Geolocation data was quickly repurposed for digital contact tracing applications to help prevent outbreaks of Covid-19, alerting close contacts to a positive test result and allowing for their quick isolation.

But public health requires the public’s consent: The technologies that will help us get past Covid-19 are only as valuable as the number of people who use them. Diagnostics, digital contact tracing apps, and vaccines require that the public trust them to be safe, effective, and protective of privacy. If people worry that their privacy or safety would be put at risk, they will refuse to use an app or get a vaccine. Public purpose is at the heart of public health.

In the midst of the pandemic, the killing of George Floyd at the hands of a law enforcement officer was recorded on a smartphone and went viral on social media. Floyd’s death, along with the recent killings of others, like Breonna Taylor and Ahmaud Arbery, led to protests of police violence and structural racism against Black people in America. Millions took to the streets to confront racism. Social media was a powerful tool in mobilizing this movement and documenting abuses of power.

However, the same social media platforms used to spread information about protests also served as a staging ground for extremist groups, and platforms failed in their duty to quickly remove speech that incited violence. During the protests, law enforcement used surveillance tools with facial recognition technology to identify and track protesters, even as those tools are shown to be discriminatory against non-white people. The use of a digital dragnet threatens a sacred right to protest and a reasonable expectation of privacy.

This is the dichotomous nature of technological progress. How that progress is shaped is ultimately up to us. It was with this understanding, last fall, that we launched the Tech Spotlight to recognize the technologists, activists, and policymakers who are thoughtfully creating and using technology in ways to protect the public good and help shape a better future. While we could not have predicted the events that have taken place since, they have only solidified our view that we must actively develop technology in accordance with the highest moral values.

We received over 200 submissions from 18 countries. We were inspired not only by the quantity of submissions we received across sectors but also by their quality. While many of the submissions are worthy of recognition, we chose three as our inaugural Tech Spotlight Finalists:

  • Thorn Spotlight is a machine learning tool that helps law enforcement in the United States and Canada identify and locate sex traffickers and the children they harm. In the past four years, Thorn reports that it has helped identify nearly 15,000 child sex trafficking victims.

  • Google AI Model Cards are short documents that accompany trained machine-learning models, helping stakeholders to understand the model’s capabilities and limitations. As artificial intelligence systems become more complex, model cards will help users make careful use of the outputs they produce. Because machine learning tools will be increasingly used in health, justice, and myriad other sectors, Google’s goal is to make Model Cards an industry standard.

  • The International Society for Stem Cell Research released “Stem Cell Based Clinical Trials: Practical Advice,” a guide that offers physicians and institutional review boards a set of practical questions to evaluate early stage, first-in-human stem cell-based clinical approaches. This guide protects patients and helps practicing physicians to conduct safe and effective clinical trials.

Useful reference for domestic helper.