Privacy is very difficult to come by in today’s connected world and this issue has only been exacerbated by the pandemic, which legitimized the collection of masses of personal data.
Governments have already begun to put Covid-related data to use in other contexts. And beyond medical data, the debate around the use of facial recognition and biometrics by state agencies rages on.
To find out more, TechRadar Pro spoke to Cindy Cohn, Executive Director at the Electronic Frontier Foundation, an organization taking a stand against censorship and surveillance.
What are some of the biggest issues that the EFF is currently engaged in?
We are doing work around creating competition in tech markets, especially through tools like interoperability. But we also continue to worry about the telecommunications giants too, and the impact of that on society. We believe the lack of competition has a key role in the fact that so many people have no good choices for broadband right now.
We’re continuing to work to support freedom of expression around the world, both in fighting bad legislative proposals like “reforming” Section 230 and helping creators on YouTube and elsewhere who have been censored by overbroad content filters.
And we are standing up for your security and privacy against so many attacks from so many directions that it’s hard to name just one or two, but our work to try to continue to track surveillance technologies acquired by law enforcement through our Atlas of Surveillance project is particularly exciting right now.
You’ve been tackling issues of censorship and surveillance for a long time. How have these issues evolved over the years, and what do they mean in this age of the remote workplace?
The biggest change is that everyone now gets what we have long been talking about. We used to have to convince people that the internet would mean that we all have the chance to make our voices heard, or that people’s jobs and culture will both be tremendously dependent on digital networks. Not anymore. Another change, and one that is welcome, is that people are more skeptical of technologies. They look for the privacy problems, they worry about where their data goes and who is using it. Now our next step is to empower them — governments and companies have convinced many people that the situation is hopeless. Our job is to convince them that a better future is possible and that we can get there.
We have long worried that the law of surveillance is based upon the presumption that we keep our most private information secret. That is, we keep our key documents in our homes, our key relationships are not tracked or often even known by those outside of our circle of family and friends (and maybe even not there either). Today we all know that our most sensitive information is held not by one, but by many third parties, and Facebook Amazon, Google and others track and either infer (via machine learning or AI) or know all of our associations. This has implications for our rights to privacy against both governments and private companies.
On censorship, we started out worrying about governments, since they were really the key threats to online speech. But now we all see that big tech giants are the leading deciders about whether you get to speak online. It’s not a constitutional problem (in the lawyer sense) but as a practical matter, we need to get to a place where we don’t have just a few companies controlling what is said online.
But also one where we have better tools, whether digital or otherwise, to protect ourselves against hateful, harassing, and other harmful activities online. I think it’s a mistake to think that the tech giants are going to start addressing these issues in a way that satisfies everyone (or maybe even anyone) so we should focus instead on ways to promote competition and interoperability in their systems so that you have better choices.
Benjamin Franklin said “every problem is an opportunity in disguise”, and that’s true for the Government and other private entities in how they’ve turned Covid-19 to exercise more control. Wouldn’t you agree?
Yes, exactly. We have been extremely worried about the opportunity for more pervasive surveillance, as well as the use of that surveillance to limit or gatekeeper our access to goods, services, and benefits. We know that measures taken in response to crises often hang around long after the crisis passes, and then get put to further uses. We’ve already seen governments in Singapore start to use Covid tracking data for other uses and I wouldn’t be surprised if that was happening in some places in the U.S. as well.
We also worry that these measures are regressive – that is people with resources can easily get things like Covid passports that are up to date and correct – and people without cannot. And since those are often the communities more at risk for the virus, its impact ends up being exactly backwards. The people who need the protection the most get it the least.
Going forward, what are some of the aspects of digital liberty and freedom that EFF will be working on in the near future?
We have issues that are evergreen: ensuring that you have rights and a voice when you go online, standing up for real security including strong, end-user controlled encryption, working to ensure that the Fourth Amendment protects us in the digital age, and to make sure that everyone has real broadband access.
In 2021, we have identified three “challenge areas” where we expect to put special attention:
Law enforcement surveillance
This category includes face recognition and other biometric data collection, as well as warrantless data collection by federal agencies, such as invasive device and social media searches by Customs and Border Protection.
Disciplinary technologies
Disciplinary technologies are sold to corporations, schools, and individuals for the ostensible purpose of monitoring performance, confirming compliance with policy, or ensuring safety. In reality, they are non-consensual violations of a person’s autonomy and privacy that are at best vaguely connected to the stated goal of the system. Prime examples are student surveillance (especially proctoring) software and employee-monitoring bossware. Closely related, and often overlapping, are stalkerware, kidware, and other consumer spyware used to monitor and control intimate partners and household members.
Strengthening the policies and structures that support online speech and foster user expression
This includes promoting and defending intermediary models that protect users’ rights and interests, advocating for a human-rights framing for speech curation by online services, and promoting competitive compatibility that give users control.
EFF has been a major advocate of decentralized technology and protocols. Can you suggest some ways regular folks can protect themselves from supervision and control?
There are a growing set of decentralized technologies that are available and worth checking out. They provide a glimpse into a better strategy and the more people use them, and contribute to them, the better they will get. For almost every service that a tech giant provides, there is a community building a decentralized version, and I look forward to the day when the next killer tool or service that will make your life better will come from that community. That day is coming, I think.
But I do think we need to clear more space for that world, and some of it will take legal and policy changes, not just individual choices by users and developers. We need to stop the blocking of interoperability by laws like the CFAA, DMCA and those click-wraps you don’t read that prevent reverse engineering and other steps we need to make new tools that let you interact with old ones. And we may need some affirmative requirements that companies allow and even foster interoperability.
In the meantime, there are some things that regular folks can do. EFF doesn’t endorse any specific tools or services but folks who are interested should check out privacy protecting tools like Signal and Tor, tracker blockers like EFF’s Privacy Badger and privacy protecting browsers and services like those offered by Firefox and DuckDuckGo. There are others, of course. And do use your privacy settings even in services like Google and Facebook. They aren’t great but they also do send a signal that you care. I often debate company shills who maintain that people just don’t care about privacy and security and when I ask how many people adjusted the privacy settings, it points to the fact that people do care, we just need better options.
- Here’s our list of the best proxy services services