So Kamara has worked on developing secure database schemes in which data can be audited and checked privately but transparently, that does not allow data to be exported or duplicated, and that deletes entries automatically after a given amount of time without special authorization from an authority like a judge.

“I think there is an intersection between traditional cryptography and privacy and what I was calling ‘crypto for the people,'” Kamara says. “There is research and there are tools that can be beneficial to large subsets of people, as in the encrypted messaging app Signal. But there are also problems and adversarial models that are unique to marginalized groups, and those problems are not being investigated. For example, not everyone ends up in a gang database, and certainly very few cryptographers or academic computer science researchers end up in gang databases.”

Kamara also advocated using the flexibility and security of tenured professorships as an opportunity to push the envelope of what cryptographic research can be—including in the case of his own talk. “I went into it thinking, ‘I’m glad I have tenure, because this is going to cost me,'” he says. But Kamara says the response has been very positive so far. “I’m sure there are many others who disagree and didn’t like the talk, but so far they haven’t reached out to let me know,” Kamara says.

The long-standing question of morality in cryptography rarely makes it to the foreground, even within the academic community itself. The discourse flared up in the wake of Edward Snowden’s 2013 revelations about mass digital surveillance by the National Security Agency, particularly after a seminal 2015 paper by UC Davis cryptographer Phillip Rogaway, which made the case that cryptography is “an inherently political tool” with “an intrinsically moral dimension.”

“I plead for a reinvention of our disciplinary culture to attend not only to puzzles and math, but, also, to the societal implications of our work,” Rogaway wrote.

Five years later, he says he doesn’t see many changes in the research most cryptographers are doing or the topics they are discussing at conferences. But he adds that he was impressed with Kamara’s talk and the steps it took to move the discourse forward. The essay Rogaway wrote in 2015, he says, would now include not just a discussion of the ethical need to defend the masses against mass surveillance, but an entreaty that the academic community focus more of its work on serving marginalized groups.

“We don’t work in a vacuum and we’re not pure mathematicians,” Rogaway told WIRED. “As much as certain cryptographers would like to see themselves as doing pure mathematics on some kind of quest of discovery, that’s not an apt description of where we sit. The field does have these very strong political connections and connections to power. And if we just say, ‘Oh, that’s not my domain,’ that in itself is a really politically situated, ahistorical view and ultimately quite elitist.”

Today, partly because of rapidly expanding anti-abuse work on social networks and communication platforms, the idea of an ethical imperative in privacy technologies has become more mainstream. But much of the actual work in cryptography remains fundamentally abstract. The practical applications that do exist often originated with a narrow field of view.

“Building the same stuff you always did but claiming that it’s for people in marginalized communities is not the same thing as human-centric threat modeling,” wrote Lea Kissner, a cryptographer and security engineer focused on anti-abuse and privacy, in a series of tweets about Kamara’s talk last week.

The type of tailored, threat-specific research Kamara described requires intimate knowledge of the actual, nuanced needs of a marginalized group. Kamara emphasized in his talk that the cryptography community needs to be much more inclusive and representative if it wants to help the vulnerable. And researchers need to seek firsthand expertise to gain a deeper understanding case by case.