The below is a growing list, please check back for new resources and keep up with our latest thoughts via our newsletter.
Publications
NEW PUBLICATION! Now inviting comment.
We are inviting feedback from tech builders—whether in commercial tech, non-profit, or open-source spaces—on our new guide, Design from the Margins: A Methodology to De-weaponize Tech.
It offers a short outline of the Design from the Margins (DFM) framework and is designed for those committed to creating just, and rights-protective technologies that resist being weaponized for surveillance, policing, or harm.
Queer Resistance to Digital Oppression, ARTICLE 19 in collaboration with the De|Center (July 2024). This report was published in four parts:
Design from the Margins: Centering the Most Marginalized and Impacted in Design Processes—from Ideation to Production, Afsaneh Rigot (Harvard Kennedy School Belfer Center for Science and International Affairs, May 2022)
Digital Crime Scenes: The Role of Digital Evidence in the Persecution of LGBTQ People in Egypt, Lebanon, and Tunisia, Afsaneh Rigot (Berkman Klein Center for Internet & Society at Harvard University, March 2022)
In the News
“The Weaponization of Things: Israel’s Techno-Violence, A Litmus Test for Technologists,” Afsaneh Rigot (Tech Policy Press)
“If Tech Fails to Design for the Most Vulnerable, It Fails Us All,” Afsaneh Rigot (Wired)
“Big Tech Should Support the Iranian People, Not the Regime,” Mahsa Alimardani, Kendra Albert, and Afsaneh Rigot (New York Times)
“Designing tech for the most vulnerable users leads to better products for all, says researcher” (CBC’s Spark podcast)
Case Studies
Discreet App Icons and Other Safety Changes on Grindr
Queer Syrian refugees in Lebanon told us they faced risks at army and police checkpoints, where they often experienced profiling and had their devices searched. Many times, it was the Grindr logo of the app on their phone which outed them and increased their risk.
We worked with the Guardian Project to implement the Discreet App Icon feature in 2017 to allow users to make their app look inconspicuously like other apps. Originally made free for “high risk countries,” in 2020, it was released for free globally and is one of the app’s more popular safety features. Subsequently, in consultation with other LGBTQ communities in MENA, DFM has been used to push for many other security and harm reduction features on the app, such as PINs, screenshot blocking, unsend messages and images, direct lines of communication to communities, and more.
This work was conducted as part of on-going research with ARTICLE 19.
Timed Messages on WhatsApp
In 2021, through research with LGBTQ communities in MENA and the criminal defense lawyers who represent them, we demonstrated the increasing use of digital evidence to criminalize LGBTQ people and how to navigate and counter it. We used the findings to work with WhatsApp to implement more disappearing messages options to reduce the amount of data gathered on people’s devices which may later be used against them in court.
This work was conducted as part of research by Afsaneh Rigot supported by ARTICLE 19 and Jessica Fjeld, Mason Kortz, and students in the Cyberlaw Clinic at Harvard Law School.
Locked Chats on WhatsApp
In 2023, we extended our work with Meta around protecting LGBTQ communities in MENA, the long standing research and expertise of the team into the patterns of device searches, interrogations, and data extraction from devices for sentencing of criminalized communities, and used additional findings from our research to create a feature on WhatsApp where the most “high risk” conversations could be hidden in locked chats. This adds a layer of protection for people using the app who are at risk of their devices being searched without a warrant and their communications used against them.
We have continued to push for WhatsApp to refine the implementation of this feature. Since its introduction, it has been updated with unique passcodes, options to fully hide/invisibilize the chat, and quick hide options for emergencies. More changes and consultations are being done to disentangle the feature fully from biometrics to ensure it is in line with DFM and needs of decentered communities.
This work was conducted as part of research by Afsaneh Rigot supported by ARTICLE 19, Jessica Fjeld, Mason Kortz, and students in the Cyberlaw Clinic at Harvard Law School.
App Icon Cloaking on Signal
In 2023, based on additional research with LGBTQ communties, activists, and protesters in MENA and beyond, we worked with Signal to implement a feature similar to the first one Grindr deployed, to allow users to hide the Signal logo on their phones and use a different, inconspicuous icon instead. Our research revealed that when these communities were stopped by police or interrogated and risked device searches, they were especially concerned about exposing the data they had in Signal, because Signal as an app is high security and often holds much of the most sensitive data, such as their organizing communication or other “criminalizing” data. This is currently only on Android.
This work was conducted both as part of on-going research with ARTICLE 19 and Afsaneh Rigot’s independent work and research.
Mutual Aid
We support our communities with no-strings mutual aid, in the form of cash and in-kind assistance. Contact us at contact@de-center.net for consideration. We encourage companies who have benefitted from DFM to contribute funds.