If technology fails to design for the most disadvantaged, we all fail

what to do in russian What protesters have in common with Twitter users is the fear of Elon Musk reading their DMs and the fear of criminalizing abortion? The companies developing the technology offer a stronger set of design practices to protect them all.

Let’s back up. last month, Russian police intimidate protesters Unlocking their phones to find evidence of dissent leads to arrests and fines. To make matters worse, Telegram, one of the main chat-based apps used in Russia, is vulnerable to these searches.Even just installing the Telegram app on a personal device can mean Its owner does not support the Kremlin war. But Telegram’s builders have Failure to design applications with personal safety in mind in high-risk environmentsnot just Russian context. Therefore, Telegram can be weaponized against its users.

Likewise, back and forth Elon Musk plans to buy Twitter, many who use the platform have raised concerns about cutting-edge algorithmic content moderation and other design changes out of his $44 billion fantasy.Advice to highly marginalised people without risk and harm frame leads to declaration “Verify everyone’s identity.” This seems to be to remove online anonymity, I have write very personal. It’s well thought out, harmful to those most at risk, and not backed up by actual methods or evidence.Aside from his unclear outburst of change, Musk’s previous actions, combined with the existing damage to Twitter’s existing structure, make it clear that we’re heading towards further impacts on marginalized groups, such as Black and POC Twitter users and transgender. At the same time, since the draft Supreme Court opinion Dobbs v. Jackson Indicates the protection afforded under the following provisions Roe v Wade under mortal threat. As those seeking or providing abortion services are convicted, it is becoming clear that the tools and technologies most commonly used to access vital healthcare data are unsafe and dangerous.

The same steps can protect users in all of these situations.If developers of these tools design their applications with a focus on security in high-risk environments – for those often seen as more “extreme” or “Edge” cases So ignored – the weaponization that users fear is impossible, or at least they will have the tools to manage their risk.

The reality is that making technologies that are better, safer, and less harmful needs to be designed based on the realities of life for the most marginalized. These “edge cases” are often ignored because they are beyond what a typical user might experience. However, they are powerful indicators of our technological shortcomings. That’s why I call these cases — the people, groups, and communities that are most affected and least supported — “decentralized.” Decentralized people are the most marginalized and often the most criminal. By understanding and identifying who is most affected by different social, political, and legal frameworks, we can understand who is most likely to fall victim to the weaponization of certain technologies.And, as an added bonus, technology that has been repositioned to extremes will always Can be rolled out to a wider user base.

From 2016 to the beginning of this year, I led a Invention project With the support of international experts, the human rights group Article 19 cooperates with local organizations in Iran, Lebanon and Egypt. We explore the lived experiences of queer people who have faced police persecution for using specific personal technologies. Take a queer Syrian refugee in Lebanon who was stopped at a police or army checkpoint for evidence. Their mobile phones were randomly searched. Seeing the icon for the queer app Grindr, the officer determined the person was queer. Other areas of the refugee’s phone were then checked for what was deemed “queer content”. The refugees were taken for further interrogation and subjected to verbal and physical abuse. They now face sentences under Article 534 of the Lebanese Penal Code and may face imprisonment, fines and/or revocation of their immigration status in Lebanon. This is one of many cases.

But what if the sign was hidden and they couldn’t use apps that indicated an individual’s sexual orientation? Still letting individuals keep apps and keep in touch with other queers? Based on research and collaboration with the Guardian Project, Grindr is working on implementing Incognito Mode on its products.

The company has also implemented our other recommendations with similar success. Changes such as Discreet App Icon allow users to display apps as common utilities such as a calendar or calculator. As a result, users can avoid the risk of being knocked out by the content or visuals of the app they own during an initial police search. While this feature is based solely on results from extreme cases, such as queer Syrian refugees, it has proven popular with users around the world.In fact, it became so popular that it went from being fully available only in “high risk” countries to being available International Free In 2020, with the popular The PIN function is also introduced under this projectThis is the first time a dating app has taken such aggressive security measures for users; many of Grindr’s competitors have followed suit.

Source link