On the upgrades of centuries-old systems of oppression and present-day tools to fight back
Yemeni women during a rally commemorating the fifth anniversary of the 2011 Arab Spring uprising. Taez, February 2016. AFP / Ahmad Al-Basha.
Globally, law enforcement agencies are adopting increasingly sophisticated surveillance technologies to employ predictive policing and monitor already overpoliced communities and demographics. Prevalent grounds for discriminatory conduct are race, class, citizenship, religion, gender identity, sexual orientation.
We hear from the news about phone interceptions, seized devices, hacked accounts. But most often, the civil society is provided with small to no information about how far these monitoring activities go.
How is technology employed to control targeted groups? And how can technology support who’s controlled to reclaim and protect their rights?
Continue reading Targeted surveillance, overpolicing and technology for resistance
Newsletter #9: sent!
(and archived if you missed it)
Work-wise: publishing a curated list of podcasts contributing to widen representation and democracy in the media space; following a hashtag frenzy about all things journalism, web, movement building and resistance; giving a final touch to articles about to be published.
Links-wise: gun violence, slow violence, the FBI admits flaws in hair analysis over decades, tools to avoid snoopers online, wireless routers spying on our breathing, Rihanna breathing it and out, and Cher.
If you are not yet a subscriber: subscribe now!
Tweets relating to Ferguson after Michael Brown was shot. Map based on mentions of the city and other related key words. Via The Huffington Post.
Algorithms are ruling an ever-growing portion of our lives.
They are adopted by health insurances to assess our chances to get sick, by airlines to make our flights safer, by social media companies to attract our attention to ads, by governments to predict criminal activity.
They can guess with great accuracy a lot of things about us, such as gender, sexual orientation, race, personality type – and can also be applied to influence our political preferences, control what we do, target what we say and, in extreme cases, limit our freedom.
This is not to say that the computational algorithm model should have an evil reputation. Both algorithms and human judgement can be beneficial, malicious, biased – and even wrong. The main difference between them is that over the years (centuries) we developed a pretty good understanding of how human judgement works, while, when it comes to algorithms, we’re just starting to get to know each other.
Continue reading The Ethics of Algorithms: notes, emerging questions and resources