Liu Xiao Bo – Ai Weiwei
On July 13 2017, activist, writer and Nobel Peace laureate Liu Xiaobo passed away in government custody.
He advocated for non-violent action, participated in the Tiananmen Square pro-democracy protests, and helped to draft and gather support for Charter 08, a call for peaceful political reform and an end to one-party rule. He spent almost a quarter of his life behind bars in China for advocating human rights and democracy.
Once the news of his death became public, one related phenomenon started to be reported. References to Liu Xiaobo, his work, and his passing were being censored on social media in China more harshly than ever before. Continue reading Censoring dissent. How the mourning over activist Liu Xiaobo’s death is being erased from the internet
On the upgrades of centuries-old systems of oppression and present-day tools to fight back
Yemeni women during a rally commemorating the fifth anniversary of the 2011 Arab Spring uprising. Taez, February 2016. AFP / Ahmad Al-Basha.
Globally, law enforcement agencies are adopting increasingly sophisticated surveillance technologies to employ predictive policing and monitor already overpoliced communities and demographics. Prevalent grounds for discriminatory conduct are race, class, citizenship, religion, gender identity, sexual orientation.
We hear from the news about phone interceptions, seized devices, hacked accounts. But most often, the civil society is provided with small to no information about how far these monitoring activities go.
How is technology employed to control targeted groups? And how can technology support who’s controlled to reclaim and protect their rights?
Continue reading Targeted surveillance, overpolicing and technology for resistance
I’m glad and proud to announce that I have joined Aspiration as Human Rights Technology Lead.
Aspiration connects nonprofit organisations, foundations and activists with software solutions and technology skills that help them better carry out their missions.
My work will focus on building technology capacity strategies in support of global nonprofit human rights organisations, capturing the scope and scale of the role technology plays in human rights efforts in different contexts, and exploring ways to create an inclusive, shared language when discussing technology in human rights efforts.
Stay tuned to read and hear more about it!
Protests, uprising and unrest are key elements of freedom of expression, contributing to the shape of society and public debate through history.
Over the centuries, individuals and groups have adopted countless tactics to reclaim rights and fight for justice – changing over time, transforming strategically according to different historical and political contexts.
What’s the current state of the art? Which are the tools adopted by protesters to raise awareness, unrest and mobilise?
Technology has entered the the world of activism, and we can recognise forms of protests which combine offline and online elements, as well as expressions of dissent which exclusively operate in the digital space.
This article aims to provide an overview of how digital civil disobedience looks like today, observe which tactics are in use and consider a possible path to develop the future tools which will help global citizens reclaim their rights.
Continue reading Digital civil disobedience: tactics, tools and future threads
Today is May 28 – happy Menstrual Hygiene Day!
For the occasion, I wrote an article about menstrual hygiene rights and you can read it here.
I am very excited to see a piece I wrote published by Bitch Media, one of my favourite media outlets (both in print and online!) and I’m particularly grateful to Sarah Mirk, Bitch’s online editor, for her invitation to write for it and for the opportunity to focus on this topic.
Menstrual hygiene is a critical human right and menstrual education is essential not only for those who menstruate, but for all human beings. So, everyone is invited to read and celebrate – today and all year long!
There’s a word – which is an entire multi-faceted concept in itself – which comes to my mind very often, whether I’m reading the news, working, talking with loved ones or following someone’s train of thoughts online.
The concept it expresses has always been at the core of my perspective of the world and of my work, exploring how technology can most effectively serve justice and rights.
So I decided to write about it, as it might turn out to be useful for others as well – next time you’re scraping data to investigate the patterns behind an issue, supporting a group in building their advocacy strategy, or making up your own mind before going to the polls.
Continue reading An intersectional take on technology, rights and justice
by Maya Ganesh, Dirk Slater and Beatrice Martini.
“You are welcome anytime, you’re not like others who come with their own bag of potatoes”
It’s with these words that the chair of Women’s Network for Unity (WNU), a sex worker collective based in Phnom Penh, thanked Maya Ganesh and Dirk Slater from Tactical Technology Collective for approaching the work with them with no assumptions or preconceived agenda, but eager to listen and develop their collaboration together.
Mutual trust and respect, real commitment to collaboration and flexibility are all essential elements to be responsibly equipped to work with a marginalised community. And they are not even enough. That’s why, together with Maya and Dirk, we decided to write about the experience as potato-less tech capacity builders, as we think it could greatly help other practitioners planning to collaborate with groups struggling to get their rights honoured and their voices heard.
Continue reading Working with marginalised communities on using data and technology in advocacy
Tweets relating to Ferguson after Michael Brown was shot. Map based on mentions of the city and other related key words. Via The Huffington Post.
Algorithms are ruling an ever-growing portion of our lives.
They are adopted by health insurances to assess our chances to get sick, by airlines to make our flights safer, by social media companies to attract our attention to ads, by governments to predict criminal activity.
They can guess with great accuracy a lot of things about us, such as gender, sexual orientation, race, personality type – and can also be applied to influence our political preferences, control what we do, target what we say and, in extreme cases, limit our freedom.
This is not to say that the computational algorithm model should have an evil reputation. Both algorithms and human judgement can be beneficial, malicious, biased – and even wrong. The main difference between them is that over the years (centuries) we developed a pretty good understanding of how human judgement works, while, when it comes to algorithms, we’re just starting to get to know each other.
Continue reading The Ethics of Algorithms: notes, emerging questions and resources