{"id":111,"date":"2015-03-16T17:24:20","date_gmt":"2015-03-16T16:24:20","guid":{"rendered":"http:\/\/beatricemartini.it\/blog\/?p=111"},"modified":"2015-03-17T09:56:45","modified_gmt":"2015-03-17T08:56:45","slug":"eoa2015","status":"publish","type":"post","link":"https:\/\/beatricemartini.it\/blog\/eoa2015\/","title":{"rendered":"The Ethics of Algorithms: notes, emerging questions and resources"},"content":{"rendered":"<p><iframe loading=\"lazy\" src=\"http:\/\/srogers.cartodb.com\/viz\/4a5eb582-23ed-11e4-bd6b-0e230854a1cb\/embed_map\" width=\"100%\" height=\"520\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><small>Tweets relating to Ferguson after Michael Brown was shot. Map based on mentions of the city and other related key words. Via <a href=\"http:\/\/www.huffingtonpost.com\/2014\/08\/15\/ferguson-twitter_n_5681720.html\">The Huffington Post<\/a>.<\/small><\/p>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=S-ws2W6UbPU\">Algorithms<\/a> are ruling an ever-growing portion of our lives.<br \/>\nThey are adopted by health insurances to assess our chances to get sick, by airlines to make our flights safer, by social media companies to attract our attention to ads, by governments to predict criminal activity.<br \/>\nThey can guess with great accuracy a lot of things about us, such as gender, sexual orientation, race, personality type \u2013 and can also be applied to influence our political preferences, control what we do, target what we say and, in extreme cases, limit our freedom.<\/p>\n<p>This is not to say that the computational algorithm model should have an evil reputation. Both algorithms and human judgement can be beneficial, malicious, biased \u2013 and even wrong. The main difference between them is that over the years (centuries) we developed a pretty good understanding of how human judgement works, while, when it comes to algorithms, we\u2019re just starting to get to know each other.<\/p>\n<p><!--more--><\/p>\n<p>The 2-day event \u201c<a href=\"https:\/\/cihr.eu\/the-ethics-of-algorithms\/\">The Ethics of Algorithms<\/a>\u201d, hosted by the <a href=\"https:\/\/cihr.eu\">Centre for Internet and Human Rights<\/a> and joined by a cross-disciplinary group of professionals from civil society, industry, technology, policy making and academia, looked into the role of algorithms in relation to two sensible domains: freedom of expression \u2013 and its troubles with social media platforms and radical content; and society \u2013 and the ethical challenges it faces.<br \/>\nThis post collects some of the questions and reflections emerged during the conference, as well as some additional resources, aiming to support further steps into upcoming conversations.<\/p>\n<blockquote class=\"twitter-tweet\" width=\"550\">\n<p>&quot;The way algorithms are right or wrong is very different from the way humans are right or wrong. We&#39;re so not ready for this.&quot; <a href=\"https:\/\/twitter.com\/hashtag\/EOA2015?src=hash\">#EOA2015<\/a><\/p>\n<p>&mdash; Frederike Kaltheuner (@fre8de8rike) <a href=\"https:\/\/twitter.com\/fre8de8rike\/status\/575233351231143936\">March 10, 2015<\/a><\/p><\/blockquote>\n<p><script async src=\"\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<h3>Notes and emerging questions<\/h3>\n<ul>\n<li><b>Clarifying terms<\/b><\/li>\n<\/ul>\n<p>We realised that discussions on the relations between algorithms, policy and freedom of expression often end up in confusion and unclarity due to assumptions on the meaning of key concepts. For example:<\/p>\n<ul style=\"list-style-type: circle;\">\n<li>what do we mean by <b>freedom of expression<\/b>? In different countries, this has different meanings.<\/li>\n<li>what do we mean by <b>threat<\/b>? And how do we make a distinction between threat to citizens and to the state?<\/li>\n<li>what do we mean by<b> radical content<\/b>? Both terrorist and extremist content end up being labelled as radical, but while terrorist content is illegal, extremists content isn\u2019t (while still having the potential to have dangerous consequences).<\/li>\n<li>how do we define <b>terrorism<\/b>, and can <b>violence<\/b> be a designator to define that, if we don\u2019t have a clear definition of what violence itself is yet? Should we shift the focus from studying how closely a group matches our <b>definition<\/b> of terrorism, to concentrating primarily on the <b>actual<\/b> harmful <b>effects<\/b> the group could cause?<\/li>\n<\/ul>\n<blockquote class=\"twitter-tweet\" width=\"550\">\n<p>&quot;[Terrorist and extremist] content providers are using the same techniques honed by spam &amp; scam people over the last decade.&quot; <a href=\"https:\/\/twitter.com\/hashtag\/EoA2015?src=hash\">#EoA2015<\/a><\/p>\n<p>&mdash; Zeynep Tufekci (@zeynep) <a href=\"https:\/\/twitter.com\/zeynep\/status\/574893833726590976\">March 9, 2015<\/a><\/p><\/blockquote>\n<p><script async src=\"\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<ul>\n<li><b>State, social media companies and freedom of expression: a look at policy, legal and technical challenges.<\/b><\/li>\n<\/ul>\n<p><b>Policy-wise, if social media companies adopt algorithms to flag radical content, this can<\/b> <b>interfere<\/b> <b>with freedom of expression, state policies and human rights standards <\/b>(this can also be avoided if companies collaborate with experts in the field to create less interfering policies \u2013 but how to identify the most suitable experts to collaborate with is a whole other delicate issue). In recent news, an example showing the confusion between roles of state and companies has been given by the <a href=\"http:\/\/www.theguardian.com\/uk-news\/live\/2014\/nov\/25\/lee-rigby-woolwich-inquiry-report-published-live-coverage\">Lee Rigby report<\/a>, stating that UK intelligence agencies couldn\u2019t have prevent a crime, but that Facebook should have alerted authorities about the publication of extremist messages online.<\/p>\n<p>A number of questions emerge then:<\/p>\n<ul style=\"list-style-type: circle;\">\n<li>what does it mean that so much of <b>our life is taking place publicly<\/b>, but with so much <b>private intermediation from companies<\/b>?<\/li>\n<li>how does <b>intermediary liability<\/b> pose challenges for company responses to violent extremism?<\/li>\n<li>what\u2019s the difference between <b>state policy and terms of service agreements of companies<\/b>?<\/li>\n<li>what\u2019s the level of <b>public acceptance of state intervention<\/b>?<\/li>\n<\/ul>\n<p>Furthermore, since <b>defining online content as terrorist has a political nature, what does it mean when a social media company does that?<\/b><br \/>\nWe can refer to the case of the anti-Islamic video <i>Innocence of Muslims<\/i>, posted on Youtube in 2012. News report that the White House asked Google to take it down, and \u201c<a href=\"http:\/\/www.thedailybeast.com\/articles\/2012\/09\/20\/should-youtube-have-taken-down-incendiary-anti-muslim-video.html\">Google refused, citing its own guidelines regarding hate speech (though it later took down the video in Egypt and Libya, due to what it called the \u201cvery difficult situation\u201d in those countries)<\/a>\u201d. Meaning that a company based in a Northern American country took a political decision about what was right for the people living in two Northern African countries?<\/p>\n<p><b>From a legal point of view<\/b>, debates and interpretations of terminology make the waters even murkier.<br \/>\nFirst of all: <b>social media companies and material support<\/b>. In 2011, Glenn Greenwald <a href=\"http:\/\/www.salon.com\/2011\/12\/20\/the_u_s_government_targets_twitter_terrorism\/\">speculated<\/a> that the US Department of Justice \u201ccould consider Twitter\u2019s providing of a forum to a designated Terrorist organization to constitute the crime of \u2018material support of Terrorism.\u2019\u201d<br \/>\nMaterial support <a href=\"http:\/\/justsecurity.org\/16961\/social-media-companies-material-support\/\">is defined as<\/a> \u201cany property, tangible or intangible, or <i>service<\/i> [&#8230;]\u201d.<br \/>\nIs social media a <b>service<\/b>? Short answer: yes. And as such, services have to provide material support. But can social media companies be liable for terrorist content under material support law? Material support law would only apply if there was some form of <b>coordination<\/b>. Is this the case?<br \/>\nAnd in addition to this, as noted by Emily Goldberg Knox in her article <a href=\"http:\/\/justsecurity.org\/16961\/social-media-companies-material-support\/\">Social Media Companies and Material Support<\/a>: \u201cIt is also unclear whether satisfying the coordination requirement is sufficient to satisfy the concerted activity requirement.\u201d And: \u201cAdditionally, despite the potential threat that results from terrorist groups using social media, other factors, such as <b>counter-terrorism value and the First Amendment<\/b>, warrant consideration.\u201d<br \/>\nAs she concludes: \u201cHow courts, legislators, and the executive branch will weigh these factors remains to be seen.\u201d<\/p>\n<p><b>From a technical point of view<\/b>, the <b>automation<\/b> of the algorithms adopted by social media companies presents a wide range of controversies.<br \/>\nOn one side, automation, when managed correctly, can help a social media companies <b>provide customised engagement<\/b> and recommendations which can enhance their popularity among users. On the other side, when automated algorithms are used by the very same companies to <b>monitor fundamental rights<\/b> \u2013 such as freedom of expression, right to privacy, freedom of belief \u2013 this presents issues involving a number of domains, from human rights, to science and ethics. Is it time to <a href=\"https:\/\/cihr.eu\/event-from-big-data-to-banality-of-evil-an-epistemological-and-ethical-analysis-of-algorithms\/\">develop an epistemic foundation for ethics of algorithms and people that develop them<\/a>?<\/p>\n<blockquote class=\"twitter-tweet\" width=\"550\">\n<p>We need a framework for policing data and algorithm. &quot;Framework&quot; came up a lot in the two days discussions <a href=\"https:\/\/twitter.com\/hashtag\/EOA2015?src=hash\">#EOA2015<\/a><\/p>\n<p>&mdash; Mohamad \u0645\u062d\u0645\u062f (@monajem) <a href=\"https:\/\/twitter.com\/monajem\/status\/575252677577347072\">March 10, 2015<\/a><\/p><\/blockquote>\n<p><script async src=\"\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<ul>\n<li><b>Citizens action<\/b><\/li>\n<\/ul>\n<p>We\u2019re about to enter a world where machines will take a lot of decisions for us \u2013 also due to mass surveillance. What are we going to do?Deciding not be comfortable with this might be our best bet. The safest nations are those with active populations, <b>rallying and fighting for democracy<\/b> \u2013 which, to make things harder, as freedom of expression, is not a given in many countries.<\/p>\n<p>As citizens we have the responsibility to look at what our legal structure says and advocate for what we want to change. Hold our governments accountable, for them to be transparent about policies handling radical content and our freedom.<br \/>\nIn order to do this, <b>being objective<\/b> (as in: removing the fear of terrorism from our reflection) would help us understand <b>what we want freedom of expression to mean, at a global level and not on a case by case level<\/b>.<\/p>\n<ul>\n<li><b>Media pluralism<\/b><\/li>\n<\/ul>\n<p>Media pluralism is a prerequisite for freedom of expression. Independent and pluralistic media are <a href=\"http:\/\/eeas.europa.eu\/delegations\/documents\/eu_human_rights_guidelines_on_freedom_of_expression_online_and_offline_en.pdf\">essential to any society to ensure freedom of opinion and expression and the exercise of other human rights<\/a>.<br \/>\nIt\u2019s our responsibility to defend plurality \u2013 but this doesn\u2019t come without a set of challenges. <b>Lack of universal access to media, content restrictions on the Internet and inconsistent approaches of states<\/b> towards Internet freedom, online pluralism and the relevance of international legal standards on freedom of expression to Internet-based media, endanger plurality \u2013 and therefore, democracy.<br \/>\nGovernments should recognise the <b>relevance of international human rights principles to media pluralism<\/b> and adopt a rights-based approach to policies regarding freedom of expression.<\/p>\n<ul>\n<li><b>From &#8216;at risk&#8217; to &#8216;a risk\u2019: the stigmatic potential of predictive policing<\/b><\/li>\n<\/ul>\n<p>Using data scraped by years worth of crime reports, algorithms can identify areas with high probabilities for certain types of crime and groups likely to commit them.<br \/>\nThis practice <b>can help the work of law enforcement agencies<\/b>, but it\u2019s of course also raising <b>concerns about privacy, surveillance<\/b> <b>and<\/b> how much <b>power <\/b>should be given over to algorithms. Predictive policing can create categorical and biased suspicion of people in predicted crime areas, and lead to unnecessary questioning or excessive searching.<\/p>\n<p>Considering this:<\/p>\n<ul style=\"list-style-type: circle;\">\n<li>what is the police expected to do with the data? The output is not entirely clear.<\/li>\n<li>the original data are collected by people, which means that they could be skewed as there could be a discriminating practice at the bottom. So if datasets can\u2019t be considered reliable, and decisions about their use are so subjective: <b>shouldn\u2019t not only the algorithms, but also the datasets and the decisions taken about them, be transparent?<\/b><\/li>\n<li>following too closely the results of an algorithm, we can incur the risk to limit our analysis to details while losing the big picture.<\/li>\n<li>there\u2019s a fine line between using predictive policing to target someone who\u2019s an activist and deciding that that person represents a threat. This quickly translates into <b>stigmatisation, exclusion, discrimination and undiscriminated surveillance of a community<\/b>.<\/li>\n<li><b>predictive policing is a political decision and it\u2019s ultimately a matter of power<\/b>. For example, we have a lot of data about the poor, because power is exercised to force them to provide way more information than it\u2019s asked to wealthier citizens (see: <a href=\"http:\/\/www.theguardian.com\/global-development-professionals-network\/2013\/feb\/11\/biometrics-development-aid-work\">concerns around the adoption<\/a> of biometric analysis in development).<\/li>\n<\/ul>\n<blockquote class=\"twitter-tweet\" width=\"550\">\n<p>Does surveillance become &#39;privacy protecting&#39; by virtue of greater accuracy, more precise targeting? <a href=\"https:\/\/twitter.com\/hashtag\/EOA2015?src=hash\">#EOA2015<\/a><\/p>\n<p>&mdash; becky kazansky\u02d9\u2006\u035c\u029f\u02d9 (@pondswimmer) <a href=\"https:\/\/twitter.com\/pondswimmer\/status\/575289660311781377\">March 10, 2015<\/a><\/p><\/blockquote>\n<p><script async src=\"\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<ul>\n<li><b>A quantitative approach to the analysis of war crimes<\/b><\/li>\n<\/ul>\n<p>Algorithms and statistics can help us analyse human rights violations, and even prove responsibilities behind war crimes and highlight violence patterns able to constitute proof in genocide trials.<br \/>\nThe example we focused on was the <a href=\"https:\/\/hrdag.org\/wp-content\/uploads\/2013\/01\/state-violence-guate-1999.pdf\">quantitative reflection on state violence in Guatemala between 1960-1996<\/a> by Patrick Ball, Paul Kobrak and Herbert F. Spirer of the Human Rights Data Analysis Group. A genocide requires patterns: to kill a big group of people, knowledge about the group\u2019s behaviour is needed. So:<\/p>\n<ul style=\"list-style-type: circle;\">\n<li><b>can algorithms identify violence patterns?<\/b><\/li>\n<li>and if so, <b>how can we decrypt them?<\/b><\/li>\n<\/ul>\n<p><iframe loading=\"lazy\" src=\"https:\/\/www.youtube.com\/embed\/KQoxBuorqeA?rel=0&amp;controls=0\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><br \/>\n<small><a href=\"https:\/\/www.youtube.com\/watch?v=KQoxBuorqeA\"><em>Information Control and Strategic Violence<\/em><\/a> \u2013 Anita Gohdes, 31th Chaos Communication Congress [31c3] (December 28, 2014)<\/small><\/p>\n<ul>\n<li><b>Reputation, search, and finance<\/b><\/li>\n<\/ul>\n<p>In <a href=\"http:\/\/www.hup.harvard.edu\/catalog.php?isbn=9780674368279\">The Black Box Society<\/a>, Frank Pasquale identifies three aspect of our lives which are heavily monitored and influenced by algorithms:<\/p>\n<ul style=\"list-style-type: circle;\">\n<ul style=\"list-style-type: circle;\">\n<li><b>reputation<\/b>: the portrait that everything we click, browse, watch, listen to, paints about us, and that can be used to evaluate us during a hiring process or to target us as potential customers for anything from a new car to a pregnancy test;<\/li>\n<li><b>search<\/b>: we look for information online and what we find is what the search engine we\u2019re using wants us to find. Search engines use ranking algorithms to provide results of our search, and the result we get is a combination of both the company\u2019s and the algorithm\u2019s biases applied to answer our search question;<\/li>\n<li><b>finance<\/b>: algorithms are known to hide financiers\u2019 moves very well. To mention an example from recent years, it was algorithms that made it possible for banks to combine sub-prime mortgages into respectable looking investments, contributing to the financial crisis of 2007-2008.<\/li>\n<\/ul>\n<\/ul>\n<h3>What\u2019s next<\/h3>\n<p>As we speak, key international events are convening representatives from civil society, industry, policy making and academia, to keep working on the hardest challenges presented by the digital age we\u2019re just getting into. To name a few: <a href=\"https:\/\/openitp.org\/festival\/circumvention-tech-festival.html\">Circumvention Tech Festival<\/a> (Spain) and the upcoming <a href=\"https:\/\/responsibledata.io\/\">Responsible Data Forum<\/a> (multiple locations), <a href=\"https:\/\/www.rightscon.org\/\">RightsCon<\/a> (Philippines) and <a href=\"https:\/\/www.gccs2015.com\/\">Global Conference on CyberSpace<\/a> (The Netherlands). It\u2019s essential that we make the most of this momentum, and join forces to think about how we want the world\u2019s rights to look like, now and for the new generations to come.<\/p>\n<p>It\u2019s clear that we need a <strong>new and multi-disciplinary understanding of how the Internet and the algorithms<\/strong> keeping it in motion <strong>work<\/strong>, and <strong>what does this mean from a global, intersectional perspective<\/strong>. It\u2019s a matter of human rights, and exercise of power, and it\u2019s crucial for our societies to work eagerly on underlining that freedom of expression, plurality and privacy are fundamental rights we all need to fight for and defend.<\/p>\n<h3>Additional resources:<\/h3>\n<ul style=\"list-style-type: circle;\">\n<ul>\n<li><a href=\"http:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=2485309\">The Slippery Slope of Material Support Prosecutions: Social Media Support to Terrorists<\/a> \u2013 Emily Goldberg Knox, Hastings Law Journal (August 22, 2014)<\/li>\n<li><a href=\"http:\/\/www.washingtonpost.com\/blogs\/monkey-cage\/wp\/2015\/02\/25\/the-abuses-of-social-media-for-understanding-international-conflict\/\">The (ab)uses of social media for understanding international conflict<\/a> \u2013 Thomas Zeitzoff, John W. Kelly and Gilad Lotan, The Washington Post (February 25, 2015)<\/li>\n<li><a href=\"http:\/\/gizmodo.com\/more-surveillance-wont-protect-free-speech-1679173716\">More Surveillance Won&#8217;t Protect Free Speech<\/a>\u00a0\u2013 Jillian York, Gizmodo (January 13, 2015)<\/li>\n<li><a href=\"https:\/\/medium.com\/message\/ferguson-is-also-a-net-neutrality-issue-6d2f3db51eb0\">What Happens to #Ferguson Affects Ferguson: Net Neutrality, Algorithmic Filtering and Ferguson<\/a> \u2013 Zeynep Tufekci, The Message (August 14, 2014)<\/li>\n<li><a href=\"http:\/\/www.niemanlab.org\/2011\/10\/why-hasnt-occupywallstreet-trended-in-new-york\/\">Why hasn\u2019t #OccupyWallStreet trended in New York?<\/a> \u2013 Megan Garber, Nieman Lab (October 17,2011)<\/li>\n<li><a href=\"http:\/\/www.nytimes.com\/2014\/12\/08\/opinion\/we-cant-trust-uber.html\">We Can\u2019t Trust Uber<\/a> \u2013 Zeynep Tufekci, Brayden King, The New York Times (December 7, 2014)<\/li>\n<li><a href=\"http:\/\/www.nytimes.com\/2012\/11\/17\/opinion\/beware-the-big-data-campaign.html\">Beware the Smart Campaign<\/a> \u2013 Zeynep Tufekci, The New York Times (November 16, 2012)<\/li>\n<li><a href=\"http:\/\/www.newrepublic.com\/article\/117878\/information-fiduciary-solution-facebook-digital-gerrymandering\">Facebook Could Decide an Election Without Anyone Ever Finding Out<\/a> \u2013 Jonathan Zittrain, New Republic (June 1, 2014)<\/li>\n<li><a href=\"http:\/\/www.motherjones.com\/politics\/2014\/10\/can-voting-facebook-button-improve-voter-turnout\">Facebook Wants You to Vote on Tuesday. Here&#8217;s How It Messed With Your Feed in 2012<\/a> \u2013 Micah L. Sifry, Mother Jones (October 31, 2014)<\/li>\n<li><a href=\"http:\/\/web.mit.edu\/gtmarx\/www\/techsoccon.html\">Technology and Social Control: The Search for the Illusive Silver Bullet Continues<\/a> \u2013 Gary T. Marx, Encyclopedia of the Social &amp; Behavioral Sciences (forthcoming)<\/li>\n<li><a href=\"https:\/\/hrdag.org\/wp-content\/uploads\/2013\/01\/state-violence-guate-1999.pdf\">State Violence in Guatemala, 1960-1996: A Quantitative Reflection<\/a> \u2013 Patrick Ball, Paul Kobrak, Herbert F. Spirer, American Association for the Advancement of Science (1999) <a href=\"https:\/\/hrdag.org\/wp-content\/uploads\/2013\/01\/state-violence-guate-1999.pdf\">[pdf &#8211; english]<\/a> <a href=\"https:\/\/hrdag.org\/wp-content\/uploads\/2013\/01\/state-violence-guate-1999-espanol.pdf\">[pdf &#8211; espa\u00f1ol]<\/a><\/li>\n<li><a href=\"http:\/\/www.iasc-culture.org\/THR\/THR_article_2015_Spring_Pasquale.php\">The Algorithmic Self<\/a> \u2013 Frank Pasquale, The Hedgehog Review (Spring 2015)<\/li>\n<li><a href=\"http:\/\/www.nytimes.com\/2015\/01\/04\/upshot\/the-measuring-sticks-of-racial-bias-.html\">Racial Bias, Even When We Have Good Intentions<\/a> \u2013 Sendhil Mullainathan, The New York Times (January 3, 2015)<\/li>\n<\/ul>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Tweets relating to Ferguson after Michael Brown was shot. Map based on mentions of the city and other related key words. Via The Huffington Post. Algorithms are ruling an ever-growing portion of our lives. They are adopted by health insurances to assess our chances to get sick, by airlines to make our flights safer, by &hellip; <a href=\"https:\/\/beatricemartini.it\/blog\/eoa2015\/\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">The Ethics of Algorithms: notes, emerging questions and resources<\/span><\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[7,8,13],"tags":[26,27,32,28,29,30,31],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v19.8 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>The Ethics of Algorithms: notes, emerging questions and resources | Beatrice Martini \u2013 blog<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/beatricemartini.it\/blog\/eoa2015\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"The Ethics of Algorithms: notes, emerging questions and resources | Beatrice Martini \u2013 blog\" \/>\n<meta property=\"og:description\" content=\"Tweets relating to Ferguson after Michael Brown was shot. Map based on mentions of the city and other related key words. Via The Huffington Post. Algorithms are ruling an ever-growing portion of our lives. They are adopted by health insurances to assess our chances to get sick, by airlines to make our flights safer, by &hellip; Continue reading The Ethics of Algorithms: notes, emerging questions and resources\" \/>\n<meta property=\"og:url\" content=\"https:\/\/beatricemartini.it\/blog\/eoa2015\/\" \/>\n<meta property=\"og:site_name\" content=\"Beatrice Martini \u2013 blog\" \/>\n<meta property=\"article:published_time\" content=\"2015-03-16T16:24:20+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2015-03-17T08:56:45+00:00\" \/>\n<meta name=\"author\" content=\"Beatrice\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Beatrice\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/beatricemartini.it\/blog\/eoa2015\/\",\"url\":\"https:\/\/beatricemartini.it\/blog\/eoa2015\/\",\"name\":\"The Ethics of Algorithms: notes, emerging questions and resources | Beatrice Martini \u2013 blog\",\"isPartOf\":{\"@id\":\"https:\/\/beatricemartini.it\/blog\/#website\"},\"datePublished\":\"2015-03-16T16:24:20+00:00\",\"dateModified\":\"2015-03-17T08:56:45+00:00\",\"author\":{\"@id\":\"https:\/\/beatricemartini.it\/blog\/#\/schema\/person\/680a80473149f3a373cfd94a2dc0eff0\"},\"breadcrumb\":{\"@id\":\"https:\/\/beatricemartini.it\/blog\/eoa2015\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/beatricemartini.it\/blog\/eoa2015\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/beatricemartini.it\/blog\/eoa2015\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/beatricemartini.it\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"The Ethics of Algorithms: notes, emerging questions and resources\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/beatricemartini.it\/blog\/#website\",\"url\":\"https:\/\/beatricemartini.it\/blog\/\",\"name\":\"Beatrice Martini \u2013 blog\",\"description\":\"On tech and tools for justice and rights\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/beatricemartini.it\/blog\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/beatricemartini.it\/blog\/#\/schema\/person\/680a80473149f3a373cfd94a2dc0eff0\",\"name\":\"Beatrice\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/beatricemartini.it\/blog\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/39f3e7ced144ed6d393d6ad6a7ab489d?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/39f3e7ced144ed6d393d6ad6a7ab489d?s=96&d=mm&r=g\",\"caption\":\"Beatrice\"},\"sameAs\":[\"http:\/\/beatricemartini.it\"],\"url\":\"https:\/\/beatricemartini.it\/blog\/author\/beatrice\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"The Ethics of Algorithms: notes, emerging questions and resources | Beatrice Martini \u2013 blog","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/beatricemartini.it\/blog\/eoa2015\/","og_locale":"en_US","og_type":"article","og_title":"The Ethics of Algorithms: notes, emerging questions and resources | Beatrice Martini \u2013 blog","og_description":"Tweets relating to Ferguson after Michael Brown was shot. Map based on mentions of the city and other related key words. Via The Huffington Post. Algorithms are ruling an ever-growing portion of our lives. They are adopted by health insurances to assess our chances to get sick, by airlines to make our flights safer, by &hellip; Continue reading The Ethics of Algorithms: notes, emerging questions and resources","og_url":"https:\/\/beatricemartini.it\/blog\/eoa2015\/","og_site_name":"Beatrice Martini \u2013 blog","article_published_time":"2015-03-16T16:24:20+00:00","article_modified_time":"2015-03-17T08:56:45+00:00","author":"Beatrice","twitter_misc":{"Written by":"Beatrice","Est. reading time":"11 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/beatricemartini.it\/blog\/eoa2015\/","url":"https:\/\/beatricemartini.it\/blog\/eoa2015\/","name":"The Ethics of Algorithms: notes, emerging questions and resources | Beatrice Martini \u2013 blog","isPartOf":{"@id":"https:\/\/beatricemartini.it\/blog\/#website"},"datePublished":"2015-03-16T16:24:20+00:00","dateModified":"2015-03-17T08:56:45+00:00","author":{"@id":"https:\/\/beatricemartini.it\/blog\/#\/schema\/person\/680a80473149f3a373cfd94a2dc0eff0"},"breadcrumb":{"@id":"https:\/\/beatricemartini.it\/blog\/eoa2015\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/beatricemartini.it\/blog\/eoa2015\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/beatricemartini.it\/blog\/eoa2015\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/beatricemartini.it\/blog\/"},{"@type":"ListItem","position":2,"name":"The Ethics of Algorithms: notes, emerging questions and resources"}]},{"@type":"WebSite","@id":"https:\/\/beatricemartini.it\/blog\/#website","url":"https:\/\/beatricemartini.it\/blog\/","name":"Beatrice Martini \u2013 blog","description":"On tech and tools for justice and rights","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/beatricemartini.it\/blog\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/beatricemartini.it\/blog\/#\/schema\/person\/680a80473149f3a373cfd94a2dc0eff0","name":"Beatrice","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/beatricemartini.it\/blog\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/39f3e7ced144ed6d393d6ad6a7ab489d?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/39f3e7ced144ed6d393d6ad6a7ab489d?s=96&d=mm&r=g","caption":"Beatrice"},"sameAs":["http:\/\/beatricemartini.it"],"url":"https:\/\/beatricemartini.it\/blog\/author\/beatrice\/"}]}},"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/beatricemartini.it\/blog\/wp-json\/wp\/v2\/posts\/111"}],"collection":[{"href":"https:\/\/beatricemartini.it\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/beatricemartini.it\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/beatricemartini.it\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/beatricemartini.it\/blog\/wp-json\/wp\/v2\/comments?post=111"}],"version-history":[{"count":34,"href":"https:\/\/beatricemartini.it\/blog\/wp-json\/wp\/v2\/posts\/111\/revisions"}],"predecessor-version":[{"id":165,"href":"https:\/\/beatricemartini.it\/blog\/wp-json\/wp\/v2\/posts\/111\/revisions\/165"}],"wp:attachment":[{"href":"https:\/\/beatricemartini.it\/blog\/wp-json\/wp\/v2\/media?parent=111"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/beatricemartini.it\/blog\/wp-json\/wp\/v2\/categories?post=111"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/beatricemartini.it\/blog\/wp-json\/wp\/v2\/tags?post=111"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}