Wednesday 2 May 2018

Cuttings: April 2018

The tyranny of algorithms is part of our lives: soon they could rate everything we do - article by John Harris in The Guardian. "For the past couple of years a big story about the future of China has been the focus of both fascination and horror. It is all about what the authorities in Beijing call 'social credit', and the kind of surveillance that is now within governments’ grasp. The official rhetoric is poetic. According to the documents, what is being developed will 'allow the trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step'.... Using a secret algorithm, Sesame credit constantly scores people from 350 to 950, and its ratings are based on factors including considerations of “interpersonal relationships” and consumer habits. Bluntly put, being friends with low-rated people is bad news. Buying video games, for example, gets you marked down. Participation is voluntary but easily secured, thanks to an array of enticements. High scores unlock privileges such as being able to rent a car without a deposit, and fast-tracked European visa applications.... It would be easy to assume none of this could happen here in the west. But the 21st century is not going to work like that. These days credit reports and scores – put together by agencies whose reach into our lives is mind-boggling – are used to judge job applications, thereby threatening to lock people into financial problems. ... Three years ago Facebook patented a system of credit rating that would consider the financial histories of people’s friends. Opaque innovations known as e-scores are used by increasing numbers of companies to target their marketing, while such outfits as the already infamous Cambridge Analytica trawl people’s online activities so as to precisely target political messaging. The tyranny of algorithms is now an inbuilt part of our lives."

How to persuade people (hint: not by telling them they're stupid) - article by Anne Cassidy in The Guardian. "A professor at Arizona State University who pioneered the study of persuasion, Cialdini was part of a team of behavioural scientists that helped propel Obama to victory in 2012. ... Were the Democrats to seek his advice [for the next presidential election], Cialdini would tell them to resist blaming the president’s supporters. '[Trump voters] don’t want to believe that they were stupid,” he says. “Cognitive dissonance research shows that the more consequential your error, the less willing you are to believe it was an error, because that undercuts your view of yourself as a good decision maker.'... Cialdini popularised the theory of social proof, which maintains that people will often look to their peers to decide what to think and how to behave. His book, Influence, published in 1984 and one of the best-selling books on behavioural psychology, was followed by a sequel two years ago, Pre-suasion, in which he explains the ideal conditions for exerting influence. Cialdini’s principles of persuasion have long been applied to marketing and business management. Successfully winning people over, in every field from politics to the workplace, he says, can come down to the right word in the right place. If you are presenting an idea at work and you want to get your team on side, whatever you do, don’t ask for their opinion on it, Cialdini says. 'When we ask someone for an opinion that person takes a half step back from us and becomes a critic.' Instead of using the word opinion, you should ask for advice on your plan. 'That person takes a half step forward because the word ‘advice’ asks for their collaboration.' "

How can Facebook change when it exists to exploit personal data? - article by John Naughton in The Observer. “The bigger story behind the current controversy is the fact that what Cambridge Analytica claimed to have accomplished would not have been possible without Facebook. Which means that, in the end, Facebook poses the problem that democracies will have to solve.... TechCrunch listed 11 separate controversies that resulted from Facebook being caught taking liberties with users’ data or trust. In most of these cases, the Zuckerberg response has been the same: sorrowful contrition followed by requests for forgiveness, topped off with resolutions to do better in future. The lad is beginning to sound like an alcoholic who, after his latest bout of drink-related excess, says sorry and expresses his determination to reform.... Facebook’s core business is exploiting the personal data of its users. That is its essence. So expecting it to wean itself off that exploitation is like trying to persuade ExxonMobil that it should get out of the oil and gas business.“

What price ethics for software designers in the poisonous era of Cambridge Analytica? - article by John Naughton in The Observer. "As the furore about Cambridge Analytica raged last week, I thought about Szilard [who conceived the nuclear chain reaction in 1933, hence making possible the atomic bomb] and then about three young Cambridge scientists who brought another powerful idea into the world. Their names are Michal Kosinski, David Stillwell and Thore Graepel and in 2013 they published an astonishing paper, which showed that Facebook 'likes' could be used to predict accurately a range of highly sensitive personal attributes, including sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age and gender. The work reported in their paper was a paradigm of curiosity-driven research.... What they might not have appreciated ... was the power this conferred on Facebook. But one of their colleagues in the lab obviously did get that message. His name was Aleksandr Kogan and we are now beginning to understand the implications of what he did. ... A remarkable essay by Yonatan Zunger in the Boston Globe, [argues] that the Cambridge Analytica scandal suggests that computer science now faces an ethical reckoning analogous to those that other academic fields have had to confront. 'Chemistry had its first reckoning with dynamite [and] poison gas attack... Physics had its reckoning when nuclear bombs destroyed Hiroshima and Nagasaki ... Human biology had eugenics. Medicine had Tuskegee and thalidomide, civil engineering a series of building, bridge and dam collapses.' Up to now, my guess is that most computer science graduates have had only a minimal exposure to ethical issues such as these."

The death of the news feed - post by Benedict Evans in his blog, referenced in John Naughton’s Memex 1.1 blog. “When I got married, my future wife and I were both quite sure that we would have a nice small, quiet wedding.... Then, we actually made a list of ‘close family and friends’… and realized why people have 100 or 200 people at a wedding. You know a lot more people than you think. According to Facebook, its average user is eligible to see at least 1,500 items per day in their newsfeed. Rather like the wedding with 200 people, this seems absurd. But then, it turns out, that over the course of a few years you do ‘friend’ 200 or 300 people. And if you’ve friended 300 people, and each of them post a couple of pictures, tap like on a few news stories or comment a couple of times, then, by the inexorable law of multiplication, yes, you will have something over a thousand new items in your feed every single day.... This is the logic that led Facebook inexorably to the ‘algorithmic feed’, which is really just tech jargon for saying that instead of this random (i.e. 'time-based') sample of what’s been posted, the platform tries to work out which people you would most like to see things from, and what kinds of things you would most like to see.... One basic problem here is that if the feed is focused on ‘what do I want to see?’, then it cannot be focused on ‘what do my friends want (or need) me to see?’ Sometimes this is the same thing - my friend and I both want me to see that they’re throwing a party tonight. But if every feed is a sample, then a user has no way to know who will see their post. Indeed, conceptually one might suggest that they have no way to know if anyone will see this post.“

Marx predicted our present crisis, and points the way out - article by Yanis Varoufakis in The Guardian, adapted from his introduction to a new edition of The Communist Manifesto. "What makes the manifesto truly inspiring today is its recommendation for us in the here and now... What we don’t need at this juncture are sermons on the injustice of it all, denunciations of rising inequality or vigils for our vanishing democratic sovereignty. Nor should we stomach desperate acts of regressive escapism: the cry to return to some pre-modern, pre-technological state where we can cling to the bosom of nationalism. What the manifesto promotes in moments of doubt and submission is a clear-headed, objective assessment of capitalism and its ills, seen through the cold, hard light of rationality.... The manifesto argues that the problem with capitalism is not that it produces too much technology, or that it is unfair. Capitalism’s problem is that it is irrational. Capital’s success at spreading its reach via accumulation for accumulation’s sake is causing human workers to work like machines for a pittance, while the robots are programmed to produce stuff that the workers can no longer afford and the robots do not need.... When asked by journalists who or what is the greatest threat to capitalism today, I defy their expectations by answering: capital! Of course, this is an idea I have been plagiarising for decades from the manifesto. Given that it is neither possible nor desirable to annul capitalism’s 'energy', the trick is to help speed up capital’s development (so that it burns up like a meteor rushing through the atmosphere) while, on the other hand, resisting (through rational, collective action) its tendency to steamroller our human spirit. In short, the manifesto’s recommendation is that we push capital to its limits while limiting its consequences and preparing for its socialisation."

The generation gap is back, but not as we know it - article by Brigid Delaney in The Guardian. "What does the new generational conflict look like? Inside the newsroom at the New York Times there is an ideological conflict brewing between the old guard and the 'new woke' employees. An article published last week in Vanity Fair titled 'Journalism is not about creating safe spaces: Inside the woke civil war at the New York Times' illustrates the tensions.... The battle lines are being drawn around older hands who believe in reporting a diverse range of views (including those that the left may find offensive), and who think that the reporting of the Trump presidency should be fairly straight down the line. The younger generation were appalled at the 2016 election results and have expressed grievance at the Times hiring for their opinion pages one writer who has expressed scepticism about climate science and a millennial who supports campus free speech. Part of this gap between young and old is the rise and mainstreaming of identity politics and intersectionality, a theory originating in black feminism, that calls out identity-based oppression.... The woke generation (young millennials aged between 18 and 30) brought the theories of intersectionality and identity into debates about a range of human rights issues: campus free speech, trans rights, the Me Too movement, marriage equality, gun control, reproductive rights, Black Lives matter and, in Australia, the Change the Date movement.... The landscape has shifted dramatically in the past few years, and older people (on the left and right) have found that they have been tripped up and called out by their more woke colleagues or friends or Twitter followers.... According to the piece in New Yorker on the new campus politics – it’s 'flummoxed many people who had always thought of themselves as devout liberals. Wasn’t free self-expression the whole point of social progressivism?'"

Are you ready? Here is all the data Facebook and Google have on you - article by Dylan Curran in The Guardian. "(1) Google knows where you’ve been. Google stores your location (if you have location tracking turned on) every time you turn on your phone. ... (2) Google knows everything you’ve ever searched – and deleted. Google stores search history across all your devices. That can mean that, even if you delete your search history and phone history on one device, it may still have data saved from other devices.... (3) Google has an advertisement profile of you. Google creates an advertisement profile based on your information, including your location, gender, age, hobbies, career, interests, relationship status, possible weight (need to lose 10lb in one day?) and income.... (4) Google knows all the apps you use. Google stores information on every app and extension you use. They know how often you use them, where you use them, and who you use them to interact with. That means they know who you talk to on Facebook, what countries are you speaking with, what time you go to sleep.... (5) Google has all of your YouTube history. Google stores all of your YouTube history, so they probably know whether you’re going to be a parent soon, if you’re a conservative, if you’re a progressive, if you’re Jewish, Christian, or Muslim, if you’re feeling depressed or suicidal, if you’re anorexic … (6) The data Google has on you can fill millions of Word documents. Google offers an option to download all of the data it stores about you. I’ve requested to download it and the file is 5.5GB big, which is roughly 3m Word documents."