“Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning. It would undermine the people’s trust in our results and company if we were to change course.”- Google, in response to SEME research.
SEME, in this context, is for those who Googled and found links to Japanese culture:
“The search engine manipulation effect (SEME) is the change in consumer preference from manipulations of search results by search engine providers.”-Wiki
According to some, “Such manipulations…could shift the voting preferences of undecided voters by 20 percent or more and up to 80 percent in some demographics.”
Who knows? These days one has to be very careful sharing any information as the proliferation of fake news is so widespread and so deep.
- Net fake/fake,
- Fake based on truth
- Truth clouded by fake
- Fake creating more fake
- Truth hard to ascertain
I am content to merely comment on Google’s statement, as it strikes a chord of dissonance. After all, the whole monetization structure of the enterprise is based on its ability to affect key and hyper targeted audiences. Yet seemingly that only holds true for the purple whale pants with the pink embroidery and not for anything else…hmm.
Frankly, as in the past, my issue is less with fake news—let’s be clear, I find the fabled New York Times guilty in that arena as well—than it is with the way algorithms are developing. To suggest that the Google algorithm is open, honest and neutral is disingenuous at best and, more likely, misleading.
“The amoral status of an algorithm does not negate its effects on society,” wrote Amit Datta and Anupam Datta of Carnegie Mellon and Michael Carl Tschantz of the International Computer Science Institute, authors of a 2015 Google advertising study.
Damien Tambini, an associate professor at the London School of Economics, who focuses on media regulation, was quoted by The Guardian as saying:
There’s an editorial function to Google and Facebook but it’s being done by sophisticated algorithms. They say it’s machines not editors. But that’s simply a mechanised editorial function.
It’s a function I’ve written about, with respect and admiration, as the head of the Creative Data Jury at the Cannes Lions International Festival of Creativity in France.
I refer you to a recent article in The Guardian, “Google, democracy and the truth about internet search,” written by Carole Cadwalladr, published Sunday, December 4, 2016.
It’s a piece most worth reading, no matter your view, as it sets up the arguments in a coherent and rational way. The author documents her journey of typing “Jews/Muslims/Women (etc.) are” into the Google search bar and watching, with horror, the auto-fill that followed.
To be fair, and clear, Google has cleaned up this mess since I first read Cadwalladr’s article last week—try it and you’ll see for yourself. Nevertheless, what was coming up just a short while ago were hateful terms and hate-filled links.
“Jews are evil.” “Muslims need to be eradicated.” “Women all have a little prostitute in them.” Cadwalladr writes about typing in the questions “Was Hitler bad?”As I’ve just confirmed, here’s Google’s top result: “10 Reasons Why Hitler Was One of the Good Guys.” Among other things, the article states, “He implemented social and cultural reform.” Eight out of the other 10 search results agree: Hitler really wasn’t that bad.
Cadwalladr continues by quoting Danny Sullivan, founding editor of SearchEngineLand.com:
He’s been recommended to me by several academics as one of the most knowledgeable experts on search. Am I just being naive, I ask him? Should I have known this was out there? “No, you’re not being naive,” he says. “This is awful. It’s horrible. It’s the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate. Google is doing a horrible, horrible job of delivering answers here. It can and should do better.”
You seek answers thinking you’ll find the truth and the trusted librarian hands you books on hate…
Yet here’s the thing.
While sources from the left and right are attacking Google and others for bias, mainstream media has jumped in as well. It’s not a new issue; in fact, it’s a problem identified as early as 1999 in the United States and now recognized globally.
In the words of acclaimed media critic Robert McChesney from his paper “Shaping the Web: Why the Politics of Search Engines Matters (2000)“:
The American media system is spinning out of control in a hyper-commercialized frenzy. Fewer than ten transnational media conglomerates dominate much of our media; fewer than two dozen account for the overwhelming majority of our newspapers, magazines, films, television, radio, and books. With every aspect of our media culture now fair game for commercial exploitation, we can look forward to the full-scale commercialization of sports, arts, and education, the disappearance of notions of public service from public discourse, and the degeneration of journalism, political coverage, and children’s programming under commercial pressure.
Many, myself included, have written about the money at stake here. See “Google’s Dance“:
…from Google’s perspective – and I don’t mean Google’s PR department, I mean Google’s management – Google is an advertising company. Ninety-seven percent of Google’s revenues, after all, come from advertising.
And we have all followed the fake news farms, whose farmers made a killing during the recent US election. Why look to Russian hackers (if it was them)—sadly they were sharing the truth—it was the novelists who were doing way worse damage.
I have shared this thought from The New York Times before, but it’s critical enough to repeat:
There is a widespread belief that software and algorithms that rely on data are objective. But software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, say researchers in computer science, ethics and law, algorithms can reinforce human prejudices.
And according to The Wall Street Journal,
The legacy media companies addressed this issue by trying, admittedly with varying degrees of success, to establish walls between the departments responsible for editorials, news reporting and advertising. This will be far more difficult in an era where algorithms—not editors—often control the content and ads a person consumes.
To be fair, there are many who would argue, rightly so, that the varying degrees of success referenced above have diminished over time and because of—you guessed it—monetization issues, is diminishing ever more quickly.
Robert Epstein, senior research psychologist at the American Institute for Behavioral Research and Technology in California, worries this is where it all leads:
Google has become the main gateway to virtually all knowledge, mainly because the search engine is so good at giving us exactly the information we are looking for, almost instantly and almost always in the first position of the list it shows us after we launch our search – the list of ‘search results’.
That ordered list is so good, in fact, that about 50 per cent of our clicks go to the top two items, and more than 90 per cent of our clicks go to the 10 items listed on the first page of results; few people look at other results pages, even though they often number in the thousands, which means they probably contain lots of good information. Google decides which of the billions of web pages it is going to include in our search results, and it also decides how to rank them. How it decides these things is a deep, dark secret – one of the best-kept secrets in the world, like the formula for Coca-Cola.
And in my first and only appearance before a congressional hearing, in a scene out of a movie, I, sitting at the green covered desk in front of a room full of people, warned of the dangers of an algorithm in the hands of a monopoly…any monopoly.
So Google and the rest have to come to grips with who they are…Either they can influence or they can’t. Either they can cause perception shifts or they can’t. From Wired:
Jigsaw, the Google-owned tech incubator and think tank—until recently known as Google Ideas—has been working over the past year to develop a new program it hopes can use a combination of Google’s search advertising algorithms and YouTube’s video platform to target aspiring ISIS recruits and ultimately dissuade them from joining the group’s cult of apocalyptic violence. The program, which Jigsaw calls the Redirect Method and plans to launch in a new phase this month, places advertising alongside results for any keywords and phrases that Jigsaw has determined people attracted to ISIS commonly search for. Those ads link to Arabic- and English-language YouTube channels that pull together preexisting videos Jigsaw believes can effectively undo ISIS’s brainwashing—clips like testimonials from former extremists, imams denouncing ISIS’s corruption of Islam, and surreptitiously filmed clips inside the group’s dysfunctional caliphate in Northern Syria and Iraq.
“This came out of an observation that there’s a lot of online demand for ISIS material, but there are also a lot of credible organic voices online debunking their narratives,” says Yasmin Green, Jigsaw’s head of research and development. “The Redirect Method is at its heart a targeted advertising campaign: Let’s take these individuals who are vulnerable to ISIS’ recruitment messaging and instead show them information that refutes it.”
And it would seem that when it comes to those whale pants and terrorists we don’t want, the answer is they think they can.
We the people need to demand the truth….listen:
Let her and Falsehood grapple; who ever knew Truth put to the worse in a free and open encounter? – John Milton, Areopagitica
Place your bets. Me? I’m an optimist.
What do you think?