Google seems to have unwittingly stumbled into the journalistic ethics debate over how strenuously the media should guard the identity of an alleged rape victim. If you were to have searched Google for certain combinations of words in connection with the Strauss-Kahn case yesterday you'd have seen the accuser' photo and name served as the first result for the phrases "Strauss-Kahn, maid, perjury" and "Strauss-Kahn, maid, IRS," (which we ran while researching a post on the consequences the accuser may face if the prosecution against Strauss-Kahn falls apart). Here's what we saw on one of those searches (we've erased the accuser's face):
The photo from the Ugandan news site New Vision accompanies the headline from the Alaska Dispatch, which ran a syndicated Christian Science Monitor story. It's not unusual for Google News to pull an image from one source and put it next to headlines from other sources. (And, just to clarify, the Christian Science Monitor story did not include the accuser's photo or name.) What's strange is that Google, through the action of its search algorithm, now finds itself essentially taking a side in a journalistic ethics question that has different answers in different countries.
According to a Google spokeswoman, the company's own principles don't necessarily influence its search results:
As Google News is algorithmically aggregated, our opinions are not part of the story or image ranking process. Our search results are just a reflection of the content that is on the web. If an article or image is flagged by a user or publisher as offensive we will manually review it for removal.
This photo was not flagged, according to the spokeswoman. She wouldn't say whether Google took a stance on journalistic gray areas such as naming the victim of an alleged sexual assault. But she did address how a photo from a Ugandan newspaper came to sit next to a headline from an Alaskan paper running content from the Chistian Science Monitor.
Our algorithms group individual articles, photos and videos from different sources into story clusters, so that when you search for a particular news topic on Google.com you may see a headline from one outlet and a photo from another outlet for the same or a related story. In many instances, the most relevant photo for a story is from a different outlet than the most relevant article. The photos and articles are clearly attributed to their sources. We only index and surface images and articles published by Google News sources. If a user or publisher flags content in Google News as offensive we will review it for removal.
It's true that a photo of the accuser in the Strauss-Kahn case is the most relevant result for a query about the accuser in the Strauss-Kahn case. But the fact that it appears in a search that doesn't include her name as one of its terms certainly raises an eyebrow, if only because her identity has been well shielded by most American media so far. We haven't been able to replicate the results today, which could be due to the mysteries of Google search, but is more likely because the column on the Strauss-Kahn accuser has moved off the New Vision home page and into the paper's archives.
Meanwhile, in the U.S., as the accuser in the Dominique Strauss-Kahn sexual assault case moves closer to the center of the story, the question of keeping her name out of the U.S. press gets increasingly complicated. She's been called a liar (but not by name) by the New York prosecutors charging the case. She's filed a lawsuit against the New York Post for calling her a "hooker" (but hasn't attached her name), and the paper didn't use her name when it wrote its inflammatory stories in the first place. Yesterday, readers commenting on a Washington Post story about whether to name her actually named her, contrary to the paper's policy. The Washington Post yesterday provided a statement to Poynter's Steve Myers, explaining how the accuser's name ended up in the comments:
It was an oversight. Given the volume of comments, we are somewhat dependent on readers flagging comments that violate our standards to catch those that don’t trip our automatic ‘bozo’ filters. We should have deleted these in keeping with our normal standards on such issues. Like most major news sites, we struggle with moderation as no amount of technology (filters) and human moderation will catch everything unless other readers flag it.
An email sent to The Atlantic Wire from a Swiss reader in response to yesterday's post about the maid's possible consequences also used her name, which shows it's a more normal thing outside the United States (foreign papers are even using her name in headlines).