NNSquad - Network Neutrality Squad

NNSquad Home Page

NNSquad Mailing List Information

 


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[ NNSquad ] A Family's Horror -- and the Role of Google Images




              A Family's Horror -- and the Role of Google Images

                 http://lauren.vortex.com/archive/000677.html


Greetings.  I'm about to pose some difficult questions.  I won't
assert that I know the answers to them all or even suggest that
succinct answers are possible.  But the questions themselves cut to
the heart of some of the most contentious and emotional ethical issues
of the Internet today.

A California appeals court has just unanimously ruled that a lawsuit
may move forward against the California Highway Patrol, related to
horrific imagery of an 18-year-old girl decapitated in a
single-vehicle traffic accident.  The photos were allegedly forwarded
by one or more on-scene CHP officers to another party, and then spread
widely across the Internet ( http://bit.ly/9u1wqo ).

The victim's family has been trying for years to hold the CHP
responsible for the dissemination of these images, and to somehow
reduce the impact and exploitation of these nightmarish photos and the
associated hateful abuse that has spread across the Net.  Many of the
sites exploiting these images attempt to portray themselves as
"educational" in nature -- but in reality most are merely purveyors of
what the film industry calls "torture porn" -- but in this case
they're dealing with the horrific death of a real person, not
fictional characters and special effects.

Regular readers know that I'm firmly opposed to censorship and have
praised Google's recent commitment to cease censorship of Google
search results in China ( http://lauren.vortex.com/archive/000667.html ).

I have also suggested in the past that some sort of "dispute
resolution" mechanism -- to deal with unusual or exceptional
situations triggered by search engine results -- would be worthy of
both consideration and debate.  If you have a few minutes to spare,
here is a pointer to some discussion of this issue 
( http://lauren.vortex.com/archive/000254.html ).

So it's with some consternation that I consider the easy availability
of the accident photos in question being facilitated via Google
Images.

A simple search on the victim's name in Google Images yields seemingly
endless copies of the exceedingly gruesome photos, *even when Google
SafeSearch is set to its most strict setting*.

Let's be very clear.  I'm not suggesting that the photos be banned.
And indeed, Google is merely indexing and archiving imagery that is by
definition actually posted and hosted at external sites not under
Google's control.

But even given these facts, would it be fair to say that Google has no
role to play in the exploitation and monetization of these images, and
in the continuing grief that they cause the victim's parents and other
family members?

Again, Google isn't the creator or poster of the photos in question.
But Google is almost certainly the primary mechanism through which the
vast majority of persons discover and locate these images.

There are some relatively simple amelioratory steps that I'd suggest
in this specific case.

Google could take a more proactive stance to avoid having such images
being so openly displayed when not in completely unfiltered SafeSearch
mode.  My hunch is that flagging most of these specific accident
photos as posted -- even on an ongoing basis (based on keywords and
Google's advanced image analysis algorithms) -- would be relatively
straightforward given Google's resources.

More broadly, this case brings into focus a class of issues
representing extremely difficult ethical dilemmas that often aren't
subject to improvement through engineering alone.

Censorship is not only dangerous but essentially impossible to
completely enforce on the Internet.  A single copy of a text or photo
(or musical performance or feature film for that matter), posted on
the Web is likely to publicly survive in some form into technological
perpetuity.  That's the reality, like it or not.

On the other hand, it can be argued that Google and other aggregators
of indexing information and links do bear some ethical responsibility
to try -- within the bounds of common sense, free speech, and
technical practicality -- to help avoid the widespread dissemination
of exceptionally hurtful and damaging materials in unfiltered search
result contexts.

In other words, it really should not be so easy to stumble across
photos of a decapitated 18-year-old girl when Google Image search
results are in a strict filtering mode.

At the macro level, to say that dealing with such issues is a dilemma
presenting major scaling challenges is a significant understatement.
But as I've earlier noted, there are a wide variety of situations
where the algorithmic precision of search engine rankings can do real
and completely unwarranted harm to actual people 
( http://lauren.vortex.com/archive/000253.html ).

Which brings us to perhaps the most important question associated with
this entire topic.  From both technical and ethical standpoints, can
we honestly say that it's unreasonable or impossible to research and
deploy steps that would help prevent thoughtless acts conducted over
the course of a few minutes -- like the alleged sending of those
accident photos by CHP officers -- from endlessly dragging other
persons through a living hell?

Not censorship.  Not a ban.  Not new laws.

Rather, just doing a better job at further extending ethical
considerations to search, in a fusion of software engineering and
humanism.

If we instead choose to insist that this cannot be accomplished, we're
eerily invoking the lyrics of Tom Lehrer's comedic critique of
German/U.S. rocketry pioneer Wernher von Braun": " 'Once the rockets
are up, who cares where they come down?  That's not my department',
says Wernher von Braun."

As Lehrer sang them, many years ago, the words were very funny indeed.

In the real world of the Internet, these ethical issues are both
difficult and serious -- but I believe subject to reasonable and
effective resolution, given the will to do so.

I can think of no organization better positioned and suited than
Google to be in the vanguard of this important area.  I trust that
they are up to the challenge.

--Lauren--
Lauren Weinstein
lauren@vortex.com
Tel: +1 (818) 225-2800
http://www.pfir.org/lauren
Co-Founder, PFIR
   - People For Internet Responsibility - http://www.pfir.org
Co-Founder, NNSquad
   - Network Neutrality Squad - http://www.nnsquad.org
Founder, GCTIP - Global Coalition 
   for Transparent Internet Performance - http://www.gctip.org
Founder, PRIVACY Forum - http://www.vortex.com
Member, ACM Committee on Computers and Public Policy
Lauren's Blog: http://lauren.vortex.com
Twitter: https://twitter.com/laurenweinstein