Академический Документы
Профессиональный Документы
Культура Документы
April 6th , 2013, 07:00 AM By Ricardo Murer B.S. in Computer Science (USP) and Master's Degree in Communications and Arts (USP). Expert in digital strategy and new technologies. Follow@rdmurer
A robot may not injure a human being or, through inaction, allow a human being to come to harm. [1st Law of Robotics] Isaac Asimov (Writer and American biochemist - 1920 - 1992)
In
1942,
in
his
science
fiction
short
story
called
Runaround,
Isaac
Asimov
described
the
three
laws
of
robotics.
Laws
that,
once
encoded
in
the
memory
of
robots,
would
ensure
that
these
machines
would
not
take
any
action
capable
of
injuring
or
killing
a
human
being.
However
reality
unfortunately
does
not
always
travel
the
same
pathways
advocated
in
sci-fi
stories.
It
may
seem
strange
at
first
to
imagine
a
war
without
soldiers,
but
many
countries
have
significantly
increased
their
investments
in
technologies
promoting
ranged
combat.
According
to
Peter
W.
Singer:
"The
U.S.
Air
Force
trained
more
unmanned
systems
operators
than
fighter
and
bomber
pilots
combined"
[SINGER,
2013].
Moreover,
according
to
Chris
Anderson,
editor
of
Wired
magazine,
the
drones
industry
may
reach
U$
30
billion
in
2015.
A
rapidly
evolving
field,
no
doubt.
But
how
long
will
drones
fly
only
in
conflict
areas?
And
who
exactly
can
control
the
reach
of
a
digital
virus,
manipulated
by
governments
and
people
without
ethical
and
moral
principles?
At
a
time
when
computer
technologies
open
out
the
scope
and
autonomy
of
the
Military
Forces,
At
a
time
when
computer
technologies
expand
the
scope
and
autonomy
of
the
armed
forces,
I
consider
essential
to
rescue
the
fundamental
values
that
weave
and
keeps
human
society
in
balance.
Every
technological
advancement
entails
environmental
and
social
consequences
and
affects
our
way
of
living.
And
those
consequences
are
often
unpredictable.
The
Good
This
new
chapter
of
our
History
has
already
begun,
more
exactly
inside
our
homes,
in
our
children's
bedroom.
It
has
been
a
long
time
since
digital
entertainment
companies
have
been
improving
their
war-themed
video
games
to
such
a
degree
that
we
may
considered,
without
a
doubt,
an
excellent
way
of
military
training.
The
most
popular
and
holders
of
the
greatest
psychological
effect
are
those
through
which
the
teenager
plays
as
if
he
were
the
soldier,
in
first
person
(First
Person
Shooter
-
FPS),
in
virtual
settings
which
are
closer
to
reality.
In
this
case,
the
screen
displays
and
1
represents
the
users
view,
while
he
holds
the
gun
with
his
own
virtual
hands.
The
aim
is
to
give
the
feeling
of
being
"inside"
the
action,
increasing
the
engagement
and
the
psychological
commitment
of
the
user.
From
some
mere
searches
on
YouTube,
we
can
find
real
footage
shot
by
American
soldiers
fighting
in
Afghanistan
or
Iraq.
When
comparing
the
scenes
with
current
digital
creations,
the
resemblance
is
startling.
And
this
is
not
restricted
to
setting
and
artillery,
but
includes
even
the
characters
(I
recommend
a
visit
to
Activision's
website
for
an
update).
Virtualization
of
war
starts
not
only
with
the
training
of
young
people
in
a
subliminal
way,
but
with
the
quest
for
the
creation
of
digital,
artificial
universes;
of
perfect
simulations
of
reality.
In
a
word,
the
Grail
of
virtual
reality.
"Virtual
reality
promises
to
expand
the
experience
of
sound
and
images
beyond
the
limits
of
the
mass
media,
beyond
even
the
confines
of
the
computer
or
the
television
screens"
[DOWNES,
2005].
So,
we
should
not
be
surprised
when
Prince
Harry
of
England,
while
he
was
in
military
service
in
Afghanistan,
revealed,
during
an
interview
to
a
journalist,
that
he
credited
his
skills
as
a
shooter
boarding
an
Apache
helicopter
to
the
fact
that
he
had
himself
played
video
games.
He
said:
"It
is
a
pleasure
for
me,
because
I
am
one
of
those
people
who
like
to
play
PlayStation
and
Xbox;
so,
by
using
my
fingers,
I
like
to
think
I
am
probably
very
helpful."
The
Bad
It
was
during
World
War
II
that
the
computational
science
began
its
relationship
with
the
Armed
Forces.
Motivated
by
a
noble
cause,
the
total
annihilation
of
Nazi
Germany
and
its
totalitarian
regime,
scientists
and
mathematicians
worked
for
the
Allies
in
order
to
develop
weapons
and
computers.
Among
them,
Alan
Turing,
one
of
the
fathers
of
the
computing
sciences,
and
Gordon
Welchman,
both
responsible
for
the
invention
of
Bombe,
a
machine
which
broke
the
encrypted
codes
of
the
Germans,
created
by
Enigma
machine.
Since
then,
from
rockets
to
nuclear
submarines,
from
radars
to
invisible
airplanes,
all
the
military
armaments
have
high
technology
onboard.
But,
while
in
the
past
soldiers
were
considered
essential
to
winning
war
battles,
this
scenario
has
begun
to
change
dramatically
as
a
result
of
the
development
of
robotics
and
digital
communication
technologies.
Today,
soldiers
and
pilots
are
leaving
the
battlefield
and
migrating
to
control
rooms,
which
keep
some
resemblance
to
their
teenagers
bedrooms.
There
they
rediscover
their
computers,
screens,
and
joysticks.
More
than
flying
planes
in
video
games
and
virtual
missions,
why
shouldnt
they
do
the
same
in
an
area
of
real
conflict
in
Afghanistan
or
Iraq?
Robotics
has
also
been
used
in
Iraq
a
few
years
ago.
The
U.S.
Army
has
a
special
unit
operating
in
Iraq,
a
team
called
the
EOD
(Explosive
Ordnance
Disposal,
comprised
of
a
soldier
who
controls
a
small
robot
from
a
distance.
This
robot,
a
PackBot
type,
weighs
just
over
one
pound,
has
multiple
cameras
and
sensors,
a
hasty
arm
with
four
joints,
and
moves
along
mats.
According
to
the
U.S.
Army,
these
units
have
already
saved
thousands
of
lives
in
Iraq.
If
today
robotic
military
units
operating
by
land
and
air
have
been
saving
lives,
why
should
we
be
against
its
use?
What
is
its
drawback?
2
The
Ugly
According
to
Moore's
Law,
the
number
of
transistors
on
integrated
circuits
doubles
approximately
every
two
years,
so
we
may
expect
more
sophistication
and
technological
development
ahead.
In
the
case
of
robotics,
surveys
indicate
"autonomy"
or
robots
ability
to
make
their
own
decisions.
According
to
Hans
Moravec:
"By
2050
the
brains
of
computer-based
robots
will
perform
100
trillion
instructions
per
second
and
begin
to
rival
human
intelligence."
[Moravec,
2009].
To
imagine
robots
that
can
think
and
act
like
us
is
not
science
fiction,
but
a
fast
developing
field,
better
known
as
Artificial
Intelligence,
or
AI.
So,
in
the
near
future,
autonomous
and
intelligent
robots
will
be
in
operation,
fighting
alongside
soldiers
or
other
robots,
killing
and
being
killed
(or
"deactivated").
The
logic
of
fair
war
divides
the
principles
of
reasoning
concerning
the
morality
of
war
into
two
categories:
the
jus
ad
bellum,
or
the
criterion
for
starting
war,
and
the
jus
in
bello,
which
is
related
to
the
requirements
for
conducting
a
war.
In
the
conduct
of
war,
there
are
two
fundamental
humanitarian
principles:
the
principle
of
discrimination
and
the
principle
of
proportionality.
The
principle
of
discrimination
demands
that
combatants
do
not
directly
attack
non-combatants
and
that
they
make
reasonable
arrangements
to
avoid
casualties
among
non-combatants.
Hence,
how
to
ensure
that
this
principle
be
respected
by
robots,
especially
in
today's
conflicts,
many
of
which
occurred
in
urban
environments,
where
civilians
and
undercover
soldiers
coexist?
In
turn,
the
principle
of
proportionality
entails
determining
the
maximum
use
of
force
that
may
be
employed,
rationally
considering
the
goals
of
a
fair
war.
When
considering
a
conflict
in
which,
on
the
one
side,
we
have
humans,
humans
that
might
be
injured
and
die,
and,
on
the
other
side,
machines,
which
are
mass-produced
and
can
be
repaired
after
any
damage,
promptly
returning
to
the
battlefield,
wars
will
always
be
uneven.
Conclusion
We
live
in
a
period
of
our
History
in
which
decisions
as
regards
technological
development
follow
the
market
rules,
i.
e.,
economic
concerns.
On
one
side,
there
are
the
clients,
represented
by
the
Armed
Forces;
on
the
other
side,
there
are
the
other
suppliers,
represented
by
war
industry.
There
is
not,
obviously,
an
ethical
and
moral
concern
guiding
the
technological
development
of
war
weapons.
Despite
the
efforts
of
Max
Weber,
Hans
Jonas
and
Norbert
Wiener,
computer
ethics
continues
to
be
nothing
more
than
a
nice
theoretical
subject
taught
in
universities.
Perhaps
we
will
have
to
hear
from
a
robot-soldier
of
the
future,
with
its
brain
capable
of
processing
100
trillion
instructions
per
second,
that
the
way
to
overcome
our
political
and
religious
differences
is
dialogue
and
peaceful
negotiation,
not
war.
3
References
Activision
http://www.activision.com/atvihub/home.do
Asimo
-
http://asimo.honda.com/
ASIMOV,
I.
I,
Robot.
New
York:
Spectra,
2004.
Computer
and
Information
Ethics.
Stanford
Encyclopedia
of
Philosophy.
First
published
Tue
Aug
14,
2001;
substantive
revision
Thu
Oct
23,
2008.
Available
at:
http://plato.stanford.edu/entries/ethics-computer/
CURRIER,
C.
Everything
We
Know
So
Far
About
Drone
Strikes.
ProPublica
-
Journalism
in
the
Public
Interest.
Feb.
5,
2013.
Available
at:
http://www.propublica.org/article/everything-we-know-so-far-about-drone-strikes
DOWNES,
D.
Interactive
Realism:
the
poetics
of
cyberspace.
Canada:
McGill-Queens
University
Press,
2005.
HALLS,
J.
S.
Beyond
AI
:
creating
the
conscience
of
the
machine.
New
York:
Prometheus
Books,
2007.
JONAS,
H.,
Technology
and
Responsibility:
Reflections
on
the
New
Tasks
of
Ethics.
Social
Research,
40:1,
Spring,
p.31.
1973.
Available
at:
http://www.conductiveoutfit.com/mta/technologyandmen/readings/Jonas%20- %20Technology%20&%20Responsibility.pdf
KELLY,
K.
What
technology
wants:
Technology
is
a
living
force
that
can
expand
our
individual
potential
if
we
listen
to
what
it
wants.
New
York:
Penguin
Group,
2010.
KREPS,
S.E.
and
Kaag,
J.
The
Use
of
Unmanned
Aerial
Vehicles
in
Contemporary
Conflict:
A
Legal
and
Ethical
Analysis.
March
15,
2012.
Available
at
SSRN:
http://ssrn.com/abstract=2023202
or
http://dx.doi.org/10.2139/ssrn.2023202
KURZWEIL,
R.
The
Singularity
is
Near.
New
York:
Viking
Books,
2005.
LANIER,
J.
You
are
not
a
Gadget
A
Manifesto.
New
York:
Penguin
Group,
2010.
LEMOS,
A.
Cibercultura:
Tecnologia
e
Vida
Social
na
Cultura
Contempornea.
Porto
Alegre:
Editora
Sulina,
2010.
LVY,
P.
O
que
virtual?
So
Paulo:
Editora
34,
1996.
MORAVEC,
H.
Rise
of
the
Robots--The
Future
of
Artificial
Intelligence.
Scientific
America.
March,
23,
2009.
Available
at:
http://www.scientificamerican.com/article.cfm?id=rise-of-the-robots&page=2
4
OBENHAUS, S. R. A Estrada para Basra e a tica da Perseguio. Military Review. 2002. PARENTE, A. (Org.). Imagem mquina: a era das tecnologias do virtual. Rio de Janeiro: Editora 34, 1993. RAYNER, A. Are video games just propaganda and training tools for the military? The Guardian, Sunday 18 March 2012. Available at: http://www.guardian.co.uk/technology/2012/mar/18/video-games-propaganda-tools- military SINGER, P. W. The Predator Comes Home: A Primer on Domestic Drones, their Huge Business Opportunities, and their Deep Political, Moral, and Legal Challenges. Brookings. March 8, 2013. Available at: http://www.brookings.edu/research/papers/2013/03/08-drones-singer SMITH, G. Pentagon Cyber Force Turns To Hackers To Meet Growing Demand. Huff Post Tech, January, 2013. Available at: http://www.huffingtonpost.com/2013/01/28/pentagon-cyber-force_n_2567564.html Notes Drones Generically are called UAVs (Unmanned Aerial Vehicles) or RPVs (Remotely Piloted Vehicles - remotely piloted vehicles) Prince Harry returns from Afghanistan as he reveals he killed Taliban insurgents http://www.telegraph.co.uk/news/uknews/theroyalfamily/9815438/Prince-Harry- confirms-he-killed-Taliban-as-he-returns-from-Afghanistan-saying-Take-a-life-to-save- a-life.html Isaac Asimov's "Three Laws of Robotics" 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Ricardo Murer graduado em Cincias da Computao (USP) e mestre em Comunicao (USP). Especialista em estratgia digital e novas tecnologias. Follow: @rdmurer