Академический Документы
Профессиональный Документы
Культура Документы
7
Is there anything new or unique about Boyer’s
case from an ethical point of view?
Boyer was stalked in ways that were not
possible before cybertechnology.
But do new ethical issues arise?
Two points of view:
Traditionalists argue that nothing is new –
crime is crime, and murder is murder.
Uniqueness Proponents argue that
cybertechnology has introduced (at least
some) new and unique ethical issues that
could not have existed before computers.
Both sides seem correct on some claims,
and both seem to be wrong on others.
Traditionalists underestimate the role that
issues of scale and scope that apply
because of the impact of computer
technology.
Cyberstalkers can stalk multiple victims
simultaneously (scale) and globally (because
of the scope or reach of the Internet).
They also can operate without ever having
to leave the comfort of their homes.
Uniqueness proponents tend to overstate the
effect that cybertechnology has on ethics per
se.
Maner (1996) argues that computers are
uniquely fast, uniquely malleable, etc.
There may indeed be some unique aspects of
computer technology.
But uniqueness proponents tend to confuse
unique features of technology with unique
ethical issues.
They use the following logical fallacy:
◦ Cybertechnology has some unique technological
features.
◦ Cybertechnology generates ethical issues.
◦ Therefore, the ethical issues generated by
cybertechnology must be unique.
Traditionalists and uniqueness proponents
are each partly correct.
Traditionalists correctly point out that no new
ethical issues have been introduced by
computers.
Uniqueness proponents are correct in that
cybertechnology has complicated our analysis
of traditional ethical issues.
So we must distinguish between: (a) unique
technological features, and (b) any (alleged)
unique ethical issues.
Two scenarios from the text:
◦ (a) Computer professionals designing and coding a
controversial computer system
◦ (b) Software piracy
James Moor (1985) argues that computer
technology generates “new possibilities for
human action” because computers are
logically malleable.
Logical malleability, in turn, introduces policy
vacuums.
Policy vacuums often arise because of
conceptual muddles.
In the early 1980s, there were no clear laws
regarding the duplication of software
programs, which was made easy because of
personal computers.
A policy vacuum arose.
Before the policy vacuum could be filled, we
had to clear up a conceptual muddle: What
exactly is software?
Attempting to control technology through law
and regulation has often been futile.
Correcting technology with other technology
has been more effective.
Ex. Laws suppressing pornography have been
rough to enforce but software that filters out
pornography has been more successful.
17
Applied ethics, unlike theoretical ethics,
examines "practical" ethical issues.
It analyzes moral issues from the vantage-
point of one or more ethical theories.
Ethicists working in fields of applied ethics
are more interested in applying ethical
theories to the analysis of specific moral
problems than in debating the ethical
theories themselves.
Three distinct perspectives of applied ethics
(as applied to cyberethics):
Professional Ethics
Philosophical Ethics
Descriptive Ethics
According to this view, cyberethics is the
field that identifies and analyzes issues of
ethical responsibility for computer
professionals.
Consider a computer professional's role in
designing, developing, and maintaining
computer hardware and software systems.
◦ Suppose a programmer discovers that a software
product she has been working on is about to be
released for sale to the public, even though it is
unreliable because it contains "buggy" software.
◦ Should she "blow the whistle?"
Don Gotterbarn (1991) argued that all
genuine computer ethics issues are
professional ethics issues.
Computer ethics, for Gotterbarn is like
medical ethics and legal ethics, which are
tied to issues involving specific professions.
He notes that computer ethics issues aren’t
about technology – e.g., we don’t have
automobile ethics, airplane ethics, etc.
Gotterbarn’s model for computer ethics
seems too narrow for cyberethics.
Cyberethics issues affect not only computer
professionals; they effect everyone.
Before the widespread use of the Internet,
Gotterbarn’s professional-ethics model may
have been adequate.
From this perspective, cyberethics is a field
of philosophical analysis and inquiry that
goes beyond professional ethics
(Gotterbarn).
Non-moral Moral
32
Four constraints that regulate our behavior
in real space: laws, norms, the market and
code / architecture
Laws – rules imposed by the government
which are enforced by ex post (after the
fact) sanctions
◦ The complicated IRS tax code is a set of laws that
dictates how much we owe. If we break these
laws we are subject to fines / penalties.
33
• Social Norms – expressions of the community.
Most have well defined sense of normalcy in
norms, standards and behavior.
– Cigar smokers are not welcome at most functions.
• The Market – prices set for goods, services or
labor.
– $3.95 for coffee and local coffee shop
• Architecture – physical constraints of our
behavior.
– A room without windows imposes certain constraints
because no one can see outside.
34
Subject to the same four constraints
◦ Laws – provide copyright and patent protection
◦ Markets – advertisers gravitate towards more popular
web sites
◦ Architectural – software code such as programs and
protocols (constrain and control our activities). Ex.
Web sites demanding username/passwords and
software deployed to filter spam and certain email.
◦ Norms – Internet etiquette and social customs.
Flaming is a bad norm.
35
Moor’s list of core human goods (considered
thin) include:
◦ Life
◦ Happiness – pleasure and absence of pain
◦ Autonomy – goods that we need to complete our
projects (ability, security, knowledge, freedom,
opportunity, reason)
36
• Finnis’ version of human good (considered
thick) includes:
– Life
– Knowledge
– Play (and skillful work)
– Aesthetic experience
– Sociability
– Religion
– Practical reasonableness (includes autonomy)
• Participation in these goods allow us to achieve
genuine human flourishing
37
Ultimate good, human flourishing of ourselves
and others should be our guidepost of value,
serving as a basis for crafting laws, developing
social institutions and regulating the Internet.
Golden Rule (Matthew 7:12)
◦ “So whatever you wish that others would do to you, do
also to them”
Immanual Kant stated “Act so that you treat
humanity always as an end and never as a
means”
38
• Those who write programs or create laws
should rely on ethics as their guide.
• Code writers need to write in such a way that
preserves basic moral values such as
autonomy and privacy.
• Many feel technology is just a tool and it is up
to us whether this powerful tool is used for
good or ill purposes.
39
Two extremes:
◦ Up to us what happens
◦ Technology locks us into inescapable cage
Technological Realism – acknowledges that
technology has reconfigured our political and
social reality and it does influence human
behavior in particular ways.
40
Teleological – rightness or wrongness of an
action depends on whether the goal or
desired end is achieved (look at the
consequences – maybe OK to lie). Sometimes
called consequentialism
Deontological – is an action right or wrong.
Act out of obligation or duty. Prohibition
against harming the innocent.
41
The good of the many—at core a teleological
framework. An action is judged by how it
affects the many (see Utilitarianism). The
point of reference is in the masses, not the
individual.
The good of the individual—at core a
deontological framework. An action is judged
by an interalized code of behavior, a moral
system.
42
Teleological
Most popular version of consequentialism
Right course of action is to promote the
most general good
The action is good if it produces the
greatest net benefits or lowest net cost
43
Deontologic
Rights-based
Looks at moral issues from viewpoint of the
human rights that may be at stake
◦ Negative right – implies one is free from external
interference in one’s affairs (state can’t tap phones)
◦ Positive right – implies a requirement that the holder
of this right be provided with whatever one needs to
pursue legitimate interests (rights to medical care and
education)
44
• Deontologic
• Duty-based
• Actions only have moral worth when they are
done for the sake of duty
– Ex. If everyone would break promises there would be
no such thing as a promise.
– Consider this when looking at intellectual property
– Ask the question “What if everybody did what you are
doing?”
– Respect for other human beings
45
1. Keep promises and tell truth (fidelity)
2. Right the wrongs you inflicted (reparation)
3. Distribute goods justly (justice)
4. Improve the lot of others with respect to virtue,
intelligence and happiness (beneficence)
5. Improve oneself with respect to virtue,
intelligence and happiness (self-improvement)
6. Exhibit gratitude when appropriate (gratitude)
7. Avoid injury to others (noninjury)
46
Good should be done and evil avoided
This principle is too general.
47
None are without flaws or contradictions
4 frameworks converge on same solutions
but suggest different solutions
One must decide which framework they will
follow and “trump” the others
48
Popularized by Beauchamp and Childress
“At first glance” one principle should be given
more weight than others but
4 principles are: autonomy, nonmaleficence,
beneficence and justice
49
Is a necessary condition of moral
responsibility
Individuals shape their destiny according to
their notion of the best sort of life worth living
50
Above all else – do no harm
51
This is a positive duty
We should act in such a way that we advance
the welfare of other people when we are able
to do so
52
Similar cases should be treated in similar
ways
Fair treatment
53
Technology seems neutral, at least initially.
Consider the cliché: “Guns don’t kill people,
people kill people.”
Corlann Gee Bush (19997) argues that gun
technology, like all technologies, is biased
in certain directions.
She points out that certain features inherent
in gun technology itself cause guns to be
biased in a direction towards violence.
Bush uses an analogy from physics to
illustrate the bias inherent in technology.
An atom that either loses or gains electrons
through the ionization process becomes
charged or valenced in a certain direction.
Bush notes that all technologies, including
guns, are similarly valenced in that they
tend to "favor" certain directions rather than
others.
Thus technology is biased and is not
neutral.
Brey (2001) believes that because of
embedded biases in cybertechnology, the
standard applied-ethics methodology is not
adequate for identifying cyberethics issues.
We might fail to notice certain features
embedded in the design of
cybertechnology.
Using the standard model, we might also
fail to recognize that certain practices
involving cybertechnology can have moral
implications.
Brey notes that one weakness of the
“standard method of applied ethics” is that
it tends to focus on known moral
controversies
So that model fails to identify those
practices involving cybertechnology which
have moral implications but that are not yet
known.
Brey refers to these practices as having
morally opaque (or morally non-
transparent) features, which he contrasts
with "morally transparent” features.
Transparent Features Morally Opaque Features