note: this material is the final posting of notes · engi 5100 part 3 final notes 270308 28/03/08 1...
TRANSCRIPT
ENGI 5100 Part 3 final notes 270308 28/03/08
1
Note: This material is the final posting of notes. It covers several important cases including
Gilbane Gold and the Challenger Disaster which will be presented in class by video and slides.
You should read these notes with reference to the S & M text.
LW March 27, 2008
GILBANE GOLD shown here most years. Video, study guide, and overheads. See binder with
Challenger documentation, etc.
Optional topic
GILBANE GOLD
Introduction
-Distribute synopsis and cast of characters
-overheads with main facts and cast of characters
-show video
DISCUSSION
Questions: Have any laws been broken? Is this question even relevant?
Why has the city not been more active? What is their responsibility?
THE VARIOUS PARTIES:
David Jackson:
-was there anything David Jackson could have done during the development of his predicament
that could have averted it?
-Do you think he should have “gone public” or “blown the whistle”
ENGI 5100 Part 3 final notes 270308 28/03/08
2-Do you think he should have told his side of the story to Channel 13 “off the record”? How
would this go over with Z-Corp?
-What advice would you give David?
-Could he actually lose his license?
Phil Port, David Jackson’s boss:
Where was Phil’s primary allegiance?
(a) To Z-Corp shareholders?
(b) To Z-Corp management?
(c) To addressing David Jackson’s concerns?
(d) To himself?
Professor Massin:
-What do you think of Professor Massin’s comments and advice?
-Is there always a compromise between the development of new products and resultant impact
on the environment?
-Should David have kept from going public?
-Should David have gone to his engineering society for advice?
-Was there no substantive evidence that Z-Corp is violating current discharge laws?
Actions of Z-Corp:
-What are the primary responsibilities of Z-Corp?
-Does the company have a greater responsibility to meet the philosophy or objective behind the
law, which is currently flawed because it does not limit the mass of pollutants discharged?
-Does the company have the responsibility to upgrade to most advanced analytical measuring
technology for heavy metals?
-Is it up to Z-Corp management to protect the environment? If so, to what degree?
SAFETY AND THE EXPERIMENTAL NATURE OF ENGINEERING [OH3044]
ENGI 5100 Part 3 final notes 270308 28/03/08
3You will have noted the dominance of safety issues in many aspects of engineering ethical
problems. Safety, particularly public safety, is paramount in much that engineers do.
Chapters 3 and 4 of S&M have this as an underlying theme. As soon as we begin to consider
safety in any quantitative way, we have to deal with the more formal notion of risk. Engineers
deal with risk on a daily basis, and even when every effort is made to reduce it to a minimum in
design and operation, we know that it is never zero. The problem is, we are never able to have
the luxury of complete knowledge. Judgements have to be made, and sometimes mistakes
occur. We also know that sometimes an innovation might lead in a totally unexpected direction,
or have unanticipated consequences.
It is for these reasons that the authors compare engineering to an experiment; calling it “social experimentation.” This ties in nicely with our earlier study on the interaction of technology and
society, except that Postman’s interest was mainly in the undesirable outcomes.
ENGINEERING AS SOCIAL EXPERIMENTATION (Chap 3)
Titanic. Sometimes engineering is guilty of overconfidence, perhaps forgetting that things that
are untried - which much of engineering must deal with - do constitute an experiment. S&M
illustrate with this overconfidence and lack of preparation for the unexpected with a very brief
account, at the beginning of Chapter 3, of the Titanic disaster of 1912. Over 1500 people
perished.
The builders were so confident of the Titanic's safety that she only carried lifeboats for about
one-quarter of the passengers and crew. This was all that was required by the regulations,
which did not anticipate ships of her size. What was the responsibility of the engineers here?
There were many other mistakes made which might have increased the likelihood of the tragedy
which took place, but there is no escaping the fact that there were not enough lifeboats.
This situation is what we sometimes call “minimal compliance”, and it constitutes one of the
greatest moral problems in engineering, according to S&M (p88). Obviously the engineer must
obey the law, but this does not replace moral responsibility if engineering judgement indicates
that the law is insufficient.
ENGI 5100 Part 3 final notes 270308 28/03/08
4
The Titanic had many innovative features - the most significant in this case being its increased
safety due to multiple watertight compartments. The authors use this case of overconfidence
and minimal compliance to dramatize the proposal that had the engineers looked upon their
innovation as a "social experiment", they would have not been so likely to have made some of
the tragic errors.
Engineering as experimentation. [OH3045]
Overall, any engineering project may itself be viewed as a kind of experiment. The design
process always involves proposing alternatives, trying out ideas, and iteration.
any project is carried out in partial ignorance
final outcomes are generally uncertain
ongoing success depends upon gaining new knowledge
Learning from the past. (not as good as it should be)
Innovation as a social experiment. [OH3046] The authors recognize that the engineer
working in a firm is not the sole experimenter. But their expertise places them in a unique
position to know what is going on and what is likely to happen.
Engineers as “Responsible Experimenters” Pursuing the model of social experimentation,
S&M suggest that this implies four elements:
1. A conscientious commitment to live by moral values.
This is very high sounding, and very broad in its application, but what exactly is meant? It must
go beyond a preoccupation with narrow self-interest. It implies a consciousness, or awareness,
in considering and acting upon the full range of moral values which bear on a given situation.
ENGI 5100 Part 3 final notes 270308 28/03/08
5This does not fall naturally into place in the working environment of many engineers.
Ninety percent are employees, and while corporations may be run by totally moral executives
(or sometimes not) there is a great pressure for an engineer to be a team player.
In fact, as we have seen, it is in keeping with our ethics codes to be a "team player", quite apart
from the normal pressures which come from trying to ensure personal job security,
advancement in the company, and professional development. Looking at it in terms of "duty
ethics", the minimum "negative duties": for example, the duty not to falsify data, not to breach
confidentiality, etc., can become the full extent of moral commitment. In other words, we then
have minimal compliance.
On the other hand, a conscientious commitment to moral values does not mean that you must
force your own views of social good on society. Moral values, by definition, have to be those
which are widely shared.
I should not leave this discussion without pointing out that the idea of an engineer as a social
guardian is not without controversy. I have distributing an article written by Samuel Florman in 1978 (Moral Blueprints) which argues that this social responsibility is over-stressed [29]. He
puts more faith in a distributed moral responsibility, in companies and other institutions, and
reasonably suggests that it is expecting too much of the employee at lower levels to look out for
the ultimate use (or misuse) of everything he might work on.
All we can do in this course of this kind is to give you an introduction to the issues and the
intellectual tools that can be used to deal with them, together with a few case studies. In the end
you have the right to develop (and no doubt, will develop) your own set of rules and your own
personal slant on moral philosophy.
The next element implied in being what S&M calls a "responsible experimenter" is: 2. A comprehensive perspective, or continual awareness of the possibility of unforseen events, etc.
ENGI 5100 Part 3 final notes 270308 28/03/08
6This is a principle which argues against the “compartmentalization” of knowledge and
responsibility. It is fundamentally integrity, or “wholeness”. One should look at the broad picture,
and take all the facts into account. Is there an undesirable side effect of the change you are
proposing in the design or system? Perhaps no one but you has thought of it. As S&M say, you
should not leave it to the sales department to let the customers know - if they even ask, which
they probably won't.
3. Moral autonomy.
This means taking personal responsibility. One cannot say “I’m only doing my job, my employer
is responsible”. That, in my view, is to abandon the idea of being a professional. It is certainly
true that the attitude of the management in a company will be a large factor in determining how
much moral autonomy an employee engineer can exercise. Keep this in mind when you
become managers. Other practical aspects of operating a business, such as competition and
deadlines, have to be considered, and one has to be realistic, and simultaneously, not to
compromise the principles.
4. There must be accountability. This means that in the end, if it is your mistake, the buck
stops at your desk. (Remember Isambard Kingdom Brunel and his atmospheric railway?). But it
is not only in this narrow sense of accepting blame when it is properly placed at your door.
There is also the acceptance of review, of just criticism from peers, and the willingness to
morally defend your actions when appropriate.
Working within a firm, under an employer’s authority, can be interpreted as a reason for a
narrowed accountability.
It is of some interest to note that here we find another mention of the work of Stanley Milgram.
You will recall that Milgram concluded from his experiments that people tend do things that they
otherwise would not if instructed by an authority figure. (Postman thought this was an obvious
common-sense result being cloaked in scientism). S&M observe that this tendency to abandon
accountability if one is under someone else’s authority is common in the professions.
ENGI 5100 Part 3 final notes 270308 28/03/08
7p96 Challenger [logical place is at end of Chap 3: i.e. here.]
ROGER BOISJOLY ON THE CHALLENGER DISASTER
I am using this case study originally prepared by MIT, but no longer maintained on the web. The
link at the end of the course notes goes to Mechanical Engineering at Texas A&M University,
and contains a good description of the case and the issues.
The case is being used to illustrate the difficulties of an employee engineer in a large
corporation with a government contract and under a lot of pressure. S&M uses this in their
Chapter 3, (Engineering as Social Experimentation). It is also a good example of the difficulties
in “whistle blowing” which we will take up after this case study.
Description of what happened 28 January, 1986. (Booster rocket diagrams etc from p98 of
S&M)
-Challenger was one of five space shuttles, all named for famous ships. (others Atlantis,
Columbia, Discovery, Endeavour). The first shuttle launch (Columbia, STS-1) was in 1981.This
was to be Challenger's first launch in 1986. (STS-51L) There had already been about 25 shuttle
missions.
-Diagram showing main tank, booster rockets, etc.
-The two solid (i.e. packed with solid propellant) rocket boosters (SRB) provide 80% of the
thrust. They parachute to earth and are re-used. 45 m tall, 4m diameter.
- the SRB's are attached to the external fuel tank for the main engines. This tank contains
several million pounds of liquid hydrogen, and it too is discarded when it is empty.
-Because the whole booster rocket was too large to transport overland from the place it was
built, in Utah, to the launch site in Florida, the seven individual segments of the booster rockets
were packed with solid fuel (A million pounds of a mixture of aluminium, potassium chloride and
ENGI 5100 Part 3 final notes 270308 28/03/08
8iron oxide in each rocket) and pairs were joined together at the factory. The two-unit sections
were then butt-jointed at the launch site. The problem occurred in those four joints, which
depend on rubber seals in the form of O-rings to prevent the escape of 5800F gas after ignition
of the solid fuel. These O-rings are about one-quarter inch thick, and 147 inches (3.7m) in
diameter. There are two in each joint, one behind the other.
-Boisjoly begins his story about a year earlier, when evidence of gas blow-by appeared in two
field joints from an earlier flight. He and colleagues had been working for the last year to fix the
seal problem. One of the observations was that the rubber O-ring was sensitive to temperature.
On the eve of the flight, the forecast was for record cold at the Florida launch site, and this
worried him.
-see joint diagram, failure event description
We let the video tell the story from there, but first let me try to help with all the names and
acronyms which you will hear. Video: Boisjoly: Fulfilling an Engineer’s Responsibility for
Safety (March 1992, 30 min approx). Names and acronyms
NASA: National Aeronautics and Space Administration.
MSFC: Marshall Space Flight Center, Huntsville Alabama (Propulsion Systems)
KSC: Kennedy Space Center (Launch Operations)
Johnson Space Center, Houston, Texas (Flight Control)
MORTON-THIOKOL: Contractor responsible to NASA for the booster rockets
Roger Boisjoly: Engineer with Morton-Thiokol
Arnie Thompson: Colleague of Boisjoly at Morton-Thiokol
Allen J. McDonald: Representing Morton-Thiokol at Cape Kennedy. Director of SRB project.
Bob Lund: vice president of engineering (MT)
Joe Kilminster: vice president for booster rockets (MT)
Jerry Mason: Senior vice president (MT)
SRB: Solid Rocket Booster
ENGI 5100 Part 3 final notes 270308 28/03/08
9STS: Space Transportation System
Video, questions and plan. We will take up each series individually to illustrate how the real
world worked in this tragic case.
Put yourself in Boisjoly's position in April, 1985. He knows the company has a problem, and
they are not giving sufficient attention to the engineering to solve it. The client, NASA, wants it
played down, and Boisjoly's management at Morton Thiokol is supportive of the client.
(The answers in what follows were probably prepared by MIT faculty.)
Overheads:
Questions and answers
ENGI 5100 Part 3 final notes 270308 28/03/08
10COMMITMENT TO SAFETY (Chap 4 of S&M) [OH3047]
Concepts of Safety and Risk We have seen that safety, particularly public safety, is a key responsibility of engineers. In terms of “rights ethics”, it derives from the right of the public not to be endangered without a clear prior warning. This last part about the “clear warning”
relates to the subtle point that if risk is acceptable, the product is deemed safe. Safety does not imply zero risk, it implies acceptable risk.
In defining a safe product as one where there is acceptable risk, we are not really getting
anywhere, for now we have to define what we mean by “risk”. Furthermore, how low must the
"risk" be for the item or process or whatever it is to be considered safe? Not only that, but we
can see at once that it depends on the context in which the risk occurs. Using a sharp knife to
slice bread is more risky if it is being done by a child than an adult. So we cannot attach some
unconditional “risk number” to the knife, although the design can influence the degree of risk
that any user is exposed to.
The notion of risk is fundamental to an engineer, and not simple. [OH3048] Let me quote
from the introduction to a marvellous book by Peter L. Bernstein:
“What is it that distinguishes the thousands of years of history from what we think of as modern
times? The answer goes way beyond the progress of science, technology, capitalism, and
democracy... The revolutionary idea that defines the boundary between modern times and the
past is the mastery of risk: the notion that the future is more than a whim of the gods and that
men and women are not passive before nature” [30].
Later on in the same chapter, Bernstein tells us something of the etymology of the word “risk”. It
derives from early Italian risicare, which means “to dare”. In this sense, risk is a choice, rather than a fate [31]. This is interesting, since well back in this course, I pointed out that
engineers make choices. Of course all people make choices, but it is an ever-present part of the
work of an engineer.
ENGI 5100 Part 3 final notes 270308 28/03/08
11The recognition that risk is inherent in engineering is, no doubt, behind the ideas of
Schinzinger and Martin in the previous chapter, where they suggest that engineers should
consider themselves to be “social experimenters”, and take the necessary precautions for the
unpredictable.
Engineering decisions may affect many people. For example, a technological direction might be
set for society - perhaps unconsciously - or choices of design methodology and materials may
establish how safe a structure will be, or how environmentally friendly a manufacturing plant will
be. Generally speaking, the public does not know what choices, or implied risks, are involved.
The engineer takes professional responsibility, so it is very important for him or her to
understand what risk means. And not only must we know, but we have a responsibility to
explain it to the public. Otherwise, we cannot properly discharge our duty of safety to society.
Our text is a little less sophisticated and perhaps a bit more pragmatic than Bernstein’s book.
S&M defines risk as “a potential that something unwanted and harmful may occur.” (p110).
[OH3049]
This may be pragmatic, but their definition of safety is somewhat more wordy: “A thing is safe if, were its risks fully known, those risks would be judged acceptable by a reasonable person in the light of settled value principles.”
The words “settled values” refer to the need to have an agreed level of acceptable risk
according to the moral values of the person or group. There needs to be an “external” point of
reference, i.e. external to the individual. Safety is therefore a matter, in the authors’ view, of how
people would find risks acceptable or unacceptable if they knew the risks and were basing their
judgements on their “most settled value” perspectives. These ideas are only partly objective; the
subjective element is clearly there. Safety is therefore also a hypothetical matter, since it
depends on a conditional knowledge of risks, which generally will not be complete.
Not only that, but when we have to judge risk, it depends on how the question is put. Human perception is not exactly as rational as we might expect, and depends somewhat on how the
ENGI 5100 Part 3 final notes 270308 28/03/08
12question is asked. (Recall Postman (Technopoly) on this point.) A fascinating
example is given in the text (p112). [OH3050,3051, 3052]
The imaginary scenario is that a country is expecting a serious outbreak of disease, which,
untreated, is expected to kill 600 people. A group of 150 people is given the choice of two
strategies, and it is to be assumed that the results will be as follows:
Choices as put to Group 1:
Program “A” will result in 200 people surviving.
Under Program “B”, there is a 1/3 probability that 600 people will be saved, and a 2/3 probability
that there will be no survivors.
Results for Group 1:
72 % chose “A”. The reason was apparently that the certainty of saving 200 under “A” was
perceived to be more acceptable to most people than the gamble of saving everyone under “B”.
A second group was given the information in the following way:
Under Program “C”, 400 people will die.
Under Program “D” there is a 1/3 probability that no one will die, and 2/3 probability that all 600
will die.
Results for Group 2: Now 22 % chose “C”, and 78 % chose “D”. Note that “A” and “C” are identical, as are “B” and
“D”.
Conclusions
One way of looking at the result is that options perceived as yielding firm gains tend to be preferred over options in which even better gains are only possible. (This is the first case).
ENGI 5100 Part 3 final notes 270308 28/03/08
13The second conclusion (from the second case) is that options which emphasize firm losses tend to be avoided in favour of possible lower losses.
In other words, people tend to be more willing to take risks to avoid perceived firm losses in favour of options than they are to win only possible gains. They avoided the 1/3
probability of no one dying in the first instance, but chose it in the second, even though the other
option was exactly the same; 400 certain deaths.
I should make clear that this is an experimental result, as reported in a prestigious scientific
journal, not the authors’ speculation. You can find the reference in S&M. So I can take it that it is
truly what we can expect from the population generally.
People treat the perception of benefit differently from the perception of risk. We will expend
more effort to avoid perceived risk (per unit loss) than to gain perceived benefit (per unit
benefit). This idea is captured in the expression we sometimes hear (but probably never
question or justify) that most people are “risk averse”. This obviously is not the same for
everyone, and there are such things as risk-takers, hard-core gamblers, and people who play
video gambling machines with their grocery money.
Thus to establish what the people regard as “safe” is quite a complex undertaking. It depends
on these rather irrational human characteristics. For example, people underrate the risk of
familiar things - e.g. driving - and overrate that of others less familiar, e.g. flying. Perception of
the risk associated with an activity can be greatly altered if something happens to someone we
know, e.g. if a friend goes through the ice on a snowmobile you are likely to be much more
conscious of the risk of crossing ice on your own machine.
ASSESSING AND REDUCING RISK
Safety is no accident: good engineering design [OH3053 and Fig 4-1 and Fig 4-5]
We have already noted that absolute safety, in the sense of zero risk to everyone under all
possible conditions, is not attainable, let alone affordable. It will likely (but not always) cost more
to manufacture safer products.
ENGI 5100 Part 3 final notes 270308 28/03/08
14I say not always, because safety is often determined by design, and sometimes
innovative design can improve safety and not increase manufacturing cost. In fact, probably the
most under-used tool is good engineering design sense. Consider the two designs in the
example on p127 (Fig 4-5) of S&M. Simple rewiring makes a safer design.
Although increased attention to safety is often accompanied by increased manufacturing cost, if
products are not safe there can be economic consequences as well as ethical considerations.
Unsafe products lead to high warranty costs, loss of customer goodwill, litigation, and so on.
There is frequently a trade-off to be made between primary manufacturing cost and secondary
costs which might be a function of product safety. This is illustrated by Fig. 4-1, P117.
The curve of primary design cost might decrease as risk is allowed to rise and product safety
worsens, as shown in “P”. On the other hand, secondary costs, such as litigation, warranty cost,
as we discussed above, might well rise as shown in “S”. The curve of total design cost, “T”, has
a minimum at “M”. If one were to try to reduce the primary costs, one might design to the right of
M. But a better plan, for the same overall design cost, would be for the manufacturer to be to the
left of “M”, at a point such as “H”. It would obviously be a better choice than being at the same
total cost to the right of M.
Uncertainties in Design [OH3054 plus Tacoma Bridge] It is easy to accept in principle that
safety must always have a high priority in design. But many uncertainties about eventual use
of a product, operating conditions, materials, or even design data reliability can undermine the
objective. One of the traditional ways to cope with these unknowns is to make reasonable
assumptions, and then build in a “factor of safety” - or “ignorance factor” (for the cynical). But
a factor of safety is effectively over design, and that can be costly. Competition and over-
confidence can force the factor of safety downward, until eventually there is a failure.
It can also be quite ineffective if the assumptions regarding the nature of use of the product
turn out to be invalid, for example if something is designed for an expected load environment,
but it is fact exposed to something which affects the structure (say) in an unexpected way. A
good example is the effect of wind inducing unexpected torsional loading on the deck of the
Tacoma Narrows bridge, which I expect you have all see demonstrated [32]. (1940).
ENGI 5100 Part 3 final notes 270308 28/03/08
15
Risk Benefit Analysis [OH3054 again] This tool is often used for large public projects. For
example, although the mathematical analysis may not be undertaken, in a project like the
development of an offshore oilfield the impact assessment required by government is of this
sort, at least intuitively. The public is consulted about what they see as risk, and what benefit.
The process can be formalized, using probabilities estimated for future anticipated events and
economic returns. Some of these are easier to calculate than others. How much value is
placed on the 0.01 % chance of a large oil spill, in order to compute the financial risk? And what
do we use for an interest rate to calculate present values of risks or benefits far into the future?
The authors say there is a need in today’s society for some open process for judging the
acceptability of potentially risky projects. (Perhaps we already have that in the case where the
environment is at risk)
The ethical question is: “Under what conditions is someone in society entitled to impose a risk on someone else on behalf of a supposed benefit to yet others.” (p122)
Public Risk and Public Acceptance: [OH3055] How much value on a life? The problem of
answering this question is obviously full of difficulty. On a population, one can use statistics
such as insurance, extra paid for hazardous work, or whatever quantifiable data there is.
National Highway Traffic Safety Administration (NHTSA): Maybe $200,725?
Research on what would be acceptable; ~$8 million
Court awards from a plane crash :
• 36 year old executive: $2,287,257
• telephone company employee: $750,000
• stewardess on duty: $275,000
ENGI 5100 Part 3 final notes 270308 28/03/08
16Auto crash: (79 Malibu gas tank explosion, sued GM)
$107 million compensation, $4.8 Billion punitive damages.
Obviously, court awards vary widely.
Case Studies [OH3056]
Three Mile Island, USA. (1979) A nuclear power plant that had a technical failure compounded
by incompetence, and disaster was narrowly avoided. No deaths or injuries, but over a billion
dollars in decommissioning costs, and the effect on the perceived risk of nuclear plants led to a
sharp downturn in the nuclear industry.
Chernobyl, Ukraine, (1986) Poorly planned and executed technical testing led to a violent
explosion. 31 died at once, but by 1992 thousands of deaths have been attributed
The Citicorp Tower, (1978) This was an innovative 59 storey building in New York, supported
on pillars over a church. A question from an engineering student started the structural engineer
thinking, and upon checking, he found that not only had he underestimated the wind loading, a
factor of safety which he might have counted on had been eliminated by a design change during
construction. He took the ethical route, alerted those involved and made provision to protect
people in the event of a big windstorm while the repairs were made. The repair cost was over
$12 million, and the engineer’s insurance picked up $2 million. There was no litigation.
Designing for failure (OH3057)
It is unrealistic to expect that engineering projects will never fail. The best that can be done is to
insure that when they do, they should fail safely, without injury to people. To this we can add
the requirement for safe abandonment; i.e. the failed product or process can be discarded or
stopped without hazard to others, property, or the environment; or safe escape from the danger
e.g. a sinking ship. Together, the authors denote this as the necessity to build in safe exit,
ENGI 5100 Part 3 final notes 270308 28/03/08
17(unlike that in the Ocean Ranger) and they see it as an integral part of sound engineering.
This amounts to proper management of risk, which is a key part of the responsible engineer’s
job.
The messages of Chapters 3 and 4. [OH3058]
Engineering is inherently a risky activity.
Public has a right to know about the risks, and the right not to be endangered without warning
Engineers should act as responsible agents
• safety is good design
• forecasting possible bad side effects
• defensive engineering
• safe exit
• Codes of ethics may help, but there is a need to continually improve.
ENGI 5100 Part 3 final notes 270308 28/03/08
18WORKPLACE RESPONSIBILITIES AND RIGHTS (Chap 5)
Both employer and employee have rights and responsibilities [OH3059].
Employers have a right to expect:
• commitment
• loyalty
• competence
• teamwork
• etc.
The employee engineer also has certain rights:
• Recognition
• conscientious refusal
• privacy
• non-discrimination
• etc.
This chapter deals with some of these issues, starting with issues of responsibility,
confidentiality, and conflicts of interest.
Confidentiality [OH3060] This is one of the central duties of a professional, and widely
acknowledged. Some would define it to mean anything that comes from the employer during
employment, but the authors think this is a little broad. But the employer has the institutional
right to identify confidential material on a reasonable basis, and S&M suggest that most would
agree to the following:
...any information that the employer would like to have kept secret in order to compete effectively (S&M)
There are a couple of related terms. Privileged information refers to material that is available
only to those with a particular authority, or privilege. For example, it could mean only the
members of a project group. Proprietary information is what is owned by the company (it is
ENGI 5100 Part 3 final notes 270308 28/03/08
19the proprietor). It is a “trade secret”, and the law makes it illegal for an employee to divulge
it, even if they have gone on to a new job.
Another way a company might choose to protect technical matters is through patents. This
legally forbids others to copy it, but a patent is public, and sometimes the idea can be used and
not contravene the patent.
Confidentiality can emerge as a problem when changing employment. For example, how can
an employee who moves to a new job use his/her experience, if nothing can be based on the
old job? In general this restriction is not so constraining that one cannot use the results of
professional experience, but if material known to be a trade secret is revealed, there could be a
lawsuit. The moral right of the employee to career advancement has to be balanced against the
right of the company to protect its competitive position. Sometimes the court has to decide. See
a good examples in the text. In one case mentioned a GM executive went to Volkswagen, who
eventually settled out of court for $100 million cash, plus other considerations for the company.
In an older but well documented case, a man by the name of Donald Wohlgemuth left B F
Goodrich to manufacture space suits with a competitor. Goodrich sued, and sought an
injunction to prevent him from working with any other company which manufactured space suits.
But the courts more or less said he was ok, in that Wohlgemuth was not conveying specific
trade secrets, but rather using the general benefits of his professional experience, even though
it was acquired at B.F. Goodrich.
Justification and management policies. The need for confidentiality is not unreasonable for
companies, and they have a right to protect their interests. The rights of both companies and
employees can be protected by sensible and open employment agreements. For example, an
employee can agree to not go to work for a competitor in the same business for a specified
period of time after leaving the first company. Some benefit may be offered for this undertaking,
e.g. portability of a pension plan.
Conflict of Interest [OH3061] When an employee has an interest which might prevent them
from meeting their obligations to the employer, there is a conflict of interest. This could be
personal, e.g. owning substantial stock in a competitor’s company, or professional, e.g. taking
ENGI 5100 Part 3 final notes 270308 28/03/08
20on an assignment for a competitor, or serving on a regulatory committee which governs
one’s own business activity.
Our own code of ethics mentions this directly, and points out that such conflicts must be made
known to the employer, and only pursued with permission. The problem arises because having
another interest different from your employer’s can cloud professional judgement, which
involves more than just following rules. If there is a grey area, and you have a conflict of
interest, you might be inclined to make a different judgement from what you would in the
absence of that bias. It is not a defence to say that in the actual case there is no chance that
judgement could be affected. If a “typical professional” could be influenced, the conflict exists.
Gifts, bribes and kickbacks For example, what constitutes a bribe? We have had some
discussion on this topic. But anything that constitutes a reward from someone other than the
employer can influence how the employer’s interest is discharged.
Interests in other companies Owning stock in a competitor’s company could obviously be a
problem.
Insider information Knowledge that is not available to those outside the firm in general could
be used to favour friends or relatives - e.g. in investments. When it comes to outside activities,
no special advantage should accrue to the employee because of access to confidential matters.
Sometimes experts such as professors let their reluctance to possibly lose industry research
support interfere with public service roles such as giving expert testimony.
Example: 4, p155 [OH3062]
RIGHTS OF ENGINEERS [OH3063]
By this we mean the ethical questions which relate to the rights of engineers as employees. The
rights we identify are in the category of “ethical rights” rather than rights under the law, although
sometimes there is legal basis. They identify the “moral high ground”, and could be the basis
for taking an action requiring some justification to someone in authority.
ENGI 5100 Part 3 final notes 270308 28/03/08
21
Professional employee rights. If we look simply at an individual employee professional
engineer, it is not difficult to identify rights that he/she should have by looking at the various
identifications.
As a human being, one has the fundamental right to life and pursuit of legitimate interests,
such as work and career. There is also the right to non-discrimination on the basis of sex, race
and age, and in many countries, such as Canada, these rights are also set in our laws.
The fact that an engineer is an employee also confers certain institutional rights. Examples are
those which arise from employment contracts, e.g. the right to a salary, vacation, etc, in return
for the work that you do. Then there are rights of the employee not to have to give up human
and political rights, eg religious rights, political rights, etc. just because they are employed.
Finally, there are special rights that are a result of the professional role one has as an
engineer, which we identify in more detail below.
Every issue is not a “rights” issue. Before we get too far into this emphasis on rights, it is
prudent to consider the fact that, as we have often observed, a professional life - or any life - is
hardly ever black and white. We have seen how obligations and therefore rights can be in
conflict, thereby creating moral dilemmas. Circumstances can affect how much weight can be
given to a particular right. Just because engineers have a duty to public welfare, and therefore a
right to blow the whistle in some cases, does not mean that a certain business decision by
management which does not materially affect public safety has to be taken to the local TV
station because you may judge your management as not working in the overall good of society.
We will spend some time on the matter of whistle blowing shortly.
Professional Rights [OH3064]
We start with the last of the above categories, which is central to one’s role as a professional
engineer.
ENGI 5100 Part 3 final notes 270308 28/03/08
22Right of professional conscience. This right is basic and generic. That is to say, it is
fundamental, and forms the basis for several individual rights of the professional engineer. It is
the moral right to exercise professional judgement based on knowledge, experience, moral
values, and moral autonomy. It is what we have sometimes called a “negative right”, or a “liberty
right” in that it is a right not to be overruled or interfered with by others. It induces the duty in
others not to interfere with its proper exercise.
Of course you should not get the idea that the right must be exercised in isolation. Most
engineering decisions are complex, and especially those which have ethical components.
Consultation and discussion with colleagues is a part of the responsible exercise of engineering
judgement. Further, your right to this consultation, and perhaps access to employer resources
essential to making such a conscientious decision, places an obligation on the employer to do
more than simply not interfere.
Right of conscientious refusal. [OH3065] But there does come a point where you consider
drawing the line. Usually when this is considered, it will then have to be viewed in the light of
one of two possibilities regarding prevailing standards:
(1) where there is a widely shared agreement in the profession that a proposed act is unethical. This is the easier of the two. Examples are forging documents, altering test results,
lying, taking a bribe (but the definition of bribe can be a bit troublesome), padding a payroll, etc.
(2) where there is some doubt about the agreement among members of the profession on the ethical issue. This is obviously more difficult, and unfortunately, it is also more likely to
occur than black and white cases. Take a case where an engineer is designing a mine
ventilation system in a developing country where there are not equivalent regulations to that in
Canada, say. The mine management might be prepared to accept a lower standard, and to
insist on Canadian standards might put the project out of reach and much needed employment
might be lost.
An engineer’s right to refuse assignments does not put a duty on the employer to keep him/her employed, if there is nothing else for him/her to do in the company. For example if
ENGI 5100 Part 3 final notes 270308 28/03/08
23company is building military aircraft and the engineer decides that he cannot ethically work
on machines of war, because it is contrary to his sense of duty to the public, there may be no
alternative but to look for a new job. So the right to refuse assignment is there, but it is
conditional. There is some legal recourse against wrongful dismissal, but it will not likely cover
all cases based on ethics.
Right to recognition S&M state without reservation that “engineers have a right to
professional recognition for their work and accomplishments.” this includes reasonable payment
of salary of fees, and other more subtle forms of recognition. For example, someone who works
hard only to have someone else take the credit is being abused and demeaned. There is a right
to fair treatment.
.
Employee rights. [OH3066] A professional employee has certain rights that result simply from
the employment contract, formal and implied. Large corporations ought to recognize a basic
set.
a) contractual, e.g.
• to a salary
• to a vacation
(b) non-contractual, e.g.
• to choose legitimate outside activities.
• to privacy and employer confidentiality
• to due process (in the event of disputes)
• to nondiscrimination and absence of sexual harassment at the workplace.
Collective bargaining. (see discussion topic 5 p166.) [OH3067]
This issue is not specifically mentioned in our code of ethics, although as we have mentioned, it
is in others. Two arguments have been used for inclusion of terms against "coercive collective
action" by engineers. One is the "faithful agent" argument, and the other is based, as is so
much, on serving the public good. But there are "public good" arguments on the side of
collective bargaining, and there is no open and shut case that shows that unionism and
professionalism are incompatible. S&M presents both sides, and admits to complexity. This is
ENGI 5100 Part 3 final notes 270308 28/03/08
24an issue that depends on the particular situation. It might be more effective to argue
on the basis of whether the bargaining strategy is effective, rather than to make it a moral issue.
Modern practice has led to the conclusion that there is now no bar that can be enforced against
collective bargaining for engineers.
Example: The Case of the Backward Math.[OH3068,3069,3070].
Jay’s boss is an acknowledged expert in the field of catalysis. Jay is the leader of a group that
has been developing a new catalyst system, and the search has narrowed to two possibilities,
Catalyst A, and Catalyst B.
The boss is certain that the best choice is A, but he directs that tests be run on “just for the
record”. Owing to inexperienced help, the tests take longer than expected, and the results show
that B is the preferred material. The engineers question the validity of the tests, but because of
the project timetable, there is no time to repeat the series. So the boss directs Jay to work
backwards and come up with phoney data to substantiate the choice of Catalyst A, a choice that
all the engineers in the group, including Jay, fully agree with.
What should Jay do, and does he have a moral right to not do as he is directed?
Inquiry:
Facts:
(1) Jay’s boss is an expert in the field. (2) On whatever basis, presumably experience, he believes the right choice is “A” (3) Jay and other engineers agree with the boss, and they also have expertise in the area. (4) Data from testing indicate that "B" is a better choice (5) The validity of the tests is questionable, but there is no time to repeat them (6) Jay is directed to write a report justifying A
ENGI 5100 Part 3 final notes 270308 28/03/08
25
Analysis. Jay has been directed to falsify data. This is clearly contrary to the moral obligation of integrity in
all he undertakes. He also has the moral responsibility to obey direction from his employer. This
falls partially into the area of clause 14 of the PEGNL code, which deals with professional
judgement being overruled. It is not a clear-cut case, because he supports his boss's intuition
(or whatever) that "A" is the best choice. Jay's reservations are on how to write the report.
Conclusion.
Although Jay agrees with his boss that "A" is the better choice, on no account can he simply
falsify data to support that choice of catalyst, especially in the face of testing evidence that
indicated “B”, even if the validity of the testing is questionable, and even when they all agree
that the result was unexpected.
He should then make the best possible case for re-testing. If that is unsuccessful, in order to
satisfy the boss’s direction he can do the required calculations of “backward math”, but his
report must clearly document that although these would be the required data if they were to
support the choice of catalyst “A”, but that is not what the testing revealed.
He should also report the actual data, and state clearly that he has no test data to support the
choice of “A”, and any recommendation to the client in favour of “A” would be contrary to such
data as they have. That is not to say that such a recommendation cannot be made to the client,
depending the risk exposure if a mistake is made, but the client would have to be made fully
aware of the facts, and the basis for the recommendation.
(2) How far does this paramount obligation to protect the public go? [OH3071]
A design group develops a new electronic circuit to be used in clock radios that would extend
their average life from five to seven years, at an increased manufacturing cost of only one
percent. When this proposal is presented to top management, it is rejected on the grounds that
ENGI 5100 Part 3 final notes 270308 28/03/08
26it is not cost-effective, and they direct that further work on the design be terminated.
Does the design group’s obligation to the public outweigh the employer’s directive?
In this case it is reasonable to conclude that the employer's action is within his legitimate
authority, and the moral obligation of loyalty (faithful agent) to the employer is more important
than the possible benefit to the public. After all it is a business decision. It might even be that it
is a bad business decision, because a competitor might well make such an improvement and
take away market share. But it is the employer's right to make it. Also the question of public
safety has not arisen, and the effect of innovations such as these are very hard to predict. The
engineer is taking on a lot to base a moral argument on presumptions of the good to the public
in this situation.
ENGI 5100 Part 3 final notes 270308 28/03/08
27WHISTLEBLOWING AND LOYALTY [OH3072]
When it comes to a course in engineering ethics, whistleblowing is one of the first to come to
everyone’s mind. This predominance is understandable, in that it usually is the most dramatic
situation in which an engineer takes a stand against an employer, internally or externally. It
fundamentally conflicts with the duty of loyalty to the employer, and Article 10 of our Code of
Ethics applies. In terms of the application of ethics, the attention it gets is largely undeserved,
because it really is a last resort. Some of the questions are:
• When is whistleblowing morally permissible?
• Is it ever morally obligatory, or is it always beyond the call of duty?
• When is whistleblowing an act of disloyalty?
• What procedures ought to be followed?
In its broadest sense, the notion of a “whistle-blower” is not limited to an employee and his
employer. Anyone who is aware of something they think is illegal or unethical can draw it to the
attention of authorities or to other people whom they think can respond. This might only be the
public. Journalists, politicians, and consumer groups do it all the time. This is of course why the
idea is so well-known.
Definition [OH3073] In the sense that we wish to use it in engineering ethics, it occurs when an
employee or former employee conveys information about a significant moral problem outside
approved organizational channels or counter to authority, to someone in a position to take
action on the problem. (S&M p167)
There are four main features in this definition:
(1) Act of disclosure: Information is intentionally conveyed outside normal channels or against
the wishes of supervisors.
ENGI 5100 Part 3 final notes 270308 28/03/08
28(2) Topic: The person believes it to be a significant moral problem for him, with
consequences for the organization and the public. Examples are illegal acts, unethical practices,
serious threats to public safety.
(3) Agent: The person disclosing the information is an employee, former employee, or
otherwise closely associated.
(4) Recipient: This would have to be a person in a position to act on the problem. For example,
telling it to a relative over dinner or a friend in the pub does not count.
We can divide the process as to whether it is kept inside the organization (internal whistleblowing) or taken outside (external whistleblowing). Even when it is internal, it might
amount to going against authority. Whistleblowing can be open or anonymous.
Note that the definition does not make assumptions about whether or not the whistleblowing is
justified, other than the fact the person believes the moral problem to be significant. (Some
other definitions differ slightly on this point.) Each case can then be argued on its merits.
Two examples p168. Fitzgerald, (student can read) and Applegate. In one case the whistle
was blown, the other not.
DC 10 crash and Applegate's dilemma [OH3074]
See also ref 7 (Sawyier)
Sketch (OH3075) (Showing why the cabin floor collapsed when the cargo door blew off.)
346 people were aboard a Turkish Airlines DC 10 jumbojet which left Paris on the evening of 3
March, 1974. Only nine minutes into the flight a cargo bay door flew open, and the main
passenger cabin floor collapsed, taking out control cables and causing total loss of control to the
tail surfaces. There were no survivors. Many engineers knew the design was unsafe, and the
danger was well documented. So why did this happen?
Details of the DC10 affair (OH3076)
ENGI 5100 Part 3 final notes 270308 28/03/08
29
S&M give a brief summary of this well-documented case, but more details are available in
reference 7. (Sawyier, in "Engineering Professionalism and Ethics")
Design for the DC10 aircraft began in 1968, in response to a call from American Airlines for a
versatile wide-bodied jet. McDonnell Douglas was in fierce competition with Boeing, who had
produced the 747, and Lockheed (L1011) for a piece of the “Jumbo jet” market.
As early as 1968, Dutch engineers had already warned of the danger of the collapse of
passenger cabin floors in these big planes if the cargo hold should suddenly become
depressurized. In moving up from smaller planes to the big wide-bodied aircraft, these
engineers argued that there was insufficient additional structural strength to withstand the
enormous forces on the floor if there were a one atmosphere pressure differential across it,
which is what would happen if the cargo bay suddenly depressurized for any reason.
In this case, the client, American Airlines, also insisted on a change in the cargo door closing mechanism. They wanted electrical door actuators rather than conventional hydraulic. They
would be lighter, and cheaper. This requirement was passed to Convair, the design contractor,
and was implemented.
Dan Applegate was the Director of Product Engineering for the Convair Division of General
Dynamics Corporation, which was under contract to McDonnell Douglas. The potential
weakness in the design was recognized by Applegate and Convair in 1968, and by 1969 they
had identified nine possible critical failure sequences involving cargo doors.
But McDonnell Douglas took full responsibility for the design, and did not inform the
regulators about these results, which indicated that the system could be exposed to a
catastrophic failure. Applegate’s superiors in Convair apparently would not allow the information
to go beyond the company, knowing that there would be a dispute as to who would bear the
cost of redesign. In fact, the contract between McDonnell Douglas and Convair forbade Convair
from going to the regulator (the Federal Aviation Authority) directly.
ENGI 5100 Part 3 final notes 270308 28/03/08
30Between 1970 and 1972 more evidence of problems with the cargo door was
documented (including at least two cases of floor collapse) occurred, but the plane was still
allowed to fly.
Applegate became so alarmed by June, 1972 that he wrote an internal memo documenting his concerns. Although these renewed concerns still did not go to the regulator, in fact the FAA
were aware that there was a problem, because tests had revealed them. One of the regulator’s
own engineers was so frustrated that he too, wrote and filed a memo. It all came apart the night
of 3 March, 1974, when the ground crew at Paris failed to properly secure a cargo bay door, and
the crash occurred a few minutes later.
What do we conclude? (OH3077)
No doubt it is an extreme case, but this illustrates the classic dilemma of the employee
engineer. If Applegate had gone over his bosses’ heads directly to the regulatory authority, or
the press, he would probably have been fired, but it might have saved 346 lives. On the other
hand, who knows what pressures he might have been under, and for that matter, who knows
how the system might have responded. We should not jump too quickly to the conclusion that it was obvious at the time that he
should have done something other than what he did, nor can we even conclude that the
accident would certainly have been prevented. But we are reminded that acting ethically can
sometimes require a lot of courage.. Applegate struggled internally to do what he thought he
should, and was badly let down by his management.
As S&M say, quoting another source, “ethics is not for wimps”
The DC10 disaster is an example of a complicated problem, with several people and
organizations sharing blame. But complexity is more often the rule than the exception, and no
code of ethics can be applied like a formula in every case. The purpose of this section on ethical
theory is to help us deepen our understanding.
ENGI 5100 Part 3 final notes 270308 28/03/08
31Perhaps we then can deal with the ethical problem before it becomes a legal problem.
Required conditions before whistleblowing: [OH3078] Under what conditions may (or
should) an engineer “blow the whistle”? The potential harm must be substantial, and may affect
the public, employees, or even shareholders. It may take various forms, safety, faulty products
or fraud, for example. For the whistleblower, it is a very serious undertaking, especially if it is
done externally. S&M draw on guidelines proposed by Richard DeGeorge (see the reference in
the text) where public safety is concerned. They advise that the following conditions must be
met before getting involved in whistleblowing:
(1) the actual or potential harm is serious and has been documented (2) the concerns have been made known to superiors, and (3) after getting no satisfaction from immediate superiors, other internal channels are exhausted, up to the highest levels of management.
Sometimes these conditions can be hard to meet, and there may be exceptions. For example,
confidentiality may interfere with the first one. The next two can be very problematic if the issue
involves managers above you, or there is extreme urgency. Finally, the price of whistleblowing
can be high, yet engineers and others have done it. Read the examples in the text - eg
Fitzgerald. What about Applegate, and Boisjoly? Both did all they might have been expected to
do, but they did not go external. Should they have?
Loyalty, collegiality and authority [OH3079]
Every company wants its employees to be loyal team players, and teamwork is a virtue with
which engineering students are very well-acquainted. The ability to work in teams, even in the
face of competing ideas and uneven contributions of members is a great strength, and the mark
of a good engineer. Although we would identify loyalty as a virtue in an employee engineer, the
word does not appear literally in the APEGN Code of Ethics, nor in any of the others in S&M.
The notion of loyalty is captured in the “faithful agent and trustee” phrase, which appears in
them all, in one form or another.
ENGI 5100 Part 3 final notes 270308 28/03/08
32Another similar characteristic is collegiality; the ability to get along with people, to make
everyone feel they belong, and to resolve conflict when it occurs.
Two types of loyalty There are two levels of loyalty to an employer. The first can be called
agency-loyalty, and refers to the contractual duties that the employee has, as well as accepting
the legitimate authority of the company.
The other type of loyalty is called attitude-loyalty, and has more to do with emotions and a
sense of personal identity with the group to which it is loyal. It is of course possible to meet the
formal contractual obligations implied by agency loyalty without reaching the level of
commitment implied by attitude loyalty.
The extent to which “faithful agent and trustee" goes beyond agency-loyalty is open to
interpretation. Some would undoubtedly say that the intention in the code does indeed go
further than this bare minimum. It can be reasonably argued that if the employee is achieving
goals of his/her own through professional employment, more is owed to the company than the
lowest level of loyalty. For example, at the very least, one is gaining professional experience.
Similarly, if there is a sharing of burdens and benefits among the group, a sense of identification
loyalty reasonably develops.
In the final analysis, the weight given to employer loyalty has to be judged in the specific
situation. There are times when it must yield to higher values. Thus loyalty is a dependent virtue, and depends on the value of contribution being made by the organization. Covering up
wrongdoing to the public could not rate as a valuable contribution, and therefore does not
command loyalty.
Optional example: The Ford Motor Company and the EPA (OH3080)
Respect for authority [OH3081] Employers must have legitimate authority over employees.
Without lines of authority in an organization, everyone is free to do their own thing, and chaos
would soon ensue. Such authority is called executive authority, and is given to individuals to
carry out their tasks within the institution.
ENGI 5100 Part 3 final notes 270308 28/03/08
33
This sort of authority is different from expert authority which derives from a special
competence, knowledge or skill. Of course expert authority can exist in individuals who have no
institutional authority.
Another important distinction is between authority and power. Real power amounts to the
ability to inspire, persuade or direct others to accomplish objectives. An individual might have a
good deal of institutional authority and resources, but if he/she lacks leadership skills, can be
quite ineffective in getting employees to produce for the company or institution. On the other
hand, someone having expert authority and respect of colleagues, or even simply charisma,
might be able to motivate or otherwise persuade people to excel.
Protecting whistleblowers [OH3082]
“Whistleblowing is a lonely, unrewarded, and fraught with peril...” (S&M, p175, quoting a
lawyer). Nevertheless, engineers feel they must do it in some circumstances. It is not
surprising that employers who have been exposed in some illegal or unethical conduct can be
quite upset, employees can suffer greatly, and of course court cases can result.
Commonsense procedures. S&M suggest several pieces of advice in considering whether or
not to take this serious action:
1. Except for extremely rare emergencies, always try first working through normal organizational
channels. Get to know both formal and informal (unwritten) rules for making appeals within the
organization.
2. Be prompt in making objections. Waiting too long may create the appearance of plotting for
your advantage and seeking to embarrass a supervisor.
3. Proceed in a tactful, low-key manner. Be considerate of the feelings of others involved.
Always keep focused on the issues themselves, avoiding any personal criticisms that might
create antagonism and deflect attention from solving those issues.
ENGI 5100 Part 3 final notes 270308 28/03/08
34
4. As much as possible, keep supervisors informed of your actions, both through informal
discussions and formal memorandums.
5. Be accurate in your observations and claims, and keep formal records documenting relevant
events.
6. Consult colleagues for advice - avoid isolation.
7. Before going outside the organization, consult the ethics committee of your professional
society.
8. Consult a lawyer concerning potential legal liabilities.
To these I would add the very real caution that one should be clear about one's motives. It is
very easy to fall victim to the temptation to turn some personal grievance into a great case of
moral principle. The moral weight of self-interest should not be zero, but it is not one of the
heaviest.
There is an interesting website mentioned by S&M, created by an American group called GAP:
“Government Accountability Project”, with a website www.whistleblower.org.
The one compelling observation which must be made is that there surely was little protection
and reward for the whistleblowers. This led us to consider conditions under which engineers are
justified in whistleblowing, especially externally. This action requires a serious moral inquiry,
since it will often be done at the sacrifice of the rights of colleagues and family.
We note that in the Challenger case, Boisjoly did not go totally public with his concerns; there
did not seem to be any mechanism which would stop the launch and time was extremely short..
But in meetings with the client he did his best to make his best case against launching that cold
day.
ENGI 5100 Part 3 final notes 270308 28/03/08
35Recap of issues concerning employee engineers’ rights: [OH3083]
Before concluding this section (and the course) let me return to the set of “issues” with which we
started the material concerning the rights of engineers.
-Do engineers have a moral right to refuse to carry out what they consider to be unethical activity?
Yes, we certainly do. But the exercise of this right does not necessarily impose a duty on the
employer to keep us employed.
-what duty falls to employers to respect this right? That is, if an employee takes a course of
action contrary to the employer's wish, based on moral grounds, must the employer refrain from
action such as discipline or dismissal?
It depends on the situation. In some cases, there is a legal constraint on the employer, and
court action can be brought by the engineer. In many cases, the employer also has the right to
moral autonomy, and might not respect your right. In other cases the employer may have no
control over the situation, for example where a law is broken.
-do the rights or responsibilities of either the employee or the employer depend on whether it is a business or an institution supported from public funds?
We have not talked much about this. Fundamentally, the engineer’s rights should not be
different, nor should the responsibilities of the employer. But the duty to the public good and fair
treatment of employees is especially accented when the employment is in the public service. It
has the effect of giving even more weight to that moral obligation.
6. Global Issues p185
( if there is time)
29 Samuel C. Florman. Moral Blueprints. In Schaub and Pavlovic, (see ref 2) pp 76-81.
ENGI 5100 Part 3 final notes 270308 28/03/08
36 30 Peter L. Bernstein. Against the Gods, the Remarkable Story of Risk. John Wiley and Sons, 1996. P1.
31 Ibid, p8
32 Henry Petroski. Design Paradigms: Case Histories of Error and Judgement in Engineering. Cambridge University press, 1994. p158.