Вы находитесь на странице: 1из 28

Science and Technology: Which Way does the Causation Run ?

Nathan Rosenberg
Stanford University
October 2004

[This paper was prepared for presentation on the occasion of the opening of a
new “Center for Interdisciplinary Studies of Science and Technolo gy” at Stanford
University, November 1, 2004].

Not to prolong your suspense, the correct answer to the question in the subtitle of my

paper is the obvious one: causation runs both ways. But I want to persuade you that the

causation running from technology to science is vastly more powerful than is generally

realized.

The reasoning is straightforward. A market economy generates powerful incentives to

undertake certain kinds of scientific research. This is because the eventual findings of

such research can be made to improve the performance, or to reduce the cost, of

technologies that are vital to the competitive success of profit-making firms. Further, I

want to suggest that there were powerful forces at work in the course of the 20th century

that had the effect of expanding the ways in which changes in the realm of technology

have led to changes in the various realms of science. I want to call your attention to

some of the most significant organizational changes, and associated changes in

1
incentives , that were responsible for strengthening the causal forces that flowed from

technology to science.

In order to do this, I will need to introduce just one single bit of jargon: I will use the

term "endogenous" from the perspective of the economist and not from the perspective

of the scientist. Thus, when I refer to the endogeneity of science, I am referring to the

extent to which scientific progress has been directly influenced by the working out of the

normal forces of the market place. My justification is that I will be trying to identify forces

that emerged in the course of the twentieth century that made scientific research more

highly responsive to economic incentives.

I also need to emphasize one caveat that I cannot emphasize too strongly. I am not

implicitly suggesting that the financial support of the country’s scientific research should

be left to the market place. Rather, I will be calling attention to the operation of market

forces that have become increasingly supportive of scientific research. I believe that

these developments were crucial to the rapid expansion of American industry, but that is

very different from suggesting that market forces, by themselves, were sufficient.

Corporate Research Labs

The proposition that scientific research became increasingly endogenous in the

course of the 20th century must necessarily begin by focusing on a key organizational

2
innovation: the industrial research lab. It was these corporate labs that determined the

extent to which the activi ties of the scientific community could be made to be responsive

to the needs of the larger economy. But such a statement, by itself, cannot stand alone.

This is because these research labs depended for their effective performance, in turn,

upon a network of other institutions. These included, above all, research at universities.

Before the Second World War, university research depended heavily, for its financing,

on private philanthropic foundations, such as the Rockefeller, Guggenheim and

Carnegie foundations. In the pre-war period, as well, universities often relied on

financial support from local industry for carrying out certain classes of research, mostly

of an applied nature. This was especially true of state universities, where it was

essential to provide evidence of assistance to local industry [agriculture, mining,

railroads] in order to justify the imposition of taxes upon the citizens of each state. In

fact, with few exceptions, funds raised by state governments went, overwhelmingly, to

support teaching and not research.

This situation was totally transformed in the post World War II period when the federal

government became, overwhelmingly, the dominant patron of scientific research, and

universities became the primary locus of such research. It is important to note that the

concentration of basic scientific research in the university community where, I think it is

fair to say, it has flourished, has been an organizational arrangement that has been

almost unique to the US. Unlike the situation in western Europe, where basic research

has been concentrated in government labs (Max Planck, CNRS) federal laboratories in

US have accounted for less than 10 percent of basic research (9.1% in mid 1990s).

3
A further distinctive feature of great importance in the US is the very large

commitment of private industry to scientific research that the NSF defines as basic.

Private industry accounted for slightly over 30% of all basic research in the year 2000

(probably declining slightly in last few years). Although at last count there were around

16,000 private firms that had their own corporate labs, the vast majority of these firms

conduct research of a predominantly applied nature. Only a very small number do basic

research. Nevertheless, over the years, a few of these corporate labs have conducted

research of the most fundamental nature - General Electric, IBM and, most important of

all, Bell Labs before the divestiture of AT&T in 1984. Researchers in a number of

corporate labs have won Nobel Prizes, most recently Jack Kilby, of Texas Instruments,

won the Prize for Physics, in the year 2000, for research leading to the development of

the integrated circuit [Kilby’s research received financial support from the federal

government].

Having said this , it is essential to realize that the research activities of industrial labs

should not be evaluated, as they often are by academics, by the usual academic criteria

- such as publications in prestigious professional journals or the winning of Nobel

Prizes. Such labs have a very different purpose. The industrial lab is essentially an

institutional innovation (of German origin) in which the research agenda is largely

shaped by the short-term needs but also, in a few notable cases, by the longer-term

strategies of industrial firms. Within the industrial context, the intended role of corporate

scientists is to improve the performance of their respective firms in the competitive

4
context of (mostly) high tech sectors of the economy. Thus, the critical achievement of

the growth of the American industrial lab in the course of the 20th century has been to

subject science, more and more, to commercial criteria. In so doing, it has rendered

science an activity whose directions were increasingly shaped by economic forc es and

concentrated on the achievement of economic goals - which is to say such scientific

research should be regarded as largely endogenous.

One further strategic role of the corporate lab arises from the fact that a firm cannot

effectively monitor and evaluate the findings, and the possible implications, of the huge

volume of university research unless it has its own internal capability for doing such

things. The importance of this point cannot be overestimated. In advanced industrial

societies that are now simply flooded with the flow of information, not only from

universities, but from professional journals on library shelves or electronically via

Internet search engines such as Yahoo and Google, the exploitation of this vast flow of

information requires an internal competence that, typically, only inhouse scientists can

provide. Indeed, America’s remarkable commercial successes in high tech markets over

the past 50 years have owed a great deal to these internal competences in private

industry. Industrial scientists have played a critical role in the transfer of potentially

useful knowledge generated by university research, not only because of their scientific

sophistication, but also because they have had a deep awareness of their firms’

commercial priorities and technological capabilities [See Nathan Rosenberg, “Why do

firms do Basic Research?” (Research Policy); Nathan Rosenberg, “America’s

University/Industry Interfaces, 1945-2000”, unpublished manuscript, May 2002;

5
and David Mowery and Nathan Rosenberg, Paths of Innovation: Technological

Change in 20th America, Cambridge University Press, 1998].

How Engineering Disciplines have Shaped Science

I would like now to call your attention to another major force for advancing the

endogeneity of science in the course of the 20th century. I would like to pose the

question: what specific role is played by engineering disciplines in determining the

scientific agenda of private firms? Let me respond, first, by offering a clarification. It is a

common practice to characterize engineering disciplines as being essentially applied

science. This is, in my view, a seriously misleading characterization. A more careful

unwinding of the intertwining of science and technology suggests that the willingness of

profit-seeking firms to devote money to scientific research is very much influenced by

the prospect of converting such research findings into finished and marketable products.

The actual conduct of scientific research may not be undertaken with highly specific

objectives in mind, but rather with an increased confidence that, whatever the specific

research findings, an enlarged engineering capability will substantially increase the

likelihood of being able to use these findings to bring improved or new products to the

market place.

From this perspective, there is a serious sense in which the economist may argue that

the science of chemistry should be thought of as an application of chemical engineering!

6
Alternatively put, the growing sophistication of engineering disciplines has had the result

of strengthening the endogeneity of science. I do not want this point to be made to

sound too paradoxical. I mean to suggest that the willingness of private industry to

commit financial resources to long-term scientific research has been considerably

strengthened by the progress of the appropriate engineering disciplines. Such progress

raises the confidence of corporate decisionmakers that the findings of basic research

may eventually be converted to profitable uses.

This argument seems particularly pertinent to the specialty of polymer chemistry, a

field that was opened by the researches of Staudinger, Meyer and Mark in Germany in

the 1920s. In the US at least, polymer chemistry is a field that has long been dominated

by the industrial research community. The fundamental research contributions to

polymer chemistry of Wallace Carothers at du Pont, beginning in 1928, owed a great

deal to the increasing maturity of chemical engineering in the preceding decade or so,

an engineering discipline to which du Pont had made important contributions [See

Hounshell and Smith]. Carothers’ research findings led directly to the discovery of

nylon, the first of a proliferation of synthetic fibers that came to constitute an entirely

new subsector of the petrochemical industry after the Second World War. But it is

doubtful that du Pont would have committed itself to Carothers’ costly, fundamental

researches in polymer chemistry, in the first place, in the absence of the progress in

chemical engineering in the decade preceding 1928. Thus, progress at the

technological level (chemical engineering) increasingly strengthened the willingness to

spend money on science, which I regard as a growth in the endogeneity of science.[See

7
Nathan Rosenberg, “Technological Change in Chemicals: The Role of University-

Industry Relations,” chapter 7 in Ashish Arora, Ralph Landau and Nathan

Rosenberg (eds.), Chemicals and Long-term Economic Growth, Wiley, 1998].

Let me sketch out the intermediate steps that underly my argument. The discipline of

chemical engineering really had its beginnings in the second and third decades of the

20th century, mainly at MIT, in response to the spectacular expansion of the automobile

industry and, along with that industry’s growth, a voracious demand for refined chemical

products (primarily, of course, for high octane gasoline). The scale of that growth can be

captured in the following numbers: In 1900 the automobile industry was so insignificant

that the Census Bureau classified cars under the category “Miscellaneous.” [In that

year there were only 8,000 registered cars in the US]. By 1925 the automobile industry

had leaped to the status of the largest manufacturing industry in the whole country

(measured by value added).

It was the growth of the automobile that gave birth to the discipline of chemical

engineering. Chemical engineers, during the 1920s and later transformed the

petroleum refining industry from small-scale batch production into one of vastly larger

scale and continuous processing. The emerging chemical engineering discipline

accomplished this by developing a new conceptual framework within which it became

possible to introduce scientific concepts and methodologies from such fields as fluid

flow (fluid dynamics), heat transfer and, in the 1930s, the pervasive power of

thermodynamics. In other words, the design of chemical process plants could now draw

8
heavily upon a number of different scientific realms. Thus, it was the establishment of a

new engineering discipline, in responding to the rapid expansion of a new transportation

technology, that, in turn, laid the basis for the profitability of scientific research, not only

in du Pont and petroleum refining firms, but in a very wide range of industries that also

made use of chemical process plants. It is worth emphasizing how pervasive chemical

process plants became in the course of the 20th century. Large chemical plants could be

found in petroleum refining, rubber, leather, coal (by-product distillation plants), food-

processing, sugar refining, explosives, ceramics and glass, paper and pulp, cement,

and metallurgical industries (e.g., aluminum, iron and steel).

How New Products Have Shaped Science

The next related observation with respect to the growing endogeneity of scientific

research goes beyond the role played by engineering disciplines in strengthening the

private incentives to perform scientific research. The argument here is that the

development of some specific new product, that is perceived to have great commercial

potential, may provide, and often has provided, a powerful stimulus to scientific

research. This proposition is surprising only if one is already committed to a rigid, overly

simplistic linear view of the innovation process, one in which causality is always

expected to run from prior scientific research to “downstream” product design and

engineering development. There is in fact, however, a straightforward endogenous

explanation at work here. A major technological breakthrough typically provides a strong

9
signal that a new set of profitable opportunities has been opened up in some precisely-

identified location. Consequently, it is understood that scientific research that can lead

to further improvements in that new technology may turn out to be highly profitable.

The problems encountered by sophisticated industrial technologies, and the

anomalous observations and unexpected difficulties that they have encountered, have

served as powerful stimuli to much fruitful scientific research in the academic

community as well as the industrial research laboratory. In these ways the

responsiveness of scientific research to economic needs and technological

opportunities has been powerfully reinforced.

This was dramatically demonstrated in the case of the advent of the transistor, the

discovery of which was announced at Bell Labs in the summer of 1948. Within a decade

of that event solid-state physics, which had previously attracted the attention of only a

small number of researchers and was not even taught at the vast majority of American

universities (mainly MIT, Princeton, and Cal Tech) had been transformed into the

largest subdiscipline of physics. It was the development of the transistor that changed

that situation by dramatically upgrading the potential financial payoff to research in the

solid state. J.A. Morton, who headed the fundamental development group that was

formed at Bell Labs after the invention of the transistor, reported that it was extremely

difficult to hire people with a knowledge of solid-state physics in the late 1940s.

Moreover, it is important to emphasize that the rapid mobilization of intellectual

resources to perform research in the solid state occurred in the university community as

10
well as in private industry, immediately after the announcement of the momentous

findings of Shockley and his research colleagues at Bell Labs. As one strong piece of

evidence for this view, the number of publications in semiconductor physics rose from

less than 25 per annum before 1948 to over 600 per annum by the mid-1950s (Herring,

unpublished manuscript, n.d.).

The chronology of the events that I have just referred to is essential to my argument.

Transistor technology was not the eventual consequence of a huge prior buildup of

resources devoted to solid-state physics, although it was of course also true that some

of the twentieth century’s most creative physicists had been devoting their considerable

energies to the subject. Rather, it was the initial breakthrough of the transistor, as a

functioning piece of hardware, that set into motion a vast subsequent commitment of

financial support for scientific research. Thus, the difficulties that Shockley encountered

with the operation of the early point-contact transistors led him into a systematic search

for a deeper explanation of their behavior, expressed in terms of the underlying

quantum physics of semiconductors. This search not only led eventually to a vastly

superior amplifying device, the junction transistor; it also contributed to a much more

profound understanding of the science of semiconductors. Indeed, Shockley’s famous

and highly influential book, Electrons and Holes in Semiconductors, drew heavily upon

this research, and the book was the direct outgrowth of an in-house course that

Shockley had taught for Bell Labs’ personnel. Moreover, Shockley also found it

necessary to run a six day course at Bell Labs in June 1952 for professors from some

11
thirty universities as part of his attempt to encourage the establishment of university

courses in transistor physics.

Clearly, the main flow of scientific knowledge during this critical period was from

industry to university, and not the other way around. Indeed, for a considerable period

of time, Stanford and the University of California at Berkeley had to employ scientists

from local industry to teach courses in solid-state physics/electronics.

A similar sequence can be seen in the commitment of funds to research in surface

chemistry, after problems with the reliability of early transistors pointed in that direction.

More recently, and to compress a much more complex chain of events, the

development of laser technology suggested the feasibility of using optical fibers for

telephone transmission purposes. This possibility naturally pointed to the field of optics,

where advances in scientific knowledge could now be expected to have potentially high

economic payoffs. As a result, optics as a field of scientific research experienced a

great resurgence in the 1960s and after. It was converted by changed expectations,

based upon recent and prospective technological innovations, from a relatively quiet

intellectual backwater of science into a burgeoning field of research. This growth of

activity in the discipline was generated, not by forces internal to the field of optics, but

by a radically altered assessment of the potential opportunities for laser-based

technologies. Moreover, different kinds of lasers gave rise to different categories of

fundamental research. As Harvey Brooks has noted: "While the solid-state laser gave a

new lease of life to the study of insulators and of the optical properties of solids, the gas

12
laser resuscitated the moribund subject of atomic spectroscopy and gas-discharge

physics."[Harvey Brooks, "Physics and the Polity," Science, 1968, vol. 160].

I draw the conclusion from this examination that, under modern industrial conditions,

technology has come to shape science in the most powerful of ways: by playing a major

role in determining the research agenda of science as well as the volume of resources

devoted to specific research fields. One could examine these relationships in much finer

detail by showing how, throughout the high tech sectors of the economy, shifts in the

technological needs of industry have brought with them associated shifts in emphasis in

scientific research. When, for example, the semiconductor industry moved from a

reliance upon discrete circuits (transistors) to integrated circuits, there was also a shift

from mechanical to chemical methods of fabrication. When Fairchild Semiconductors

began to fabricate integrated circuits, they did so by employing new methods of

chemical etching that printed the transistors on the silicon wafers and also laid down the

tracks between them. This chemical technique did away with expensive wiring, and also

produced integrated circuits that operated at much higher speeds. At the same time, the

increased reliance upon chemical methods brought with it an increased attention to the

relevant subfields of chemistry, such as surface chemistry.

I cite the experience of changing methods of wafer design and fabrication to indicate

the ways in which the changing needs and priorities of industry have provided the basis

for new priorities in the world of scientific research. But it is essential to emphasize that

these new priorities exercised their influence, not only upon the world of industrial

13
research, but upon the conduct of research within the university community as well. I

need only point out that Stanford University has had, for some time, its own Center for

Integrated Systems. This Center is devoted to laboratory research on microelectronic

materials, devices, and systems, and is jointly financed by the federal government and

private industry.

SERENDIPITY

There is a further source of causation running from technology to science to which I

would like to call your attention. I refer to the role of serendipity. It is, of course, to be

expected that well-trained scientific minds are likely to turn up unexpected findings in

many places. As Pasteur expressed it in the mid-19th century: "Where observation is

concerned, chance favors only the prepared mind." By way of contrast, consider

Thomas Edison, by universal consent a brilliant inventor, but someone who had little

interest in observations that had no immediate practical relevance. In 1883 he observed

the flow of electricity across a gap, inside a vacuum, from a hot filament to a metal wire.

Since he saw no practical application and had no scientific training, he merely described

the phenomenon in his notebook and went on to other matters of greater potential utility

in his effort to enhance the performance of the electric light bulb. Edison was, of

course, observing a flow of electrons, and the observation has since even come to be

referred to as the "Edison Effect" - named after the man who, strangely enough, had

failed to discover it. Had he been a curious (and patient) scientist, less preoccupied

with matters of short-run utility, Edison might later have shared a Nobel Prize with Owen

14
Richardson who analyzed the behavior of electrons when heated in a vacuum, or

conceivably even with J.J. Thomson for the initial discovery of the electron itself.

Edison's "prepared mind," however, was prepared only for observations that were likely

to have some practical relevance in the short-run.

A distinctive feature of the 20th century in dynamic capitalist economies was the

vastly-increased numbers of scientifically "prepared minds" in both the universities and

private industry. The pursuit of the possible implications of unexpected observations

became the basis, on many occasions, for fundamental breakthroughs that occurred

serendipitously when "prepared minds" were available to pursue the possible

implications of the unexpected. Surely the most spectacular instance of serendipity in

the 20th century - not achieved in an industrial laboratory - was Alexander Fleming's

brilliant conjecture, in 1928, that the unexpected bactericidal effect that he had observed

in the bacterial cultures in his Petri dish, was caused by a common bread mould that

had accumulated on his slides. Fleming published this finding in 1929, but no

substantial progress was made in producing a marketable product until more than a

decade later, when the exigencies of wartime led to a joint, Anglo-American "crash"

program to accelerate the production of the antibiotic [Elder, Albert Lawrence (ed.),

The History of Penicillin Production, American Institute of Chemical Engineers,

New York, 1970].

It is at least a plausible speculation that, had Fleming made his marvelous discovery

while working in a pharmaceutical lab, penicillin would have become available, in large

15
quantities, far more swiftly than was in fact the case [For a contrary view, see Bernal,

volume 3, pp. 926-7]. In the context of this paper it is also worth pointing out a little-

known historical fact, that the technology to produce the antibiotic in bulk was achieved

not, as would ordinarily have been expected, by the pharmaceutical chemist, but by

chemical engineers. It was the chemical engineers who demonstrated how a technique

called "aerobic submerged fermentation," which became the dominant production

technology, could be applied to this complex product [Elder, op. cit].

The growth of organized industrial labs in 20th century America vastly enlarged the

number of trained scientists in the industrial world who encountered strange

phenomena that were most unlikely to occur, or to be observed, except in some highly

specialized industrial context. In this sense, the huge increase in new high tech

products, along with dense concentrations of well-trained scientific specialists in

industry, sharply increased the likelihood of serendipitous discoveries in the course of

the twentieth century.

Consider the realm of telephone transmissions. Back at the end of the 1920s, when

transatlantic radiotelephone service was first established, the service was discovered to

be poor due to a great deal of interfering static. Bell Labs asked a young man, Karl

Jansky, to determine the source of the noise so that it might be reduced or eliminated.

He was given a rotatable antenna to work with. Jansky published a paper in 1932 in

which he reported that he had found three sources of noise: Local thunderstorms, more

distant thunderstorms, and a third source which he described as "a steady hiss static,

16
the origin of which is not known." It was this "star noise" as Jansky labelled it, which

marked the birth of the entirely new science of radio astronomy.

Jansky's experience underlines why the frequent attempt to distinguish between

basic research and applied research is extremely difficult to carry out in practice.

Fundamental scientific breakthroughs often occur while dealing with very applied or

practical problems, especially problems relating to the performance of new technologies

in an industrial context.

But the distinction breaks down in another way as well. It is essential to distinguish

between the personal motives of the individual researchers and the motives of the

decisionmakers in the firm that employs them. Many scientists in private industry could

honestly say that they are attempting to advance the frontiers of basic scientific

knowledge, without any concern over possible applications. At the same time, the

motivation of the research managers, who decide whether or not to finance research in

some basic field of science, may be strongly motivated by expectations of eventual

useful findings.

This certainly appears to have been the case in the early 1960s when Bell Labs

decided to support research in astrophysics because of its potential relationship to the

whole range of problems and possibilities in the realm of microwave transmission, and

especially in the use of communication satellites for such purposes. It had become

17
apparent that, at very high frequencies, annoying sources of interference in

transmission were widely encountered.

This source of signal loss was a matter of continuing concern in Bell Labs'

development of the new technology of satellite communications. It was out of such

practical concerns that Bell Labs decided to employ two astrophysicists, Arno Penzias

and Robert Wilson. Penzias and Wilson would undoubtedly have been indignant if

anyone had suggested that they were doing anything other than basic research. They

first observed the cosmic background radiation, which is now taken as confirmation of

the "Big Bang" theory of the formation of the universe, while they were attempting to

identify and measure the various sources of noise in their antenna and in the

atmosphere. It seems fair to say that this most fundamental breakthrough in cosmology

in the past century was entirely serendipitous. Although Penzias and Wilson did not

know it at the time, the character of the background radiation that they discovered was

just what had been postulated earlier by cosmologists at Princeton who had devised the

Big Bang theory. Penzias and Wilson shared a Nobel Prize in Physics for this finding.

Their findings were as basic as basic science can get, and it is in no way diminished by

observing that the firm that had employed them did so because the decisionmakers at

Bell Labs hoped to improve the quality of satellite transmission.

The parallelism between the fundamental discoveries of Jansky and Penzias and

Wilson is, of course, very striking. In both episodes, the Bell Labs researchers stumbled

upon discoveries of the greatest possible scientific significance while involved in

18
projects that were motivated by the desire of Bell Labs to improve the quality of

telephone transmission. In the case of Penzias and Wilson, they were conducting their

research with a remarkably sensitive horn antenna that had been built for the Echo and

Telstar satellite projects. Wilson later stated that he was originally attracted to work at

Bell Labs because working in the Labs would provide access to a horn antenna which

was one of the most sensitive of such antennas in existence [Steve Aaronson, "The

Light of Creation - an Interview with Arno A. Penzias and Robert C. Wilson," Bell

Laboratories Record, January 1979, p. 13].

I have called attention to 2 episodes at Bell Labs in which industrial researchers

discovered natural phenomena of immense scientific significance while the firm that

employed them did so in the hope that they would solve serious problems connected

with the performance of a new communications technology. In one sense it is fair to say

that important scientific findings by profit-making firms are sometimes achieved

unintentionally - they have discovered things that they were not looking for, which I take

to be the generic meaning of Horace Walpole's mid-eighteenth century neologism -

serendipity. Such breakthroughs in the private sector, moreover, are difficult to

understand if one insists on drawing sharp distinctions between basic and applied

research on the basis of the motivations of those performing the research. I find it

irresistible here to invoke, once again, the shade of the great Pasteur: "There are no

such things as applied sciences; only applications of science."

19
In fact, I would go much further: when basic research in industry is isolated from the

other activities of the firm, whether organizationally or geographically, it is likely to

become sterile and unproductive. Much of the history of basic research in American

industry suggests that it is likely to be most effective when it is highly interactive with the

work, and the concerns, of applied scientists and engineers within the firm. This is

because the high technology industries have continually thrown up problems, difficulties

and anomalous observations that were most unlikely to occur outside of specific high

technology contexts.

The sheer growth in the number of trained scientists in industrial labs, along with the

growth of new, highly complex, specialized products that appeared in the course of the

20th century, powerfully increased the likelihood of serendipitous findings. High tech

industries provide a unique vantage point for the conduct of basic research but, in order

for scientists to exploit the potential of the industrial environment, it is necessary to

create opportunities and incentives for interaction with other components of the firm.

Bell Labs before divestiture (1984) is probably the best example of a place where the

institutional environment was most hospitable for basic research. I do not suggest that

Bell Labs was, in any respect, a representative industrial lab. Far from it. It was a

regulated monopoly that could readily recoup its huge expenditures on research. But,

perhaps even more important, it came to occupy a location on the industrial spectrum

where, as it turned out, technological improvements required a deeper, scientific

exploration of certain portions of the natural world that had not been previously studied.

20
INSTRUMENTATION

Of course my examination of the endogeneity of science has been no more than a

very modest and partial sketch. Entire categories of the influence of technology upon

science have been completely ignored here, such as the pervasive impact of new

instrumentation, i.e., technologies of observation, experimentation and measurement.

Indeed, scientific instruments may be usefully regarded as the capital goods of the

research industry. Much of this instrumentation, in turn, has had its origins in the

university world and, to underline the extent of the intertwining of technology and

science in recent years, some of the most powerful of those instruments, such as

Nuclear Magnetic Resonance, had their origins in fundamental research that was

originally undertaken in order to acquire some highly specific pieces of knowledge, such

as a deeper understanding of the magnetic properties of atomic nuclei. Indeed, Felix

Bloch was awarded Stanford’s first Nobel Prize in physics for precisely such research.

[See N. Rosenberg, “The Economic Impact of Scientific Instrumentation

Developed in Academic Laboratories,” in John Irvine et al., Equipping Science for

the 21st Century, Edward Elgar, 1997. See also, in the same volume, Carlos

Kruytbosch, "The Role of Instrumentation in Advancing the Frontiers of Science,"

chapter 2.]. Nuclear Magnetic Resonance spectroscopy, in turn, became an invaluable

tool in chemistry for determining the structure of certain molecules (e.g., hydrogen,

deuterium, boron and nitrogen atoms (Kruytbosch, pp. 32-4).

21
Clearly, instrumentation and techniques have moved from one scientific discipline to

another in ways that have been highly consequential for the progress of science. In fact,

it can be argued that a serious understanding of the progress of individual disciplines is

generally unattainable in the absence of an examination of how different areas of

science have influenced one another. This understanding is frequently tied directly to

the development, the timing and the mode of transfer of scientific instruments among

disciplines. The flow of “exports” appears to have been particularly heavy from physics

to chemistry, as well as from both physics and chemistry to biology, to clinical medicine

and, ultimately, to the delivery of health care. There has also been a less substantial

flow from chemistry to physics and, in recent years, from applied physics and electrical

engineering to health care. NMR eventually became the basis for one of the most

powerful diagnostic tools of twentieth (and twenty-first) century medicine (MRI).

The transistor revolution was a direct outgrowth of the expansion of solid-state

physics, but the successful completion of that revolution was in turn heavily dependent

upon further developments in chemistry and metallurgy which provided materials of a

sufficiently high degree of purity and crystallinity. Finally, physics has spawned

subspecialties that are inherently interdisciplinary: for example, biophysics, astrophysics

and materials science.

One further point, however, is implicit in what has already been said. The availability

of new or improved instrumentation or experimental technique in one academic

discipline has often been the source of interdisciplinary collaboration. In some critical

22
cases, it has involved the migration of highly trained scientists from one field to another,

such as those physicists from the Cavendish Laboratory at Cambridge University who

played a decisive role in the emergence of molecular biology. This emergence had

depended heavily upon scientists, trained in physicists’ skills at Cavendish, who

transferred the indispensable tool of x-ray crystallography into the very different realm of

biology. Molecular biology was the product of interdisciplinary research in the special

sense that scientists trained in one discipline crossed traditional scientific boundary

lines and brought the intellectual tools, concepts and experimental methods into the

service of an entirely new field [See the magisterial, yet highly accessible volume by

Horace Judson on the early history of molecular biology, The Eighth Day of Creation].

The German physicist, Max von Laue, discovered the phenomenon of x-ray

diffraction in 1912. Its applications were, in the early years, employed by William Bragg

and his son, Lawrence Bragg, primarily in the new field of solid-state physics but also,

later on, in developing the field of molecular biology. The main center of the

methodology of x-ray diffraction was, for many years, the Cavendish Laboratory,

presided over by Lawrence Bragg. Numerous scientists went there in order to learn how

to exploit the technique, including Max Perutz, at the time a chemist, James Watson,

Francis Crick, John Kendrew, all later to receive Nobel Prizes in Physiology and

Medicine?). The transfer of skills in x-ray diffraction was facilitated by the unusual step

of the establishment of a Medical Research Council unit at the Cavendish, headed by

Perutz but under the general direction of the physicist Lawrence Bragg [Francis Crick,

What Mad Pursuit, Penguin 1988, p. 23. James Watson later reported Bragg’s

23
obvious delight over “...the fact that the X-ray method he had developed forty

years before was at the heart of a profound insight into the nature of life itself.”

The Double Helix (Simon Schuster, 1968) p. 220].. To infer the three-dimensional

structure of very large-molecule proteins by the new technique of x-ray crystallography,

which offered only two-dimensional photographs of highly complex molecules, appears

to have been a hellishly difficult enterprise, but it provided much of the basis for the new

discipline of molecular biology. Rosalind Franklin who, sadly, died very young, is widely

agreed to have been the most skilful practitioner of x-ray crystallography.

Moreover, it is important to observe that the two separate communities - university

scientists (including medical school clinicians) and commercial instrument makers -

interacted with and influenced one another in ways that were truly symbiotic. Precisely

because these two communities marched to the tunes of very different drummers, each

was ultimately responsible for innovative improvements that could not have been

achieved by the other, had the other been acting alone [See Annetine Gelijns and

Nathan Rosenberg, “Diagnostic Devices: An Analysis of Comparative

Advantages,” chapter 8 in David Mowery and Richard Nelson (eds.), Sources of

Industrial Leadership]. It should be added that the applications of physics research

have us ually moved more readily across disciplinary boundary lines in industry than

they have in the academic world. Profit-making firms are not particularly concerned with

where those boundary lines have been drawn in the academic world; they tend to

search for solutions to problems regardless of where those solutions might be found

[NRC 1986] .

24
Thus, the technological realm has not only played a major role in setting the research

agenda for science, as I have argued. Technology has also provided new and

immensely more powerful research tools than existed in earlier centuries, as is obvious

by mere reference to electron microscopy in the study of the micro-universe, to the

Hubble telescope in the study of the macro-universe, and to the laser, which has

become the most powerful research instrument throughout the realm of the science of

chemistry. In addition, the laser has found a wide range of uses in medical care.

Finally, since this article has been written within easy walking distance of the Stanford

Linear Accelerator, it seems appropriate to close with the following observation: in the

realm of modern physics it appears that the rate of scientific progress has been largely

determined by the availability of improved experimental technologies. In the succinct

formulation of Wolfgang Panofsky, the first director of SLAC: "Physics is generally

paced by technology and not by the physical laws. We always seem to ask more

questions than we have tools to answer." Exactly.

25
26
27
28

Вам также может понравиться