The Crisis of Modern Science

The Crisis of Modern Science
With both philosophy and religion in such problematic condition, it was science alone that seemed to rescue the modern mind from pervasive uncertainty. Science achieved a golden age in the nineteenth and early twentieth centuries, with extraordinary advances in all its major branches, with widespread institutional and academic organization of research, and with practical applications rapidly proliferating on the basis of a systematic linkage of science with technology. The optimism of the age was directly tied to confidence in science and in its powers to improve indefinitely the state of human knowledge, health, and general welfare.
Religion and metaphysics continued their long, slow decline, but science’s ongoing—indeed, accelerating—progress could not be doubted. Its claims to valid knowledge of the world, even subject to the critique of post-Kantian philosophy, continued to seem not only plausible but scarcely questionable. In the face of science’s supreme cognitive effectiveness and the rigorously impersonal precision of its explanatory structures, religion and philosophy were compelled to define their positions in relation to science, just as, in the medieval era, science and philosophy were compelled to do so in relation to the culturally more powerful conceptions of religion. For the modern mind, it was science that presented the most realistic, and reliable world picture— even if that picture was limited to “technical” knowledge of natural phenomena, and despite its existentially disjunctive implications. But two developments in the course of the twentieth century radically changed science’s cognitive and cultural status, one theoretical and internal to science, the other pragmatic and external.
In the first instance, the classical Cartesian-Newtonian cosmology gradually and then dramatically broke down under the cumulative impact of several astonishing developments in physics. Beginning in the later nineteenth century with Maxwell’s work with electromagnetic fields, the Michelson-Morley experiment, and Becquerel’s discovery of radioactivity, then in the early twentieth century with Planck’s isolation of quantum phenomena and Einstein’s special and general theories of relativity, and culminating in the 1920s with the formulation of quantum mechanics by Bohr, Heisenberg, and their colleagues, the long-established certainties of classical modern science were radically undermined. By the end of the third decade of the twentieth century, virtually every major postulate of the earlier scientific conception had been controverted: the atoms as solid, indestructible, and separate building blocks of nature, space and time as independent absolutes, the strict mechanistic causality of all phenomena, the possibility of objective observation of nature. Such a fundamental transformation in the scientific world picture was staggering, and for no one was this more true than the physicists themselves. Confronted with the contradictions observed in subatomic phenomena, Einstein wrote: “All my attempts to adapt the theoretical foundation of physics to this knowledge failed completely. It was as if the ground had been pulled out from under one, with no firm foundation to be seen anywhere upon which one could have built.” Heisenberg similarly realized that “the foundations of physics have started moving . . . [and] this motion has caused the feeling that the ground would be cut from science.”
The challenge to previous scientific assumptions was deep and multiple: The solid Newtonian atoms were now discovered to be largely empty. Hard matter no longer constituted the fundamental substance of nature. Matter and energy were interchangeable. Three-dimensional space and unidimensional time had become relative aspects of a four-dimensional space-time continuum. Time flowed at different rates for observers moving at different speeds. Time slowed down near heavy objects, and under certain circumstances could stop altogether. The laws of Euclidean geometry no longer provided the universally necessary structure of nature. The planets moved in their orbits not because they were pulled toward the Sun by an attractive force acting at a distance, but because the very space in which they moved was curved. Subatomic phenomena displayed a fundamentally ambiguous nature, observable both as particles and as waves. The position and momentum of a particle could not be precisely measured simultaneously. The uncertainty principle radically undermined and replaced strict Newtonian determinism. Scientific observation and explanation could not proceed without affecting the nature of the object observed. The notion of substance dissolved into probabilities and “tendencies to exist.” Nonlocal connections between particles contradicted mechanistic causality. Formal relations and dynamic processes replaced hard discrete objects. The physical world of twentieth-century physics resembled, in Sir James Jeans’s words, not so much a great machine as a great thought.
The consequences of this extraordinary revolution were again ambiguous. The continuing modern sense of intellectual progress, leaving behind the ignorance and misconceptions of past eras while reaping the fruits of new concrete technological results, was again bolstered. Even Newton had been corrected and improved upon by the ever-evolving, increasingly sophisticated modern mind. Moreover, to the many who had regarded the scientific universe of mechanistic and materialistic determinism as antithetical to human values, the quantum-relativistic revolution represented an unexpected and welcome broaching of new intellectual possibilities. Matter’s former hard substantiality had given way to a reality perhaps more conducive to a spiritual interpretation. Freedom of the human will seemed to be given a new foothold if subatomic particles were indeterminate. The principle of complementarity governing waves and particles suggested its broader application in a complementarity between mutually exclusive ways of knowledge, like religion and science. Human consciousness, or at least human observation and interpretation, seemed to be given a more central role in the larger scheme of things with the new understanding of the subject’s influence on the observed object. The deep interconnectedness of phenomena encouraged a new holistic thinking about the world, with many social, moral, and religious implications. Increasing numbers of scientists began to question modern science’s pervasive, if often unconscious, assumption that the intellectual effort to reduce all reality to the smallest measurable components of the physical world would eventually reveal that which was most fundamental in the universe. The reductionist program, dominant since Descartes, now appeared to many to be myopically selective, and likely to miss that which was most significant in the nature of things.
Yet such inferences were neither universal nor even widespread among practicing physicists. Modern physics was perhaps open to a spiritual interpretation, but did not necessarily compel it. Nor was the larger population intimately conversant with the arcane conceptual changes wrought by the new physics. Moreover, for several decades the revolution in physics did not result in comparable theoretical transformations in the other natural and social sciences, although their theoretical programs had been based largely on the mechanistic principles of classical physics. Nevertheless, many felt that the old materialistic world view had been irrevocably challenged, and that the new scientific models of reality offered possible opportunities for a fundamental rapprochement with man’s humanistic aspirations.
Yet these ambiguous possibilities were countered by other, more disturbing factors. To begin with, there was now no coherent conception of the world, comparable to Newton’s Principia, that could theoretically integrate the complex variety of new data. Physicists failed to come to any consensus as to how the existing evidence should be interpreted with respect to defining the ultimate nature of reality. Conceptual contradictions, disjunctions, and paradoxes were ubiquitous, and stubbornly evaded resolution. 2 A certain irreducible irrationality, already recognized in the human psyche, now emerged in the structure of the physical world itself. To incoherence was added unintelligibility, for the conceptions derived from the new physics not only were difficult for the layperson to comprehend, they presented seemingly insuperable obstacles to the human intuition generally: a curved space, finite yet unbounded; a four-dimensional space-time continuum; mutually exclusive properties possessed by the same subatomic entity; objects that were not really things at all but processes or patterns of relationship; phenomena that took no decisive shape until observed; particles that seemed to affect each other at a distance with no known causal link; the existence of fundamental fluctuations of energy in a total vacuum.
Moreover, for all the apparent opening of the scientific understanding to a less materialistic and less mechanistic conception, there was no real change in the essential modern dilemma: The universe was still an impersonal vastness in which man with his peculiar capacity for consciousness was still an ephemeral, inexplicable, randomly produced minutia. Nor was there any compelling answer to the looming question as to what ontological context preceded or underlay the “big-bang” birth of the universe. Nor did leading physicists believe that the equations of quantum theory described the actual world. Scientific knowledge was confined to abstractions, mathematical symbols, “shadows.” Such knowledge was not of the world itself, which now more than ever seemed beyond the compass of human cognition.
Thus in certain respects the intellectual contradictions and obscurities of the new physics only heightened the sense of human relativity and alienation growing since the Copernican revolution. Modern man was being forced to question his inherited classical Greek faith that the world was ordered in a manner clearly accessible to the human intelligence. In the physicist P. W. Bridgman’s words, “the structure of nature may eventually be such that our processes of thought do not correspond to it sufficiently to permit us to think about it at all. . . . The world fades out and eludes us. . . . We are confronted with something truly ineffable. We have reached the limit of the vision of the great pioneers of science, the vision, namely, that we live in a sympathetic world in that it is comprehensible by our minds.” 3 Philosophy’s conclusion was becoming science’s as well: Reality may not be structured in any way the human mind can objectively discern. Thus incoherence, unintelligibility, and an insecure relativism compounded the earlier modern predicament of human alienation in an impersonal cosmos.
**********
When relativity theory and quantum mechanics undid the absolute certainty of the Newtonian paradigm, science demonstrated, in a way that Kant as a convinced Newtonian could never have anticipated, the validity of Kant’s skepticism concerning the human mind’s capacity for certain knowledge of the world in itself. Because he was certain of the truth of Newtonian science, Kant had argued that the categories of human cognition congruent with that science were themselves absolute, and these alone provided a basis for the Newtonian achievement, as well as for man’s epistemological competence in general. But with twentieth-century physics, the bottom fell out of Kant’s last certainty. The fundamental Kantian a prioris —space, time, substance, causality—were no longer applicable to all phenomena. The scientific knowledge that had seemed after Newton to be universal and absolute had to be recognized after Einstein, Bohr, and Heisenberg as limited and provisional. So too did quantum mechanics reveal in unexpected fashion the radical validity of Kant’s thesis that the nature described by physics was not nature in itself but man’s relation to nature— i.e., nature as exposed to man’s form of questioning.
What had been implicit in Kant’s critique, but obscured by the apparent certainty of Newtonian physics, now became explicit: Because induction can never render certain general laws, and because scientific knowledge is a product of human interpretive structures that are themselves relative, variable, and creatively employed, and finally because the act of observation in some sense produces the objective reality science attempts to explicate, the truths of science are neither absolute nor unequivocally objective. In the combined wake of eighteenth-century philosophy and twentieth-century science, the modern mind was left free of absolutes, but also disconcertingly free of any solid ground.
This problematic conclusion was reinforced by a newly critical approach to the philosophy and history of science, influenced above all by the work of Karl Popper and Thomas Kuhn. Drawing on the insights of Hume and Kant, Popper noted that science can never produce knowledge that is certain, nor even probable. Man observes the universe as a stranger, making imaginative guesses about its structure and workings. He cannot approach the world without such bold conjectures in the background, for every observed fact presupposes an interpretive focus. In science, these conjectures must be continually and systematically tested; yet however many tests are successfully passed, any theory can never be viewed as more than an imperfectly corroborated conjecture. At any time, a new test could falsify it. No scientific truth is immune to such a possibility. Even the basic facts are relative, always potentially subject to a radical reinterpretation in a new framework. Man can never claim to know the real essences of things. Before the virtual infinitude of the world’s phenomena, human ignorance itself is infinite. The wisest strategy is to learn from one’s inevitable mistakes.
But while Popper maintained the rationality of science by upholding its fundamental commitment to rigorous testing of theories, its fearless neutrality in the quest for truth, Kuhn’s analysis of the history of science tended to undercut even that security. Kuhn agreed that all scientific knowledge required interpretive structures based on fundamental paradigms or conceptual models that allowed researchers to isolate data, elaborate theories, and solve problems. But citing many examples in the history of science, he pointed out that the actual practice of scientists seldom conformed to Popper’s ideal of systematic self-criticism by means of attempted falsification of existing theories. Instead, science typically proceeded by seeking confirmations of the prevailing paradigm—gathering facts in the light of that theory, performing experiments on its basis, extending its range of applicability, further articulating its structure, attempting to clarify residual problems. Far from subjecting the paradigm itself to constant testing, normal science avoided contradicting it by routinely reinterpreting conflicting data in ways that would support the paradigm, or by neglecting such awkward data altogether. To an extent never consciously recognized by scientists, the nature of scientific practice makes its governing paradigm self-validating. The paradigm acts as a lens through which every observation is filtered, and is maintained as an authoritative bulwark by common convention. Through teachers and texts, scientific pedagogy sustains the inherited paradigm and ratifies its credibility, tending to produce a firmness of conviction and theoretical rigidity not unlike an education in systematic theology.
Kuhn further argued that when the gradual accumulation of conflicting data finally produces a paradigm crisis and a new imaginative synthesis eventually wins scientific favor, the process by which that revolution takes place is far from rational. It depends as much on the established customs of the scientific community, on aesthetic, psychological, and sociological factors, on the presence of contemporary root metaphors and popular analogies, on unpredictable imaginative leaps and “gestalt switches,” even on the aging and dying of conservative scientists, as on disinterested tests and arguments. For in fact the rival paradigms are seldom genuinely comparable; they are selectively based on differing modes of interpretation and hence different sets of data. Each paradigm creates its own gestalt, so comprehensive that scientists working within different paradigms seem to be living in different worlds. Nor is there any common measure, such as problem-solving ability or theoretical coherence or resistance to falsification, that all scientists agree upon as a standard for comparison. What is an important problem for one group of scientists is not for another. Thus the history of science is not one of linear rational progress moving toward ever more accurate and complete knowledge of an objective truth, but is one of radical shifts of vision in which a multitude of non-rational and non-empirical factors play crucial roles. Whereas Popper had attempted to temper Hume’s skepticism by demonstrating the rationality of choosing the most rigorously tested conjecture, Kuhn’s analysis served to restore that skepticism. 4
With these philosophical and historical critiques and with the revolution in physics, a more tentative view of science became widespread in intellectual circles. Science was still patently effective and powerful in its knowledge, but scientific knowledge was now regarded as, in several senses, a relative matter. The knowledge science rendered was relative to the observer, to his physical context, to his science’s prevailing paradigm and his own theoretical assumptions. It was relative to his culture’s prevailing belief system, to his social context and psychological predispositions, to his very act of observation. And science’s first principles might be overturned at any point in the face of new evidence. Moreover, by the later twentieth century, the conventional paradigm structures of other sciences, including the Darwinian theory of evolution, were coming under increasing pressure from conflicting data and alternative theories. Above all, the bedrock certainty of the Cartesian-Newtonian world view, for centuries the acknowledged epitome and model of human knowledge and still pervasively influential in the cultural psyche, had been shattered. And the post-Newtonian world order was neither intuitively accessible nor internally coherent—indeed, scarcely an order at all.
**********
Yet for all this, science’s cognitive status would still have retained its unquestioned preeminence for the modern mind. Scientific truth might be increasingly esoteric and only provisional, but it was a testable truth, continually being improved and more accurately formulated, and its practical effects in the form of technological progress—in industry, agriculture, medicine, energy production, communication and transportation—provided tangible public evidence for science’s claims to render viable knowledge of the world. But it was, paradoxically, this same tangible evidence that was to prove crucial in an antithetical development; for it was when the practical consequences of scientific knowledge could no longer be judged exclusively positive that the modern mind was forced to reevaluate its previously wholehearted trust in science.
As early as the nineteenth century, Emerson had warned that man’s technical achievements might not be unequivocally in his own best interests: “Things are in the saddle and ride mankind.” By the turn of the century, just as technology was producing new wonders like the automobile and the widespread application of electricity, a few observers began to sense that such developments might signal an ominous reversal of human values. By the mid-twentieth century, modern science’s brave new world had started to become subject to wide and vigorous criticism: Technology was taking over and dehumanizing man, placing him in a context of artificial substances and gadgets rather than live nature, in an unaesthetically standardized environment where means had subsumed ends, where industrial labor requirements entailed the mechanization of human beings, where all problems were perceived as soluble by technical research at the expense of genuine existential responses. The self-propelling and self-augmenting imperatives of technical functioning were dislodging man and uprooting him from his fundamental relation to the Earth. Human individuality seemed increasingly tenuous, disappearing under the impact of mass production, the mass media, and the spread of a bleak and problem-ridden urbanization. Traditional structures and values were crumbling. With an unending stream of technological innovations, modern life was subject to an unprecedentedly disorienting rapidity of change. Gigantism and turmoil, excessive noise, speed, and complexity dominated the human environment. The world in which man lived was becoming as impersonal as the cosmos of his science. With the pervasive anonymity, hollowness, and materialism of modern life, man’s capacity to retain his humanity in an environment determined by technology seemed increasingly in doubt. For many, the question of human freedom, of mankind’s ability to maintain mastery over its own creation, had become acute.
But compounding these humanistic critiques were more disturbingly concrete signs of science’s untoward consequences. The critical contamination of the planet’s water, air, and soil, the manifold harmful effects on animal and plant life, the extinction of innumerable species, the deforestation of the globe, the erosion of topsoil, the depletion of groundwater, the vast accumulation of toxic wastes, the apparent exacerbation of the greenhouse effect, the breakdown of the ozone layer in the atmosphere, the radical disruption of the entire planetary ecosystem—all these emerged as direly serious problems with increasing force and complexity. From even a short-term human perspective, the accelerating depletion of irreplaceable natural resources had become an alarming phenomenon. Dependence on foreign supplies of vital resources brought a new precariousness into global political and economic life. New banes and stresses to the social fabric continued to appear, directly or indirectly tied to the advance of a scientific civilization—urban overdevelopment and overcrowding, cultural and social rootlessness, numbingly mechanical labor, increasingly disastrous industrial accidents, automobile and air travel fatalities, cancer and heart disease, alcoholism and drug addiction, mind- dulling and culture-impoverishing television, growing levels of crime, violence, and psychopathology. Even science’s most cherished successes paradoxically entailed new and pressing problems, as when the medical relief of human illness and lowering of mortality rates, combined with technological strides in food production and transportation, in turn exacerbated the threat of global overpopulation. In other cases, the advance of science presented new Faustian dilemmas, as in those surrounding the unforeseeable future uses of genetic engineering. More generally, the scientifically unfathomed complexity of all relevant variables—whether in global or local environments, in social systems, or in the human body—made the consequences of technological manipulation of those variables unpredictable and often pernicious.
All these developments had reached an early and ominous proleptic climax when natural science and political history conspired to produce the atomic bomb. It seemed supremely, if tragically, ironic that the Einsteinian discovery of the equivalence of mass and energy, by which a particle of matter could be converted into an immense quantity of energy—a discovery by a dedicated pacifist reflecting a certain apex of human intellectual brilliance and creativity—precipitated for the first time in history the prospect of humanity’s self-extinction. With the dropping of atomic bombs on the civilians of Hiroshima and Nagasaki, faith in science’s intrinsic moral neutrality, not to say its unlimited powers of benign progress, could no longer be upheld. During the protracted and tense global schism of the Cold War that followed, the numbers of unprecedentedly destructive nuclear missiles relentlessly multiplied until the entire planet could be devastated many times over. Civilization itself was now brought into peril by virtue of its own genius. The same science that had dramatically lessened the hazards and burdens of human survival now presented to human survival its gravest menace.
The great succession of science’s triumphs and cumulative progress was now shadowed by a new sense of science’s limits, its dangers, and its culpability. The modern scientific mind found itself beleaguered on several fronts at once: by the epistemological critiques, by its own theoretical problems arising in a growing number of fields, by the increasingly urgent psychological necessity of integrating the modern outlook’s human-world divide, and above all by its adverse consequences and intimate involvement in the planetary crisis. The close association of scientific research with the political, military, and corporate establishments continued to belie science’s traditional self-image of detached purity. The very concept of “pure science” was now criticized by many as entirely illusory. The belief that the scientific mind had unique access to the truth of the world, that it could register nature like a perfect mirror reflecting an extra-historical, universal objective reality, was seen not only as epistemologically naive, but also as serving, either consciously or unconsciously, specific political and economic agenda, often allowing vast resources and intelligence to be commandeered for programs of social and ecological domination. The aggressive exploitation of the natural environment, the proliferation of nuclear weaponry, the threat of global catastrophe—all pointed to an indictment of science, of human reason itself, now seemingly in thrall to man’s own self-destructive irrationality.
If all scientific hypotheses were to be rigorously and disinterestedly tested, then it seemed that the “scientific world view” itself, the governing meta-hypothesis of the modern era, was being decisively falsified by its deleterious and counterproductive consequences in the empirical world. The scientific enterprise, which in its earlier stages had presented a cultural predicament—philosophical, religious, social, psychological—had now provoked a biological emergency. The optimistic belief that the world’s dilemmas could be solved simply by scientific advance and social engineering had been confounded. The West was again losing its faith, this time not in religion but in science and in the autonomous human reason.
Science was still valued, in many respects still revered. But it had lost its untainted image as humanity’s liberator. It had also lost its long-secure claims to virtually absolute cognitive reliability. With its productions no longer exclusively benign, with its reductionist understanding of the natural environment apparently deficient, with its evident susceptibility to political and economic bias, the previously unqualified trustworthiness of scientific knowledge could no longer be affirmed. On the basis of these several interacting factors, something like Hume’s radical epistemological skepticism—mixed with a relativized Kantian sense of a priori cognitive structures—seemed publicly vindicated. After modern philosophy’s acute epistemological critique, the principal remaining foundation for reason’s validity had been its empirical support by science. The philosophical critique alone had been in effect an abstract exercise, without definite influence on the larger culture or on science, and would have so continued if the scientific enterprise had itself continued being so unequivocally positive in its practical and cognitive progress. But with science’s concrete consequences so problematic, reason’s last foundation was now unfirm.
Many thoughtful observers, not just professional philosophers, were forced to reevaluate the status of human knowledge. Man might think he knows things, scientifically or otherwise, but there was clearly no guarantee for this: he had no a priori rational access to universal truths; empirical data were always theory-soaked and relative to the observer; and the previously reliable scientific world view was open to fundamental question, for that conceptual framework was evidently both creating and exacerbating problems for humanity on a global scale. Scientific knowledge was stupendously effective, but those effects suggested that much knowledge from a limited perspective could be a very dangerous thing.
With both philosophy and religion in such problematic condition, it was science alone that seemed to rescue the modern mind from pervasive uncertainty. Science achieved a golden age in the nineteenth and early twentieth centuries, with extraordinary advances in all its major branches, with widespread institutional and academic organization of research, and with practical applications rapidly proliferating on the basis of a systematic linkage of science with technology. The optimism of the age was directly tied to confidence in science and in its powers to improve indefinitely the state of human knowledge, health, and general welfare.
Religion and metaphysics continued their long, slow decline, but science’s ongoing—indeed, accelerating—progress could not be doubted. Its claims to valid knowledge of the world, even subject to the critique of post-Kantian philosophy, continued to seem not only plausible but scarcely questionable. In the face of science’s supreme cognitive effectiveness and the rigorously impersonal precision of its explanatory structures, religion and philosophy were compelled to define their positions in relation to science, just as, in the medieval era, science and philosophy were compelled to do so in relation to the culturally more powerful conceptions of religion. For the modern mind, it was science that presented the most realistic, and reliable world picture— even if that picture was limited to “technical” knowledge of natural phenomena, and despite its existentially disjunctive implications. But two developments in the course of the twentieth century radically changed science’s cognitive and cultural status, one theoretical and internal to science, the other pragmatic and external.
In the first instance, the classical Cartesian-Newtonian cosmology gradually and then dramatically broke down under the cumulative impact of several astonishing developments in physics. Beginning in the later nineteenth century with Maxwell’s work with electromagnetic fields, the Michelson-Morley experiment, and Becquerel’s discovery of radioactivity, then in the early twentieth century with Planck’s isolation of quantum phenomena and Einstein’s special and general theories of relativity, and culminating in the 1920s with the formulation of quantum mechanics by Bohr, Heisenberg, and their colleagues, the long-established certainties of classical modern science were radically undermined. By the end of the third decade of the twentieth century, virtually every major postulate of the earlier scientific conception had been controverted: the atoms as solid, indestructible, and separate building blocks of nature, space and time as independent absolutes, the strict mechanistic causality of all phenomena, the possibility of objective observation of nature. Such a fundamental transformation in the scientific world picture was staggering, and for no one was this more true than the physicists themselves. Confronted with the contradictions observed in subatomic phenomena, Einstein wrote: “All my attempts to adapt the theoretical foundation of physics to this knowledge failed completely. It was as if the ground had been pulled out from under one, with no firm foundation to be seen anywhere upon which one could have built.” Heisenberg similarly realized that “the foundations of physics have started moving . . . [and] this motion has caused the feeling that the ground would be cut from science.”
The challenge to previous scientific assumptions was deep and multiple: The solid Newtonian atoms were now discovered to be largely empty. Hard matter no longer constituted the fundamental substance of nature. Matter and energy were interchangeable. Three-dimensional space and unidimensional time had become relative aspects of a four-dimensional space-time continuum. Time flowed at different rates for observers moving at different speeds. Time slowed down near heavy objects, and under certain circumstances could stop altogether. The laws of Euclidean geometry no longer provided the universally necessary structure of nature. The planets moved in their orbits not because they were pulled toward the Sun by an attractive force acting at a distance, but because the very space in which they moved was curved. Subatomic phenomena displayed a fundamentally ambiguous nature, observable both as particles and as waves. The position and momentum of a particle could not be precisely measured simultaneously. The uncertainty principle radically undermined and replaced strict Newtonian determinism. Scientific observation and explanation could not proceed without affecting the nature of the object observed. The notion of substance dissolved into probabilities and “tendencies to exist.” Nonlocal connections between particles contradicted mechanistic causality. Formal relations and dynamic processes replaced hard discrete objects. The physical world of twentieth-century physics resembled, in Sir James Jeans’s words, not so much a great machine as a great thought.
The consequences of this extraordinary revolution were again ambiguous. The continuing modern sense of intellectual progress, leaving behind the ignorance and misconceptions of past eras while reaping the fruits of new concrete technological results, was again bolstered. Even Newton had been corrected and improved upon by the ever-evolving, increasingly sophisticated modern mind. Moreover, to the many who had regarded the scientific universe of mechanistic and materialistic determinism as antithetical to human values, the quantum-relativistic revolution represented an unexpected and welcome broaching of new intellectual possibilities. Matter’s former hard substantiality had given way to a reality perhaps more conducive to a spiritual interpretation. Freedom of the human will seemed to be given a new foothold if subatomic particles were indeterminate. The principle of complementarity governing waves and particles suggested its broader application in a complementarity between mutually exclusive ways of knowledge, like religion and science. Human consciousness, or at least human observation and interpretation, seemed to be given a more central role in the larger scheme of things with the new understanding of the subject’s influence on the observed object. The deep interconnectedness of phenomena encouraged a new holistic thinking about the world, with many social, moral, and religious implications. Increasing numbers of scientists began to question modern science’s pervasive, if often unconscious, assumption that the intellectual effort to reduce all reality to the smallest measurable components of the physical world would eventually reveal that which was most fundamental in the universe. The reductionist program, dominant since Descartes, now appeared to many to be myopically selective, and likely to miss that which was most significant in the nature of things.
Yet such inferences were neither universal nor even widespread among practicing physicists. Modern physics was perhaps open to a spiritual interpretation, but did not necessarily compel it. Nor was the larger population intimately conversant with the arcane conceptual changes wrought by the new physics. Moreover, for several decades the revolution in physics did not result in comparable theoretical transformations in the other natural and social sciences, although their theoretical programs had been based largely on the mechanistic principles of classical physics. Nevertheless, many felt that the old materialistic world view had been irrevocably challenged, and that the new scientific models of reality offered possible opportunities for a fundamental rapprochement with man’s humanistic aspirations.
Yet these ambiguous possibilities were countered by other, more disturbing factors. To begin with, there was now no coherent conception of the world, comparable to Newton’s Principia, that could theoretically integrate the complex variety of new data. Physicists failed to come to any consensus as to how the existing evidence should be interpreted with respect to defining the ultimate nature of reality. Conceptual contradictions, disjunctions, and paradoxes were ubiquitous, and stubbornly evaded resolution. 2 A certain irreducible irrationality, already recognized in the human psyche, now emerged in the structure of the physical world itself. To incoherence was added unintelligibility, for the conceptions derived from the new physics not only were difficult for the layperson to comprehend, they presented seemingly insuperable obstacles to the human intuition generally: a curved space, finite yet unbounded; a four-dimensional space-time continuum; mutually exclusive properties possessed by the same subatomic entity; objects that were not really things at all but processes or patterns of relationship; phenomena that took no decisive shape until observed; particles that seemed to affect each other at a distance with no known causal link; the existence of fundamental fluctuations of energy in a total vacuum.
Moreover, for all the apparent opening of the scientific understanding to a less materialistic and less mechanistic conception, there was no real change in the essential modern dilemma: The universe was still an impersonal vastness in which man with his peculiar capacity for consciousness was still an ephemeral, inexplicable, randomly produced minutia. Nor was there any compelling answer to the looming question as to what ontological context preceded or underlay the “big-bang” birth of the universe. Nor did leading physicists believe that the equations of quantum theory described the actual world. Scientific knowledge was confined to abstractions, mathematical symbols, “shadows.” Such knowledge was not of the world itself, which now more than ever seemed beyond the compass of human cognition.
Thus in certain respects the intellectual contradictions and obscurities of the new physics only heightened the sense of human relativity and alienation growing since the Copernican revolution. Modern man was being forced to question his inherited classical Greek faith that the world was ordered in a manner clearly accessible to the human intelligence. In the physicist P. W. Bridgman’s words, “the structure of nature may eventually be such that our processes of thought do not correspond to it sufficiently to permit us to think about it at all. . . . The world fades out and eludes us. . . . We are confronted with something truly ineffable. We have reached the limit of the vision of the great pioneers of science, the vision, namely, that we live in a sympathetic world in that it is comprehensible by our minds.” 3 Philosophy’s conclusion was becoming science’s as well: Reality may not be structured in any way the human mind can objectively discern. Thus incoherence, unintelligibility, and an insecure relativism compounded the earlier modern predicament of human alienation in an impersonal cosmos.
**********
When relativity theory and quantum mechanics undid the absolute certainty of the Newtonian paradigm, science demonstrated, in a way that Kant as a convinced Newtonian could never have anticipated, the validity of Kant’s skepticism concerning the human mind’s capacity for certain knowledge of the world in itself. Because he was certain of the truth of Newtonian science, Kant had argued that the categories of human cognition congruent with that science were themselves absolute, and these alone provided a basis for the Newtonian achievement, as well as for man’s epistemological competence in general. But with twentieth-century physics, the bottom fell out of Kant’s last certainty. The fundamental Kantian a prioris —space, time, substance, causality—were no longer applicable to all phenomena. The scientific knowledge that had seemed after Newton to be universal and absolute had to be recognized after Einstein, Bohr, and Heisenberg as limited and provisional. So too did quantum mechanics reveal in unexpected fashion the radical validity of Kant’s thesis that the nature described by physics was not nature in itself but man’s relation to nature— i.e., nature as exposed to man’s form of questioning.
What had been implicit in Kant’s critique, but obscured by the apparent certainty of Newtonian physics, now became explicit: Because induction can never render certain general laws, and because scientific knowledge is a product of human interpretive structures that are themselves relative, variable, and creatively employed, and finally because the act of observation in some sense produces the objective reality science attempts to explicate, the truths of science are neither absolute nor unequivocally objective. In the combined wake of eighteenth-century philosophy and twentieth-century science, the modern mind was left free of absolutes, but also disconcertingly free of any solid ground.
This problematic conclusion was reinforced by a newly critical approach to the philosophy and history of science, influenced above all by the work of Karl Popper and Thomas Kuhn. Drawing on the insights of Hume and Kant, Popper noted that science can never produce knowledge that is certain, nor even probable. Man observes the universe as a stranger, making imaginative guesses about its structure and workings. He cannot approach the world without such bold conjectures in the background, for every observed fact presupposes an interpretive focus. In science, these conjectures must be continually and systematically tested; yet however many tests are successfully passed, any theory can never be viewed as more than an imperfectly corroborated conjecture. At any time, a new test could falsify it. No scientific truth is immune to such a possibility. Even the basic facts are relative, always potentially subject to a radical reinterpretation in a new framework. Man can never claim to know the real essences of things. Before the virtual infinitude of the world’s phenomena, human ignorance itself is infinite. The wisest strategy is to learn from one’s inevitable mistakes.
But while Popper maintained the rationality of science by upholding its fundamental commitment to rigorous testing of theories, its fearless neutrality in the quest for truth, Kuhn’s analysis of the history of science tended to undercut even that security. Kuhn agreed that all scientific knowledge required interpretive structures based on fundamental paradigms or conceptual models that allowed researchers to isolate data, elaborate theories, and solve problems. But citing many examples in the history of science, he pointed out that the actual practice of scientists seldom conformed to Popper’s ideal of systematic self-criticism by means of attempted falsification of existing theories. Instead, science typically proceeded by seeking confirmations of the prevailing paradigm—gathering facts in the light of that theory, performing experiments on its basis, extending its range of applicability, further articulating its structure, attempting to clarify residual problems. Far from subjecting the paradigm itself to constant testing, normal science avoided contradicting it by routinely reinterpreting conflicting data in ways that would support the paradigm, or by neglecting such awkward data altogether. To an extent never consciously recognized by scientists, the nature of scientific practice makes its governing paradigm self-validating. The paradigm acts as a lens through which every observation is filtered, and is maintained as an authoritative bulwark by common convention. Through teachers and texts, scientific pedagogy sustains the inherited paradigm and ratifies its credibility, tending to produce a firmness of conviction and theoretical rigidity not unlike an education in systematic theology.
Kuhn further argued that when the gradual accumulation of conflicting data finally produces a paradigm crisis and a new imaginative synthesis eventually wins scientific favor, the process by which that revolution takes place is far from rational. It depends as much on the established customs of the scientific community, on aesthetic, psychological, and sociological factors, on the presence of contemporary root metaphors and popular analogies, on unpredictable imaginative leaps and “gestalt switches,” even on the aging and dying of conservative scientists, as on disinterested tests and arguments. For in fact the rival paradigms are seldom genuinely comparable; they are selectively based on differing modes of interpretation and hence different sets of data. Each paradigm creates its own gestalt, so comprehensive that scientists working within different paradigms seem to be living in different worlds. Nor is there any common measure, such as problem-solving ability or theoretical coherence or resistance to falsification, that all scientists agree upon as a standard for comparison. What is an important problem for one group of scientists is not for another. Thus the history of science is not one of linear rational progress moving toward ever more accurate and complete knowledge of an objective truth, but is one of radical shifts of vision in which a multitude of non-rational and non-empirical factors play crucial roles. Whereas Popper had attempted to temper Hume’s skepticism by demonstrating the rationality of choosing the most rigorously tested conjecture, Kuhn’s analysis served to restore that skepticism. 4
With these philosophical and historical critiques and with the revolution in physics, a more tentative view of science became widespread in intellectual circles. Science was still patently effective and powerful in its knowledge, but scientific knowledge was now regarded as, in several senses, a relative matter. The knowledge science rendered was relative to the observer, to his physical context, to his science’s prevailing paradigm and his own theoretical assumptions. It was relative to his culture’s prevailing belief system, to his social context and psychological predispositions, to his very act of observation. And science’s first principles might be overturned at any point in the face of new evidence. Moreover, by the later twentieth century, the conventional paradigm structures of other sciences, including the Darwinian theory of evolution, were coming under increasing pressure from conflicting data and alternative theories. Above all, the bedrock certainty of the Cartesian-Newtonian world view, for centuries the acknowledged epitome and model of human knowledge and still pervasively influential in the cultural psyche, had been shattered. And the post-Newtonian world order was neither intuitively accessible nor internally coherent—indeed, scarcely an order at all.
**********
Yet for all this, science’s cognitive status would still have retained its unquestioned preeminence for the modern mind. Scientific truth might be increasingly esoteric and only provisional, but it was a testable truth, continually being improved and more accurately formulated, and its practical effects in the form of technological progress—in industry, agriculture, medicine, energy production, communication and transportation—provided tangible public evidence for science’s claims to render viable knowledge of the world. But it was, paradoxically, this same tangible evidence that was to prove crucial in an antithetical development; for it was when the practical consequences of scientific knowledge could no longer be judged exclusively positive that the modern mind was forced to reevaluate its previously wholehearted trust in science.
As early as the nineteenth century, Emerson had warned that man’s technical achievements might not be unequivocally in his own best interests: “Things are in the saddle and ride mankind.” By the turn of the century, just as technology was producing new wonders like the automobile and the widespread application of electricity, a few observers began to sense that such developments might signal an ominous reversal of human values. By the mid-twentieth century, modern science’s brave new world had started to become subject to wide and vigorous criticism: Technology was taking over and dehumanizing man, placing him in a context of artificial substances and gadgets rather than live nature, in an unaesthetically standardized environment where means had subsumed ends, where industrial labor requirements entailed the mechanization of human beings, where all problems were perceived as soluble by technical research at the expense of genuine existential responses. The self-propelling and self-augmenting imperatives of technical functioning were dislodging man and uprooting him from his fundamental relation to the Earth. Human individuality seemed increasingly tenuous, disappearing under the impact of mass production, the mass media, and the spread of a bleak and problem-ridden urbanization. Traditional structures and values were crumbling. With an unending stream of technological innovations, modern life was subject to an unprecedentedly disorienting rapidity of change. Gigantism and turmoil, excessive noise, speed, and complexity dominated the human environment. The world in which man lived was becoming as impersonal as the cosmos of his science. With the pervasive anonymity, hollowness, and materialism of modern life, man’s capacity to retain his humanity in an environment determined by technology seemed increasingly in doubt. For many, the question of human freedom, of mankind’s ability to maintain mastery over its own creation, had become acute.
But compounding these humanistic critiques were more disturbingly concrete signs of science’s untoward consequences. The critical contamination of the planet’s water, air, and soil, the manifold harmful effects on animal and plant life, the extinction of innumerable species, the deforestation of the globe, the erosion of topsoil, the depletion of groundwater, the vast accumulation of toxic wastes, the apparent exacerbation of the greenhouse effect, the breakdown of the ozone layer in the atmosphere, the radical disruption of the entire planetary ecosystem—all these emerged as direly serious problems with increasing force and complexity. From even a short-term human perspective, the accelerating depletion of irreplaceable natural resources had become an alarming phenomenon. Dependence on foreign supplies of vital resources brought a new precariousness into global political and economic life. New banes and stresses to the social fabric continued to appear, directly or indirectly tied to the advance of a scientific civilization—urban overdevelopment and overcrowding, cultural and social rootlessness, numbingly mechanical labor, increasingly disastrous industrial accidents, automobile and air travel fatalities, cancer and heart disease, alcoholism and drug addiction, mind- dulling and culture-impoverishing television, growing levels of crime, violence, and psychopathology. Even science’s most cherished successes paradoxically entailed new and pressing problems, as when the medical relief of human illness and lowering of mortality rates, combined with technological strides in food production and transportation, in turn exacerbated the threat of global overpopulation. In other cases, the advance of science presented new Faustian dilemmas, as in those surrounding the unforeseeable future uses of genetic engineering. More generally, the scientifically unfathomed complexity of all relevant variables—whether in global or local environments, in social systems, or in the human body—made the consequences of technological manipulation of those variables unpredictable and often pernicious.
All these developments had reached an early and ominous proleptic climax when natural science and political history conspired to produce the atomic bomb. It seemed supremely, if tragically, ironic that the Einsteinian discovery of the equivalence of mass and energy, by which a particle of matter could be converted into an immense quantity of energy—a discovery by a dedicated pacifist reflecting a certain apex of human intellectual brilliance and creativity—precipitated for the first time in history the prospect of humanity’s self-extinction. With the dropping of atomic bombs on the civilians of Hiroshima and Nagasaki, faith in science’s intrinsic moral neutrality, not to say its unlimited powers of benign progress, could no longer be upheld. During the protracted and tense global schism of the Cold War that followed, the numbers of unprecedentedly destructive nuclear missiles relentlessly multiplied until the entire planet could be devastated many times over. Civilization itself was now brought into peril by virtue of its own genius. The same science that had dramatically lessened the hazards and burdens of human survival now presented to human survival its gravest menace.
The great succession of science’s triumphs and cumulative progress was now shadowed by a new sense of science’s limits, its dangers, and its culpability. The modern scientific mind found itself beleaguered on several fronts at once: by the epistemological critiques, by its own theoretical problems arising in a growing number of fields, by the increasingly urgent psychological necessity of integrating the modern outlook’s human-world divide, and above all by its adverse consequences and intimate involvement in the planetary crisis. The close association of scientific research with the political, military, and corporate establishments continued to belie science’s traditional self-image of detached purity. The very concept of “pure science” was now criticized by many as entirely illusory. The belief that the scientific mind had unique access to the truth of the world, that it could register nature like a perfect mirror reflecting an extra-historical, universal objective reality, was seen not only as epistemologically naive, but also as serving, either consciously or unconsciously, specific political and economic agenda, often allowing vast resources and intelligence to be commandeered for programs of social and ecological domination. The aggressive exploitation of the natural environment, the proliferation of nuclear weaponry, the threat of global catastrophe—all pointed to an indictment of science, of human reason itself, now seemingly in thrall to man’s own self-destructive irrationality.
If all scientific hypotheses were to be rigorously and disinterestedly tested, then it seemed that the “scientific world view” itself, the governing meta-hypothesis of the modern era, was being decisively falsified by its deleterious and counterproductive consequences in the empirical world. The scientific enterprise, which in its earlier stages had presented a cultural predicament—philosophical, religious, social, psychological—had now provoked a biological emergency. The optimistic belief that the world’s dilemmas could be solved simply by scientific advance and social engineering had been confounded. The West was again losing its faith, this time not in religion but in science and in the autonomous human reason.
Science was still valued, in many respects still revered. But it had lost its untainted image as humanity’s liberator. It had also lost its long-secure claims to virtually absolute cognitive reliability. With its productions no longer exclusively benign, with its reductionist understanding of the natural environment apparently deficient, with its evident susceptibility to political and economic bias, the previously unqualified trustworthiness of scientific knowledge could no longer be affirmed. On the basis of these several interacting factors, something like Hume’s radical epistemological skepticism—mixed with a relativized Kantian sense of a priori cognitive structures—seemed publicly vindicated. After modern philosophy’s acute epistemological critique, the principal remaining foundation for reason’s validity had been its empirical support by science. The philosophical critique alone had been in effect an abstract exercise, without definite influence on the larger culture or on science, and would have so continued if the scientific enterprise had itself continued being so unequivocally positive in its practical and cognitive progress. But with science’s concrete consequences so problematic, reason’s last foundation was now unfirm.
Many thoughtful observers, not just professional philosophers, were forced to reevaluate the status of human knowledge. Man might think he knows things, scientifically or otherwise, but there was clearly no guarantee for this: he had no a priori rational access to universal truths; empirical data were always theory-soaked and relative to the observer; and the previously reliable scientific world view was open to fundamental question, for that conceptual framework was evidently both creating and exacerbating problems for humanity on a global scale. Scientific knowledge was stupendously effective, but those effects suggested that much knowledge from a limited perspective could be a very dangerous thing.
有关键情节透露