Salvaging Science

SUBHEAD: The time of specialization for professional scientists is about over. Once again the amateur observer to carry on the scientific tradition.

 [IB Editor's note: This is the latter portion of Greer's long current article. The first six paragraphs can be found at his website that is linked below.]  

By John Michael Greer on 5 August 2011 for the ArchDruid Report - (http://thearchdruidreport.blogspot.com/2011/08/salvaging-science.html)
   
Image above: A collection of 19th century amateur science apparatuses. From (http://www.sas.org/index.html).

It’s rarely remembered these days that until quite recently, scientific research was mostly carried on by amateurs. The word “scientist” wasn’t even coined until 1833; before then, and for some time after, the research programs that set modern science on its way were carried out by university professors in other disciplines, middle class individuals with spare time on their hands, and wealthy dilletantes for whom science was a more interesting hobby than horse racing or politics.

Isaac Newton, for example, taught mathematics at Cambridge; Gilbert White founded the science of ecology with his Natural History of Selborne in his spare time as a clergyman; Charles Darwin came from a family with a share of the Wedgwood pottery fortune, had a clergyman’s education, and paid his own way around the world on the H.M.S. Beagle.

 It took a long time for scence as a profession to catch on, because—pace a myth very widespread these days—science contributed next to nothing to the technological revolutions that swept the western world in the eighteenth and nineteenth centuries. Until late in the nineteenth century, in fact, things generally worked the other way around: engineers and basement tinkerers discovered some exotic new effect, and then scientists scrambled to figure out what made it happen.

James Clerk Maxwell, whose 1873 book Electricity and Magnetism finally got out ahead of the engineers to postulate the effects that would become the basis for radio, began the process by which science took the lead in technological innovation, but it wasn’t until the Second World War that science had matured enough to become the engine of discovery it then became. It was then that government and business investment in basic research took off, creating the institutionalized science of the present day.

Throughout the twentieth century, investment in scientific research proved to be a winning bet on the grand scale; it won wars, made fortunes, and laid the groundwork for today’s high-tech world. It’s a common belief these days that more of the same will yield more of the same—that more scentific research will make it possible to fix the world’s energy problems and, just maybe, its other problems as well. Popular as that view is, there’s good reason to doubt it. The core problem is that scientific research was necessary, but not sufficient, to create today’s industrial societies.

Cheap abundant energy was also necessary, and was arguably the key factor. In a very real sense, the role of science from the middle years of the nineteenth century on was basically figuring out new ways to use the torrents of energy that came surging out of wells and mines to power history’s most extravagant boom.

Lacking all that energy, the technological revolutions of the last few centuries very likely wouldn’t have happened at all; the steam turbine, remember, was known to the Romans, who did nothing with it because all the fuel they knew about was committed to other uses. Since the sources of fuel we’ll have after fossil fuels finish depleting are pretty much the same as the ones the Romans had, and we can also expect plenty of pressing needs for the energy sources that remain, it takes an essentially religious faith in the inevitability of progress to believe that another wave of technological innovation is right around the corner.

The end of the age of cheap abundant energy is thus also likely to be the end of the age in which science functions as a force for economic expansion. There are at least two other factors pointing in the same direction, though, and they need to be grasped to make sense of the predicament we’re in. First, science itself is well into the territory of diminishing returns, and most of the way through the normal life cycle of a human method of investigation.

What last week’s post described as abstraction, the form of intellectual activity that seeks to reduce the complexity of experience into a set of precisely formulated generalizations, always depends on such a method. Classical logic is another example, and it’s particularly useful here because it completed its life cycle long ago and so can be studied along its whole trajectory through time. Logic, like the scientific method, was originally the creation of a movement of urban intellectuals in a society emerging from a long and troubled medieval period.

Around the eighth century BCE, ancient Greece had finally worked out a stable human ecology that enabled it to finish recovering from the collapse of Mycenean society some six centuries before; olive and grapevine cultivation stabilized what was left of the fragile Greek soil and produced cash crops eagerly sought by markets around the eastern Mediterranean, bringing in a flood of wealth; the parallel with rapidly expanding European economies during the years when modern science first took shape is probably not coincidental.

Initial ventures in the direction of what would become Greek logic explored various options, some more successful than others; by the fifth century BCE, what we may as well call the logical revolution was under way, and the supreme triumphs of logical method occupied the century that followed. Arithmetic, geometry, music theory, and astronomy underwent revolutionary developments.

That’s roughly where the logical revolution ground to a halt, too, and the next dozen centuries or so saw little further progress. There were social factors at work, to be sure, but the most important factor was inherent in the method: using the principles of logic as the Greeks understood them, there’s only so far you can go.

Logical methods that had proved overwhelmingly successful against longstanding problems in mathematics worked far less well on questions about the natural world, and efforts to solve the problems of human life as though they were logical syllogisms tended to flop messily. Once the belief in the omnipotence of logic was punctured, on the other hand, it became possible to sort out what it could and couldn’t do, and—not coincidentally—to assign it a core place in the educational curriculum, a place it kept right up until the dawn of the modern world. I know it’s utter heresy even to hint at this, but I’d like to suggest that science, like logic before it, has gotten pretty close to its natural limits as a method of knowledge.

 In Darwin’s time, a century and a half ago, it was still possible to make worldshaking scientific discoveries with equipment that would be considered hopelessly inadequate for a middle school classroom nowadays; there was still a lot of low hanging fruit to be picked off the tree of knowledge. At this point, by contrast, the next round of experimental advances in particle physics depends on the Large Hadron Collider, a European project with an estimated total price tag around $5.5 billion. Many other branches of science have reached the point at which very small advances in knowledge are being made with very large investments of money, labor, and computing power.

Doubtless there will still be surprises in store, but revolutionary discoveries are very few and far between these days. Yet there’s another factor pressing against the potential advancement of science, and it’s one that very few scientists like to talk about. When science was drawn up into the heady realms of politics and business, it became vulnerable to the standard vices of those realms, and one of the consequences has been a great deal of overt scientific fraud. A study last year published in the Journal of Medical Ethics surveyed papers formally retracted between 2000 and 2010 in the health sciences.

About a quarter of them were retracted for scientific fraud, and half of these had a first author who had had another paper previously retracted for scientific fraud. Coauthors of these repeat offenders had, on average, three other papers each that had been retracted. Americans, it may be worth noting, far more often had papers retracted for fraud, and were repeat offenders, than their overseas colleagues. I don’t know how many of my readers were taught, as I was, that science is inherently self-policing and that any researcher who stooped to faking data would inevitably doom his career.

Claims like these are difficult to defend in the face of numbers of the sort just cited. Logic went through the same sort of moral collapse in its time; the English word "sophistry" commemorates the expert debaters of fourth-century Greece who could and did argue with sparkling logic for anyone who would pay them. To be fair, scientists as a class would have needed superhuman virtue to overcome the temptations of wealth, status, and influence proffered them in the post-Second World War environment, and it’s also arguably true that the average morality of scientists well exceeds that of businesspeople or politicians.

That still leaves room for a good deal of duplicity, and it’s worth noting that this has not escaped the attention of the general public. It’s an item of common knowledge these days that the court testimony or the political endorsement of a qualified scientist, supporting any view you care to name, can be had for the cost of a research grant or two.

 I’m convinced that this is the hidden subtext in the spreading popular distrust of science that is such a significant feature in our public life: a great many Americans, in particular, have come to see scientific claims as simply one more rhetorical weapon brandished by competing factions in the social and political struggles of our day.

This is unfortunate, because—like logic—the scientific method is a powerful resource; like logic, again, there are things it can do better than any other creation of the human mind, and some of those things will be needed badly in the years ahead of us. Between the dumping of excess specializations in a contracting economy, the diminishing returns of scientific research itself, and the spreading popular distrust of science as currently practiced, the likelihood that any significant fraction of today’s institutional science will squeeze through the hard times ahead is minimal at best.

What that leaves, it seems to me, is a return to the original roots of science as an amateur pursuit. There are still some corners of the sciences—typically those where there isn’t much money in play—that are open to participation by amateurs.

There are also quite a few branches of scientific work that are scarcely being done at all these days—again, because there isn’t much money in play—and their number is likely to increase as funding cuts continue.

To my mind, one of the places where these trends intersect with the needs of the future is in local natural history and ecology, the kind of close study of nature’s patterns that launched the environmental sciences, back in the day. To cite an example very nearly at random, it would take little more than a microscope, a notebook, and a camera to do some very precise studies of the effect of organic gardening methods on soil microorganisms, beneficial and harmful insects, and crop yields, or to settle once and for all the much-debated question of whether adding biochar to garden soil has any benefits in temperate climates. These are things the green wizards of the future are going to need to be able to figure out.

 With much scientific research in America moving in what looks uncomfortably like a death spiral, the only way those skills are likely to make it across the crisis ahead of us is if individuals and local groups pick them up and pass them on to others. Now is probably not too soon to get started, either.

 .

1 comment :

car salvage uk said...

Recycling offers a lot of benefits to the environment. We can save the resources we have for the future generations since scraps provide a great deal of usable items.

Post a Comment