"Know then thyself, presume not gods to scan. The proper study of mankind is man."
—Alexander Pope
We science fiction people often preen ourselves over SF's successful predictions. The famous visit by the FBI to John Campbell's office during World War II; rockets and space travel; TV; etc. And in fact we haven't done too badly in the technological forecasting business; no worse than anyone else, anyway.
But we don't often mention our "predictions" in the social sciences.
Remember the Golden Age of science fiction? Those were the days when "psycho-history" was an exact science using real math; computers manipulated the calculus of values, and matrix algebra, and all that good stuff. Psychiatrists "cured" criminals; judges were physicians, not lawyers. The social ills of the nation, the world, aye, the universe were plugged into computers (big, massive ones, not the dinky little things IBM and DEC make nowadays) and lo! the answers came forth.
Those stories had their effect, at least on me: I decided I was going to be the Hari Seldon of the XXth Century. I wanted to use the very best tools available, so I studied math and physics and hard sciences, then formal logic and Boole's Laws of Thought, and Carnap's sentential calculus, and once I was tooled up came a perfect orgy of psychology and sociology and anthropology that ended with what used to be called a "terminal degree" (meaning that thought ceases with the Ph.D.?). I studied psychology at the University of Iowa, where they had not one but two schools of psychology, Hull's pseudo-mathematical "learning theory," and Kurt Lewin's "vector psychology." I kept wondering when they were going to use mathematics. Surely, thought I, there would come a time when they would give rigorous definitions; but no, what happened was that they took mathematical symbols and let them stand for some perfectly good English words—but without improving the precision of their definitions one whit. And even when they played math games with the resulting symbols (none of which could really be quantified), the most complicated function I ever saw was a simple algebraic equation.
But then there was statistics. That, we were told, is a tough subject. Well, given that it was taught daily at 0700 by a professor of education, it seemed tough; but in fact all that was taught was cookbook stat, how to compute mean, median, mode, standard deviation, and the like; and how to do cookbook tests using Student's "T-test," and a peek at the Chi-square—not at the Chi-square distribution, of course. Heaven forbid that psychology students learn real mathematics; so actual calculations of probability not covered in the cookbook were beyond my classmates, and I suspect that if you ask the average Ph.D. in psych or sosh or an thro or ed what a probability density is, you'll either get a blank stare or hear something mumbled about specific gravities.
In due time I wandered to the University of Washington. (Accident certainly plays a large part in one's life: I was in school on the Korean-type GI Bill, which paid a fixed sum per month; I hadn't lived with my parents since a year before I graduated from high school, but to save money I had to go to a state university as a resident; my parents had gravitated to Alaska, but the state of Washington had generously declared residents of Alaska to be Washingtonians; and therefore. . .)
The psychology department at the University of Washington had its schools, too: one was headed by a maniac who'd spent twenty years in the attic studying conditioned reflexes in chickens. The best-known man at Washington was Edwin Ray Guthrie, one of the "big three" in learning theory; at Iowa they'd taught us he was not merely wrong, but stupid. (My own opinion is that he was the only practical psychologist in the theory business; his theory can be stated in two sentences, and his practical deductions from the theory are almost absurdly simple; but they can be applied—and they work. That's another story for another time.) And finally there were a couple of professors who actually understood something of mathematics, and who seemed determined to apply real scientific method to the study of man.
One was Paul Horst, who had a contract from the US Navy; he was trying to predict the four-year grade point average of entering freshmen. Lest Proxmire read this and retroactively award Dr. Horst a Golden Fleece, let me point out that the Navy had—and has—a damned legitimate interest in predicting academic success. It costs a lot of money to send a recruit through specialized training such as electronics school; if you can choose from among the boots those likely to do well in the school, you'll save the cost of Dr. Horst's grant in no time.
Those were heady days. Horst's approach to the problem was to get every possible measure on every entering freshman, wait until they graduated, then flog hell out of the data. The goal was to find a series of weights to apply to each predictor such that when you did all the addition and multiplication you had an adequate prediction; and to do that for each of thirty possible majors! Here finally was a legitimate use for matrix algebra, which Horst required all of his students to take.
We also went to computer school, because inverting a 60 x 60 matrix is hairy. Of course the best computers in the world weren't very good; IBM thoughtfully gave the school a 650, but there weren't any programs to do what we wanted, and we grad students had to learn programming: not in easy languages like Basic or Fortran, which didn't exist; not even in modern assembly language; no, we had to do it in machine language.
Eventually it was done. (My part, as I recall, was a program that would invert triangular matrices; it took a whole summer to develop it, too.) Came the day when the great grade prediction program was to be run. Since the 650 had rather limited memory (on a drum at that) the programs made it punch intermediate answers on cards, which were then carried from the punch to the reader to be fed in again; we were up all night getting just part of the answer. But at last we had the equations: take an incoming freshman, subject same to a battery of tests, plug in high school grades and class standing, plug in a correction for the particular high school (and save the data, because that correction factor needed more cases from each school to give it more accuracy); put that into the computer; and out came a prediction of the grade that student would get after four years in each of about 30 major subjects.
Only predictions, of course; now there was nothing for it but to wait four years and let those students graduate. Obviously each case would count only toward the predictions made for the major actually chosen; but enough of those would validate the predictors. Eventually there'd be enough data to validate the method used.
I'd left before the first students graduated, but I'm told it looked very good indeed; good enough that the University decided to give incoming students their predictions to help them choose majors.
And it hit the fan.
I don't know the current status of the grade prediction program at the UW now; I gather it's moribund. It seems the predictions were racist. They were detrimental to some of the high schools (remember that correction factor?). They were also detrimental to certain departments, because they showed that students almost certain to flunk out in one of the difficult majors would do well in many of the soft sciences . . . (In certain majors there was not one single predicted flunk.)
So what's the point of all this?
Two points: one, it may just be possible to do really useful stuff in the social sciences; and two, it takes a lot of time, and it takes a lot of money; and if time and money shortages don't discourage that sort of thing, the next factor is almost certain to: it's hard work. It takes real knowledge of real hard stuff; much harder than sophomore stat and freshman calculus.
And do they require that sort of thing in social science departments? They do not. What they do require for a Ph.D. in psych is "History of Psychology"—a course in which you're required to learn, in great detail (the textbook was written by a man named Boring; portent enough, but the reality was worse), what all the "great thinkers" of the field believed. At the end of each section you find out why they were all wrong. It's as if to get a degree in chemistry you had to spend months learning about the phlogiston theory; as if physics required a three-week course in Democritus' beliefs about atomic structure. In other words, this required course is a confession: the discipline has so little content that they've invented this artificially difficult barrier so the doctorate won't be so easy to get.
Nor is that all: you can spend an entire quarter debating the difference between a "hypothetical construct" and an "intervening variable," a subject worth perhaps five minutes; you can learn a jargon designed to make your conversation incomprehensible, and which serves no purpose other than to see that someone from another discipline will be discouraged from trying his hand; and when it's all finished you are qualified to do what?
What indeed? What is a person with an undergraduate degree in psychology capable of doing? And psych is the tough one; if a B.S. in psychology is aptly named, what are we to make of sociology?
But maybe it's all just as well. Do you really want social science? Let me illustrate.
Probably the most controversial subject in the field involves IQ tests. What, if anything, do they mean? And since most IQ tests show a statistical difference between the races, shouldn't their use be forbidden? (Some courts have forbidden their use in university entry decisions for precisely that reason.)
And my Lord, the arguments that can develop! Nature versus nurture. Heredity versus environment. I listened to a paper on the subject presented by a Harvard professor at an AAAS meeting a couple of years ago, and by Roscoe the debate hasn't moved an inch since my undergraduate days.
Yet it wouldn't be hard to settle, would it? Not if the answer really were wanted.
When I took social sciences seriously, one experiment reported in the Tests and Measurement courses seemed really elegant: the twin studies. It's a simple experimental design. First locate a number of pairs of twins. What you want is identical twins reared together; identical twins reared apart; fraternal twins reared together; and fraternal twins reared apart. Those reared together shared roughly the same environment; while identical twins have identical heredity, unlike fraternal twins who are no more closely related than any other siblings. Go find a number in each category; not easy, but not so very difficult in this era of forms and dossiers.
Give them a number of tests. Ideally test everyone in their class at school, or job category at work, so that your subjects don't know they've been singled out. Then compare the results. What you're looking for is not absolute IQ, whatever that means, but point spread between pairs.
My Differential Psychology text reported such an experiment, and lo! the results were unambiguous. The least difference between pairs was identical reared together, as you'd expect; but then came identical reared apart, not fraternal together—suggesting strongly that heredity was more important than environment in determining what was being measured by the IQ tests.
I'm told that the classic experiment reported in my book was in error; that some of the data may have been fudged. Okay. That's possible. But instead of long debates with anecdotes, and speculations on whether those data were fudged, why not go do it again?
While we're at it, why not develop a really good grade prediction program? The computers exist. Lord knows there's enough money spent on tests. And there must be IQ data on millions of graduates of tax-supported institutions; that can be followed up to see if all that testing is worth anything. If it is, fine, use it to save time and effort and money; if not, fine again, abolish the silly tests; but what we actually do is ridiculous.
Maybe we don't want successful prediction? Might good predictions of academic success have a baleful effect on the republic?
Aha. We're now in the realm of political "science", which once a long time ago meant the study of political philosophy and involved a great deal of history; nowadays the rage is "behavior", meaning that what was faddish in psychology twenty years ago has now caught on in poly sci; with about the same utility. Not that all political science courses are a waste of time; there's considerable value in discovering that most of the ideas and movements and problems we think are unique to our age have cropped up again and again in other times and places. One can also learn something about statesmanship and diplomacy, and even a bit about how to win an election. But there's damned little science in it.
There's sometimes not even common sense. Take the business about a political "left" and "right". It's easy to prove it's nonsense. There's absolutely no variable underlying that "spectrum"; indeed, I pretty well proved in my dissertation that it takes at least two variables at right angles to each other to map even the broadest political groupings each to a unique point. The whole idea of a "left" and "right" is nonsense—but it's still with us, and it has important consequences in the very real world. What in the world do British labor unions have in common with Soviet communism other than the vague feeling that both are "the left"? Are the Czechs better off under Soviet occupation than they were under the Nazis? Of all the stupid notions in academia, the "left-right" model of politics is demonstrably among the silliest; to flog an already-used example, it's as if the chemistry department allowed the rest of the faculty to act as if they believed in phlogiston. If political science can't manage even to stamp out that nonsensical notion, what can it do?
And that at last brings me to the point of this polemic.
There isn't any "social science." None. There are no experts, not in the same sense in which one can be an expert in physics, or chemistry. To the extent that science fiction has encouraged the notion that a science of human behavior exists, we have harmed the world.
We can survive the sociologists. We may even be able to survive the psychologists. The political scientists are a bit more dangerous, but they don't have all that much power: mostly I think of good they could do (like using freshman poly sci to dispel some of the nearly-universal nonsense) and sigh over the waste.
Economists are another matter.
"It ain't what we don't know that hurts us, it's what we know for certain that ain't so."
The economists think they know. And between them and the lawyers, they run the country.
Hope springs eternal. Even after discovering that the useful content of academic psychology can be learned in under a year, and that political science, while enlightening and valuable for intellectual stimulation, was less scientific than psychology, I still yearned for Hari Seldon's laws. Perhaps economics? Economists at least say they're scientific. Grad students in economics talk about input-output models, aggregate economic analyses, "fine tuning" the economy; even in my day, they had complex equations systems which, once they had computers, they could solve . . .
Alas, it's worse there than elsewhere. Look at some of those splendid computer models—and look at the results. They don't predict a damned thing. Hell's bells, as I write this they're wondering whether we're in a recession or not! Now sure, economists can explain everything after it's happened—but so can any of the social sciences. And the trouble with acting as if economists have some special knowledge is that they get in the way of common sense.
Look: it doesn't take much genius to see that minimum wages cause unemployment of the unskilled. You wouldn't hire at three dollars an hour someone capable of doing only two dollars' worth of work; why think anyone else will? Now sure, politicians might act cynically: raise the minimum wage, and count on inflation to negate the effect; but that's not science.
It's not a lot harder to see that high taxes encourage people to spend rather than save; if you want to curb inflation, reduce the tax rate.
They give Nobel Prizes in economics. The award is political, of course; people with diametrically opposite views have won it. If one's right, the other must be wrong. Or they both are.
The theory of state-supported education is that it's an investment in the future. The future citizens should have intelligent opinions and useful skills.
Some think this is the most important investment we can make.
So who allocates this most important investment? Why, the people objectively least qualified to do so, of course: incoming freshmen. Department budgets are closely correlated with number of majors. Thus we have a kind of oriental bazaar, with each department trying to woo as many of the frosh as possible. Each also wants to have one or more courses required for graduation; that too boosts enrollment and thus budget.
There's another way.
Wouldn't it make more sense to subsidize departments in proportion to the republic's need for their graduates? And while we're at it, to use our new powerful computers to generate really good predictions of success in the field? Now true, that would mean some students wouldn't get into the department of their first choice; at least not at public expense. But is that any worse than the present situation, which looks like a bad parody of manpower allocation?
Over five years ago I was asked to testify to a legislative committee investigating diminishing resources; at the time I said the most critical diminishing resource was trained talent. I've had no reason to change my opinion.
In the 50's we thought it shameful that almost 20% of our population was in some degree illiterate. We debated what to do about it. The social scientists promised that all it would take was some Federal Aid to Education; a couple of billion dollars would solve the problem nicely.
Three years ago we lamented our 30% functional illiteracy. Now we have a Department of Education. When do we reach 50%? Anyone want to bet we won't?
Mrs. Pournelle is a reading specialist; her students are illiterate teen-agers, many of whom have thick files proving "scientifically" that they can't possibly learn to read. They've "got dyslexia" (which translates to "reading difficulties;" reminds me of my friend who was much relieved when the physician told him his lower backache was lumbago). She tips the files into the waste can and teaches the kids to read. She is also required by law to take various university classes on how to do her job; thus I'm exposed to the journals and textbooks, and they are simply unbelievable. What passes for research would be laughable if it didn't cost so much—and so thoroughly affect people's lives.
So what's to be done about it? I don't know. My agreement with Baen entitles me to an occasional tirade, and this has been it. Years ago E. C. Banfield said, "The existence of a body of nonsense which is treated as if it were a grand principle ought not be regarded by reasonable critics as equivalent to a grand principle," and I'd like to think I could persuade some of the more honest academicians to take that seriously.
Because it is serious.
We have big computers now. We have analytical tools which might, just might, allow some real science in the social sciences. Hari Seldon's psycho-history probably isn't possible; but something short of it may yet be developed by people trained in scientific method and equipped with modern tools; who know something of computer science and the capabilities of both large and small machines, and also know enough mathematics to have something to program.
But that won't happen if we continue to insist that students learn the nonsense that fills today's social science texts. If they spend their time on nonsense they won't have time to learn anything else.