A Calculating People

The Spread of Numeracy in Early America

Patricia Cline Cohen, University of Chicago Press, 1982

 

These notes provide a collage, primarily in the author's own words, of issues and evidence in the history of quantitative literacy adapted from "A Calculating People." The notes do not represent a complete or coherent summary of the book, but merely a selection of ideas relevant to quantitative literacy.(QL Home Page)


The essence of literacy is the manipulation and interpretation of writtensymbols. Even the most exalted speech in a Shakespearean play does not require literacy until it is written and read. But what of numeracy? Reckoning over twenty is tantamount to being able to read and write. (In 1701 an English mathematicians descried Americans as "barbarous" because they could "hardly reckon above twenty.") But what of the abacus? What of logical thinkers (Plato, Aquinas) or calculating prodigies? Defining numeracy is not as easy as defining literacy. Clearly it is a mistake to think of numeracy as either wholly present or wholly absent.

In the middle ages, numbers were used to record transactions, not for calculating. (Counting boards were used for that.) Arabic numerals prevailed as a means of calculating, even though many feared that they could be modified too easily to serve as a reliable means of keeping records. Commercial calculation was complicated because different units were used for different commodities. Up through the eighteenth century most Englishmen though it perfectly reasonable to let the material being measured dictate the unit of measurement. No one suggested a uniform system of weights and measurements...not least because it would have forced an unnatural dissociation between the product and its measure.

Before the seventeenth century the order of the cosmos was dictated by Aristotelian classification. Quantification emerged as an alternative way of making sense of the world, a way that could account for activities newly perceived in the interstices of classical categories. In the 16th and 17th centuries, numbers appeared to capture the vastness of the world and the universe, the importance of money, and the unruliness of people. They also introduced into Western thought the idea that it is possible to remove values from anything that can be quantified. Nevertheless, by the end of the seventeenth century numeracy had not made rapid progress in England. Probably fewer than four hundred Englishmen could be said to be mathematically minded, including teachers of navigation as well as members of the Royal Society. These men shared a delight in numbers and a keenness to measure.

In the seventeenth century, applications of numbers were still limited. Measuring things was a kind of sport, and few people engaged in it. Yet commerce thrived since all that was required to enter into a market exchange was trust: merchants made change and kept records for their less numerate clients. (Today we do the same with mortgage rates and FICA deductions.) The common feature of the diverse instances of quantification (e.g., counting parish membership, measuring height of mountains) was their origin amongst a relatively few men. Those who took up the craft were inspired by the ability of numbers to create certainty. In 1690 Sir William Petty published Political Arithmetickin which he argued that the specificity of numbers conferred objectivity. True political knowledge, Petty believed, would arise out of full quantitative accounting of social and economic facts.

Numbers are preeminently descriptive. On the most basic level they enumerate. They create uniformity, make comparison possible. Moreover, they can register the combined effect of several variables. Quantification then creates a sense of uniformity and finitude; it counts, and in the process, accounts. It can be a powerful explanatory tool, an alluring way to impose order on the flux of the seventeenth century world.

The shift in the last three centuries in how we view censuses illustrates well the shift from qualitative to quantitative thinking. In 1694 Parliament passed a census law. However, it was not well received by the populace, both because it was seen as foreshadowing more aggressive tax collection but also because it violated a biblical warning which took precedence over any advantage that numerical precision may have conveyed. The Old Testament records a "sin of David" who brought a plague on Israel for "numbering" the people. (Note: "census" and "censor" derive from the same Roman practice of censuring immoral behavior.) In the seventeenth century, most people thought of population in terms of qualities and characteristics, but number was not among them. Political arithmeticians talked about the importance of knowing the whole number of people, but dictionaries still did not show the word "population." Not until mid-eighteenth century was their sufficient interest in population as a quantity to make a census sensible.

 

Numeracy in Early America

Settlers in the American colonies lacked interest in arithmetic only partly because of their relatively low level of education. Puritans in New England had the highest levels of university education and literacy, but arithmetic was not among the subjects considered basic for Puritan children to learn. Even when the church membership declined, Puritans never thought about counting members as a means of documenting the slide from faith.

In 1735 Ben Franklin wrote about Mathematicks in the Pennsylvania Gazette and argued for benefits beyond those of trade, including a logic which stretched the mind and improved the faculty for exact reasoning. Indeed, what progress there was in numeracy in pre-revolutionary America came not as a tangent to commercial growth but as the result of a slow but fundamental shift in the way men thought about human affairs and divine intervention. Puritans did not keep totals of souls saved because of their fatalistic belief that the number of the elect was predetermined. The decline of fatalism and the discovery of peculiar regularities in events once thought to be under inscrutable divine control encouraged the evolution of a mathematical sense in the eighteenth century.

Death and mortality--God's ultimate province--provide a good example. Eager empiricists showed death rates to be different in different places, thus suggesting that they are somewhat under man's control. The gradual realization that man could, in theory, control mortality accelerated the decline of the notion of an omnipotent God and hastened the arrival of autonomous man, directing his own destiny. Fatalism and uncertainty began to give way to control and predictability, exercised through the medium of numbers.

Bills of mortality led religious leaders (e.g., Cotton Mather) to argue for preordained mortality percentages, applying data from European cities to Boston without a second thought (because they were believed to be divine, thus universal). A 1721 smallpox epidemic in Boston occasioned the first suspicion that man could alter God's plan. Mather proposed inoculations to fight spread of the disease, and marshalled medical statistics to prove the efficacy of this approach. The inoculation controversy gripped Boston almost as tightly as the disease itself. However, the public debate was not so much over whether (much less how) inoculation worked, but how inoculation affected God's power to ordain deaths and send punishments in the form of epidemics. Aiding the sick did not interfere with God's power, but preventing illness removed smallpox from God's arsenal of warnings. "It was a problem that engaged the best minds of Boston in 1721."

Samuel Grainger, a mathematical practitioner in Boston and one of the most numerate of the colonists, considered argument by number to be irrelevant in the face of religious obstacles. "Do not be seduced by its [smallpox vaccination] supposed success. For to urge the lawfulness of this practice from its success is a very weak argument to prove it so. For should success become a sufficient plea for the lawfulness of any action, every wicked action successfully acted would become lawful at that rate."

The smallpox record illustrates not only colonists' skepticism about the proper role of numbers in thought, but also the relative innumeracy of even the educated New Englanders. Numerical arguments in newspapers about smallpox were filled with errors and sloppy reasoning. Numbers were used to impress, even calculations that could impress only the inattentive. But in 1730 no one gave numbers close scrutiny.

 

Numeracy and Democracy

By the end of the eighteenth century a new attitude emerged towards numbers and the respective powers of God and man. This consisted of three notions that were, at the time, novel: That man could alter the course of nature; that quantification was essential both as a tool for altering nature and as a means of assessing the extent of the alteration; and that one was entitled to live out his "full" life.

In the eighteenth century "mathematicians" were primarily surveyors and measurers who taught short courses in evenings and on weekends. Since (applied) mathematics was learned this way, it was slow to penetrate other parts of the educational system. It was viewed more as a trade and craft than as part of basic education. However, a few individuals, Jefferson among them, were inveterate calculators--people who quantified everything, who exercised an "esoteric proclivity for numbers" in a climate not supportive of that style of thought. Quantification, for these men, held a psychological attraction every bit as strong as whatever practical value they may have perceived.

The drive for numeracy in the colonies coincided with the great debates about how to implement a democratic society. Jefferson's plan to decimalize money, weights, and measures discarded centuries of tradition in order to simplify arithmetic so that "men of ordinary capacities could become as facile as he in the calculations of daily life." His plan to impose a grid survey on the Western wilderness was meant to eliminate ambiguous property boundaries. Things that were counted and measured perfectly were fixed for certain; precision defied ambiguity. That was the attraction of mathematics for Jefferson, and for the new nation.

Decimal money, begun in 1792, democratized commerce by putting computation within reach of all. At the same time, the self-consciously utilitarian spirit of the new nation invaded education and elevated arithmetic to the status of a basic skill along with reading and writing. Decimal money and arithmetic education were justified as fruits of republican ideology: numeracy was hailed as the cornerstone of free markets and free society. "Republican money ought to be simple and adapted to the meanest capacity."

 

Learning Arithmetic

Numeracy made slow progress in the schools. The English assumption that arithmetic was too difficult to explain persisted in America. Eighteenth century methods of problem solving confounded all but the best students because they deliberately relied on memory, not on understanding. Texts contained a plethora of rules to match every conceivable situation, sometimes in verse to aid in memorization. Most students' education ended with the rule of three (given three parts, find the fourth)--what we now call ratio and proportion. In New England many students did not get even that far, since the ultimate aim of a Puritan education was theological, not mercantile or scientific. Indeed, in 1648 the rules for a grammar school in New Haven directed that children should be taught to "cipher for numeracion and addicion and noe further."

Before arithmetic became a necessity, it was taught and learned as an abstract system of knowledge. Between 1660 and 1750, various practical arithmetics gradually replaced Robert Recorde's standard theoretical text which taught arithmetic as a system for dealing with abstract quantities. The practical texts omitted explanations as being unnecessary and beyond the grasp of their audience of tradesmen. For the most part, authors stuck to rules and examples, making no attempt to weave the whole into an integrated system of thought. They omitted reasons because they believed them to be "tedious and inconvenient."

In 1800 arithmetic was still laborious, depending on rules, catechism, and memory. In the process of making arithmetic a required subject for all, no one had stopped to wonder whether there might be a better pedagogic approach. Educators finally had to conclude that the traditional method of instruction simply did not work very well. Elementary arithmetic had for so long been associated with commerce that it had been overlooked as a purely intellectual exercise for the mind. "We do boys a great injustice by supposing that they cannot reason," said one educator. But with the desire for a populace well attuned to reason as a solid foundation for republican government, there emerged the idea that even basic arithmetic could help train citizens to think well.

New texts emerged offering various reforms--to eliminate fractions and use only decimals; to eliminate units; to use counters and bead frames. Warren Colburn in 1821 introduced "mental arithmetic" for children as young as five years old based on "inductive reasoning." The goal was to abandon slavish reliance on rules and memory, and let children develop their own rules by manipulating tangible objects. The old system worked from rules to examples, the new from examples to rules. Colburn's mental arithmetic became quite popular. Children competed in arithmetic contests as they did in spelling bees; they learned to think quickly in a changing marketplace; and their growing mental skills reinforced the early nineteenth century prestige of inductive reasoning.

When arithmetic became required in school, girls for the first time began to learn arithmetic. Soon pamphlets arguing against this practice began to appear. According to critics, women failed the utilitarian test of advanced mathematics (they had no use for it) and the capacity test for learning (women at that time were believed to be intuitive, not inductive beings). Thus a stereotype formerly hidden became dogma because the issue had to be confronted. In geometry, one branch of mathematics that had never fallen victim to the deadening memory-based approach of the commercial arithmetic texts, logic dominated. So it is not surprising that particular objections were raised whenever girls took up the subject. The irony of these arguments became evident in the 1840s when the feminization of the teaching profession presented new and compelling reasons for women to be more fully numerate.

Even in postsecondary education, numeracy was slow to take root. Harvard College in the seventeenth century did not embrace arithmetic, which it considered a vulgar subject. Although Harvard established a mathematics professorship in 1726, it relegated the teaching of arithmetic, geometry, and astronomy to a mere two hours a week in the senior year. In 1740 Yale made arithmetic a first year subject, and Harvard followed two decades later. Arithmetic became a requirement for admission at Yale in 1745, at Princeton in 1760; and at Harvard in 1802.

 

Statistics and the State

In 1800 "statistics" meant a statement about the civil condition of a people. (Its root comes from "state" not "statics.") Leaders of the new republic limited statistics to definite facts about populations--wealth, trade, industry, occupations, civil and religious institutions--arguing that these were the data most appropriate for assessing the American experiment in republican government. Soon compiling civil facts and figures became common. The many new and practical uses of numbers contributed to the belief that whatever was quantifiable was objective. (Observer bias in measurement never seems to have troubled anyone.) Puritanism enhanced the scientific world-view by demanding direct, personal experience of nature (as of God) and by creating civil chaos that liberated creative individuals. By the late eighteenth century, facts had come to be seen as indisputable and objective in contrast to opinions which are idiosyncratic and debatable. Statistical thought offered a way to mediate between contending political ideas based on a homogeneous social order and economic realities that were fast undermining homogeneity. An extension of this respect for facts was the idea that if only enough facts were known, disagreement on public issues would end. Inventories of facts were touted as providing objective basis for determining the common good. Complete possession of facts, it was hoped, would eliminate factionalism. Wrote one influential editorialist, "If all men know alike, though imperfectly, their opinions must be the same."

Debates about the new census focused on whether it should measure people or products. In 1790 Madison argued for a more detailed census that categorized people and occupations. Congress threw out the census of occupations as of no benefit. By 1800 Congress was asked by leading scientists for an unprecedented social survey, but they rejected that too. "Few shared Madison's enthusiasm for marking the progress of society."

The movement to comprehend society through quantitative facts accelerated in precisely the same decade in which arithmetic instruction in the schools was being thoroughly revamped. Arithmetic not only improved the logical faculties and prepared young boys for commerce, but it also opened the doors to useful political knowledge in the form of data on the state. Example: a children's board game dating from 1806 included many facts and figures from the 1800 census.

However, innumeracy persisted just beneath the surface. The temperance movement used statistics widely in their efforts, but never thought of distinguishing between correlation and causation. Random samples were unknown. Moreover, this empiricism was freighted with unacknowledged values. They counted acres of land and export tonnage, miles of roads built and postage stamps sold, inebriate poor and hymns memorized. But what they did not count in 1820--for instance, the number of slave owners, black mortality, female illiteracy--tell as much about their society as the things they chose to notice. "In the nineteenth century what was counted was what counted."

 

Early Nineteenth Century

The census of 1840 generated intense political debate because it seemed to support slave-owner's arguments that slaves could not survive freedom: the census revealed a gradient of black insanity rising from south to north. Experts and politicians fought over the data. Fortuitously, the American Statistical Association was founded in 1839 and made itself available to resolve errors in the 1840 census. Despite this "expertise," nobody was able to find an explanation.

Many years later analysts showed how a routine error of recording a few senile whiles in an adjacent "idiot black" column, combined with the two demographic gradients (larger numbers of elderly in the north and larger number of blacks in the south) produced a gradient of idiot blacks rising from south to north. That on the eve of the Civil War no one offered this explanation for a matter of major public dispute "says something about the degree of quantitative sophistication that they all shared."

Data were used in other ways in the pre-war political debates. Some argued against slavery by displaying statistics showing that the South trailed the North in every measurable aspect. But on what basis did they expect these statistics to persuade? Anti-slavery advocates assumed the form an argument should take to be convincing. But there are other ways one could contrast two cultures as dissimilar as North and South. Suppose one compared per capita wealth instead of total wealth. Would slaves then be counted as population in the denominator, or as wealth in the numerator?

There is a striking element of hubris in nineteenth-century statistical thought, a hubris that is still at work in the twentieth: to measure is to initiate a cure. In early nineteenth century America, numbers were celebrated because they were genuinely useful, because they were thought to discipline the mind, because they marked the progress of the era, and because they were reputedly objective and precise, and hence tantamount to truth. Yet despite the argument that numbers would provide a convincing and objective basis for political decisions, social values always left their imprint on supposedly objective empirical facts.


QL Home Page
Last Update: 25 June 1999
Contact: Lynn Arthur Steen
URL: http://www.stolaf.edu/other/ql/cohen.html
Copyright © 1999.