HORSEPOWER

Step 2 of my strategic plan: Post my favourite class assignments from the past year, which will start this week with manure and will eventually end with sewage. Enjoy.


We tend to think of nineteenth century cities like Pittsburgh as industrializing under the power of steam. But Joel Tarr argues that an older technology also drove the development of the great cities of the steam age.

In 1775 James Watt patented the steam engine, a machine that would become a symbol of the industrial revolution. Forty years later, Benjamin Latrobe opened a steamboat engine workshop on the banks of the Monongahela River in Pittsburgh. The power source that Latrobe used to build his engines? Two blind horses.

Horses like Latrobe’s were a central cog in the nineteenth century urban economy. They were hooked up to engines through circular sweeps, rotating platforms and treadmills, and harnessed to vehicles on wheels and tracks. City horses hauled steel, powered ferries, pressed bricks. They were the source of valuable manure and even more valuable carcasses. They were the catalysts for the paving of streets and the suburbanization of cities.

But, like the combustion engine, their great success as a technology also contributed to their eventual decline. Thanks to the work of Joel Tarr, a professor of history and policy at Carnegie Mellon University, and his colleague Clay McShane, a professor of history at Northeastern University, we are rediscovering how horses hauled cities like Pittsburgh into the modern age.

Horses at construction site

Horses at Cincinnati subway construction site, probably the 1920s. Image credit: University of Cincinnati Library.

Rediscovering the horse as an urban technology

Joel Tarr thought he was done with horse manure back in 1971. The Jersey City native had joined the faculty at Carnegie Mellon in 1967 with a background in urban political history and an interest in how the modern city had been shaped by transport technologies. During his research, he kept coming across historical complaints that he thought might be interesting to a general audience.

The article he submitted to the magazine American Heritage was a vivid account of the problems faced by a horse-driven city, including the staggering scale of manure logistics:

…as health officials in Rochester, New York, calculated in 1900, the fifteen thousand horses in that city produced enough manure in a year to make a pile 175 feet high covering an acre of ground and breeding sixteen billion flies, each one a potential spreader of germs.

“Urban Pollution— Many Long Years Ago”, American Heritage, 1971

Tarr noted that the eventual solution to these problems was the adoption of a new technology, one that harnessed machines rather than animals. “Apparently the editorial board got into a big fight about it because some of them thought that it was an apology for the automobile,” he recalls.

Controversial from the start, the article prompted several newspaper editorials and is still cited today in debates about pollution. But after creating a stir, Tarr moved on, reasoning that “someone else could worry about the horse manure.” And so he pursued his interests in the environmental and technological history of cities and left the subject of horses alone for more than twenty years.

But while the details of the manure problem lived on in the public imagination, Tarr knew that there was much more to say about the importance of horses in the history of our cities. In 1995, he refused to allow the manure article to be reprinted in an anthology and instead asked if he could write a new article on the topic with his friend McShane, who had recently published a history of cars in cities. From that first article, the project ballooned out into a decade of scholarship and co-authorship of their 2007 book on horses as an urban technology, The Horse in the City: Living Machines in the Nineteenth CenturyThe book is a detailed examination of the centrality of horses to cities, focusing on New York City, Boston, and Tarr’s adoptive home, Pittsburgh.

From steam power to horse power

Tarr and McShane emphasize that the horse was viewed as a “living machine,” valued primarily for its ability to provide power. As machines, horses were an integral part of the economy, even after the advent of the steam engine.

After refining the steam engine, James Watt invented a standard measure of mechanical work – 33,000 foot-pounds of work per minute, or 1 horsepower. This unit allowed customers to estimate how many horses an engine could replace and to gauge whether replacing their horses would be economic. In many cases, it wasn’t. For much of the nineteenth century, horses were the engine of choice for applications that required flexibility or mobility and for businesses that could not afford a large capital outlay.

But the one application in which horses were irreplaceable was ground transport within the city. Goods from the expanded railway and steamboat lines could only be distributed to their final destinations under the power of horses, which meant that horse-drawn transport grew more efficient in parallel with steam technology. Innovations in breeding produced larger and larger horses in the pursuit of (as one agricultural reformer put it) “the best machine for turning food into money.” These industrial-strength horses could pull even larger loads after the development of lighter vehicles made with modern materials.

One resident of Pittsburgh remembered the “pandemonium of noises” produced by horse transport in the 1860s:

Numerous wagons, hauling heavy pigs of iron and iron products, timber wheels with anywhere from six to fourteen horses from which huge and unwieldy vehicles hung castings of many tons’ weight, the clattering omnibus, the rattle of the mail wagons, drays […] and other conveyances common to traffic.

George Thornton Fleming,1904

This was the cacophony of Pittsburgh’s developing steel industry, the sound of a modern city propelled by coal and hooves.

Photo of 1897 traffic

Horse-drawn wagons and carriages, an electric trolley car, and pedestrians congest a cobblestone Philadelphia street in 1897. Image credit: National Archives and Records Administration, 30-N-36713.

Shaping the city

The structure of Pittsburgh’s neighborhoods today still reflects the age of horse-drawn vehicles. Public transport began with road vehicles called omnibuses, but gathered momentum with one of the most influential urban innovations of the nineteenth century, railed “horsecar” lines. These tracks, the precursors of the cable car and electric streetcar systems, provided a smooth ride that omnibuses could never achieve on cobbled pavements that were optimized for horseshoe traction. The benefits of the tracks were not just to the spines of riders, but to the speed the horses could travel, the number of riders they could haul, and the amount of profit their owners could make.

The first lines were laid in 1863 and by 1890 the average Pittsburgh resident took 192 horsecar trips per year. The tracks had grown along the lines of least resistance, following valleys and avoiding the worst of Pittsburgh’s steep hills. As the lines expanded, ridership increased at a rate much faster than population growth, reflecting Pittsburgh’s shift to the suburbs; Residents could now live further away from downtown and make a daily commute to work. Wards within an hour’s smooth ride of downtown were suddenly more desirable than when they were a longer, more expensive and more bone-jarring omnibus ride. The relatively flat Eastside saw the biggest growth – between 1870 and 1890 it grew from 5,350 dwellings to 17,604. Construction boomed in areas within a ten-minute walk of a horsecar line. Tarr and McShane write that “the greater speeds allowed Americans to fulfill the new dream of the middle class, a detached home with a yard on the outskirts of a city.” Meanwhile, downtown was losing residents to the new suburbs and slowly transforming into a true central business district.

Tarr and McShane point out that the horsecar alone did not cause these changes in Pittsburgh and other growing cities. Factors like economic expansion, population growth and a new appreciation for suburban life played an important role, but the horsecar was the technology that allowed these trends to play out, and it set the patterns that were extended in the twentieth century by the streetcar and the gas-fueled automobile.

Problems with the living machine

In 1872, American horses came down with a terrible case of the flu. Several Northeastern cities ground to a dramatic halt. The horse flu epidemic cut off city supplies, grounded fire departments, and isolated suburbs from their vital horsecar lines. When one commentator later warned that another epidemic would reduce New York City to “straits of distress,” he concluded that although “cities have been made by building around the horse there is no necessity for keeping him permanently as their centre.” As the century progressed, more and more objections were made to the city’s dependence on horses.

Like all technologies, horses had their downsides. They were living creatures, susceptible to disease, unreliability, and even personality. They required an enormous infrastructure of foul-smelling stables, with stockpiles of hay that posed a significant fire hazard. But above all, horses were prolific polluters. The average city horse unleashed 25-35 pounds of manure and two to three gallons of urine per day.

Horse manure started out as just one of the many hazards of urban life, but as the century progressed, the exploding city horse population became a source of public angst and newspaper editorials. To make matters worse, by the 1880s the bottom had fallen out of the manure market.

Fresh manure had long been a valued commodity, sold by stable owners and street sweepers to farmers on the urban periphery. But thanks partly to competition from new guano and rock phosphate fertilizers, the price of manure had fallen to less than a quarter of its worth. A New York Times editorial from 1881 conveys the confusion caused by a city decision to declare summer dumping grounds off-limits amidst the glut of manure: “Public health nuisance: No place for stable manure—What is to become of it?” By 1908, one journalist claimed that 20,000 New Yorkers died each year from “maladies that fly in the dust, created mainly by horse manure.” The biggest problem was that the accumulating piles were a favorite breeding ground for flies, a vector for life-threatening diseases like typhoid.

Part of the solution to the manure problem was technological. By 1902 most horsecar lines had transitioned to electric trolleys only a decade after they had been first introduced. But the manure problem itself was not necessarily responsible for the speed of this change. Tarr and McShane argue that in many cases, the new technology was rapidly embraced by horsecar companies because these companies did a tidy side business in land speculation. Horsecar lines had the reliable effect of pushing up property prices wherever they were laid, but by the late 1880s, horsecar lines had mostly expanded as far as they could within a one-hour commute of downdown. With the increased speed of electrified trolleys however, horsecar companies could expect to double that radius and reap the rewards in real estate. As a result, these companies became intimately involved in urban politics and in many cases bought themselves influence on city councils to ensure they received the necessary franchises. Within a decade, most of the lines had switched over to electric.

For a few more decades, horses were still favored for tasks like fighting fire, hauling waste, and making neighborhood deliveries. But by the end of World War II, even these jobs fell to the automobile. The horse manure problem was solved and the age of the car had begun.

The technological solution

Despite the initial optimism that cars were a clean and efficient alternative to the horse, the new technology has also become a victim of its own success. The burning of fossil fuels generates air pollution that can be as hazardous to human health as the diseases spread by flies, and it releases carbon dioxide that contributes to climate change. A century after the decline of the horse, we are again facing a chronic pollution problem.

Embedded among the engineers and policy faculty of Carnegie Mellon, Tarr has consistently pursued historical questions that provide perspective for contemporary policy debates, particularly the problems of urban waste. But ever since Tarr published that first article on the horse manure problem, commentators have repeatedly used the story as a parable about the wonders of technological fixes to environmental problems. For instance, Steven Levitt and Stephen Dubner used the story of the horse in their 2009 book SuperFreakonomics to justify the use of radical geoengineering solutions to climate change.

Tarr himself doesn’t believe technological change is always a panacea. “Why do we automatically assume that every new device will be better?” he asks. He has made urban technological change one of his specialties because he believes it is important that we understand the drivers of change. “History circles,” he explains.

This particular circle has come around quickly. In Tarr’s office there is a reproduction of a magazine photo hanging prominently amongst the accumulated books and papers; in it stands his father in a worker’s cap, cigarette between his lips, at work under the harsh light of the night shift at a shipyard. He had been one of those workers who built the urban landscape with the help of a living machine. “He had a horse,” Tarr says, “back when he was in the scrap business in New York. He had a horse called Shivers, and that’s just about all I know about it.”

Where to Find Out More

The Horse in the City: Living Machines in the 19th Century by Clay McShane and Joel Tarr Johns Hopkins University Press, 2007.

“The Centrality of the Horse in the Nineteenth-Century American City,” Clay McShane and Joel Tarr, In Raymond A. Mohl (ed.), The Making of Urban America Scholarly Resources, Inc., 1997.

“The Horse Era in Pittsburgh,” Joel Tarr, Western Pennsylvania History, Summer 2009, 28-41

Curiosity killed the parrot? My guest post at Scientific American blogs

There’s too much to say about kea, those playful, destructive and slightly obsessive-compulsive snow parrots from New Zealand. I wrote a guest post at Scientific American Blogs this week on the problem of lead poisoning in wild kea populations, but there were a million things I had to leave out for fear of boring people with kea overload. If I ever finish my homework, maybe I’ll  write more about them, in the meantime please enjoy:

Wheelie bin raids

The Kea Conservation Trust

The 1993 documentary Kea: Mountain Parrot

Kea - Mountain Parrot

Update (24th Jan):

Just plain ol’ footage of kea flying around:

Counting fish

Update: December 10 – I won a travel award! I’m going to ScienceOnline! A hearty thanks to NESCent, and you should all go read the other awesome winning posts – on bed bug ground zero, bee housekeeping, and evolutionary escapes from environmental toxins.

——————————-

I’m entering this post in the 2012 NESCent evolution blog contest. The winners get a travel award to attend ScienceOnline 2013!

Pink salmon, Bear Creek by K.Yasui/2011 USFWS Alaska Fish Photo Contest. Shared under this Creative Commons license.

Pink salmon spend two salty years in the ocean before they return to their birthplace to spawn and to die. If that birthplace was Auke Lake, near Juneau, Alaska, then a returning salmon can only reach its final destination by passing through a narrow opening in the weir at Auke Creek, which drains the lake into Auke Bay. Every year, thousands of pink salmon pass through the weir’s trap, both adults fighting upstream and juveniles coasting the other way. Each one of those fish is counted by researchers who stand thigh deep in the cold water, monitoring the trap every day between March and October. This marathon fish count stretches back to the 1970s, and has provided one of the most detailed records of a salmon population anywhere in the world. Combined with a fortuitous little genetic experiment performed at the weir in 1979, the Auke Creek data have also given us some long-sought evidence that the annual rhythms of the natural world are evolving in response to climate change.

Auke Lake is the body of water on the left  and Auke Bay is on the right. Migratory fish that need to move between the two must travel the short creek in between. Photo by Gillfoto, used under this Creative Commons license.

Many things have changed in the decades since the fish counting started. Average stream temperatures are higher by more than one degree celcius, the salmon are returning to the lake nearly two weeks earlier, and the entire migration season falls within a narrower window of time. Although we can’t say for sure that the migration shifts are caused by the temperature change, it falls into a pattern that has been observed for many other organisms all over the world. Birdsbutterflies, frogs, flowers, plankton – to name just a few – are slightly shifting the timing of their big, seasonal life events, all consistent with a response to a warming climate.

But do these timing shifts count as evolution? Without evidence of genetic change in a population, such shifts might be just the result of individuals adjusting within their normal range of behaviors. Genetic evidence to the contrary is extremely hard to come by – so even though biologists have long believed that the many examples of shifting seasonal traits must include some examples of rapid evolution, they haven’t had the hard genetic data to show it.

Luckily, some three decades ago, fisheries biologist Anthony J Garrett started an obscure little experiment at Auke Creek. Recently, that experiment was extended and repurposed by Ryan P Kovach, a graduate student from University of Alaska, Fairbanks, and David A Tallmon of University of Alaska, Southeast, to confirm that the Auke Creek salmon have indeed evolved.

In the 1979 experiment, Garrett tinkered with the genetics of late-migrating salmon just enough to let him trace their fortunes. Historically, the fish counters could distinguish between two relatively distinct populations that migrated about twenty days apart – the “early run” and the “late run.” Interested in these sub-populations, Garrett looked for a gene variant to use as a genetic “marker” for the late-run fish. The genetic marker he chose was naturally present at low levels in the population, but seemed likely to be selectively neutral – neither harming nor helping the fish that bore it. He captured all of the very last migrating pink salmon of the season and only spawned those that carried the genetic marker. The offspring of those fish rejoined the naturally spawned population, and by the time the next generation returned, late-run salmon had a five-fold increased frequency of the genetic marker compared to the early run.

Because all those diligent Auke Creek fish counters in waders were also taking DNA samples throughout the spawning seasons, we know that the frequency of the late-run marker stayed constant for about a decade, confirming that the marker was indeed selectively neutral. The “marked” fish and their descendants kept turning up reliably late until 1989, when stream temperatures during the spawning season reached the second highest on record.

By 1991, the late-run marker had faded back to the low natural levels found in the early-run fish, and in parallel, the fish counters saw a dramatic decrease in the number of salmon turning up late. In a single generation, the distinct late-migrating subpopulation had practically disappeared, making the average migration time of the entire population significantly earlier. In 2011, twenty years later, the data looked much the same as in 1991 – which means the Auke Creek salmon population is probably still dominated by descendants of the 1989 early run.

So this is interesting news for biologists looking for evidence of climate change-driven evolution. But what does it mean for salmon? Today, Auke Creek pink salmon are as abundant as ever, and thanks to that hot 1989 summer, the population is now adapted to a slightly warmer climate. But because of that adaptation process, they are also less genetically diverse and less behaviourally diverse, which means they might not be so lucky when up against other natural selection events in the future. There is also a limit to how early a salmon can spawn. If temperatures continue to rise, at some point Auke Lake could cease to be a viable salmon spawning ground, with effects that would ripple through the region, both ecologically and economically. It would also bring an end to the salmon counting.

Auke Creek Salmon Research. Photo by Alaska Fisheries Science Center, NOAA Fisheries Service (Public domain).

Genetic change for earlier migration timing in a pink salmon population

Ryan P. Kovach, Anthony J. Gharrett and David A. Tallmon

Proc Biol Sci. 2012 Sep 22; 279 (1743):3870-8

We were here

In the early hours of a Monday morning in July 1945, the world’s first atomic bomb test lit up a remote corner of New Mexico. Several weeks later, two more atomic bombs were dropped on large urban centres in Japan. These events marked the beginning of the ‘atomic age,’ but they also marked another beginning, a brief pulse that an experimental biologist would call Time Zero.

Since my post on the ‘natural’ genetics experiment on rescue workers at the World Trade Center site, I’ve been thinking more about the unintentionally brilliant experiments that can emerge from disasters and accidents. One of the most remarkable examples is the so-called bomb-pulse, which is the global isotopic signature left by the atomic bomb tests of the 1950s and 1960s. That signature is found in every living thing on the planet and can now be read back like a ticking clock. It can tell us the birth year of an unidentified murder victim, a vintage wine, your brain cells, fat cells or even the molecules of fat themselves. But it also left an enduring message for future scientists in the geological record. The message says: We were here.

Dog 2 – 19kt, Nevada Test Site, May1951. Image Public Domain, with many thanks to Trinity Atomic Web Site. Click on the photo to visit this fascinating archive of historical documents and media on nuclear weapons.

Before 1945, the global levels of naturally-occurring radioactive isotopes were steady. Between 1955 and 1963, an intense period of cold war-fueled nuclear weapons development caused a sudden increase in the levels of certain isotopes in the atmosphere. This increase came to a sharp end with the signing of the 1963 Partial Test Ban Treaty (PTBT), an agreement that dramatically reduced the number and power of such tests. Bomb pulse testing makes use of the spike in atmospheric levels of the harmless isotope carbon-14 (14C), which doubled between 1945 and 1963. 14C is normally produced at a low rate by the action of cosmic rays in the upper atmosphere, but 99% of the carbon on earth is in the non-radioactive 12C form. The ratio between 14C and 12C levels in living things reflects that of the atmosphere. Plants take up the 14C and 12C in the form of carbon dioxide, convert it into sugars, are eaten by animals, who in turn may be eaten by other animals.

We can estimate the ‘birth date’ of molecules within a living thing because the levels of 14C have been decreasing at a steady rate since 1963. This regular decrease is due to the gradual dissipation of the isotope into the ocean and into living things, as well as dilution due to the burning of fossil fuels (which are rich in 12C). By comparing the historical records of atmospheric  14C ratios to the ratios in say, a vintage bottle of Australian red, we can determine the year in which the grapes were grown. Similarly, the 14C ratio of the tooth enamel of an unidentified body can tell us their year of birth to an accuracy of less than two years. We can do this because the enamel is only formed at very specific times in childhood.

But most of our bodies are not made of permanent structures like tooth enamel.  We are each a colony of different kinds of cells that are constantly growing, dying, and renewing. The bomb pulse allows us to measure the birth and lifespans of these different kinds of cells, giving us an average ‘age’ for the different cells of our body. To do this, we measure the 14C ratio of the DNA molecules in each cell, since DNA is made only at the time the cell is first formed (during cell division). Many of the discoveries made using this technique have settled acrimonious debates or overturned long-held models. For instance, it showed that the neurons of your neocortex (the ‘brainiest’ bit of our brain) have the same birthday as you do.  In other words, you’re stuck with the neurons you were born with (you can read a summary at Not Exactly Rocket Science.) Another (of many) high profile findings of the same group was that you are not stuck with the fat cells you are born with – most fat cells die and are replaced by a new cell about once a decade. Last month, a new study was published that looked at the molecules of fat within those fat cells, and found that their average age was about 1.6 years. They also found that the average age of fat molecules in obese people is about 50% higher than in non-obese people, probably because the rate of fat removal is slower.

As interesting and useful as all these methods are, we are probably only going to be able to use them for another generation or two, since atmospheric 14C levels should be back to their pre-cold war levels by about 2020.  However, there will also be a more long-lived legacy of the bomb pulse: the sudden spike of isotopes in the geological record. The sediments being laid down today will contain organic matter with higher levels of 14C. Will this become a distinct ‘event boundary’ like the iridium-rich K-T boundary that records the arrival of an asteroid and the extinction of (many) dinosaurs? Geologists are currently arguing about that possibility as part of the wider debate about whether to formally recognise a new geological epoch – the Anthropocene. Informally, the Anthropocene designates the modern age, under the hypothesis that human activity has changed the planet as profoundly as many other major geological events. Some geologists argue that the bomb pulse would be the best candidate for the official stratigraphic boundary of the Anthropocene. It’s unambiguous, global, and sharp.

To have a debate about how geologists in the future should classify the evidence of our existence is a charmingly human activity. I look forward to the squabble continuing for some decades. But whatever the outcome of the debate, no matter how long our civilization lasts, whether it flares magnesium bright or fades into the darkness, it will be much, much longer before all traces of our existence are gone. We were definitely here.

Assorted additional information about atomic bombs, atomic bomb tests and atomic bomb pulse testing:

For mere mortals:

An archive of historical material related to nuclear weapons, including an eerie gallery of mushroom cloud photos. And another one at The Atlantic, with more variety and bigger pictures.

An interesting article from Gareth Cook about whether the bombing of Hiroshima and Nagasaki was responsible for the end of WWII: Why did Japan Surrender?

Elizabeth Kolbert explores the Anthropocene in National Geographic.

For mere mortals with access to fancy academic journals beginning with ‘S':

At Science, read about how Kirsty Spalding, an Australian postdoc, developed the ‘bomb pulse’ dating method by collecting horses heads from the abattoir and analysing the brains with very sensitive isotope detectors.

Also at Science, the Anthropocene debate continues amongst geologists.