Blog

continuity and change in american racism

On a late spring afternoon, my mom was standing in line at the Big Star getting groceries before heading home to make dinner for her husband and tend to her baby. In the checkout line, a voice came over the loudspeakers and announced that MLK Jr had just been shot and killed downtown. The checkout boy smiled and said “They finally got the son of a bitch.”

That was April 4, 1968, in Memphis. I was the baby. To just about any of you reading this–including my mom–this will seem like a long time ago. Two likely reactions to this vignette, in light of when it happened, are: 1) That was a long time ago and things have changed a lot since then, and all this yelling about racism is seriously overblown; and 2) That was a long time ago and apparently nothing has changed, despite the civil rights movement and all the laws and trying to re-educate people and all that. So we’re screwed.

Both reactions would be understandable, and fortunately, they are both wrong.

As a historian, it’s my job to pursue abiding questions in human history whose examination takes place over the long term, regardless of current events. But it is also my job as a historian to offer whatever insights I think history might have to offer as we grapple with pressing problems that may seem of the moment, but are actually deeply-rooted in our history.

The fact is that things HAVE changed a lot since I was born. As humans, it is natural enough that we would prefer to focus on that, because it feels good. But there is another fact, on which we would rather not focus, because it makes us frustrated, angry, and depressed. That fact is, of course, that things haven’t changed as much as they needed to, or that we’d like to believe they have.

It is true that changing deeply-embedded social attitudes takes time, and that fifty-odd years is not that much time in societal terms. But we mustn’t use an acknowledgement of that reality as a cover or excuse for the fact that this society has been largely stalled on that front for much of that period. Until we acknowledge that, and understand it, we have what seems like a valid reason to do nothing and just wait for “time” to take care of it. It won’t. It will just get worse.

That’s because racism is still well-ensconced in the bedrock of American social consciousness. That’s obvious right now, in that the most vocal, overt racists have been emboldened by the politicians they helped put in office to act out in ways no one my age or younger has ever seen. (My parents, on the other hand, saw plenty of it–and much worse.) While of course it causes most of us great dismay, it’s encouraging in a way that this is happening, because it forces the rest of us to pay attention to the fact that something we would like to think has gone away is still very much with us. On the other hand, it would be easy to comfort ourselves by assuming that these are the ONLY racists, and that if we can just isolate and neutralize them, we can isolate and neutralize racism–and, after all, they are a desperate fringe on the political ropes, right?

If only. But racism in this country–and elsewhere in the world–is much bigger and deeper than that. We can root it out, but only if we acknowledge its existence, call it for what it is, and then start doing something about it. We haven’t done that yet. That’s why things haven’t gotten any better than they have. I hope some of you find that encouraging–especially those of you much younger than I. You can do this–if you want to. The problem is, this country has never really wanted to. And that’s largely because racism, as I said, is much bigger and deeper than the shouters and the shooters. The history will help explain that.

A big question students of U.S. history ask is, Why doesn’t the U.S. seem to have a class system, the way our mother country and other Old World societies do? There are two answers to that. The first is, we do, but we don’t like to acknowledge it, because it interferes with our central national cultural myth of equality and opportunity. The second, though, validates the question, because it is true that our class system is different. Part of the difference is structural and part of it is psychological, and one reinforces the other. The difference begins with the absence of a hereditary aristocracy in this society. It was never transplanted, because hereditary aristocrats have no need to go hack and sweat their way through a new world; they’re already masters of the one they’re in. With no firmly-ensconced upper class, there was room for the creation of a new one based on something other than birth. While early America certainly inherited the cultural assumptions of “good breeding,” and they meant something far more substantive by the term “gentleman” than most contemporary Americans could ever understand, the class structure was more fluid here.

The second major reason for the distinctiveness of our class system from that of the Old World is racism.The entire British American enterprise, once it began to take off in the 1640s, was based on the wealth produced by captured, transported, enslaved Africans–over 12 million of them. The more we study the early modern British Atlantic world, the more clear that becomes. That was, of course, unmistakable in Barbados or South Carolina, but it was just as true of the commercial ports of Liverpool, Boston, and New York–just one or two steps removed. Racism as we know it–the assumption that one ethnic group is inherently superior to another, giving it the right to rule and exploit–or at least to denigrate and marginalize–the other group–coalesced, hardened, and wrote itself into law as the slave economy grew.

The Civil War was fought against slavery, not against racism. The national effort to ensure a new society in which black Americans had an equal opportunity to participate lasted 12 whole years after one of the worst wars in modern history, before the country lost interest and let the Southern white supremacists take over again. Why? Racism.

The Civil Rights Movement of the 1950s and 1960s WAS a fight against racism. It enjoyed remarkable and unusual success in changing the legal structures that had institutionalized racism, particularly in the openly-apartheid South.

But racism was never, and is not now, a Southern phenomenon. When MLK’s marchers demonstrated against housing discrimination in suburban Detroit, they were attacked and beaten by mobs of local whites. The ghettos of Northern cities weren’t there just because people like to live around their own people. They were there because those people weren’t allowed to live anywhere else.

In 1970, the Federal judge for whom my dad worked wrote the first court-ordered busing plan to desegregate public schools in compliance with the law. It was called Plan Z. The death threats were bad enough that he was under the protection of the U.S. Marshals for a time. But by then, Nixon and the Republican Party had already successfully deployed their new “Southern strategy,” exploiting the resentment and fear of Southern whites who were watching the riots in the North and California on TV, and who were happy to support calls for “law and order” and appeals to “the silent majority.” Ironically, in that sense, George Wallace had run as an open segregationist independent, and had taken enough votes from Nixon that his victory was narrow. It wouldn’t be in ‘72. I used to think that a neo-Wallace candidacy would be impossible now. That was before 2016. A lot of us were happily complacent about a lot of things before 2016.

And this past week, a subcontractor for my wife’s parents was arrested after brandishing an automatic rifle at a group of Black Lives Matter protesters. That brings us up to 2020.

Racism is tightly bound up with poverty. That’s one of the main reasons it’s so hard for us to get our heads around it–those of  us who are trying to do that, anyway. The two feed each other. When you set up a society in which one ethnic group is systematically, automatically disadvantaged–which our ancestors most certainly did, whether in Philadelphia, Pennsylvania or Philadelphia, Mississippi–that group is going to be disproportionately poor. That poverty will persist, generation after generation, artificially amplified and structured by an entire complex system of discrimination. Both the discrimination and the poverty will wreak havoc on the souls of those in the affected group, as they always do with human beings. Crime, self-destruction, hopelessness, an inability to take control over one’s own life and impose direction and purpose on it–these are the classic individual and social pathologies of poverty and disadvantage.

The white poor share much of this, of course. There are more white people on welfare in this country than black people, simply because there are more poor white people in this country than poor black people. But black people are affected out of proportion to their numbers, because the other element–the racist discrmination element–affects them uniquely.

That brings us to something MLK Jr and others figured out early on: there is no fighting racism without tackling poverty–black and white. And brown–historically, there was no place in the U.S. where you would find a more toxic concentration of the pathologies inflicted on a group of people by poverty and marginalization than on an American Indian reservation.

Lower-class American whites have always been able to direct their resentment toward black people rather than at the classes above them. “Race” trumps class in this country. That is the final piece of the answer to the question about the American class system. Those with more wealth and power–white of course–have always been able to exploit this lower-class white resentment whenever it suited their interests. But that paints poorer whites as passive dupes, and a close examination pokes holes in that assumption. “Why don’t poor white Americans vote their class interest?” ask perpetually-befuddled observers from outside our society. The answer is, because they are voting their RACE interest, which carries very real privilege. They believe this privilege gives them an advantage in life, and that surrendering it–allowing the non-privileged to have equal access–is acquiescing in their own subsuming.

This is the fallacy of the zero-sum game–the fallacy that Adam Smith famously exposed in 1776 when he touted the advantages of free-market capitalism (as we call it) over imperial mercantilism (closed-trade systems). The fallacy of the zero-sum game is that wealth–or power–is of finite quantity, such that if I take a bigger slice of the pie, it leaves everyone else a smaller one. In the zero-sum game worldview, it’s us against them. Take yours before someone else gets it. But the best thing by far that free-market capitalism taught us is that wealth can be created. We can create opportunity.

It is true that, today, we are being forced to face the very worst thing about free-market capitalism: that it assumed unlimited growth based on unlimited resources, and resources, it turns out, ARE limited, so we must learn not to be wasteful and to turn our one-way throw-away economies into more circular ones. That has people scared, and it fuels that defensive instinct to hold on to what’s yours and to see the disadvantaged as a direct threat–especially if you are already on the bottom rung of the ladder.

But there is more to it, and the rest of it is not as easy to read, or as easy to excuse. The rest of this will not shy away from some ugly truths. Historians can’t afford to do that. What follows may offend you. I certainly hope so.

We humans have an innate desire to feel superior to each other. It takes an exceptionally healthy and happy soul to be free of that. Primates are socially hierarchical. When you feel close to the bottom of your society, you are likely to cling fiercely to your sense of superiority over anyone below you. As Gene Hackman’s character relates to his FBI partner in the movie Mississippi Burning, his poor-white father once said to him, “Son, if you’re not better than a nigger, who ARE you better than?” It is certainly true that saying such a thing out loud is not as accepted as it used to be. That is because things HAVE changed. But it is equally true that a lot of Americans still feel that way, whether they admit it to themselves or anyone else or not. Things haven’t changed as much as they need to.

Rich people, of course, can afford to distance themselves from such impoliteness, and always have. They don’t have to speak the words and they don’t even have to feel them. They just reap the profits from the entire system. Their sense of superiority is on much firmer ground.

But even MLK didn’t ask us to stop judging each other. He just asked us to stop judging each other based on ethnicity. That was in 1963. We name roads after him and have a holiday for him, but that is self-congratulatory. He was far more radical than we’d like to acknowledge. He understood the connections between racism, systemic poverty, the distractions of imperial wars, in which the shock troops are poor kids in uniforms.

Here’s what racism really truly looks like in everyday, “respectable” life in our society. I am going to put things into words here that are not my words. Please do not lift them out of context or quote them so that they might be mistaken for my own words and attitudes.

It’s not just the overweight working-class white thirty-something brandishing the automatic rifle at the protesters. It’s your nice, unassuming retired aunt who reluctantly admits, after a few drinks and more than a little pressing, to being racist. She would never brandish a rifle in public. She deplores that. But she votes.

It’s the retired couple from up North who, in the course of calmly justifying their choice in 2016, sincerely offer the observation that, in their experience, black people are the only group of people who refuse to work–who just want everything handed to them. 

It’s the guy on the corner yelling at the BLM protesters because the police have good reason to be jumpy with black men, since most of the criminals are black men, and once in a while, a mistake will be made, and there’s no such thing as white privilege when white people have to work for everything and black people just get special treatment. And it’s too bad about the kid in the Chicago projects but the sad truth is, he would have just grown up to be a drug dealer anyway. That guy votes.

And it’s the rich businessman in the two million dollar house on the river with his gigantic homemade sign who would never admit to being a racist. He votes. And donates.

Here’s the crux of today’s American racism: The problem with black people is black people, simple as that, and while the rednecks with their Confederate flags are obnoxious and distasteful, they aren’t wrong. Americans who subscribe to this will be careful about when,where, and to whom they say this out loud–if they say it out loud at all–but it’s there. And it is either the tacit or explicit opinion of the majority of people who have decided many of our most recent elections.

And that will only change when those of us who do not share this opinion out-vote them. The numbers are there. But not enough show up. And then they get gerrymandered…but we have made real progress pushing back on that. Now is the time to take advantage of it. Racists vote.

But in everyday life–when it’s not election day, and when there’s no protest–we just have to start calling out racism for what it is. In my experience, that has not led to huge unpleasant scenes. Then again, I’m not putting myself in social situations with the hard-boiled and then challenging them. That’s not for me. If you can do it, you have my respect.

Calling it out for what it is will be easier when we know it’s out there and we’re prepared to hear it from “nice” people. Then we’re not so taken aback, kicking ourselves later for not saying anything. It IS out there. It’s everywhere. And as long as it can hide, it’s safe.

One thoughtful push-back to a racist comment, and one vote, may not seem like much. But that’s everything. That’s the solution. If you’re doing it, you’re not the only one. And network-theory researchers have proven that one such action has quantifiable, exponential ripple effects that you, as the instigator, will never know about. But don’t take it on faith; there’s evidence.

That checkout boy in 1968 might or might not feel emboldened to say the same thing today. The fact that we don’t know means we have lots of work left to do. But it’s work we can all do, in our regular lives, while we go about our regular business. I don’t remember if my mom said anything to the checkout boy. What I remember is that she was terrified, as they announced the city-wide curfew effective immediately, in anticipation of rioting, and she rushed to get home to her baby. We all have our own lives and our own people to take care of. What I do know is, in 1968 in a white grocery store in Memphis, if she did say something reproachful to him, she would have been in a distinct minority. That is not true now. For all those who suffered so much to get us to that point, and for ourselves and our kids, let’s build on that.

 

Getting A First Book Published: A Walk-through of the Process

image for getting published blog post

I’ve digested the publication process from finishing a manuscript to release, for new scholars who haven’t gone through it yet, or for anyone who’s just idly curious. My experience with Brill was nearly perfect, so I wouldn’t expect it to get much better than this.

I submitted my dissertation in February 2017. I knew I wanted to write a book based on it, but that the book would be more than a revision and expansion of it. I gave myself two years to do the research, write the book, and get a contract. The best possible circumstances under which to do that are far and away under the relatively lucrative umbrella of a postdoctoral fellowship, which is analogous to a residency. For either one year or two (two if you can get it), you are paid enough to live on while you work on the book project. Most, but not quite all, are tied to a specific institution; you are required to be in residence there. Some have a light teaching load attached. Since we were not going to be relocating, those were out. I applied for the rest, and got none of them. They are of course ridiculously competitive. Humanities and social science funding in this country is not exactly a national priority. Fortunately, I have a spouse who is happy doing something that makes her a good living. My only direct financial assistance was a $400 grant from the Society for Nautical Research in the UK which allowed me to hire a capable researcher to do some in-person work at the National Archives and the Bristol Archives, which was most valuable.

I had the manuscript close enough to finished to begin querying publishers in the fall of 2018. That requires writing a book proposal. I got some help from someone I knew who had been through this on how to do that, and also looked it up on more than one potential publisher’s website. I first submitted to ****** UP because they had just launched a series that I thought could be a good fit. They quickly informed me otherwise. Next, I went to the UP of X that publishes one of our primary journals and the monograph series that goes with it—both of which were co-founded by one of my mentors, who sat on their editorial board. Alas, he had died in the meantime. They rejected it too, which surprised me, but there is absolutely no point in dwelling on that for five minutes (as is true for applications to schools and for grants).

At the same time (December 2018), I had submitted to Brill, when I realized that they had a series of monographs in the history of technology that seemed an obvious fit. Also, the series editors were, unbeknownst to me at the time, two scholars with whom I had worked to put together a conference panel that fall. One of those had actually presented a paper in the panel I had organized, and we had chatted quite a lot at the conference. So, they knew me and knew my work. Brill’s acquisitions editor responded while the UP of X still had the proposal and said he was interested. I put out a panic query to people who had been through this, who were kind enough to advise me to put Brill off nicely while I waited for UP of X. Most fortunately, Brill was still interested after UP of X passed, and so the proposal went from their acquisitions editor to the series editors—the ones I knew. Lesson: while simultaneous  submissions are not forbidden in book publishing, as they are with journal articles, I don’t think I would do it again. Too potentially awkward and stressful. More Important Lesson: presenting at conferences can be worth far more than it might cost, and far more than you might realize at the time.

An aside is in order here. The acquisitions editor works for the publisher. He’s basically a buyer, though he can be as involved in the editorial process as he wants. In this case, given that there were two series editors and an editorial assistant, he was not involved at all past the acquisition, so far as I can tell. Series editors do not work for the publisher. They are scholars in the specialty who probably conceived the series themselves, and are primarily responsible for approving new titles and overseeing the editorial process once a title is acquired for the series. The acquisitions editor cannot accept a book for such a series without their consent. By the way, the only people here who are getting paid are the publisher’s employees—the acquisitions editor and the editorial assistant.

By 18 January, the series editors had approved the acquisition, and within a week or so, I had submitted the complete manuscript to Brill. The next step is to send it out for anonymous peer review. This entails the publisher’s asking two (usually) scholars in the field to read the manuscript and opine whether or not the publisher should publish it, and if so, with what revisions. It can take time to find the reviewers and it takes some time for them to review the manuscript. I was fortunate; Brill was prompt in doing this, and the reviewers were prompt in getting it back to them. Also, in this time period, the series editors will make their editorial recommendations for revisions.

It was late July before I knew for certain that the book was going forward, based on the submission of the external reviews and the series editors’ reactions to those. With so many cooks in the kitchen at this stage, it may be necessary to do a little polite inquiring as to status, as the line might get cut between one or more of the parties. This proved necessary for me. I had a back-and-forth with one of the series editors, who rode point through the process, and we got exactly on the same page with what revisions were going to happen. The series editors have full latitude to do what they want with the external reviews. Ours were mostly helpful, but the series editors did not want to follow every single suggestion. (External reviewers can be grumpy, even if they basically approve of the work.) While I was working on the revisions, I was also working on acquiring suitable images and permissions to use those images. This is a Royal Pain in the Ass and I Am Not Lying, but there’s no getting around it. Fortunately, the publisher had given me a handbook that treated in detail all aspects of getting the revision ready, including this aspect, and the editorial assistant was there to help if I needed her. She and the series editors had definite opinions about the illustrations and captions, and I welcomed that. None of their suggestions were objectionable to me. Keep in mind that the publisher will dictate what types of, and how many, illustrations you may use. Much of that has to do with cost. Brill was perfectly happy with lots of color plates, but then again, my book costs $153.00. Publishers who want to sell books more cheaply, let alone put them out in paperback, will generally not allow such extravagance.

I object to it for more than one reason, but repositories will charge you big money for the rights to reproduce images from their collections. The more widely they think the publication will be distributed, the more money they want. This can amount to hundreds or thousands of dollars. My main series editor knew from experience that German archives and museums don’t do this, so he advised using them. In the end, I found that the same was true of the Swedish Museum of History in Stockholm, which owns the originals of most of the technical drawings I wanted to use. This was a godsend. In the U.S. and UK, however, forget it. The ed. asst. decided she really wanted a certain image in there that’s owned by The National Maritime Museum at Greenwich. It cost £50, which Brill paid. The publisher might be willing to pay modest costs here; they will stipulate that in your agreement.

You have to have a permission form signed by the copyright holder of every image you want to use, and those forms have to be transmitted to the publisher, as they are potentially liable for any copyright infringements (although they stipulate in your contract that you are ultimately responsible for making sure you are not violating copyright).

Also, at this point, I filled out an Author Questionnaire for them, which compiles the information necessary for marketing. You select the key words for search engines, you write a short abstract, tell them who you think the readership will be, and you tell them about specific journals you think they should submit the book to for review, and specific awards for which the book might qualify.

I submitted the revision and all the permission forms late November 2019. So, ten months since initial acceptance. The series editors then have to read and comment on the revision. If they approve it, then at that point you will get a contract. The publisher will not formally commit to publishing the book until they have the requested revision in their hands and have approved it. I signed a contract on 19 November and they had received it by 6 December. (They’re in the Netherlands.) Finalizing images and permissions must be completed before the book can go into production. This required some back-and-forth up to Christmas 2019.

The contract spells out the respective obligations of author and publisher. Basically, the author commits to completing revisions and other work stipulated by the publisher in a certain time frame, and the publisher commits to publishing and marketing the work in a certain time frame, provided the author has satisfied their stipulations. It spells out who has copyright and what use may be made of copyrighted material by the author. These terms, in my case at least, are quite generous toward the author. The contract will also specify number of author copies, percentage of author discount on additional copies and on other books, and how royalty distribution works. (In general, one does not realize royalties on academic books.) I found the contract easy to read and, while my impression was that there might be some wiggle room for negotiation, the terms were fine with me as initially offered.

The book went into production 6 February 2020, and at this point I was working with a production editor. Her job was to see the book through the production process and, ultimately, send it to the printer. Meanwhile, she would be working with me, on the one hand, and a typesetter on the other. The typesetter isn’t really a typesetter anymore, but someone who takes the files and converts them into final form for publication. The files they are given are already damn close; you have used the font they asked for (their house font, which you download), you have formatted everything exactly as stipulated, and you have copy-edited the crap out of the manuscript (if you’re good, you’ll still miss a few things; I’m good, and I did).

Somewhere in here, the book will go up on the publisher’s website as a forthcoming title. This same page will be where buyers can order it when it is out. In our case, there was also a PDF flyer that could be distributed.

The series editors and the production editor commented on the submitted revision, and we made a few minor tweaks. Then she sent it to the typesetter for conversion into proofs—the final-layout form that I would then proofread, correct, and send back. Meanwhile, I had to compile the index, minus page numbers, as those would only be available once the proofs were done—and even then, only valid if the pagination didn’t change during the correction process. (The index took me two weeks of full-time work, in two separate stages. I’d never done one, so of course I looked up formatting rules in the Chicago manual. If you were using MLA or whatever, you’d look it up in theirs.) I submitted corrections to the first proofs on 13 March. All of the errors I found were mine, not theirs. Fortunately, none of them were major enough to mess up the pagination of the proofs, so I was able to paginate the index too. It’s important to note here that the publisher expects a clean copy of that revision to send to the typesetter the first time. They have to pay the typesetter and they do not want you coming back wanting major edits that mean the typesetter has lots of work to do to re-work the proofs. In fact, they reserve the right to charge you for it if you do. You are only allowed to correct typos. You cannot decide that you want to re-phrase your assessment of Smith’s book on spinning wheels.

By 31 March, they had sent me the final proofs to check and I had let them know they were fine. By 10 April, the book had gone to the printer. The e-book was published on 14 April and the hardback on the 16th –two weeks earlier than the final publication date on the website. So, the entire publication process from acceptance to publication was about sixteen months. At this point, they will be selling it to academic libraries, primarily.

The process is, I’m sure, somewhat different for established scholars, let alone distinguished ones. I hope this is a helpful walk-through for other first-timers or aspiring academic authors.

 

 

 

 

A quick update…

So quick in fact that I’m going to bullet-list it:

  • It so happens that two books are coming out on the same day–30 April–the edited collection to which I have contributed, and my first monograph. The Publications page has all the details; I just updated that. (Link will open in a new window.)
  • I am getting toward the end of the stack of secondary-source reading for Book 2. I probably have another six weeks. Speaking of Book 2…
  • As I wrote on the Publications page, and sent out over social media, I was unable to secure funding for the second necessary archival work trip to Maryland this summer, despite applying for all the grants and fellowships I knew about. So, to keep the project on schedule, as I’ve already applied for a big 2021 grant for Book 3, I started a GoFundMe campaign to raise the necessary $2,100, or as much of it as I could. If you would like to know more about that, it’s on the Publications page, and there’s a link at the bottom of each page of the website. We’re about 2/3 of the way there. Contributions in any amount are appreciated and will be properly acknowledged. (Links will open in new windows.)
  • I hope everyone reading this is well and getting along all right. We’re fine here.

No ships, not much history, but hear me out…

(It’s a how-we-think-about-technology sort of piece.)

One of my chief goals as a historian of technology is to help myself and my readers understand the tacit assumptions we make about technology without realizing it–what we take for granted about it as truth rather than what it actually is: our perspective, which, like any limited perspective, can do much to obscure, rather than reveal, the whole truth. That’s just a specific example of perhaps the chief goal of history as an intellectual discipline: the enterprise of getting beyond our own limited perspective far enough to realize that that is, exactly, what it is–a limited perspective, based on limited experience, and that the perspectives and experiences of people in the past were different. (Anthropology does the same thing, but for people removed from us spatially and not necessarily temporally, though anthropologists study them, too.)

The most common assumption we accept about technology is the assumption of “progress”–that overall, despite some bumps and hills and valleys, technology is improving. Of course there are many ways in which that’s true–at least partially, but it inevitably oversimplifies, at best, and greatly distorts, at worst, what’s really going on.

Perhaps the most helpful basic caveat we can apply to the general notion of “progress” is to acknowledge the reality of pros and cons–of advantages and costs. Every technological choice has a cost, whether or not it has benefits and regardless of what those benefits may be. When we remember to consider the costs of a technological choice, we ensure a fuller understanding of that technology, and our relationship to it. And by “cost” I do not simply mean a figure of currency. I mean the trade-offs that must be accepted when choosing one technological option over another.

I find today’s automotive technology, and the marketplace in which it’s bought and sold, an especially helpful example of how to think about technology accurately. At this point, the technology has been around long enough to have existed in more than one cultural milieu. It has been heralded and condemned, loved and loathed, credited for making “modern life” possible for most people and convicted as a glaring example of why “modern life” must change. I don’t have the time or space to write a short history of the automobile in cultural context, so I’ll stick to an assessment of it in my own society right now, and refer to its broader history in passing.

I write this at the moment when the internal-combustion-engine-powered automobile has peaked, and is now on its way out. If ever there were a perfected technology, we have experienced it in the cars of the early 21st century. A 2006 car, built toward the top of the quality scale, is as sound a rejoinder to “they don’t build ‘em like they used to” as we’ll find. Those of us with some years and miles on us remember the cars of our youth, and it’s only those who don’t know much about cars who think that the cars of any time before 2000 were “better” than those of our own day.

I might as well go ahead and use the word “better” so we can expose it for the intractable problem it poses to really understanding technological choice in human life. As noted, I think the example I just used is as strong a defense of the use of “better” as we’ll find. A 2006 car is likely to be safer, more reliable, more durable, more comfortable, faster, better-handling, more fuel-efficient, less polluting, and equipped with more convenience features, than its closest equivalent from any point in the past, while the cost of that car, new, remained within reach of the middle-class buyer, and the durability of that car meant that, used, it presented a more attractive option to the buyer on a stricter budget.

What I just wrote is a fact, because I did not use the word “better” in an overall, general sense. I can defend, with specific data if necessary, any of the specific comparisons I just made. But observe what happens when I write this: “Any 2006 car is better than a ‘67 Pontiac GTO.”

1967 GTO

Photo by Greg Gjerdingen, CC-BY-2.0, https://commons.wikimedia.org/wiki/File:1967_Pontiac_GTO_(14040324408).jpg

Say that in certain circles and you will be more or less attacked, and not just because you have unwittingly stumbled into a group of crazy people. Without delving into the details of why this is so, we can safely lump all those under “aesthetics.” From the style of the body to the sound of the engine to the nostalgia and associative power of the older car, it has a combination of attributes that, to its devotees, make it “better” than what they could buy for the same money as an outstanding example of the old muscle car is now worth—say, a brand-new Mercedes-Benz E400 mid-size sedan, full of technology no one was even thinking about in 1967.

MB E class

Photo by Vauxford, CC-BY-SA-4.0, https://upload.wikimedia.org/wikipedia/commons/9/9b/2019_Mercedes-Benz_E220d_SE_Automatic_2.0_Front.jpg

Comparing more recent cars to each other requires distinguishing between differences more subtle, easier to overlook, but still important. Like any technology widely-used, the car is shaped not just by designers and engineers seeking aesthetic and physical performance attributes, but by cost constraints, materials availability, and a legal-regulatory environment. Right now, the legal-regulatory environment is driving automotive technological choice perhaps more than any other pressure. In 1967, few were concerned about fuel consumption or emissions. There were no laws requiring drivers and passengers to wear seat belts, though the technology did exist. But the car runs in a drastically different cultural milieu now, in which only the reactionary and oblivious are not acutely concerned with the reduction of consumption of fossil fuels and CO2 emissions. Seat belt laws are only one item in a long list of mandated safety features. Those safety features have saved so many human lives from what would otherwise have been fatal accidents. We are rapidly developing viable all-electric cars, and electric motors lend themselves quite well to that application, with their instant generous torque, quiet operation, simplicity, and longevity. Batteries are another matter, but enough resources are being thrown at battery technology that we have already seen substantial lengthening of range and shortening of charge times just in the past few years. Meanwhile, CAFE (Corporate Average Fuel Economy) regulations and their equivalents in other countries have driven automakers to make substantial changes to powertrains and pursue weight reductions to eke out 1 to 2 mpg more fuel economy per vehicle per model cycle. Weight reduction is an across-the-board win, except in cost; it is usually more expensive to make a vehicle with the same strength, rigidity, and longevity but lighter weight, as it requires more expensive materials, such as aluminum alloys developed for aircraft and high-strength steels. Power train changes for fuel economy is what I want to focus on, because that is where the valuable illustration lies of technological relativism, so to speak.

While it is too early to know, I strongly suspect that the power trains of the internal-combustion-engine-powered car peaked in the first decade of the 2000s in terms of longevity, reliability, and aesthetics (sound and feel). While such advances as electronic fuel injection (widely-available since the 1980s), variable valve timing and lift (1990s), and the six-speed automatic transmission (early 2000s) eliminated waste and contributed to long-term reliability, power trains in this period could be over-built and under-stressed. Most engines were normally-aspirated, rather than turbo- or supercharged. Computer control kept everything running in tight spec, contributing to the smoothness, low maintenance, and efficiency of the power train, and allowing transmissions to respond immediately and smoothly to input, keeping the engine in its optimum power band. By 2010 or so, these power trains were proven to be good for virtually limitless service with only basic, and inexpensive, maintenance. Exhaust systems with catalytic converters and oxygen sensors put out quiet, minimal exhaust an order of magnitude cleaner than what was possible in 1967, without compromising the performance of the power train.

But the tightening of the regulations was relentless, as governments pursued more ambitious targets for the reduction in fuel consumption and CO2. Pushing past the old maxim that “there’s no replacement for displacement,” automakers began to substitute smaller engines for larger ones, adding turbocharging and direct injection to make up the power and torque losses. Six-speed transmissions quickly became yesterday’s news, replaced by seven, eight, nine, and even ten-speed boxes. As for the manual transmission, the joystick of real driving, it has become almost a unicorn. Computer-generated, synthesized “engine sounds” are now piped through the sound systems of “sporty” vehicles, as these new powerplants cannot deliver the sound of a normally-aspirated V8 or V6 (or, for that matter, a high-revving performance-built I4 or I6).

So far, these efforts have paid off; the manufacturers are meeting regulatory requirements and meeting demand. But what about the costs? It’s certainly true that new cars are seriously expensive relative to past markets. They have to be. And with cars just a few years old so excellent, buying new is nowhere near as compelling a choice as it was when I was young, when cars weren’t as durable and new ones were cheaper than they are now. Aside from purchasing costs, though, there are—or may well be—others. Aside from aesthetics, the new power trains may be decreasing reliability and durability across the board (it’s early yet). Adding turbocharging to a small engine adds stress and heat to that engine, and complexity to the power train. Direct injection has proven to introduce premature carbon build-up in some engines. Newer transmissions have had trouble finding and holding the right gear, and are programmed so aggressively for fuel economy that they tend to up-shift too early for  optimum engine performance, unless put in “sport” mode, and thus defeating the entire purpose for their complex and relatively-untried existence. For those of us who celebrated the development of the attainable car to near-perfection in almost every way, lending itself so well to long-term satisfying ownership, the latest developments raise concerns that we may be going back to an automotive marketplace more in line with the short-attention-span, disposable-goods culture that we so desperately need to get away from. I hope not. I don’t think the challenges posed by the new technologies are insurmountable for automotive engineers. But I do wonder whether they will be given the time necessary to work out the kinks in them before they are phased out to be replaced by something else. Some of that probably depends on how quickly electric cars become viable on the mass market.

Regardless, one cannot say that today’s cars are “better” than the cars of ten years ago. Once we are forced to define what we mean by “better,” it’s instantly clear that “better” in some ways means—or likely means—not as “good” in other ways. What are your priorities? What are you willing to sacrifice to have something else? That is how technological choice always works; it’s just that we so often don’t see that, because we live under the illusion of “general progress”; everything is basically getting better all the time.

No it isn’t. It is just getting different.

I know I’m getting to that age where it’s natural to become suspicious of the new and to cling, to some extent, to the familiar. But I also know what it’s like to own a well-built, satisfying car for many years and like it just as much as I did when I got it, or more. So, next time, I’ll be buying a used, over-built, normally-aspirated V8-powered modernized throwback with a sterling reputation and not-so-sterling fuel economy, and hoping gas stays “cheap” for a few more years.

 

 

 

The Logs of HM Schooner Sultana

We got home two days ago from a ten-day research trip to Chestertown, Maryland, funded by a Carter Fellowship from the Early American Industries Association. My second book project is a “biography” of sorts of HM Schooner Sultana, built at Boston in 1767 and used by the Royal Navy as a customs-enforcement interceptor on the Eastern Seaboard between 1768 and 1772. Because the Navy prepared accurate draughts of her hull and rig, made a detailed inventory of her equipment and specifications, and preserved her logs and muster books, we have a record of this vessel unheard-of for a similar vessel in normal merchant service. Based on that information, a dedicated group of people designed, built, and launched a replica of Sultana between 1999 and 2001, and she has operated as an educational vessel on the Chesapeake ever since, supported by the Sultana Education Foundation. During the design process, the Sultana group ordered copies of all the official documents pertaining to the original schooner from the Public Record Office in London. It is those documents that, thanks to the generous hospitality of Drew McMullen, Executive Director of the Foundation, and his staff,who were all friendly and supportive, I have been allowed to read.

working at Sultana

Sultana had  both a sailing master, David Bruce, and a commander–Lt. John Inglis, an American who would remain loyal and eventually retire from the Navy as an Admiral. On this visit, I got through Bruce’s log, which covers every day of the schooner’s constant service from July of 1768 until early December 1772. Through the sometimes-almost-inscrutable handwriting and “creative” orthography, I got a terse summary of day-to-day events far too demanding for our modern sensibilities of risk tolerance and comfort, from the harrowing gale-lashed passage from Deptford to Halifax, to the boarding and searching of merchant ships up and down the East Coast of North America in all conditions, keeping a crew of 25 on a vessel too small for that many men, with rampant desertions, occasional impressment, a few floggings, and at least two deaths, before a second transatlantic in the other direction which, like the first, almost proved disastrous.

Sultana was part of an effort to enforce the Townshend Acts, themselves intended as instruments of a reformed British Empire able to meet the substantial challenges of its sudden post-1763 expansion and its depleted Treasury. She and her sister vessels were effective–perhaps too effective–in this role, subjecting British American maritime commerce to a constant scrutiny to which it was not accustomed. The resentment created by the use of naval vessels for commercial policing ratcheted up tensions between London and America–and between Americans of differing stations and opinions–to a dangerous degree.

I am applying for funding to return to Chestertown for two more weeks of reading, and I will post updates on this project here from time to time. Thanks for reading, and if you are interested in knowing more about Sultana and the Foundation that operates her replica, visit http://www.sultanaeducation.org.

 

Wooden Ships and Rocket Engines: How Skills Live and Die

This week, we celebrate–as we damn well should–the 50th anniversary of the successful Apollo 11 mission to the moon in 1969. That was, in this historian’s not-at-all-humble opinion, the single coolest thing human beings have ever done. The launch of such a massive vehicle out of Earth’s gravity, to the moon, and back required the most powerful engines ever built–the Saturn V F1 rocket engines, developed under the supervision of Wernher von Braun, director of the first successful effort to develop a rocket capable of delivering a payload to a predetermined target: the Aggragat 4, or A4, better-known as the V2, a medium-range ballistic missile used by Nazi Germany against Allied cities in the latter months of the Second World War in Europe.

The Saturn V F1 was the culmination of everything von Braun and his team had learned through trial and error in thirty years of research, production, and testing–combined with all the money the U.S. Government was willing to spend on the project–which, for a time, was anything the rocket men asked for. One of the Apollo astronauts recently observed that we can accomplish just about anything if cost is no object.

S-IC_engines_and_Von_Braun

The Saturn V was just magnificent. Its five F1 engines, making maximum thrust, developed 32,000,000 horsepower. Thirty-two million horsepower. That makes me smile like a happy simpleton. And we think a 500-horsepower car is a beast.

It’s been a long time, but I grew up visiting my aunt and uncle and cousins in Huntsville during the summers, so I’ve seen a real Saturn V. As Brian Cox remarked about the temple at Karnak, this thing was not built on the scale of humans. It was built on the scale of gods.

It was built by humans, though, not gods, and no one could fault you for assuming that every single detail of its design and construction was documented and filed for posterity. I certainly assumed that.

Then I watched this Curious Droid video last week …

 

… and learned that, actually, it wasn’t. Such was the rush to get the thing operational, to get the moon mission done by 1970, to honor the murdered President and beat the Soviets, that each F1 was somewhat different–somewhat hand-built, at a time when computer-aided design was science fiction and the most powerful thing humans had ever built that didn’t involve a nuclear reaction was basically the refinement of 1940s technology.

So, when current rocket scientists went back to the F1 plans to consider the feasibility of building them again, they determined that it was not possible. Too much of what made the engine work was in the brains of the engineers who built each one, individually, and in their hurried scribbling on napkins and scraps of office paper, long thrown away.

We can build a close approximation of the F1–a version designed with computer software, the techniques we are accustomed to, rather than those they were accustomed to in 1969. We do certain things a certain way and they did those things differently. Their engine worked. Our engine would work too. But we cannot feasibly replicate the original, because our skills are different from theirs. The fire control team on the USS North Carolina in 1944 would not be able to work the fire control system on the USS Zumwalt in 2019. But the team on the Zumwalt would not be able to work the North Carolina‘s, either. They weren’t trained to.

The F1 engine is a perfect modern example of something central to the history of technology, which is what I do. Technological know-how, like every other aspect of human history, is largely non-linear, despite our unshakable tendency to think of our own history in linear terms. Technological history is not predetermined. Nor is any other aspect of history. The same goal can be accomplished in different ways, and those ways are shaped and guided by the culture doing the shaping and guiding of the technology.

Also, the skill of the artisan–the knowledge carried in the head of an individual human, which is slightly different from that carried in the head of any other human, and which that human can apply to the manipulation of objects–has survived longer than one might assume in our age of blueprints, engineering degrees, precision machining, and mass production.

In fact, mass production and artisanal skill are not mutually exclusive. I would argue that artisans are working, right now, on the floor at the Honda plant in Ohio and in the machine shop at GE Aviation in Wilmington. “Continuity more than change is the human condition,” wrote Henry Glassie.

When these people retire or die, their skills leave the shop with them. They cannot entirely be put into a set of plans, a book, or a tape recorder. The only way to preserve them is to transmit them directly into the brain of another human through a process we traditionally call apprenticeship.

When we undertake the reconstruction, or merely the interpretation, of a human contrivance from the past, whether it is a 17th-century ship, a First World War fighter plane, or a Saturn V rocket engine, our top challenge is to understand how they built what they built, without possessing the skill set they had. Sure, plans and documentation are a great help, but they are not sufficient. That is why the techniques for understanding artisanal craft are just as important for historians of modern technology as they are for those of us who spend most of our time in the age of planks and sails and hemp rope.

A true understanding of any past technology, then, requires much more than the ability to locate documents and read them. It requires more than the ability to make sense of what immediately meets the eye at an archaeological site. It requires educated guesswork, experimentation, and a thorough understanding of the culture that built the technology–its skill set, its tastes, its prejudices.

Without those abilities, the best answer historians would be able to give to the question “How did we go to the moon 50 years ago?” would be “We don’t know.”

Or, perhaps more tellingly, “We forgot.”

 

The Tragedy of the American Revolution

I’ve been immersed in readings on the crisis years of the 1760s and 70s in England and British America, as background for my second book project. My major field as a doctoral student was the British Atlantic in the 17th and 18th centuries, yet I’ve learned a lot in this reading phase that I didn’t know or vaguely knew, so it’s reasonable to assume that significant gaps in understanding exist in general. It seemed appropriate to write this post on the Fourth of July, which is a much more somber occasion when one considers the reality of what produced it.

The focal point of this project is the schooner HMS Sultana, built in Boston in 1767 by the fiercely loyalist Benjamin Hallowell, Boston’s most prominent shipbuilder. Hallowell used his British connections to send the ship to England where she was bought into the Royal Navy for use as an interceptor back in British America–part of a squadron tasked with seriously ramping up on-the-water enforcement of British customs regulations. Those efforts met with stubborn resistance and even violence; several of these ships, including Sultana, were attacked by the crews of merchant ships they intercepted in coastal waters, or by groups of colonists in the port towns they visited. The violence culminated in the boarding and burning of the grounded HMS Gaspee in Rhode Island in 1772. At that point, the government in London appointed a royal commission to investigate the incident, the most serious official response possible, and decided to employ larger, more powerful vessels in this duty. Sultana was recalled to England to be sold out of the service and into historical obscurity.

So much of the period’s open hostility between the imperial government and colonial subjects took place on the water, over maritime trade and labor. Unlike customs collectors and other shore-based officials, naval officers were not a part of local communities. Their loyalties were to the highly-professionalized Navy, their own careers within it, and the promise of a share in any contraband seized or ships condemned to auction for carrying it.

British American merchants and masters, however, were long-accustomed to ignoring or evading those stipulations of the imperial trade laws they found inconvenient–laws that were intended to protect British imperial trade from foreign (French, Dutch, and Spanish) interference. Evading customs duties and ignoring regulations was so long established,and so profitable, that many merchants considered it their right. A prevalent attitude of the 18th-century British Atlantic was the notion that political loyalty and obedience were good and fine as long as they did not interfere with personal interest.

The British government used the means at its disposal to enforce the laws passed by Parliament, for the benefit of the empire as a whole, as they saw it. Those who resisted that enforcement believed they were protecting their own interests against an overweening imperial state determined to rule them absolutely rather than treat them as Englishmen ought to be treated by other Englishmen.

I used the word “absolutely” rather deliberately, as the deepest political fear of any good Englishman was absolutism–the exercise of arbitrary power by a state in the absence of a constraining constitution.

Almost everyone, on both sides of the Atlantic, believed that the imperial relationship between Britain and her American colonies was in need of serious reform, given the dramatic growth in population, land area, wealth, and political maturity of the colonies by the mid-18th century. It was also commonly assumed on both sides of the British Atlantic that the American colonies would one day be independent, as the dominant explanatory metaphor of the relationship–that of parent and child–implied. But the history of the relationship between the end of the Seven Years War in 1763–the war in which Britons from both sides of the Atlantic fought together to force the French and Spanish out of North America–is a history of repeated misunderstandings, miscommunications, shortsighted decisions, and mutual mistrust, as the British Atlantic empire lurched in fits and starts down a path of increasing spite and despair. Once large groups of men were going around carrying guns–whether British regulars or colonial militias–cooler heads hoped against hope that a spark wouldn’t hit the powder keg, but that was overly optimistic. There was no plan for independence when people started killing each other in Massachusetts in April of 1775.

revolutionary-war-008

National Archives (U.S.)

This country has had two civil wars. This was the first, and like the second, it was brutal and vicious and long and terrible. This was a double civil war, in that Britons from outside British America were killing Britons from within it, and vice versa, and British Americans who would not forswear their loyalty to the Crown, despite their profound disagreements with current British policy, were fighting a vicious partisan war against their own countrymen–sometimes their own families–who had abandoned any allegiance to the empire. That American war was the more brutal of the two; the British regulars were professional soldiers. The American partisans were motivated by ideology and by burning hatred and resentment toward each other, which fueled and was fueled by atrocities–torture, murder, the burning of homes and farms. The native nations either joined one side or the other, trying to pursue their own interests, or tried to stay neutral. Thousands of enslaved Americans went over to the British in a bid for freedom. Even after British forces quit the war after Yorktown in 1781, the partisan guerrilla fighting went on through 1782 as perhaps 60,000 loyalists waited to find out what the peace settlement would be and what would happen to them. Ultimately, most of them were forced to leave their homeland, branded as traitors by those they considered traitors, but who now had the power in the new United States, and thus could write the laws and the histories.Benjamin Hallowell, who had built Sultana back in ’67, left Massachusetts for good with his family for Canada.

One of the most important services we can provide as historians is to push back against the prevalent fallacy of inevitability–the usually-unspoken assumption that things turned out the way they did because there was no other way they could turn out. Human societies tend to view their own histories in such a way as to take for granted that what happened, happened. We know how it turned out, and that makes it difficult to understand how things looked to people at any given moment before it turned out that way.

The American Revolution was not inevitable. It wasn’t even necessary for British American “freedom” (a word we in the U.S. have abused as long and in as many ways as a word can possibly be abused); Canada is free, Australia and New Zealand are free; the British Commonwealth provides a model of how British societies separated by oceans and proud of their individual stories can co-exist peacefully. Smart people on both sides of the British Atlantic, and on both sides of the patriot/rebel-loyalist/Tory conflict in British America, thought and wrote about ways that could work, in a new constitutional relationship. But too many people weren’t ready for that for too many reasons, and everything fell apart instead, with awful consequences. War is always the most colossal failure of intelligent human beings to rise above and solve their problems. The fireworks on the Fourth of July are the echoes of the gunshots and cannon blasts that shattered lives and hopes and dreams that might have lived and thrived.

 

Liminal scholarship: working between boundaries

DSC_0007

History, my home discipline, has a reputation for relying less on robust theoretical constructs than its close relatives, such as anthropology. As the study of the past, it is free to study the past of any people, in any period, so long as there is language-based evidence to use as source material. That has been, traditionally, a defining limitation of the discipline. However, one of the major reading fields I undertook for my PhD in history was material culture studies. In earlier graduate training, I studied historical archaeology—the archaeology of people about whom we also have traditional historical—language—evidence. Some historians study art, some study medicine, some politics, language, etc.

Nevertheless, as with any academic discipline, history in practice does not tend to be as free as history in theory. A few strong fashions and foci rule most disciplinary territory at any given time. Funding follows fashion, and further encourages work in already-established areas of inquiry. Right now, for example, my discipline is preoccupied with the study of peoples historically marginalized from power relative to European males and their descendants: indigenous non-Europeans, Africans (especially enslaved Africans and their descendants), women, and those who were not strictly heterosexual and/or did not conform to widely- enforced gender norms. The table of contents of any major journal in the discipline will reflect this.

The discipline is also limited by the expertise of its practitioners. If a student pursues the humanities as an undergraduate, then undertakes and completes graduate training in history, that student has been rigorously-equipped to be a historian. But a historian of what? Certain areas of inquiry will prove challenging or even inaccessible to a historian without additional training and experience in areas of expertise outside the history department. For example, it would be difficult to be a historian of the development of cosmological precepts from Newton to quantum mechanics without a deep competence in physics. One would be hard-pressed to pursue the history of North African Bedouins without a strong command of their languages. History does, in fact, traditionally demand language competency from its practitioners, but it does not demand concomitant technical competence in other areas. I contend that technical extra-disciplinary competence is necessary to pursue promising avenues of intellectual inquiry in the humanities that are otherwise out of reach.

In the liminal zone between two fields in my discipline—the history of science and maritime history—is the history of navigation. In a survey of the state of scholarship in that field published in the International Journal of Maritime History, Willem Mörzer Bruyns observed that the subject would be well-served by more historians with technical competence in the subject.[1]  He could just as well have made the same point about economic historians writing about shipping: that they could pursue avenues of inquiry pertinent to their research questions if they possessed a sophisticated technical understanding of ships, without which such avenues would remain off-limits.

In my specialty of maritime history, as pursued within the parent academic discipline over the past three decades, the economic aspects of maritime activity have taken center stage; in fact, it was that very focus that brought the specialty into the academy in the first place. Older maritime history was generally non-academic and/or naval in focus. Maritime economic historians have pursued data-intensive research on the contributions of shipping to productivity and economic growth, from the early Middle Ages to the twentieth century. Such studies were part of a great flowering of econometric history, led by scholars just as competent in the economics department as on the history side, most famously Douglass North and Robert Fogel, who shared the 1993 Nobel Prize in Economic Sciences for their quantitative-history work on the early modern and modern western economies. Cliometrics, the use of formal economic theory, mathematical methods, and quantitative data to do historical research, contributed much to a more sophisticated understanding of the rise of modern economies. An important example was the work on “invisible earnings”—with shipping as a major component—in the British Atlantic economy of the seventeenth and eighteenth centuries. The work of North, his students James Shepherd and Gary Walton, as well as John McCusker and Russell Menard taught us that the economy of British America was largely based on its maritime commercial success, and thus could not be understood, as it had been, as a negative balance of payments vis-à-vis the British home islands. This in turn forced a re-evaluation of the economic contributors to the American Revolution.

That is where I work—but I am not an economic historian. I am a technological historian, and when we consider the technological aspects of shipping in maritime economic history, perhaps especially in the early modern period, we enter a liminal zone where there is still much to be done. It is here where the technical competencies of economic historians, technological historians, and ship archaeologists meet—and here where those competencies run out. It is also here where the considerable technical competencies of non-academics make a compelling case for relevance. These non-academics include experienced ship modelers, who base their work on careful, in-depth research; antiquarians, whose knowledge of a specific subject routinely exceeds that of academic historians in terms of technical detail; naval architects and marine engineers, who understand the principles of ship design as as no one else does; professional and volunteer mariners, who actually know how to operate replicas of early-modern vessels; and the shipwrights who know how to build and maintain those vessels.

It should be clear from that list that no one, no matter how dedicated or intelligent, can master all of those specialties in one lifetime—or even come close. Everyone who wants to work in that liminal zone will have a different set of competencies, but that set will by necessity cross disciplinary boundaries and will probably cross the street between the academy and the outside world as well. Everyone working in the zone who wants to make a contribution to scholarship must take advantage of what others in the zone have to offer. Collaborative effort, so central to scholarship in the natural sciences, must become much more common in history if the discipline is to exploit more areas of inquiry requiring technical expertise.

My work has two primary aims: first, to help answer questions about ship technology raised by the work of economic historians working on shipping productivity; and second, to contribute a more sophisticated understanding of period ship technology and of “pre-industrial” technology more broadly to the history of technology. Since I am not an economic historian, I am limited to the liminal zone of economic history in which period ship technology bears directly on their hypotheses and problems—which I am capable of understanding, thanks to my academic training. I did not read history of technology as a supervised field in graduate school. I began serious reading in it while researching my dissertation. Most of my subsequent study of it has been self-guided—but that’s self-guiding by someone with a doctorate in the discipline, so it’s not as though I’m not equipped to do that. With almost thirty years’ experience learning about sailing craft, the arts of navigation, and the nature of the sea—both inside and outside the academy, self-taught and formal—I have the technical understanding to interpret and present findings on ship technology to a historian or general audience. I do not, however, have the technical understanding of a naval architect, or of an archaeologist who has worked on ship design intensively. I work on acquiring as much as I can, but there is so much room in the liminal zone for naval architects, marine engineers, and technically-equipped archaeologists to contribute, ultimately, to economic and technological maritime history. I dream of an ongoing, productive, collaborative relationship with at least one such person. Meanwhile, a few of them have already contributed much to my work that I would not have had access to otherwise.

Maritime history is only one field in one discipline in which the liminal zone between traditional scholarship and extra-academic technical expertise promises so much to those who can bring both to bear. An accomplished musician has the potential to be the best historian of music. A trained architect has the potential to be the best historian of architecture. As more scholars come to doctoral study later in life, the potential may be there for us to benefit from more scholars like Jeff Bolster, who spent years running charter boats in the Caribbean before returning to the academy and putting out important work on life at sea.[2]

It is also true, though, that some promising areas of the liminal zones will remain out of reach without a commitment to pursue collaborative relationships with those whose expertise overlaps but extends past one’s own in an important area. If we start seeing more scholarship published from work along the boundaries of fields and disciplines and vocations, we should expect to see more co-authors and multiple authors in the bylines. I hope we do.

 

[1] Willem F.J. Mörzer Bruyns, “Research in the History of Navigation: Its Role in Maritime History,” International Journal of Maritime History 21:2 (December 2009): 261—87.

 

[2] See W. Jeffrey Bolster, Black Jacks: African American Seamen in the Age of Sail (Harvard, 1997); and The Mortal Sea: Fishing the Atlantic in the Age of Sail (Belknap, 2012).

Maritime History in the U.S.: Hiding in Plain Sight

OLYMPUS DIGITAL CAMERA

The United States, for the most part, ignores maritime history—including our own. We have thought of ourselves as “continental” for so long that we no longer feel connected to the seas around us in a historical sense. We have only a vague awareness of our histories as maritime colonies of European empires—Spanish, Dutch, French, English. When we think of the sea historically, we tend to think of it as a barrier, crossed by intrepid (or rapacious, or both) adventurers, and then settlers, who survived long ocean crossings, settled along the coasts, then moved inland. We tend to forget that, for the first three hundred years of European settlement and African enslavement in North America, our ancestors were connected to each other and to their extended networks of kin, culture, political allegiance, trade, and in many cases sustenance by the sea. These people moved in ships, whether by choice or by force, and not just one way. They went west, but they went east, too—and south, and north, and every other point of the compass that swung in the binnacles of thousands of wooden ships under sail.

We no longer move in ships—though our stuff does: about 90% of it. We are as connected by the sea economically as we ever were, but not politically or socially. We are no longer primarily a maritime society—but our ancestors were. Ships crossed the oceans and landed in ports. Boats carried people and small cargoes along the coasts, up and down creeks and rivers, to trade, to move, or just to visit neighbors. Forests were thick and roads were few and bad. The railroads did not “open up” the West to white settlement. Steamboats did.

The peoples who were here before all that did not use the oceans the way Europeans did. They had come here in the distant past over a land bridge that was long gone. But they used the streams and rivers and bays and sounds just as much as anyone, for travel and trade and fishing. Africans had long done the same. Those who were captured and chained and brought here—over twelve million were loaded on ships, though not nearly that many survived the passage—either found themselves on labor gangs, producing crops for export overseas, or, if they were more fortunate, living lives that allowed them some degree of direct participation in the American water-world—padding a canoe, steering a periauger, sailing a ship.

The economy of what became the United States was built on shipping crops grown on slave plantations to market, and building the ships, barrels, and supplies necessary to do that, and to bring manufactures to people in the Americas who wanted and needed to buy them. It was built on sailing ships to Africa, loading prisoners, and bringing them back to America for sale as forced labor. The revolution that fractured the British Atlantic Empire started on the water. The party in the power in London saw a chance to use the Navy to enforce customs and taxes on British American ships—in their own waters. It was effective—too effective. Angry British Americans attacked ships flying the Royal Navy ensign but built in British America by American shipwrights.

The Revolution ruined the British American economy because the British American economy was maritime, and it only recovered when the Royal Navy once again allowed it to use the sea. By the 1860s, American ships were giving British ships a run for their money around the world. The races of the British and American tea clippers from China to western markets weren’t just business; they competed under their countries’ flags, like Olympic teams, and the results were the hottest news.

The catastrophe of the Civil War and the drive to conquer the Far West put an end to that. Eventually, we became a naval power—later the greatest—but we were no longer a maritime people. Not in general, anyway. New England, naturally enough, has always tried to keep the flame of our maritime history burning. The Peabody-Essex Museum in Salem, Massachusetts, published our premier scholarly maritime history journal, The American Neptune, for 61 years, but it finally folded in 2002. There is no scholarly print journal in maritime history published in the United States. Of the two premier journals in the field, one was founded in Canada and published there for 25 years. It is now edited and published in England, along with the other. The International Journal of Nautical Archaeology is published in England. The journal of the North American Society for Oceanic History is edited and published in Canada.

Our task as U.S. maritime historians is, in one sense, the same as for every other maritime historian: to write the best maritime history we can, based on the best research we can do. But for the public, it is to do our best to remind people of the story I just sketched out—to remind people how central our maritime history is to who we are and where we came from. To commit professionally to maritime history in the U.S. is walking uphill. There is very little funding because there is so little interest. We are a thoroughly international discipline, appropriately enough, and those of us who live in this wealthy nation are acutely aware of the challenges we face in getting our work paid for relative to those elsewhere. That will change when and if we manage to tell this story to enough people. There’s a thriving group of talented people working in the field here. Help us get our story out, because it isn’t our story. It’s the country’s story.

 

Doing “pre-industrial” history of technology

Earlier this month, I got to present a paper in a session I organized for the annual meeting of the Society for the History of Technology in St. Louis. I have two things to say about that. One has to do with the paper and the session, and the other doesn’t.

SHOT and the field it promotes are overwhelmingly concerned with “industrial” and “post-industrial” technology, and the relationships between those cultures and the technologies they produce and use–as well as those they don’t. That does not mean, however, that those of us working in earlier periods are unwelcome; I had 12 scholars who signed up, which means I had to propose three sessions. SHOT accepted only one, but they re-distributed most of the other scholars, and their papers, to other suitable sessions. The session was well-attended, the presenters enthusiastic and compelling, and SHOT was kind enough to find us a most suitable chair/commentator who took an obvious interest in the papers and the proceedings. We learned about the competition to build an astounding number of Gothic cathedrals in late-medieval Europe, the relationship between the building and operation of mills and the social and power structures of medieval England, and what the artisanal craft of shipbuilding in the seventeenth- and eighteenth-century British Atlantic might teach us about doing history of technology in general. (That was mine, of course.) I thank Anne McCants of MIT, Adam Lucas of the University of Wollongong, and our chair and commentator, independent historian Pam Long, for making this work. If you’d like to learn a little about what they work on, you can use these links (links will open in a new window):

Anne McCants

Adam Lucas

Pam Long

I also thank those who signed up for our sessions and either couldn’t make the conference or presented in other sessions: David Zvi Kalman, Moritz Nagel, John Pannabecker, Steven Walton, Yovanna Pineda, Rob Johnstone, Dustin Studelska, Gideon Burton, and William McMillan.

The other thing I want to say is about winning–and not winning–awards and grants. I was up for two at this conference: the prize for best paper by a first-time, early-career presenter; and a no-strings-attached postdoc worth $10,000–plenty to complete the research I want done for this book. I knew I wanted to win them, but I didn’t realize how badly until I found out I didn’t, after which I confess I threw a bit of a tantrum (in private), which made clear to me not only that I don’t like to lose, but that I’d got quite used to winning things lately. The inescapable truth, though, is that when you play this game, you will win some and you will lose some. They’re all seriously competitive, whether or not they have big checks attached. I can’t objectively complain about the number of such cap feathers I have on my CV at this point. (But, as the indefatigable Joe Walsh so memorably sang, “I can’t complain, but sometimes I still do…”) If you’re a young scholar or an aspiring scholar and you’re reading this, first of all, thank you, and also, remember, when it counts (which is when you’ve been rejected for something you want and you’re in the middle of that gross feeling), that the only proper response is to try again as soon as possible. The biggest thing I ever won, I won the second time I applied for it. The whole equation changes each time; as long as you’re qualified, your name should be in that hat.

The acknowledgements sections of academic books, usually included in the introductions, can be terribly intimidating (at least they used to be for me), because they typically list all the grants and prizes and fellowships the author won that facilitated the completion of the book. What they do not contain is all the grants, prizes, and fellowships the author did NOT win. I have resolved to note this in the acknowledgements section of my book (if the editor will let me, of course), so as not to daunt any new or would-be scholars who might be reading.

That’s enough blogging for now. Back to that other thing… Thanks for reading.

PS the picture at the top is the ceiling of the bar in the Union Station Hotel in St. Louis where the conference was. They put really cool light shows up there during happy hour. The picture is off-kilter and blurry. That is how pictures taken in bars by current patrons should be.