This week, The Junto features a roundtable on digital pedagogy, in which we discuss our different approaches to using digital sources in the classroom. Today, Joseph Adelman talks about working with students on technical knowledge. You can also read Part 1 by Rachel Herrmann on source accesibility, and Part 2 by Jessica Parr on teaching digital history to non-majors.
I’m always both impressed and intimidated when I see a digital history project pop up in my social media channels. Faculty are doing some amazing work getting students to create work using sophisticated software, apps, and other programs. They create websites, run statistical analyses, markup text using TEI … and I have no idea how to replicate it in my classroom either for myself or my students. To be fair, I have not yet taught a course on digital history specifically (nor do I plan to in the near future). So I’d like to focus instead on some practical thoughts about integrated digital history methods into the classroom in topical upper-level courses.
In the several years I’ve now been teaching, I’ve tried to integrate some sort of digital element into most of the courses I teach. The logic for me is relatively simple. First, students will do well to learn a variety of skills about how to implement historical research, thinking, and writing. Second, these sorts of projects can be (should be?) more interesting to do for the students and more interesting to grade for me than it would be if I simply assigned a standard term paper. Finally—and this picks up and connects the theme of the first two—these digital assignments often require more thinking about how to design good research questions because they ask students to undertake work in a format with which they are mildly uncomfortable.
My way of doing that digitally has been to ask students to think about quantitative evidence. Numbers are often scary to history majors (my advisees, ahem, often delay their math and science courses), but it’s important to understand how to analyze them. In other words, it’s the sort of discomfort I aim to create in the classroom. In several courses, I’ve been able to take advantage of online databases produced by historical organizations and scholars to offer students a trove of information for which they could create historical projects. These sorts of databases make it easier for the novice digital researcher to use databases to ask questions, convert the data into spreadsheets, analyze and interpret the data, and re-imagine it in the visual form of a table or graph.
The two assignments of mine that most do this offer students the opportunity to conduct research in political or economic history. First, in a course on the early American republic, students did a small assignment using the A New Nation Votes database sponsored by the American Antiquarian Society and Tufts University. Because the assignment was relatively brief, covering the first unit of the course, I tailored the research questions that students would answer in a way that limited the number of variables they were addressing. First, they had to select a Massachusetts town (given the demographics at my state university, that for many meant their home towns, which I had expected) and analyze the voting results for the gubernatorial elections from 1787 to 1825. They had to compare trends in the town to the state overall, using other course readings. In the second part, I invited them to be more speculative, to imagine a larger research paper and write a prospectus for it that would require the use of the database.
In an economic history course this past fall, by contrast, the quantitative/digital assignment dominated the work of the semester. Given that the topic was the economic history of the Atlantic world, the course focused a great deal of attention on the trans-Atlantic slave trade. So the major assignment for the course was to write a paper based on the Trans-Atlantic Slave Trade Database, one of the earliest digital history projects (it first appeared in the late 1990s on CD-ROM). Students used the database to form queries, exported the data to Excel, and worked on converting it into usable tables and graphs. They then had to write a somewhat shorter research paper (with the assumption that much of the work would be on creating the visualizations) that integrated much of the research data.
In creating these assignments, I try to focus on where students are doing the work. They will spend a significant amount of time for sure trying to figure out what to do with numbers as opposed to the texts with which they’re more familiar. But they’ll also need to spend time mastering the technology. One of the things that surprised me the first time I did such an assignment (honestly) was how much class time we ended up devoting to learning how to design a spreadsheet in Excel, how to organize information, how to create a graph. As many of our readers know, the idea of the “digital native” is less axiomatic than some in the media would have us believe. But for the novice teacher, it’s a reminder that any sort of digital assignment within the context of a traditional topics course will require time carved out of the syllabus to go over the technology itself.
These assignments are not perfect, of course, and there are many ways they might be improved. Excel is a bit clunky (though I’m comfortable that understanding it will help them when they go out into the working world), and they’re not doing sophisticated digital analysis. But using these tools—which, as Rachel noted on Monday, help make more research accessible to students—allows for a basic introduction to questions of digital history and quantitative analysis.
 My students are all very familiar at this point with my lecture about the “good” kind of discomfort and the “bad” kind, though I don’t know whether they all quite trust me about the good kind.
 I would be remiss if I didn’t self-promote that you can find links to all of my assignments, syllabi, and other teaching materials (including course blogs, which I’ve discussed previously at The Junto), at my personal website.
 May I brag about my students for a moment? I had a lot of fun grading this second part. They came up with some pretty creative ways of putting the quantitative voting data alongside other sources, in particular newspapers and private correspondence, to examine all sorts of questions about voting behavior.