AAUP 2014 recap: Inclusiveness and collaboration keys to scholarly publishing (Part 1)

This week, our two Pat Hoefling Grant winners will share their experiences from AAUP 2014. Today's post is by Composition Coordinator Tony Brewer. 

In June I attended the Association of American University Presses annual meeting in steamy New Orleans. I’ve been to AAUP production managers meetings over the years. (I’ve been at IU Press in two 8-year runs: 1996-2004 and 2006-present) but had not attended the annual meeting since 1999—pre-tweet, pre-Facebook, pre-everything, it would seem. My overdue attendance was made possible through the Pat Hoefling Professional Development Grant, which I was honored to receive.

The theme of the meeting was “Open to Debate,” aiming to “relay the possibilities brought on by a renewed sense of inclusiveness and collaboration,” and it felt as though inclusiveness and collaboration were at the core of nearly every session. From determining who to talk to in a Press-Library-Provost-etc. structure, to seeking out non-traditional, non-book-first means of disseminating scholarship, everyone has a seat at the table and thinking “outside the box” doesn't really cover it. The box has been repurposed as an umbrella beneath which scholars and publishers are sheltering in place and trying to make real-time sense of an open access world.

At one point Doug Armato, Director of University of Minnesota Press, quoted Sigfried Giedion in Mechanization Takes Command: “Every generation has to find a different solution to the same problem.” The Monday morning plenary session, “Not Just Open Access,” approached the “problem” of OA from different angles. Moderator John Sherer (Director at University of North Carolina Press) outlined the new “crisis in publishing,” after noting that publishing crises have been identified often in the history of the industry. As creation costs approach $0, why pay for the dissemination of scholarship? Information is clearly in abundance everywhere, so why the paywalls?

Kathleen Fitzpatrick, Director of Scholarly Communications at the Modern Language Association, related how the MLA is addressing OA with the MLA Commons, which “allows MLA members to create a professional profile, connect with one another, seek feedback on their work, establish and join groups to discuss common interests, and share their ideas with a broader audience through new kinds of open-access publications.” It’s sort of a social media site strictly for MLA members, but the comments sections will allow publications to grow over time, becoming a living anthology.

Joe Esposito, President of Processed Media, emphasized that OA is not a matter of old vs. new but an intensification of the scholarly publisher mission: to disseminate knowledge. He also described how university press publishing compares to both library publishing and for-profit-publishing, and pointed out that since faculty certification is tied to publishing in relatively unprofitable fields, universities need to recommit themselves to the mission driving publishers. His excellent presentation is found here.

“Whose mission?” asked Mark Edington, Director at Amherst College Press, in his turn at the podium. He too called for universities to (re)engage themselves with publishing—with the dissemination of scholarship—describing OA as less a business model than a reassignment of the scholarly mission. His presentation is here and it’s a good one.

The next session I attended, “Enhancing Production through Workflow Systems,” covered aspects of XML workflow, which the IU Press EDP department has been discussing and working toward for some time. Eric Newman, Managing Editor at Fordham University Press, chaired. Panelists were Susan Baker from packager Westchester Publishing Services; Sylvia Hunter, Editorial Manager at University of Toronto Press; and Pamela Schnitter, Senior Designer at Princeton.

XML (Extensible Markup Language) is a coding system that defines a set of rules in a format that is both human- and machine-readable. It is similar to HTML, used for coding websites, but is more flexible because it also can be exported directly to printable PDFs or Word documents (.docx is also XML based). As Susan Baker said, XML defines “what it is” not “what it looks like.” Appearance is determined by the “reader” of the XML document, whether an ebook or an Adobe or Microsoft product.

Many presses utilize XML at different points in the production chain: some begin the editing process in XML, others wait until final, print- and web-ready PDFs have been produced then extract XML documents from InDesign files. Like everything else, there are many ways to achieve the desired results and it all depends on what a Press needs and at what point in the process they need it. Pamela Schnitter, Senior Designer at Princeton University Press, provided a great summary of best practices for XML and also described Well-Formed Document Workflow, a proprietary XML-first workflow system developed by SCRIBE, with whom IU Press may be working in the future.

Next up, appropriately, was “Experimental Formats and Models,” which covered alternatives to books and journals in both print and e forms. A subtitle for this session could have been “But How Short Is Too Short?” Alan Harvey, Director at Stanford University Press, began by stating that commercial houses have been publishing re-purposed, shorter-than-book-length texts for decades. Further, “condensed” may have a Reader’s Digest connotation but the essay (excerpt, short work, etc.) is still a valid form because abridged versions of larger works are a key form of scholarship and may be more attractive to more readers.

Doug Armato, Director at University of Minnesota Press, made many useful, funny analogies: traditional solitary authorship = Highlander: “There can be only one,” while collaborative authorship = The Muppets: “We’re in this together.” It is true that the Web is where scholars congregate and commerce happens, and university presses can become a Maginot Line: costly to maintain, leaving other parts of the scholarly mission underfunded, and frankly, if you want the goods it is easy enough to end-run. He went on to describe Debates in the Digital Humanities, a collaboratively curated project between CUNY and University of Minnesota Press simultaneously published as a continuously updated website and printed book. DDH is an ambitious attempt at OA text combined with a “custom-built social reading platform. Going beyond the basic task of making the contents of the print edition accessible, the OA platform makes the text interactive, with key features that allow readers to interact with the text by marking passages as interesting and adding terms to a crowdsourced index.”

Katie Hope at MIT Press described their excerpted text initiative in detail: MIT Press Bits bundles 1-3 chapters from a stand-alone book, offering readers a taste of the complete work; while MIT Press Batches similarly bundles 6-10 articles from various MIT journals around a unifying theme. These are low-priced quick reads designed to attract casual readers and hopefully entice them to buy the full text. Both these offerings are ebook only and available in the usual myriad formats (Kindle, Nook, etc.). MIT Press has done a great job marketing these short works, with a cohesive design incorporating MIT Press logo and gathering the BITs under attractive, eclectic categories. They have even placed ads in city bus/tram lines. Whatever works.

At this point I was reminded of recent shorter projects from IU Press, such as the book Misremembering Dr. King, or some of our fiction offerings under Break Away Books, or the IU basketball title This Is INDIANA, which was assembled from Herald-Times newspaper articles. Other ebook and “enhanced PDF” projects have appeared over the past year in the production department, so we are wading in and more will be forthcoming.

“Working with Book Packagers” was next. IU Press has made extensive use of packagers in the past, but typically rely on them when books stack up or we want to ensure titles make it in for fiscal year sales. Laura Westlund at University of Minnesota Press echoed the latter, adding she felt Minnesota needed a single packager working on all titles in order to save time on short schedules. Also of note, Minnesota handles all design in-house and prefers aggregator XML files post-production.

With packagers, turnaround is key at all points in the process. Consensus among panelists was that packaging does and doesn’t save time, though. It definitely does if you are the one who would be doing the work in-house. But overall schedule improvement is less certain, not necessarily guaranteed. Packaging also still requires some in-house project management and problem mitigation.

Ellen Foos at Princeton said that PUP packages mostly monographs because they are more cost-effective than complicated books. Their packaging process can be a bit more time-consuming, though, because their university requires multiple bids. She also mentioned PUP tends to retain freelance editors who have worked as packagers.

Later in the afternoon was the session “What Are Libraries Doing as Publishers?” which covered some collaborative projects between libraries and presses. Melanie Schlosser, Digital Publishing Librarian at The Ohio State University Libraries and Editor of The Lib-Pub, said that the success of library/publisher projects is mostly anecdotal, yet they are another way of pushing forward our scholarly mission. She went on to describe the service model of Libraries-as-Publishers, with the scholar/author providing content, peer review, and editorial focus, while the librarian provides essentially everything it takes to make that content available, from domain hosting, production services, and distribution to metadata management, training, and marketing.

These projects are not a good fit for all authors, and they require a great deal of project management, but if the university is behind it, collaborations can bring many entities under one tent. Downloads are a key metric when gauging success, and the sort of data management this model requires is well suited to library and information scientists.

Sarah Lippincott described the Library Publishing Coalition, a two-year project (January 2013 – December 2014) that “defines library publishing as the set of activities led by college and university libraries to support the creation, dissemination, and curation of scholarly, creative, and/or educational works.” She was quick to point out that library publishing doesn’t replace but rather augments traditional publishing, and that reaching consumers (ad buys, trade shows, postcards) is in many ways beyond their scope. Still they are better able to take advantage of metadata management and pathways of scholarship dissemination because those are their areas of expertise.

One of the most interesting (to me) comments of the meeting came out of this session, when someone asked: Is everything a scholar produces a “work of scholarship”?

I believe it was Kevin Hawkins, Director of Library Publishing at University of North Texas, who answered, “We don’t make that call for them,” everyone on the panel agreed.

The last session I attended Monday was “Worst Book I Ever Acquired,” an inversion of the usual “best book ever” session. I knew it was going to be a bit fluffy but it was instructive to hear horror stories from other presses and clear explanations for why a book did or did not succeed. There were tales of books that didn’t make it to the marketplace or to a conference on time and so missed their “one chance” to really move units. Most interesting was a show-and-tell with book covers: “One of these books sold 5,000 copies, the other only 54, both on same or similar topic. Which is which?” It was nearly impossible to tell based on a cover and title which book was the “winner.”

I noted that there were no e-book disasters in the lot, which I wasn’t really expecting: such ventures are still relatively new; success, as noted earlier, is largely anecdotal at this point; and “failure” in the context of this session seemed tied to the traditional model of inventory and unit sales. But it did make me wonder what a “worst e-book” might look like, after another few years of experimentation, and who all would claim responsibility.

Part 2 of Tony's recap will be posted on the blog tomorrow.