Recently, I attended a workshop for a topic mostly unrelated to my work in digital collections. At introduction time, I gave a nutshell view of what I do by saying my group digitizes Baylor’s special collections and makes them available online. Despite the whole thing taking about 15 seconds and being intentionally generic, I’ve done this intro enough times by now to know what was going to happen next.
An older gentleman sitting on the front row got what I can only describe as the “ah-ha!” look on his face, and at the first break, he approached me and asked a question I get more often than not when I talk to people about what we do at the Digitization Projects Group.
“I work at a small museum, and we’re being told to digitize our collections. Once we do, we can just throw those old papers out, right? And is a DVD a good storage solution?”
My answer to him was simple, but it probably wasn’t what he expected to hear.
“Do you remember microfilm?” I asked him. “And when was the last time you used it and thought ‘Gosh, I wish I could get my hands on the original just to compare it to what I’m looking at’ only to find it’s been decades since anyone saw a paper copy? That’s why you can’t just throw things out once they’re scanned.”
“Also,” I added, “DVDs are terrible.”
Okay, so I wasn’t quite that blunt on the DVD answer, but the effect was the same: a stunned look of disbelief. In some ways, I don’t blame him. There’s a lot of misinformation (and outright falsehoods) out there about digitization, data preservation, and care of digitized materials, and the more channels it has to filter through to reach people at smaller institutions, the more distorted it can get.
If you haven’t done so, I encourage you to check out a book by Nicholson Baker called Double Fold: Libraries and the Assault on Paper. Baker’s central premise is that during the microfilming heyday of the 1980s and 1990s, libraries and other institutions put too much faith in the technology of microfilming and weren’t always diligent about properly preserving and storing the newspapers that had been filmed. It is a polemical, biased, uncomfortable book to read, and it is less than popular among librarians. But that was exactly Baker’s point.
Baker wanted to draw attention to the notion that just because a technology had come along that promised better access and a smaller storage footprint didn’t mean professionals could become lax about enforcing good practices of physical archival storage. While much of Baker’s criticism has been ably (and thoroughly) countered by library professionals in the decade since Double Fold’s publication in 2001, it remains a stirring think piece on the dangers of over-reliance on a “silver bullet” solution at the expense of long-term viability.
At the heart of Baker’s issues with microfilm was the prevailing attitude that, once a run of newspapers had been filmed, it was perfectly acceptable for the originals to be tossed, as the filmed versions were thought to be a reasonable substitute that preserved both the look and content of the papers at a fraction of the space required to store them. But what happens if the film is bad and no one noticed until the originals were long gone? Or what if a page was skipped, or an entire volume? Or what if the film falls prey to “vinegarization” – an inherent agent of deterioration wherein the films layers begin to breakdown and disintegrate, producing a distinctive vinegary, “salad dressing” smell – and now cannot be viewed?
If the originals are gone, the answer is clear: there’s nothing you can do.
Which brings me back to my fellow workshop attendee’s question: once things are scanned, they’re safe to pitch, right? The problems outlined in Baker’s book could just as easily apply to the process of digitizing archival materials. We believe the technology behind digitization is reliable, replicable, and sustainable, and we’ve learned a great deal about how to approach digitizing materials thanks to the lessons revealed by the great microfilm boom of the last century. As such, we’ve got processes and technologies in place to monitor our digital files, keeping them secure and accessible for decades to come.
But what about the things we can’t predict? What if the next generation of computers is so different from what we’re used to today that the very idea of digital files changes completely? What if a specialized virus destroys every TIFF file in creation? What if the Mayans were right, and civilization as we know it craters at the end of the year, rendering all our painstaking efforts profoundly moot?
The best answer is to do what people have done since 200 BC: go back to the paper versions.
That’s why we counsel our partners to use the process of digitizing materials to serve as a catalyst for rehousing materials in archival storage if they’re not stored that way already. That’s why we urge conservation of fragile materials before they arrive at our center. That’s why we never tell them it’s safe to throw something away just because it’s been scanned, cataloged and placed in a digital collection.
That’s why I told the man from the workshop that the answer to his question is a very simple, “No.”
And the DVD question? Think about this: when was the last time you popped a CD into your car’s stereo that you hadn’t listened to in a while, only to find that your favorite song was skipping like a hyperactive preschooler thanks to a series of almost-imperceptible scratches? It’s happened to all of us, and the same thing can happen to a supposed “100 year, archival” gold DVD.
But for years, digitizers at institutions large and small were told that backing up your files to a DVD and putting it on a shelf was a great example of a reliable backup, to the point where many early digitization outfits didn’t keep any other versions of files around once they were burned to disc. But we found pretty quickly that those discs weren’t reliable enough to be a sole backup source, so now we keep multiple copies on spinning discs, analog tape, and in the cloud both on- and off-site to ensure long term stability of our digital assets.
All of this makes good sense, but if professionals at big institutions like the Library of Congress, the National Archives and even Baylor’s own DPG have to keep constant watch on evolving technology trends just to stay up to speed, how can we expect staffers at small to mid-size institutions to keep up?
Ultimately, it comes down to education and using a common sense approach to digitization projects. Education on the part of large institutions like the Library of Congress, the Texas Historical Commission, and, at a local/regional level, our own staff to educate people at small institutions on the basics of digitization and file management. Workshops, webinars, websites and more can be found that contain basic information about how to scan documents, how to manage the data that results, and what to do to keep it safe, and more access to this kind of information can do great good to counteract some of the old misconceptions that are still out there.
And common sense? That’s something Baker’s Double Fold should give us reason to trust in spades. If something is important enough to scan and put online, isn’t it common sense to think that it’s important enough to preserve physically? If an archival collection was kept safely stored for decades in the right environment, does it make sense to throw it out now that it’s been scanned? And if we know that paper-based items can last for centuries when properly stored, doesn’t it make sense to hold onto them as long as we can, just in case?
Is digitization an important undertaking for libraries, museums, and archives of all sizes? Undeniably.
Should we take steps to ensure our cultural heritage – digital and physical – is properly stored, displayed, and accessed? Without a doubt.
Does either of those facts mean it’s safe to discard a decade’s worth of 19th century American newspapers once they’ve been scanned, as happened with microfilmed newspapers in the 1990s?
If anyone’s reading this post in 3012, do me a favor: look me up and let me know.