|
This topic comprises 2 pages: 1 2
|
Author
|
Topic: Unhappy Medium: The Challenges With Archiving Digital Video
|
Monte L Fullmer
Film God
Posts: 8367
From: Nampa, Idaho, USA
Registered: Nov 2004
|
posted 09-21-2014 12:06 AM
Interesting read
quote: If you’ve been to the movies in the past decade, chances are you haven’t seen a film. Celluloid, the industry’s medium of choice for more than a century, has all but disappeared from American cinemas, with 95 percent of big screens already converted to digital projection and an ever-growing number of movies “born digital”—that is, shot on digital cameras. Since Paramount became, in January, the first major studio to embrace digital-only distribution for its domestic releases, other studios are expected to follow, bringing an end to films on film.
In Washington, the impact isn’t confined to the multiplex. To register a copyright, filmmakers must submit their work to the Library of Congress, which stores 123 years’ worth of American film—from Newark Athlete (1891) to The Expendables 3, due in August—at its 45-acre Packard Campus for Audio Visual Conservation, in Culpeper. Around the same time that Paramount’s The Wolf of Wall Street became the first movie to receive an all-digital release last December, the Library received a 35-millimeter copy.
The rise of digital may soon require those copyright regulations to be rewritten. Already, says Gregory Lukow, chief of the Packard Campus, Hollywood is pressuring the Library to accept, instead of film prints, the set of computer files known as Digital Cinema Packages (DCPs), which studios send to most movie theaters instead of film reels. Encrypted to prevent piracy, each DCP can be played in one theater’s projector, at certain times, for a certain period. “It’s as good as a doorstop for us,” Lukow says.
His reservations about digital formats are shared wherever vintage films are stored, repaired, or shown around Washington—at the National Archives’ film center in College Park, which houses thousands of movies produced by the executive branch; at the Smithsonian, which specializes in anthropological films; and at the National Gallery of Art, where footage of Jackson Pollock and other artists at work is collected. For these institutions dedicated to preserving millions of miles of footage for posterity, the end of film signals a wrenching technical, cultural, and budgetary shift.
• • •
Anyone who has abandoned brittle, yellowing photo albums in favor of iPhones or the cloud might consider the film-to-digital shift a no-brainer. But for archivists, digital storage is a logistical headache. Once reduced to bits and bytes, information requires constant migration to keep pace with advances in technology, which tend to last no more than a few years. Lose the trail of upgrades and a film librarian, managing thousands of newsreels, shorts, and one-of-a-kind snippets, could lose access to an artifact entirely.
Add to this the fact that no one has come up with a satisfactory equivalent to old-fashioned film, which is a more robust, tangible medium than anything in the digital world. Archivists make repairs to decaying reels before copying the original onto polyester “safety” film stock. (Most film today, even what Hollywood types refer to as celluloid, is made of polyester.) Once they put the resulting print, called a preservation master, into cold storage, it can last several centuries.
“Film is simple,” says Criss Kovac, supervisor of the National Archives’ Motion Picture Preservation Lab. “It’s elegant. If we create a new copy and do the quality control, we know we can come back to it in a couple hundred years and it’s still going to be there, exactly the way it was when we printed it in 2014.”
The Archives’ collection of newsreels; documentaries; military recruiting films starring the likes of Jimmy Stewart and Ronald Reagan; and educational shorts about LSD, venereal disease, and forest fires together form a kind of collective national memory, which Kovac and her staff are charged with preserving “until the end of the republic,” she says. “The only way to do that is by using film stock.”
As digital takes over cinema, that commodity is harder to come by. Kodak, which is among the Library’s top suppliers (and the Archives’ only one), just recently emerged from bankruptcy. Though the company has no plans to abandon motion-picture film, the future availability—and cost—of film stock remains uncertain. Last year, Fujifilm ended production of movie film entirely. In recent years, the Library of Congress has had to rely on ORWO, a German distributor, but it deals only in black-and-white film.
“We’re going to continue to buy film stock for as long as we can,” says Ken Weissman, supervisor of Packard’s film-preservation lab. But he has already experienced “difficulties in catching Kodak in the right moment, at the right price, within our federal contracting regulations.” Kovac predicts that in the future “we’re going to have to be really selective about what we preserve.”
• • •
The film community has begun to mobilize around the challenges posed by new media. In 2007, the Academy of Motion Picture Arts and Sciences released a landmark study, “The Digital Dilemma,” and called on filmmakers, curators, and studio executives to work together to create new standards for producing and preserving digital film.
The technicalities can be daunting. A format called a Digital Cinema Distribution Master (DCDM) is being offered as an alternative to the DCP. Akin to the original camera negative—the strip of film that goes through the camera on a movie set—the DCDM, unlike the DCP, is not encrytped. But Weissman points out that the DCDM has its own problems. For instance, it doesn’t allow the Library of Congress to add closed captioning and subtitling, which the Library is legally obligated to offer. Accepting DCDMs would force the Library to create foreign-language versions of films in-house. “A laboratory like ours is really not geared towards that,” says Weissman.
There’s also the relative cost of digital to consider. Movie studios like digital distribution because it’s cheaper: A DCP will eventually take less than $100 to produce and deliver to theaters, compared with $1,000 for a film print. But digital formats are much more expensive to preserve. The Academy of Motion Picture Arts and Sciences estimates the annual cost of maintaining a digital master at $12,500, 11 times that of film.
Archivists have already begun to work together to accommodate digital formats. In 2010, the Archives completed a decade-long effort to preserve the entire Universal newsreel collection after the Library of Congress supplied some missing reels. Members of the Association of Moving Image Archivists, the National Film Preservation Board, and other professional organizations regularly meet to hash out the future of film and digital preservation.
But more cooperation is needed. “We don’t really have any interagency agreements in place to help each other out,” says Kovac, though she’s hopeful that, as more problems arise, government-funded archivists will pull together.
• • •
Until the technical questions are answered, the resistance to digital media is more than a matter of nostalgia. But some movie experts think the efforts to adapt to digital ignores the relationship between cinematic images and their medium.
Peggy Parsons, head of the National Gallery of Art film program, regards the studios’ move to digital as “a distortion of history.” For her, film’s character—its grain, scratches, wear and tear—grants a physical connection not only to the filmmaking process but also to the times and places a movie was screened. “You’re having a much more profound experience when you’ve got the real artifact,” Parsons says of film projection. “You can feel it in your mind and body. It isn’t just what the actors are doing or what the story is. It’s an experience of the whole medium.”
The National Gallery hosts free, weekly repertory-film screenings, for which Parsons always tries to acquire a movie in its original format, to “replicate the historical experience as closely as possible,” she says.
Lukow and Weissman sympathize but view Parsons’s purism as a distraction. “People talk about the unique look and dynamism of film, but for the most part that’s been lost for decades anyhow,” Weissman says. With brighter projector lamps and modern screens, he points out, a film will never look to us the way it did to its first audiences.
He and Lukow agree that the movie, not the format, comes first. “The essence of the film is the content,” says Weissman. “It’s the images that are recorded on it, not the fact that it’s film per se.”
Those sentiments may sound out of place at Packard, which opened in 2007 to be, in Lukow’s words, America’s “last place standing” for film. The Library of Congress remains “committed to preserving film as film as long as we can,” Lukow says. “At the same time, we recognize that we have to position ourselves for the post-film world.”
| IP: Logged
|
|
|
|
|
Marcel Birgelen
Film God
Posts: 3357
From: Maastricht, Limburg, Netherlands
Registered: Feb 2012
|
posted 09-21-2014 05:07 PM
While I totally agree that Digital Archival still does have its issues, I'm also quite sure that we can solve those issues if we want. We're at least quite broadly aware that there is need for improvement, which is quite a step forward. So, it's not all grim and dark.
Both Sony and Panasonic are actively working on optical storage discs that can hold somewhere between 200GByte and 300GByte and should last for somewhere up to 50 years. For cold storage, this sounds like a good solution.
Keep in mind that, if done right, digital data doesn't have to decay. We can maintain the fidelity of the data for "infinity", if we have the will to do so. That's quite to the contrary to any kind of analog storage, as everything will eventually slowly decay and the only way to preservation is the creation of a copy, that will have less of the fidelity of the original.
Keeping important archival data "hot" (as in, instantly available) shouldn't be such big of a deal actually. Storage costs are coming down every year and the data generated in the past isn't going to magically multiply itself.
The thing I'm more worried about than preserving the actual bits, is preserving knowledge of the formats. Film is fairly easy, you take it out of the can and you look at it. It's almost instantly clear how it works. With digital that's different. Digital is just a long string of ones and zeros. You need to understand how you get from there to something that represents moving images and sound. It's already hard to open files you created with software about a year or 20 ago. Something like e.g. JPEG2000 sounds straight-forward right now, but something seemingly simple and abundant like JPEG compression is far from trivial. Who knows if we still have the necessary knowledge and tools to decode it 50 years from now?
So what we need is well documented, commonly used and straight-forward formats. Uncompressed, unencrypted image and sound data, stored in a logical way. So reverse engineering would still be easily attained, in case the documentation to those formats would get lost.
| IP: Logged
|
|
Carsten Kurz
Film God
Posts: 4340
From: Cologne, NRW, Germany
Registered: Aug 2009
|
posted 10-13-2014 08:36 AM
Marcel - every now and then, the institutions dealing with archive technologies make an attempt to freeze certain file specifications to keep them available for long term storage. PDF/A and JPEG2000 are examples of these approaches.
Other than that, I have yet to see a real world 'codec' vanish. I can still open proprietary graphics files from very early home computers. Another compelling aspect of digital is the versatility of software libraries. Once a format decoder is part of an open code library, it is usually more effort to get rid of it than to retain it, even if it is used only rarely.
I think most of the real issues with archiving still is reliability of physical media. There have been attempts in the past to also take care of this - some optical disc formats like MOD/WORM received some special recognition there due to it's long term reliability, but then of course these 'built-to-survive' formats always lagged behind in terms of the needed capacity for media storage applications.
I think distributed storage is the way to go, since it totally ignores any specific storage technology with it's specific capacity constraints, and any digital media can be transferred transparently. Over the course of the years, content can migrate without any special work related transfer attention to complete new storage technologies. So, at least in theory, that problem is 'solved'. Just that local institutions are not yet sure wether they would want to setup their own costly long-term storage system, or if they can trust commercial systems/services like Amazon.
- Carsten
| IP: Logged
|
|
Bobby Henderson
"Ask me about Trajan."
Posts: 10973
From: Lawton, OK, USA
Registered: Apr 2001
|
posted 10-13-2014 07:16 PM
quote: Carsten Kurz Marcel - every now and then, the institutions dealing with archive technologies make an attempt to freeze certain file specifications to keep them available for long term storage. PDF/A and JPEG2000 are examples of these approaches.
Other than that, I have yet to see a real world 'codec' vanish. I can still open proprietary graphics files from very early home computers. Another compelling aspect of digital is the versatility of software libraries. Once a format decoder is part of an open code library, it is usually more effort to get rid of it than to retain it, even if it is used only rarely.
Dead and obsolete applications are a very serious problem for data archival.
While PDF/A is more tailored for archival use, many PDF files are created in various PDF/X formats. Any PDF file is merely a container for content that can vary in extreme in terms of quality. One often needs access to the original application-based file used to create the document to make revisions (such as an Adobe InDesign .INDD file, Adobe Illustrator .AI file or even a MS Word .DOCX file). PDF files aren't edit-friendly unless such capability is deliberately included (which often isn't). It can be a real pain attempting to open files made in older versions of a graphics application. Lots of things get changed in the applications themselves. That gets compounded by plug-ins that are no longer updated to be compatible with new versions of a host application and new operating systems. Kai's Power Tools was one of the most popular Photoshop plug-ins, but it's dead now. And then you have issues with missing fonts and fonts that aren't compatible with a different operating system.
JPEG2000 has a lossless compression mode, but that clearly isn't used on any DCPs. Out of "flat" image file formats TIF is still the most popular for lossless with regular JPEG being most popular for lossy images (even when it's not the best choice). If one needs to make any revisions they need the original image file whether it's from a digital camera or something composed in Photoshop or some other creative application.
I can think of all sorts of applications and file formats from the 1990s that are virtually dead now.
Does anyone remember proxy-based image editors, like X-Res and Live Picture? They were great back in the early 1990s when RAM had a very high per megabyte price. At one time a license of Live Picture cost over $6000. Once RAM grew more affordable those applications fell out of favor. I don't think there's any way to import the image formats from either one of those applications now.
Aldus Freehand (later Macromedia Freehand) was one of the most popular vector-based drawing applications. It was Postscript-based, had many of Adobe Illustrator's capabilities and had a few features that made it arguably superior to Illustrator. It was gaining popularity on the Windows platform since both Aldus and then Macromedia didn't play stupid games with making certain versions exclusive to the Mac platform. Adobe bought Macromedia in the early 2000's and immediately halted further development of Freehand. Some previous versions of Illustrator could open Freehand files, but Adobe removed that capability with the CS6 and CC releases -which pissed off a LOT of long time Freehand users with many years worth of FH files.
With consumer electronics technology starting to shift a little more toward 4K it wouldn't be surprising to see a push from producers, movie fans and others to have some 1990's movies "re-mastered" in 4K. It's funny in a morbid way to think how difficult or downright impossible that task might be, even if all the live action stuff was shot on film.
Take Jurassic Park for example. Many of its visual effects were created using early versions of Alias Power Animator and Softimage, running on Silicon Graphics IRIX-based computer systems. Does anyone even have any of the digital assets and work files used to create the CGI sequences? How would anyone successfully load or import those files? It's not like anyone has a 22 year old SGI-based render farm laying around, much less one that could take 2K content and re-render it in 4K. Alias replaced Power Animator with Maya. Then Autodesk bought Alias. Softimage changed hands a couple times (Microsoft, Avid) before Autodesk bought Softimage and later killed it. There's probably little hope of importing the original CGI data into modern hardware, at least not without some serious software engineering put into the effort. Chances are if someone wanted to take a 20 year old movie or TV show and re-render its visuals to a better quality standard they might be stuck re-creating the visuals from scratch.
| IP: Logged
|
|
|
|
|
|
Bobby Henderson
"Ask me about Trajan."
Posts: 10973
From: Lawton, OK, USA
Registered: Apr 2001
|
posted 10-14-2014 01:13 PM
quote: Justin Hamaker Why couldn't the film industry create some sort of high end optical media to use instead of film.
The movie studios' parent corporations wouldn't fund the R&D costs (anywhere from hundreds of millions to billions of dollars) to attempt creating an optical disc format that had both the data capacity needed and longevity. No such optical disc technology exists at this time, at least not anything that can be replicated on any sort of industrial or mass/consumer-level scale.
No one is going to create a new optical disc format or much of anything else digital-based without a path to far higher volume consumer level sales potential. The Blu-ray Disc Association member companies have waited years after the debut of 4K UHDTV sets to even start working on a new 4K BD standard. 4K capable Blu-ray players won't be available until late next year.
Meanwhile Hollywood studios are actively undermining sales of so-called physical media (movies on DVD & Blu-ray optical discs). They're trying to push everyone to buying virtual copies of movies that you either stream or download, and cutting out retail channels in the process, and in theory creating more profit for themselves. Little do these guys realize they're going to Netflix themselves into the poor house with this approach. After all, if I'm going to stream/download a movie why do I need to spend a bunch of money buying it when I can just stream it on Netflix? If I have cable I could wait a little and watch it there.
Anyway, movie studios firmly have their mind set into storing movies on hard discs and data tapes.
Back to the subject of obsolete, dead & dying file/application formats...
Does anyone actually use Real Player anymore? The Real Player app used to be very common, but it became widely despised for including adware and spyware, along with spawning pop-ups and other bullshit. At one time the app would literally "phone home" to tell Real Player Networks the viewing/listening habits of its users. Real Player had its own audio, video, image & text formats. I don't think any of the creative applications I use support it. None of those formats are included in the HTML 5 specification. Real Player is still available, but I sure as hell am not using it. I feel sorry for users who built up collections of audio and video files encoded in RPN's codecs.
Speaking of HTML 5, that brings up Flash -one of the few formats I can think of immediately where big companies have deliberately taken steps to make their products incompatible with it. Flash is still very common on a lot of web sites, even though it doesn't work on iOS devices or any Android devices made in the last couple years. Meanwhile the HTML 5 spec is still arguably a mess. Those involved with steering HTML 5 standards have never actually ratified a finalized standard, and they continue to change things. There's a bigger push for WebM and WebP audio/video and image compatibility, even though it's not widely supported in creative applications. HTML 5 has nothing near universal compatibility on desktop web browsers, portable device web browsers and web browsers built into other devices like game consoles and smart TV sets.
There's a lot of legal and political crap going on with file/application format standards.
This stuff affects a lot more than graphics and video content. Think of the can of worms one opens by having its company books maintained on really old accounting software that's no longer developed and has no data migration pathway into current software. Those printed hard copies become pretty valuable if they weren't shredded to make more space in the office.
| IP: Logged
|
|
Carsten Kurz
Film God
Posts: 4340
From: Cologne, NRW, Germany
Registered: Aug 2009
|
posted 10-14-2014 02:48 PM
Bobby, if you want every intermediate or hyped up new broadcast/streaming/gadget format to be 'archivable', you are on a desparate track. You will also need to find a way to archive the creativity of the people who created the media ;-)
No, for the time being, it is sufficient to have usable well defined archive formats by ISO and the like for final artwork or anything that the people dealing with archives consider worth archiving.
Digital storage will at the same time facilitate archiving of the various media assets that come with a film production, as most of these assets are now digital and file based as well. Before, you would have put analog tape reels, DAT, CD-Rs, MODs, ZIPs, MDs, etc. in boxes, ready to be unreadable 10 years later. File based storage will be able to hold all these assets just as well as the final movie, making-of footage, etc.
It's the better approach, and the technology is there already. As long as distributed storage does not yet have the necessary trust associated with it, studios will create analog or digital separation masters on film for conventional vault storage. These will probably be the last manifestations of media-on-film, when film for capture and distribution will have disappeared.
- Carsten
| IP: Logged
|
|
|
|
|
All times are Central (GMT -6:00)
|
This topic comprises 2 pages: 1 2
|
Powered by Infopop Corporation
UBB.classicTM
6.3.1.2
The Film-Tech Forums are designed for various members related to the cinema industry to express their opinions, viewpoints and testimonials on various products, services and events based upon speculation, personal knowledge and factual information through use, therefore all views represented here allow no liability upon the publishers of this web site and the owners of said views assume no liability for any ill will resulting from these postings. The posts made here are for educational as well as entertainment purposes and as such anyone viewing this portion of the website must accept these views as statements of the author of that opinion
and agrees to release the authors from any and all liability.
|