Jump to content

Wikipedia:Reference desk/Archives/Miscellaneous/2015 November 13

From Wikipedia, the free encyclopedia
Miscellaneous desk
< November 12 << Oct | November | Dec >> Current desk >
Welcome to the Wikipedia Miscellaneous Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


November 13

[edit]

Old TV / Camera footage

[edit]

Was camera / video footage really bad in decades past. There's a distinct kind of 'quality' you see on films or recordings from the 80's 70's and prior. Or, is this difference between picture quality explained due to the effects of the media it was shot on degrading over time. I'm noticing old new reports and footage from the 90's and 00's starting to look like they were shot in the 80's.

Sadly I haven't lived long enough to say for sure, but if anyone was watching broadcasts in say the 60's, 70's (assuming it was color) was it just as good in terms of picture quality as standard resolution today? — Preceding unsigned comment added by 80.195.27.47 (talk) 17:39, 13 November 2015 (UTC)[reply]

It was certainly not hi-def. The technologies have pushed either other, over time. ←Baseball Bugs What's up, Doc? carrots17:49, 13 November 2015 (UTC)[reply]
Prior to digital storage, most video was stored on analog magnetic media, basically video tape. Here's a pretty good discussion of the degradation of video tape, along with decay rates. The figures in the article imply about a 1% loss of quality per year. --Jayron32 17:54, 13 November 2015 (UTC)[reply]
As for resolution, modern HDTV has from 720 to 1080 lines (4K resolution in 2160 lines, but that doesn't seem to have caught on yet), while analog TV had from 405 lines to 625 lines. HDTV is also wider, so you get more detail that way, too. As for why old broadcasts seem crappy on modern TVs, part of that is due to the proliferation of large screen TVs. Analog TV didn't look good on large screen TVs (let's say 60 inches diagonally), unless viewed from quite a distance, so few people had them (also because they were much more expensive in real terms than today). You generally only saw them in sports bars. StuRat (talk) 18:17, 13 November 2015 (UTC)[reply]
Well, there are separate issues at play. One is how the thing was made in the first place, and the other has to do with how/if it has degraded since. I will only focus on the first issue here. Movies made in the 70s and earlier were usually shot and stored on film stock, then some were converted to video tape, and later remastered into digital formats.
Analog film imparts varying qualities on the image, depending on its physical response response to different types of light at different intensities. See Color_motion_picture_film, List_of_motion_picture_film_stocks.
Basically, each brand and make of film would have its own color balance and color temperature. (Incidentally, this is also why black people and other persons of color often don't look good in some old movies and photos - the film technologies were balanced to make the skin of white people look good [1] [2] [3].)
Film buffs can often tell by a few still images if something was shot on Kodachrome or Fujifilm. If you restrict to just movies, many times the "old time look" that you experience is because of the type of film they were shot on. SemanticMantis (talk) 18:28, 13 November 2015 (UTC)[reply]
I believe this Q is about TV broadcasts, not film. StuRat (talk) 18:49, 13 November 2015 (UTC)[reply]
Yes, well, your belief notwithstanding, OP said in part "films from the ... 70's", so I gave a reply relevant to that. In my (WP:OR) experience, the look and feel of old films has a lot to do with the type of films that were used. SemanticMantis (talk) 23:44, 13 November 2015 (UTC)[reply]
Some TV broadcasts were actually originally shot on film, rather than video tape. The "look and feel" of film is different than video, and some TV directors want that specific look and feel. Famously in the U.S., well into the video tape era, NFL Films insisted on shooting all of its highlight films on high speed, high quality 35 mm film. The TV show Scrubs also was shot on film (specifically 16 mm film rather than tape or digital, despite being shot during the digital era. Film stock also degrades, the wikipedia article Film preservation covers some of this. --Jayron32 19:17, 13 November 2015 (UTC)[reply]
I've been watching some 70s and 80s clips out of nostalgia and the thing I notice most about the quality is how filthy it is. Like, literal dirt on the film. Hairs, dust specks, chunks of stuff falling in and out of camera view. Watch an old cartoon some time - the wide areas of colour make that stuff really stand out. Older material, especially lower budget stuff like commercials and local shows, also have problems with focal depth. End result is that, even before you get into technical differences, you already have to forgive a lot of shortcomings. 99.235.223.170 (talk) 02:38, 14 November 2015 (UTC)[reply]
It's very easy for film to accumulate stuff, and it takes extra effort to clean it up. When they redid The Wizard of Oz a few years ago, they made digital copies of every frame and then individually "cleaned" the digital copies. This is by no means a new problem. It was made fun of, in a 1940s WB cartoon, in which a guy was singing a nice melody, and there appeared to be a hair in the projector - actually cartooned in, as at some point the singer stopped abruptly and yelled "Get that hair out of there!" ←Baseball Bugs What's up, Doc? carrots02:50, 14 November 2015 (UTC)[reply]
The OP might also be interested in the soap opera effect. Dismas|(talk) 16:45, 15 November 2015 (UTC)[reply]