Comments
Dracius t1_iuivg5u wrote
I'm curious if OP is referring to older shows on older TVs, because if you watch any 80/90's era cartoons on Netflix/Hulu/whatever, they're very saturated and bright.
[deleted] t1_iujtc3f wrote
[deleted]
chris-ronin t1_iujvk6l wrote
just because the relative range is the same, it doesn’t mean the consistency is. the analog signal and the color subcarrier chews the hell out of whatever you’re broadcasting. many family’s still had their 70s era color tvs which would be fuzzy and off tint well into the cable era. it was more about minimizing damage rather than achieving a pristine picture.
now that the hardware baseline is so good, you can count on that extra quality overhead to show up, even with the same rgb range.
there’s a reason component cables didn’t push past 1080i. you lost at literally EVERY transmission step before things went digital.
[deleted] t1_iuk6d52 wrote
[deleted]
lazydogjumper t1_iujv5xo wrote
That is only partially correct. They have similar range but the quality of the image, not to mention the broadcast itself, factored into how the images were colorized. Thus, older shows DO have more/darker color.
[deleted] t1_iuk7sm2 wrote
[deleted]
lazydogjumper t1_iuk9yhi wrote
More saturated colors means that more of the image is discernable even if it is not as clear. The same reason many early cartoons had thicker outlines
[deleted] t1_iukbxlz wrote
[deleted]
chris-ronin t1_iuitzr3 wrote
most of it is simply the result of the analog production process, and the losses inherent to each step. so you basically overshoot with bold colors so that it still looks decent at the other end.
older cartoons were produced with cells and photographic film. in that process you are literally blocking or transmitting light through both the cells themselves and the film both at the time they’re photographed and reproduced and eventually transferred to whatever electronic signal, which back then was also analog and had its own transmission loss.
every part of that process will result in a loss of both detail and ‘dynamic range’ so to speak because trying to pass a light through something while also having it be opaque is a conflicting process.
add into it all the other hand production methods, cell paintings, the relative quality of transfer technologies at the time, and yes the fact that its end was to show up either on a low res tv screen or a movie screen, which although capable of high detail theoretically, between multiple showings, flicker, and reproduction also be degraded.
in addition each color beyond just a flat reference color requires more attention to detail to keep consistency between artists etc.
we really take for granted how much the digital process has opened up for art reproducibility in the last 30 years.
FeliusSeptimus t1_iuj8bl1 wrote
> we really take for granted how much the digital process has opened up for art reproducibility in the last 30 years.
Yep. I worked on the digital team in a video production shop for a short time in mid-1990 and it was interesting watching the analog guys setting up to send video up to us for capture. They'd load up a tape and then fiddle around with half a dozen knobs while watching a little analog 'scope screen that plotted several indicator dots showing information about the image quality. The screen was marked with little boxes indicating the ideal value for each parameter, and each knob would affect some or all of the parameters. The guy running it would spend a minute or two going back and forth adjusting knobs to try to get as many of the indicator dots as close to the target boxes as he could. He said they could get it pretty close, but they'd never get them all into the boxes, and if you loaded up the same tape again next week to do it again the dots would be in different locations. He said that's why they called NTSC video 'Never Twice the Same Color'.
Today it would be interesting to take that old equipment and connect it to a machine learning system and see how well it could adjust the inputs to get it set up precisely.
chris-ronin t1_iujbq3l wrote
honestly, it’s why i bristle at the nit picks of most modern tech reviews. the cheapest walmart android tablet has better color calibration and picture quality than the most expensive consumer sony crt of the 90s. across the board the quality and consistency of everything from the signal to the image is better than what i grew up watching and using.
[deleted] t1_iujviji wrote
[deleted]
chris-ronin t1_iujvzqv wrote
see my other comment. the point is more that you are accounting for the loss inherent in analog to analog transfer. cell to film. film to film. film to analog. analog to crt. even just correcting for exposure in the film process you’re playing chicken between contrast and detail. that’s why those settings on tvs exist. it was very messy.
have you ever had to juggle a v-hold dial?
that’s why i put ‘dynamic range’ in quotes because it’s really about corrections to your detail and contrast and what gets lost, rather than the absolute capability of the signal, but it was the broadest answer without talking about things like photoshop exposure levels.
[deleted] t1_iuk5d3u wrote
[deleted]
chris-ronin t1_iuk6d7c wrote
home viewers aren’t engineers. there will be a loss. at all steps. everywhere. expansion of the universe styles. even the top engineer within ranges will get maybe 90-99% and that’s at best case. now repeat that multiple times. it’s physics. when you re-record or retransmit something analog you lose. every time. in detail. in clarity, in absolute range. that’s a very long pipe from the cell to the analog tv set all steps included, and the art direction accounted for that.
and also, it’s way easier to instruct a girl in the paint department (how it was done then) to paint a solid color within the lines, from a specific color number, than to worry about how well they painted a subtle gradient. so yes, it was also partially so to the art production method as well.
so it’s an artistic decision driven by the technical limitations when even the BEST technician were working within an upper quality bound.
see the sister reply to yours in this thread. he was doing analog to digital with experts and they had to account for the fiddliness and that was a single step transfer.
[deleted] t1_iuk6ud4 wrote
[deleted]
lazydogjumper t1_iukagtd wrote
Its the contrast,and live action film requires a lot more color correction by using lighting and lenses. The reason it wasnt all shows is because not all shows COULD. It wasnt as easy as slapping a filter on it.
Iyellkhan t1_iuj180r wrote
Honestly it probably depends on the post production house prepared them. Everything has been scanned or copied to digital at this point. Sometimes that means the film was rescanned (if it still exists), some times its just just NTSC upscaled. Your black and white points may be at the whim of the junior color artists, which will partially determine the saturation and sharpness
makesyoudownvote t1_iuj8z1a wrote
There are several reasons.
Back then animated movies and shows were made by drawing directly onto paper and see through cells (basically like clear plastic sheets) this usually included the colors. Today even in the rare "hand drawn" cartoons that still exist, the color is usually done by computer.
The cells were usually the parts that would move or change the most, which is why if you ever noticed in those cartoons, you call always tell the part that is going to move, because the color looks a little different, and smoother.
So with that understanding here are some reasons.
-
Cells don't hold color as well as paper. Everything drawn on them looks a little dulled.
-
The clear part of cells stacked on paper dims the color of the paper below it slightly because there something inbetween.
-
Pigments (paint, markers, inks, dyes, crayons) are not perfect and are duller/dimmer than a computer generated image on a computer screen.
-
Pictures of pictures are never as vibrant and bright as the original pictures themselves. This is a little less true today as filters can boost colors of pictures afterwards, but this wasn't done back then.
-
Video tape used at the time, especially the cheaper video used for animation for broadcast TV has far less contrast and color range than newer cameras, and even those have far less than something made directly on the computer. TVs themselves could not show nearly as much color as today tvs so there wasn't really as much point in trying to go for anything fancier unless it was for a Disney quality feature film.
-
Batman TAS and several shows that followed it broke the mold of animation of the era by drawing on black paper instead of white. This gave it a far darker and grittier look that was copied by a lot of animation that came after it.
sorkxnaud t1_iujb116 wrote
TIL that Batman TAS was drawn on black paper. That certainly explains the grittier look! My respect for that show’s artistry just grew!
stolid_agnostic t1_iuiilw2 wrote
Our color preferences change over time, and new ones will always seem bright and fresh to us. There are professionals with advanced degrees and lots of artistic experience who spend their careers developing these and selling them to other companies--clothing manufacturers, artists, whatever. When you look at older palettes, like from your childhood, they always seem dim and dark even though they seemed bright and fresh at the time because there has been a continuous progression of colors since then and you have experienced this passage in time, personally.
StuffinYrMuffinR t1_iuijjq3 wrote
While everything you said may or may not be true.
Tv shows/video games/movies were designed for the screens of the time. They designed them to look a certain way on the technology of the time. They look so different now because our technology is vastly different.
If you look at 16bit games on an old CRT it looks better than on a modern 4k TV.
stolid_agnostic t1_iuijyjs wrote
Oh I agree, a lot of programs look better on CRTs too, though many definitely are improved if on an modern LCD.
[deleted] t1_iuiw1h5 wrote
[removed]
zero_z77 t1_iujfl1y wrote
So 3 reasons:
-
Gamma correction - old school tube TVs work differently than modern LCD/LED screens. Specifically in the way the pixels represent color intensity. With modern screens, there is a mostly linear relationship between the pixel value and the actual color being seen so red 200 is twice as red as red 100. With tube TVs there was a curve, so red 200 might be 3 times as red as red 100, so you'd need to adjust the value down so that it looks right on the screen.
-
A lot of animation was done by hand - a lot of older animation was drawn on physical media, photographed, then assembled into a video reel. This means that the physical color palette used was restricted to what the artists had available at the time. The process used today is almost entirely digital and software can create basically any color you could ever want.
-
Style - in a lot of cases, the darker coloring was a stylistic choice. For example, scooby doo has very dark coloring to add to the mystery/horror atmosphere presented by the show. This is still done in some modern cartoons & animated shows.
[deleted] t1_iujye39 wrote
[deleted]
shotsallover t1_iujfssr wrote
A lot of it depends on how they were stored. If they were stored on a high quality master or digitally odds are the original content had more colors than the old TVs could even show you.
Old NTSC TV had a relatively narrow range of colors (color space) it could support. When we switched to HD we got a much wider range of colors. 4K has an even wider one. Many old TVs shows were stored with a wider color space than old NTSC TVs could display, so when you saw them on your old NTSC TV, it "clipped" the colors to what the TV could show. Which could have lead to them looking washed out.
Many old TV shows have been remastered for the streaming age, and now that we have TVs that can show us all the colors the show was originally created with, we get to see them. Some shows have even had the colors "corrected" to make them look more realistic/vibrant on modern TVs.
shotsallover t1_iujj52h wrote
Also, many shows were re-scanned from the original source files to prepare them for streaming. Particularly popular ones. Many shows were shot on actual film (especially animated ones, though "shot" is a term used loosely) which has much better color than what NTSC could display. So when they went back they took some effort to scan them "better" for modern TVs.
[deleted] t1_iuk4dcu wrote
[deleted]
lazydogjumper t1_iuial0x wrote
Depending on the show, they were colored differently to fit tv formats. Because of the way TVs worked, the brightness and graininess had to be factored into how the pictures looked. Now that we can see them in higher definition the images look saturated with color.