• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Oct 25, 2017
7,664
I know most tvs at the time maxed out at 480i (and edtvs came later) but was there a technical reason why this horrible resolution was a thing? I mean it's literally an unstable flickering image...it should have been unacceptable on every side (from the broadcasts/console output AND the fact that tv manufactures or whoever was responsible for 480/576i being a "thing")

It just seems wild to me. When you think about it. The picture is literally shaking lmao. And we just accepted it all our life huh.

was it a bandwidth issue? Could broadcasts not send progressive signals? From what I've googled 480p and 480i bandwidth needed wasn't even that much different anyway!
 

Mullet2000

Member
Oct 25, 2017
5,907
Toronto
The short version of it is that it didn't look bad on TVs at the time. CRTs display it "properly" while modern HDTVs don't, so it looks horrible on them.

It was done to limit the bandwidth of video signals and reduce flickering of video.
 

Derachi

Member
Oct 27, 2017
7,699
The short version of it is that it didn't look bad on TVs at the time. CRTs display it "properly" while modern HDTVs don't, so it looks horrible on them.
Thread over in 1. 480i on CRTs was totally acceptable.

I'd pay extra for a modern TV with proper support for older resolutions!
 

DopeyFish

Member
Oct 25, 2017
10,796
Because bandwidth.

Cable used to blast every station at you. So by using 480i, they could then have twice as many TV stations.
 

xir

Member
Oct 27, 2017
12,577
Los Angeles, CA
everything was 480i on ntsc, 480p is like a phantom operating output. (240p or whatever still has to be rendered at 480i on. ntsc)
CRTs did interlacing because the image would start to fade, so by drawing every over line it could keep things consistent, or something along those lines.

A better question is why interlacing wasn't dropped when new formats came out....1080i will always seem weird to me. also all those hologram recordings in star wars are interlaced..... gross.

edit, this does have me wondering, when i game was 60 back in the old days, did that just mean it drew each frame on each scanline? i remember disc 2 of gran turismo arcade/night races looking silky msooth
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,664
The short version of it is that it didn't look bad on TVs at the time. CRTs display it "properly" while modern HDTVs don't, so it looks horrible on them.

I remember even as a clueless young one I still wondered why the hell is my tv not stable and always shaking

It was so weird but I never knew until much later what it was lol
 

Dreamboum

Member
Oct 28, 2017
22,865
I don't think consumer CRTs can output 480p so 480i was the way to go to give games a higher resolution without resorting to 240p
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,664
Thread over in 1. 480i on CRTs was totally acceptable.

I'd pay extra for a modern TV with proper support for older resolutions!

i have a Sony trinitron with rgb scart and a pvm 20l5 and I still think 480i is disgusting on it haha acceptable yes barely and of course MUCH better than on a flat panel but once I got a progressive scan tv back then I was Mindfucked how much better it was

And yeah I guess we had no choice so we just accepted to technically it's acceptable haha
 

BumbleChump

Member
Aug 19, 2018
536
Most consumer crt TVs only supported 15khz, which means it can only do 240p or 480i. It wasn't until later that crts could do 31khz that they could go to 480p and above. CRT tech is a deep rabbit hole I dug into recently while fixing my Astro City's crt monitor.
 
Nov 8, 2017
13,111
The standard 3 cable connector didn't have enough bandwidth for 480p. You needed component cables for that. Most early TVs did not have component connections, this wouldn't be normal until quite late in the CRT game, relatively. Fuck, in the 90's people were using RF connectors and shit.

I have a CRT in my living room and I also assure you that composite 480i / 576i doesn't look flickery on them. It actually looks quite nice as long as the image isn't blown up too big.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
They used it because it used less bandwidth to achieve a similar image quality.
And it was developed in the 1940s, so was limited by the technology of the time
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,664
The standard 3 cable connector didn't have enough bandwidth for 480p. You needed component cables for that. Most early TVs did not have component connections, this wouldn't be normal until quite late in the CRT game, relatively. Fuck, in the 90's people were using RF connectors and shit.

I have a CRT in my living room and I also assure you that composite 480i / 576i doesn't look flickery on them. It actually looks quite nice as long as the image isn't blown up too big.

im no tv expert for sure (hence the thread) but I definitely see flicker and pretty bad flicker tbh on both of my crts. I mean interlaced flickers inherently doesn't it?

I think I'm just maybe super sensitive to this shit haha. Like when I play 240p consoles on them they are clean and stable but soon as I fire up a 480i source like ps2 for example the playstation 2 logo for example flickers pretty badly-can't even make out the registered trademark R cos it's shaking lol
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,664
They used it because it used less bandwidth to achieve a similar image quality.
And it was developed in the 1940s, so was limited by the technology of the time

hey thanks Boris! Love your stuff thank you, so was it like "we make the tvs to adjust for broadcast" or was it "we make broadcasts to cater for the tv (in this case 480i) which came first do you know?

this stuff fascinates me honestly
 

Manmademan

Election Thread Watcher
Member
Aug 6, 2018
16,019
I know most tvs at the time maxed out at 480i (and edtvs came later) but was there a technical reason why this horrible resolution was a thing? I mean it's literally an unstable flickering image...it should have been unacceptable on every side (from the broadcasts/console output AND the fact that tv manufactures or whoever was responsible for 480/576i being a "thing")

It just seems wild to me. When you think about it. The picture is literally shaking lmao. And we just accepted it all our life huh.

was it a bandwidth issue? Could broadcasts not send progressive signals? From what I've googled 480p and 480i bandwidth needed wasn't even that much different anyway!

This is a super weird question. It's like asking "why were there ever black and white TVs, everything should have been in color."

The simple answer to this is that CRT technology wasn't capable of a progressive image. A CRT fires an electron gun at the screen, which creates an electron beam that travels across the screen lighting up RGB phosphors to make the image. Odd lines as the beam travels top to bottom, even lines as it travels back up bottom to top. Doing it that way increased the (perceived) frame rate of the image since each half of the image was captured at different times.

Television sets as created for most of the 20th century couldn't do a progressive signal. I don't think I saw a 480p television until the mid to late 90s. Computer monitors maybe, but those worked on a completely different standard.

When HDTV was just getting off the ground, there was a debate between 1080i and 720p as to which HD standard was "better" and IIRC 1080i was better for live sports since the image simply updated faster than 720p did.


im no tv expert for sure (hence the thread) but I definitely see flicker and pretty bad flicker tbh on both of my crts. I mean interlaced flickers inherently doesn't it?

I think I'm just maybe super sensitive to this shit haha. Like when I play 240p consoles on them they are clean and stable but soon as I fire up a 480i source like ps2 for example the playstation 2 logo for example flickers pretty badly-can't even make out the registered trademark R cos it's shaking lol

The PS2 logo shouldn't be "flickering" at all on a CRT with a good set and good connections. Are you sure you aren't thinking of "dot crawl?" If you have the PS2 hooked up with S-video or component cables it will likely get rid of that issue even if the source IS 480i. The PS2 isn't limited to 480i either and *can* do 480p, btw.
 
Last edited:

VariantX

Member
Oct 25, 2017
16,890
Columbia, SC
I didn't even know what 480p looked like until I got a 480p monitor in the year 2000. That was back when people thought games running at 640x480 were the bees knees since everything else was 320x240 for the most part.
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,664
This is a super weird question. It's like asking "why were there ever black and white TVs, everything should have been in color."

The simple answer to this is that CRT technology wasn't capable of a progressive image. A CRT fires an electron gun at the screen, which creates an electron beam that travels across the screen lighting up RGB phosphors to make the image. Odd lines as the beam travels top to bottom, even lines as it travels back up bottom to top. Doing it that way increased the (perceived) frame rate of the image since each half of the image was captured at different times.

Television sets as created for most of the 20th century couldn't do a progressive signal. I don't think I saw a 480p television until the mid to late 90s. Computer monitors maybe, but those worked on a completely different standard.

When HDTV was just getting off the ground, there was a debate between 1080i and 720p as to which HD standard was "better" and IIRC 1080i was better for live sports since the image simply updated faster than 720p did.

I don't think it's super weird it's more curiosity and to add on top of that tvs absolutely were able to do progressive signals 240p was a thing now I don't think if they used some trickery to display this (hence the question) but 240p was a thing. If tvs couldn't display progressive signals that begs the question how did consoles manage to do 240p what was the method behind that
 

Manmademan

Election Thread Watcher
Member
Aug 6, 2018
16,019
I own a consumer CRT that can output 480p. Hell it can output up to 1080i

Consumer CRTs that can do 480p/720p/1080i exist, but they showed up very late into the lifespan of those things. By the time the tech was feasible, LCDs and Plasmas were coming down in price, and CRTs of any decent size were REALLY freaking heavy.


I don't think it's super weird it's more curiosity and to add on top of that tvs absolutely were able to do progressive signals 240p was a thing now I don't think if they used some trickery to display this (hence the question) but 240p was a thing. If tvs couldn't display progressive signals that begs the question how did consoles manage to do 240p what was the method behind that

No consumer televisions did 240p, at least not in the united states. There was no 240p media, and no 240p broadcast standard. As someone else mentioned, you can't get a progressive signal to a television using an RF cable, composite, or S-video. And these were the only available connections all the way up to the 1990s (at least in the states, RGB connections were a thing in the EU).

Non hi def TVs (and non-EDTVs, which came very late to the party) could not handle a progressive image. They didn't have the bandwith and the technology in them was always displaying an interlaced image. If your console did 240p internally but was connected to your Standard Def TV, the output would have been interlaced, not progressive.
 
Last edited:

Listai

50¢
Member
Oct 27, 2017
5,665
i have a Sony trinitron with rgb scart and a pvm 20l5 and I still think 480i is disgusting on it haha acceptable yes barely and of course MUCH better than on a flat panel but once I got a progressive scan tv back then I was Mindfucked how much better it was

And yeah I guess we had no choice so we just accepted to technically it's acceptable haha

Honestly the 20L5 is a great set but 800 TVL is not kind to 480i you're able to discern the flickering between the fields much more than on a consumer set.

I have a 20L2 and while 480i looks servicable on it - it definitely looked better on my RGB modded KVAR25M31 Trinitron.
 

turbobrick

Member
Oct 25, 2017
13,085
Phoenix, AZ
The standard 3 cable connector didn't have enough bandwidth for 480p. You needed component cables for that. Most early TVs did not have component connections, this wouldn't be normal until quite late in the CRT game, relatively. Fuck, in the 90's people were using RF connectors and shit.

I have a CRT in my living room and I also assure you that composite 480i / 576i doesn't look flickery on them. It actually looks quite nice as long as the image isn't blown up too big.

Yeah, I have a CRT that I got sometime in I think 2002 and was surprised it has component input. Because of that I ended up getting the PS2 component cables, and games looked pretty good on it from what I remember.
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,664
Honestly the 20L5 is a great set but 800 TVL is not kind to 480i you're able to discern the flickering between the fields much more than on a consumer set.

I have a 20L2 and while 480i looks servicable on it - it definitely looked better on my RGB modded KVAR25M31 Trinitron.

hey, we have basically the same setup!

and yeah def agree it looks better on the trini but it's also not what I would call a stable image lol

hey I see some people getting downscalers to downscale 480i to 240p have you tried this? What does it look like does it look better?
 
Jun 30, 2018
117
In certain cases 480i can actually look better than 480p on a CRT. Games like MGS:2 that have simple textures and at 480i 60fps have a very distinct scanline effect at the center of the image which make them appear sharper than 480p and more pleasing to the eye while in motion. Also some games like baldur's gate dark alliance 2 actually super sample from a higher resolution to 480i so flickering is kept to a min.

But if you just want the reason why countries used interlaced signals for TV......well to put it simply, it was a cost based decision.
 

GearDraxon

Member
Oct 25, 2017
2,786
When did the mentality that "my high end TV can't show progressive-scan images so it's unstable trash" start?
 

Listai

50¢
Member
Oct 27, 2017
5,665
hey, we have basically the same setup!

and yeah def agree it looks better on the trini but it's also not what I would call a stable image lol

Oh definitely I'd much prefer a progressive image I just think it gets a bit of a bad rap.

That said with the PVM prices being what they are I've given up on a 31khz set.
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,664
Consumer CRTs that can do 480p/720p/1080i exist, but they showed up very late into the lifespan of those things. By the time the tech was feasible, LCDs and Plasmas were coming down in price, and CRTs of any decent size were REALLY freaking heavy.




No consumer televisions did 240p, at least not in the united states. There was no 240p media, and no 240p broadcast standard. As someone else mentioned, you can't get a progressive signal to a television using an RF cable, composite, or S-video. And these were the only available connections all the way up to the 1990s.

Non hi def TVs (and non-EDTVs, which came very late to the party) could not handle a progressive image. They didn't have the bandwith and the technology in them was always displaying an interlaced image. If your console did 240p internally but was connected to your Standard Def TV, the output would have been interlaced, not progressive.

ive either discovered something absolutely mind blowing and have been wrong all along or maybe I am explaining wrong

Aren't all consoles up until the ps2 era all outputting 240p??? There is a tangible difference in the stability of the image of the Super Nintendo for example compared to the PlayStation 2
 

ascii42

Member
Oct 25, 2017
5,798
When HDTV was just getting off the ground, there was a debate between 1080i and 720p as to which HD standard was "better" and IIRC 1080i was better for live sports since the image simply updated faster than 720p did.
720p was better for sports, so ABC and Fox chose to broadcast in that format. The reason why it's better for fast moving content is that 1080i60 effectively resolves to 1080p30. The flip side of that is 1080i looks better for content that doesn't need to be 60 fps.
 

Manmademan

Election Thread Watcher
Member
Aug 6, 2018
16,019
ive either discovered something absolutely mind blowing and have been wrong all along or maybe I am explaining wrong

Aren't all consoles up until the ps2 era all outputting 240p???

No. And as explained the PS2 can do 480p when appropriate. Valkyrie Profile 2 for instance is a 480p PS2 game. There used to be a super weird device called the Xploder that would work with the PS2 and *force* it to internally render games in an HD image above 480p, so the PS2 was definitely capable of higher resolutions, devs just never used it. I may still have my Xploder disc somewhere.

There is a tangible difference in the stability of the image of the Super Nintendo for example compared to the PlayStation 2

I think you might be confusing "stability" with "resolution" and "dot crawl" here.

720p was better for sports, so ABC and Fox chose to broadcast in that format. The reason why it's better for fast moving content is that 1080i60 effectively resolves to 1080p30. The flip side of that is 1080i looks better for content that doesn't need to be 60 fps.

You're probably correct here. i was going off of memory and it's been a REALLY long time since I even had to think about the 1080i/720p debate.
 

Deleted member 8752

User requested account closure
Banned
Oct 26, 2017
10,122
i have a Sony trinitron with rgb scart and a pvm 20l5 and I still think 480i is disgusting on it haha acceptable yes barely and of course MUCH better than on a flat panel but once I got a progressive scan tv back then I was Mindfucked how much better it was

And yeah I guess we had no choice so we just accepted to technically it's acceptable haha
Actually, the better your CRT, the more you'll notice the 480i interlacing. A PVM 20L5 will show the interlacing much more than a PVM 20M2U for example, since it has an additional 200 lines of resolution sharpness.


A consumer set that can accept component or RGB is going to be even better at hiding the temporal "flaws" of the interlaced signal.
 

Deleted member 8752

User requested account closure
Banned
Oct 26, 2017
10,122
No consumer televisions did 240p, at least not in the united states. There was no 240p media, and no 240p broadcast standard. As someone else mentioned, you can't get a progressive signal to a television using an RF cable, composite, or S-video. And these were the only available connections all the way up to the 1990s (at least in the states, RGB connections were a thing in the EU).

Non hi def TVs (and non-EDTVs, which came very late to the party) could not handle a progressive image. They didn't have the bandwith and the technology in them was always displaying an interlaced image. If your console did 240p internally but was connected to your Standard Def TV, the output would have been interlaced, not progressive.

240p is a hack of 480i. Any CRT that takes 480i natively, takes 240p natively too. It just blanks out half the lines of resolution and shows the non-blanked lines every frame instead of interlacing them. Thus, every SD CRT in North America is 240p capable, and wouldn't show an "interlaced" signal when fed 240p.

That's why the retro folks are so gaga about scanlines - that's what 240p looks like.
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,664
No. And as explained the PS2 can do 480p when appropriate. Valkyrie Profile 2 for instance is a 480p PS2 game. There used to be a super weird device called the Xploder that would work with the PS2 and *force* it to internally render games in an HD image above 480p, so the PS2 was definitely capable of higher resolutions, devs just never used it. I may still have my Xploder disc somewhere.



I think you might be confusing "stability" with "resolution" and "dot crawl" here.



You're probably correct here. i was going off of memory and it's been a REALLY long time since I even had to think about the 1080i/720p debate.

no no. Not including ps2- <up until> ps2. So everything before. I know ps2 gc and Xbox support 480p on select titles and some even 1080i. I meant previous

And yeah man sorry to tell you but crts do display 240p on consoles before that. Every single one has resolutions usually 320x240 progressive and genesis has some titles 256x240 progressive

So tvs definitely could display progressive signals

from what I've read most crts could display 240p/480i/576i

Then much later basically just before fat panels started becoming the norm HD crts could go higher
 
Nov 8, 2017
13,111
im no tv expert for sure (hence the thread) but I definitely see flicker and pretty bad flicker tbh on both of my crts. I mean interlaced flickers inherently doesn't it?

I think I'm just maybe super sensitive to this shit haha. Like when I play 240p consoles on them they are clean and stable but soon as I fire up a 480i source like ps2 for example the playstation 2 logo for example flickers pretty badly-can't even make out the registered trademark R cos it's shaking lol

It flickers but because of the way the CRTs work, it's dramatically less noticable than it is on an LCD. ALthough it's possible that if you have like a super speccy advanced CRT the extra clarity afforded by that might increase noticability. I have to look super close to notice anything on my cheap junker. At normal viewing distances I can't tell at all that it's flickering.

Thank god too, because I'm in a PAL region and most systems here just flat didn't allow you to do use progressive scan, even if they did in America.
 

Lump

One Winged Slayer
Member
Oct 25, 2017
16,034
I remember playing Melee on a Sony Trinitron back in the day with the standard connection, and then upgrading for component later since the TV supported it and being amazed how much better it looked in Progressive Scan - not just the sharpness but the colors. It was like a whole new experience.
 

Polyh3dron

Prophet of Regret
Banned
Oct 25, 2017
9,860
everything was 480i on ntsc, 480p is like a phantom operating output. (240p or whatever still has to be rendered at 480i on. ntsc)
CRTs did interlacing because the image would start to fade, so by drawing every over line it could keep things consistent, or something along those lines.

A better question is why interlacing wasn't dropped when new formats came out....1080i will always seem weird to me. also all those hologram recordings in star wars are interlaced..... gross.

edit, this does have me wondering, when i game was 60 back in the old days, did that just mean it drew each frame on each scanline? i remember disc 2 of gran turismo arcade/night races looking silky msooth
Most video games from pre-Dreamcast and PS2 consoles were actually running at 240p. There were a small number of 480i games on PS1 with actual gameplay in that resolution, but Gran Turismo was not one of them. It did switch to 480i for its menus though. When trying to play games that do this through an upscaler it can result in a black screen while the scaler changes resolution which can be a pain.
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,664
240p is a hack of 480i. Any CRT that takes 480i natively, takes 240p natively too. It just blanks out half the lines of resolution and shows the non-blanked lines every frame instead of interlacing them. Thus, every SD CRT in North America is 240p capable, and wouldn't show an "interlaced" signal when fed 240p.

That's why the retro folks are so gaga about scanlines - that's what 240p looks like.

Thank you! This is what I meant-so that's why it's a "progressive" signal because it's actually missing half of the information between the scanlines? Am I reading that right?
 

Manmademan

Election Thread Watcher
Member
Aug 6, 2018
16,019
no no. Not including ps2- <up until> ps2. So everything before. I know ps2 gc and Xbox support 480p on select titles and some even 1080i. I meant previous

And yeah man sorry to tell you but crts do display 240p on consoles before that. Every single one has resolutions usually 320x240 progressive and genesis has some titles 256x240 progressive

So tvs definitely could display progressive signals

from what I've read most crts could display 240p/480i/576i

Then much later basically just before fat panels started becoming the norm HD crts could go higher

This is absolutely, positively incorrect. Don't know what else to tell you. SDTV Consumer TVs can't accept a progressive image without a component connection, and did not display one. Used to sell these things for a living, so I'm definitely sure on it.
 

Listai

50¢
Member
Oct 27, 2017
5,665
No consumer televisions did 240p, at least not in the united states. There was no 240p media, and no 240p broadcast standard. As someone else mentioned, you can't get a progressive signal to a television using an RF cable, composite, or S-video. And these were the only available connections all the way up to the 1990s (at least in the states, RGB connections were a thing in the EU).

Nah, while I guess you're right about there being no 240p broadcast standard all consoles up until the PS2, DC etc were outputting 240p images through RF, composite, svideo and RGB.
 

Stefarno

I ... survived Sedona
Member
Oct 27, 2017
893
If you think 480i is bad at least you weren't in Europe where we'd often get games badly converted to PAL meaning they had black bars at either side of the image and ran 20% slower.
 

Listai

50¢
Member
Oct 27, 2017
5,665
This is absolutely, positively incorrect. Don't know what else to tell you. SDTV Consumer TVs can't accept a progressive image without a component connection, and did not display one. Used to sell these things for a living, so I'm definitely sure on it.

You are fiercely, powerfully wrong.

You're conflating a 31khz progressive 480p image with a 15khz 240p image. Even then it's still wrong as you can send 480p over RGB.
 

Deleted member 8752

User requested account closure
Banned
Oct 26, 2017
10,122
This is absolutely, positively incorrect. Don't know what else to tell you. SDTV Consumer TVs can't accept a progressive image without a component connection, and did not display one. Used to sell these things for a living, so I'm definitely sure on it.
You're not correct at all. There are entire websites and forums devoted to this topic.

Heck, I'm playing a 240p game over composite right now on my NES/CRT.
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,664
This is absolutely, positively incorrect. Don't know what else to tell you. SDTV Consumer TVs can't accept a progressive image without a component connection, and did not display one. Used to sell these things for a living, so I'm definitely sure on it.

sorry to tell you man but have a look at the other posts quoting you

You had me concerned there because I was sure all crts did 240p (which they clearly do)
 

xir

Member
Oct 27, 2017
12,577
Los Angeles, CA
Most video games from pre-Dreamcast and PS2 consoles were actually running at 240p. There were a small number of 480i games on PS1 with actual gameplay in that resolution, but Gran Turismo was not one of them. It did switch to 480i for its menus though. When trying to play games that do this through an upscaler it can result in a black screen while the scaler changes resolution which can be a pain.
For gt was more about frame rate. NTSC is 29.97 no variance and again. 240p signal is still being spit out as 480i I'm the end to run on an ntsc crt