• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Oct 25, 2017
7,660
and if not how did crts get away with no need for vsync or similar things to prevent screen tearing? Actually mainly 2d games I never really remember tearing

just another thing crts were great at huh :( they really were nearly perfect ?
 

HTupolev

Member
Oct 27, 2017
2,436
Tearing is an actual discontinuity in the image being sent to a display, so it affects CRTs just as much as any other technology. Games did have to use vsync to prevent tearing.
 

JoJoBae

Member
Oct 25, 2017
1,493
Layton, UT
really interesting, well actually now that you mention it yes the early 3D games did have it but what about 2d games? How come they never suffered from screen tearing? Vsync?
My understanding was it was more to do with how the Pre-PS1/N64 systems graphics worked. That's all I got though.

I know of a few original Xbox games that had screen tearing. And then there's Riddick that had a dynamic resolution scaler.
 

RGB

Member
Nov 13, 2017
657
Yes most games used vsync, nothing different from today. The one difference being a synced CRT refresh is an instantaneous change on the next vertical blank, whereas modern panels all add varying amounts of processing / state switching latency. (Obviously this has become close to negligible on a quality display, but it took us a long time to get there)
 

Pargon

Member
Oct 27, 2017
12,013
really interesting, well actually now that you mention it yes the early 3D games did have it but what about 2d games? How come they never suffered from screen tearing? Vsync?
As I understand it, older 2D systems scanned out directly to the display rather than using a frame buffer, so it was always in sync (no tearing) without requiring v-sync; thus they were also very low latency.

Frame buffers were introduced later, since they have higher requirements; but while they have many advantages, using them also requires v-sync to prevent tearing since they are not synced to the scan-out by default any more.
 
OP
OP
nogoodnamesleft
Oct 25, 2017
7,660
As I understand it, older 2D systems scanned out directly to the display rather than using a frame buffer, so it was always in sync (no tearing) without requiring v-sync; thus they were also very low latency.

Frame buffers were introduced later, since they have higher requirements; but while they have many advantages, using them also requires v-sync to prevent tearing since they are not synced to the scan-out by default any more.

edit hmmm above !

i seeeeeee I understand now, I'm watching retro game channels now and I noticed wow none of these old 2d games have screen tearing and wondered how they worked and yeah :) very interesting thank you
 

HTupolev

Member
Oct 27, 2017
2,436
but what about 2d games? How come they never suffered from screen tearing? Vsync?
Essentially yes.

Modern games generally render a frame as an image, and then tell the system to output that image to the display. When the game gets done rendering the next frame, it can tell the system to output this new image instead. Screen tearing is what happens if the system makes this switch while the first image is still being drawn to the display, so part of the display shows the old frame and part of the display shows the new frame.
"Vsync" means that the system only makes the switch if it's not in the process of sending a frame out.

In the 2D era, most consoles worked a bit differently. A game would basically tell the graphics hardware in the console what the scene was: what background image(s) to use, what sprites to draw where, etc. The console would then create the image pixel-by-pixel as the data is being sent to the display. There's no "framebuffer" containing a finished version of the frame.
As long as the scene isn't changed while it's being drawn out, the "vsync" happens implicitly. If you could change the scene while it's being output - say, by shifting the background by some amount when the frame is only halfway done - then there would be tearing halfway down the image.
 
Aug 30, 2020
2,171
Tearing was significantly less perceivable on a CRT. It was still technically there, and you'll hear a few people on here claiming they saw it all the time. But I played games with vsync off all the time back on CRT monitors and it rarely showed up. I expect it was due to limited pixel persistence and the natural scanning of the display. It also didn't hurt that CRT monitors usually displayed over 60Hz most of the time, although something like 360 games on a CRT had (hard to detect) tearing.

I'm super sensitive to tearing on a LCD. I can't abide it. I can't play without vsync unless I have freesync / gsync (and then it's basically taken care of for me).
 

gebler

Member
Oct 27, 2017
1,269
As I understand it, older 2D systems scanned out directly to the display rather than using a frame buffer, so it was always in sync (no tearing) without requiring v-sync; thus they were also very low latency.

Frame buffers were introduced later, since they have higher requirements, but while they have many advantages, using them also requires v-sync to prevent tearing since they are not synced to the scan-out by default any more.
You seem to think of vsync in a very narrow sense, like how it works on PC graphics cards where you have multiple full-resolution framebuffers and just flip them with vsync. But vsync as a concept is older than that, and is indeed supported on systems like the Atari 2600 (where you sync a lot more to the raster beam than the start of the frame, but certainly do that as well) as well as systems like the C64 where games usually used tile-based graphics and sprites, but vsync was pretty much required to achieve smooth scrolling without tearing.
 

RGB

Member
Nov 13, 2017
657
There's no "framebuffer" containing a finished version of the frame.
If image elements in the game weren't flickering on the screen there was a buffer involved, presented after the next vblank. Even if it's not called a frame buffer, it would be a buffer in main memory, a dedicated sprite, rom, or whatever.

An aside. NES sprite flicker is a different thing. Programers choosing to present multiple banks of sprites on successive vblanks to exceed the fixed sprite limits of the hardware.
 

Aeana

Member
Oct 25, 2017
6,938
The reason consoles like the NES didn't have screen tearing is because the PPU did not allow updating the video memory outside of the vblank period (except for a few special cases, like updating scroll position, which is how we got those neat line scrolling faux parallax effects in Sunsoft games).
 

Aeana

Member
Oct 25, 2017
6,938
The NES and later consoles used video memory and that's definitely a form of a frame buffer. It's a stark contrast to something like the Atari 2600 which did not have any form of frame buffer/video memory, and thus you had to "race the beam" to write your graphics line-by-line to the TV.
 

RGB

Member
Nov 13, 2017
657
Let's not clutter up the OPs thread, but for those to be in memory (with the exception of hardware sprites) in order to be displayed by video hardware something had to put hem in a contiguous block in order to be written to the display efficiently. I.e. Something assembled all the pieces of the background image in a block of memory beforehand, this would be a buffer.
 

Fularu

Member
Oct 25, 2017
10,609
Screen tearing was very common in the early days. Most systems weren't designed around 2d graphical abilities like scrolling, parallax scrolling, sprite manipulations and so on.

Look at scrolling games on the MSX, the ZX Spectrum, the Atari ST or the PC-88 to get an idea of how bad screen tearing could be
 

HTupolev

Member
Oct 27, 2017
2,436
The NES and later consoles used video memory and that's definitely a form of a frame buffer.
There's a "frame buffer" insofar as the data required to draw a frame is buffered somewhere. But I was using "framebuffer" in this context to refer to a frame-resolution image that a game's graphics get drawn into, and which gets output to the display. The NES rendering system has no such thing.

Let's not clutter up the OPs thread, but for those to be in memory (with the exception of hardware sprites) in order to be displayed by video hardware something had to put hem in a contiguous block in order to be written to the display efficiently. I.e. Something assembled all the pieces of the background image in a block of memory beforehand, this would be a buffer.
The NES background imagery uses 8x8 blocks referenced through a tilemap. It fetches new (8x1) tile data on the fly every 8 pixels as it goes through a scanline. There is no complete background image buffer anywhere.
 

RGB

Member
Nov 13, 2017
657
The NES background imagery uses 8x8 blocks referenced through a tilemap. It fetches new (8x1) tile data on the fly every 8 pixels as it goes through a scanline. There is no complete background image buffer anywhere.
I mean, it's not contiguous but it's still static data, just abstracted one level. :P
For the OPs question, without being system specific, we're still presenting an image without tearing.
 

krat0zs

Member
Jan 18, 2020
359
UK
I feel like screen tearing started with the PS2 era, infamously with God of War, but earlier games had it too (Apparently as early as Dark Cloud.) Anecdotally, I've been playing The Simpsons Hit and Run again and even the GameCube version has screen tearing, which is odd in retrospect as Nintendo now seems to almost mandate full V Sync in games for their consoles. Not sure if the PS1, N64, Saturn or Dreamcast had any games with screen tearing, but I'd be interested to find out.
 

arcadepc

Banned
Dec 28, 2019
1,925
There was screen tearing on computer games too, both on dos and windows during VGA and svga era, though there you could dabble with refresh rates and resolutions and eliminate or reduce it.
 

KDR_11k

Banned
Nov 10, 2017
5,235
A lot of 2D games were not just V-synced but had logic specifically timed to the H-blanks (when the electron beam hits the right edge of the line and moves back to the left to draw the next one). For example the 2600 only had enough memory to keep a single line of pixels, you had to set up that line, wait for the H-blank and then replace it with a new line. These systems also have very limited sprite numbers (e.g. 8 8x8 sprites on the NES) that had to be multiplexed by the line, if more than 8 sprites happened to be in a horizontal line then you'd get the infamous sprite flicker. Effects like the screen waving left and right or the faux parallax scrolling on single-layer systems (NES, for example) were achieved by scrolling the screen a different amount on every H-blank so technically you got screen tearing in every single line.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,683
It was just a prevalent.
Overscan hid some of it and in PAL land , the black bars we often had also hid some
 

Deleted member 9479

User requested account closure
Banned
Oct 26, 2017
2,953
lol. The first time I ever witnessed screen tearing was on my huge ass 40 pound (wasn't that heavy but awkward size made it feel that way) 21" crt right after installing a new GPU at a pan larty in the early 2000s. I was convinced I'd broken the monitor in transit to the party. Oh how young and naive I was.
 

HBK

Member
Oct 30, 2017
7,978
Essentially yes.

Modern games generally render a frame as an image, and then tell the system to output that image to the display. When the game gets done rendering the next frame, it can tell the system to output this new image instead. Screen tearing is what happens if the system makes this switch while the first image is still being drawn to the display, so part of the display shows the old frame and part of the display shows the new frame.
"Vsync" means that the system only makes the switch if it's not in the process of sending a frame out.

In the 2D era, most consoles worked a bit differently. A game would basically tell the graphics hardware in the console what the scene was: what background image(s) to use, what sprites to draw where, etc. The console would then create the image pixel-by-pixel as the data is being sent to the display. There's no "framebuffer" containing a finished version of the frame.
As long as the scene isn't changed while it's being drawn out, the "vsync" happens implicitly. If you could change the scene while it's being output - say, by shifting the background by some amount when the frame is only halfway done - then there would be tearing halfway down the image.
Quoting for truth.

CRTs work the same as LCD with regard to tearing and VSync.

But tearing is the result of a framebuffer swapped outside of VSync range, so for it to happen you need to have a framebuffer. So old cartridge games didn't have tearing (unless purposely done so) as they didn't have any framebuffer (consoles themselves barely had any RAM).The image was basically constructed in "real time" as the electron beam framed the screen, and all this was basically 100% hardware driven so while you could have "slowdowns" if the CPU had to compute too many objects, the "GPU" would always render the scene at full framerate, skipping objects (i.e. not drawing them) as necessary, on a line by line basis.

Basically.
 

RestEerie

Banned
Aug 20, 2018
13,618
oh...absolutely....seen my fair share of screen tearing on those old quake 3 engine games on a CRT.
 

HBK

Member
Oct 30, 2017
7,978
It should be noted that screen tearing is bad, like Geneva Conventions level bad. That's why most games try to avoid tearing at all costs. And old games usually didn't even consider having screen tearing to help have a few more "frames" per second drawn due to how BAD it is.
 

Het_Nkik

One Winged Slayer
Member
Oct 27, 2017
3,403
Tearing was significantly less perceivable on a CRT. It was still technically there, and you'll hear a few people on here claiming they saw it all the time. But I played games with vsync off all the time back on CRT monitors and it rarely showed up. I expect it was due to limited pixel persistence and the natural scanning of the display. It also didn't hurt that CRT monitors usually displayed over 60Hz most of the time, although something like 360 games on a CRT had (hard to detect) tearing.

I'm super sensitive to tearing on a LCD. I can't abide it. I can't play without vsync unless I have freesync / gsync (and then it's basically taken care of for me).
This is crazy. I remember Metal Gear Solid 3 for PS2 had a ton of screen tearing on my CRT. Stands out because it's the first game I ever saw it in.
 
Aug 30, 2020
2,171
This is crazy. I remember Metal Gear Solid 3 for PS2 had a ton of screen tearing on my CRT. Stands out because it's the first game I ever saw it in.

This is crazy. Being able to spot tearing on interlaced output requires supernatural eyes. Perhaps you were seeing something else that you later identified as tearing once you saw tearing for the real first time. I think this is the symptom most people have experienced.

Virtually nobody talked about screen tearing in the CRT days. Now some people on this forum act like 'oh yeah it was always there'. But more likely it's false memories from decades of LCD use.
 

Het_Nkik

One Winged Slayer
Member
Oct 27, 2017
3,403
This is crazy. Being able to spot tearing on interlaced output requires supernatural eyes. Perhaps you were seeing something else that you later identified as tearing once you saw tearing for the real first time. I think this is the symptom most people have experienced.

Virtually nobody talked about screen tearing in the CRT days. Now some people on this forum act like 'oh yeah it was always there'. But more likely it's false memories from decades of LCD use.
I was using the same Sony Wega Trinitron HD CRT to do 100% of my video game playing from 2006 till earlier this year. I finally buckled and threw the thing out in January and bought a 4K TV. I legit do not have even have a YEAR of playing on LCDs, much less decades.

If you want to argue it's an interlaced/progressive scan thing, I spent the majority of the 360 and PS4 years playing at 1080i, which was the TVs highest output, and still plenty of screen tearing.

I don't know what to tell you. Either the HDTV CRT tech is different than a standard CRT and makes screen tearing visible in both interlaced and progressive scan images, or you're wrong. Because I know what I've been seeing for the past 14 years.