but what about 2d games? How come they never suffered from screen tearing? Vsync?
Essentially yes.
Modern games generally render a frame as an image, and then tell the system to output that image to the display. When the game gets done rendering the next frame, it can tell the system to output this new image instead. Screen tearing is what happens if the system makes this switch while the first image is still being drawn to the display, so part of the display shows the old frame and part of the display shows the new frame.
"Vsync" means that the system only makes the switch if it's not in the process of sending a frame out.
In the 2D era, most consoles worked a bit differently. A game would basically tell the graphics hardware in the console what the scene was: what background image(s) to use, what sprites to draw where, etc. The console would then create the image pixel-by-pixel as the data is being sent to the display. There's no "framebuffer" containing a finished version of the frame.
As long as the scene isn't changed while it's being drawn out, the "vsync" happens implicitly. If you
could change the scene while it's being output - say, by shifting the background by some amount when the frame is only halfway done - then there
would be tearing halfway down the image.