• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

night814

One Winged Slayer
Member
Oct 29, 2017
15,036
Pennsylvania
It and OoT were so good at everything else that the frame rate was irrelevant, a lot of people wouldn't have noticed or cared
 
Nov 8, 2017
13,099
The games have a 16.67fps cap in PAL regions bro.

I have MM on Virtual Console and on 3DS, but I can't imagine I'll actually finish the game until the decompilation project is completed and I can play a native PC port.
 
May 15, 2019
2,448
I think CRTs handled odd framerates better than modern TVs do. Definitely still prefer the original, rough framerate and all, to the butchered 3DS remake. Just fix the framerate and up the resolution of the N64 version and give me that.
 

Many

Member
Sep 17, 2018
566
I just watched a video of Ocarona running at 30 and 60 fps, and it just look weird, Link looks like is floating. 20 fps for live.
 

Larrikin

Member
Oct 25, 2017
2,712
Wait... you mean people have been playing this unplayable game for decades? And I thought gamers had standards.
 

Fat4all

Woke up, got a money tag, swears a lot
Member
Oct 25, 2017
92,682
here
no duh

if i hear a game is running at 20 fps i know its either on n64 or switch
 

logan_cadfgs

Member
Oct 28, 2017
945
Yeah, both N64 Zelda titles had 20fps caps.
Peace Walker on PSP was capped at 20fps, too. They were all critical darlings regardless!

There's gotta be more examples, right? I thought I remembered PS1 Tony Hawk being capped at 20fps, but I checked and it's actually capped at 30fps ... it just can't get there most of the time 😂
 

Bjomesphat

Member
Nov 5, 2017
1,820
I had high end PCs growing up in the 90s so I experienced high frame rates with 3D games.

Of course I only realized this in retrospect. At the time, frame rates never even crossed my mind and didn't bother me in the slightest. It just felt like some games on PC were "faster", but I never made the connection it was a performance thing.

The only time I really took note was playing Star Fox on SNES and then going to N64.
 

Unicorn

One Winged Slayer
Member
Oct 29, 2017
9,528
It was something that felt sluggish or that movement was slow. I think that was the language myself and friends used until fps was more commonplace in discourse.
 

Servbot24

The Fallen
Oct 25, 2017
43,070
Yep and it's one of the best games of all time. FPS rarely matter to how good a video game is.
 

Bonfires Down

Member
Nov 2, 2017
2,814
The frame rate was really bad. The only reason it wasn't a complete disaster is because the gameplay rarely requires split second reactions.
 

Mau

Member
Oct 25, 2017
2,865
I don't recall that at all but I only started appreciating 60fps during the PS5 gen lol
 

deep_dish

Member
Oct 25, 2017
941
You know how sometimes you see footage from the standard definition era of TV and ask "how did anyone watch this?"

ya, that's the N64
 

demondance

Member
Oct 27, 2017
3,808
No one noticed, no one cared, game was amazing.

Lots of people who loved the game noticed back then. The early 3D consoles were a transition to compromised gameplay on a technical level, and it led to a major "2D is inherently better than 3D" thing for years.

Tbh, I don't remember anyone caring about frame rate in games back then.

If you were online you would've seen it. The language wasn't in Digital Foundry terms but it was there. Especially for arcade and PC gamers.
 
Last edited:

Dolce

Member
Oct 25, 2017
14,236
You know how sometimes you see footage from the standard definition era of TV and ask "how did anyone watch this?"

Standard definition television tends to look bad on modern TVs because modern TVs suck and aren't able to handle interlaced video among many, many other things. They didn't actually look like that on TVs of their era.
 

BriGuy

Banned
Oct 27, 2017
4,275
The biggest thing I remember people complaining about in those days was pop-in. Otherwise though, if it was in 3D, we were generally happy.
 

JershJopstin

Member
Oct 25, 2017
5,332
It's weird to see Majora's Mask singled out when this was true for Ocarina of Time as well. Majora's Mask did dip more, but they shared the 20fps limit.

Obviously CRTs helped smooth things out.
Standards change. Majora's Mask also had a resolution of 480i (afaik).
240p on Nintendo 64, 480i in all emulated rereleases (including the GameCube version OP played). No idea if it's different for PAL. I believe it can run at 480p on GameCube and Virtual Console as well with the right cables.

I did find one claim that it switches to 480i when you open the Bomber's notebook so they could fit more readable text on screen at once. I don't know for sure if that's true.

no duh

if i hear a game is running at 20 fps i know its either on n64 or switch
Lol

I don't think there's Switch games capped to 20fps though.
 

mikehaggar

Developer at Pixel Arc Studios
Verified
Oct 26, 2017
1,379
Harrisburg, Pa
The biggest thing I remember people complaining about in those days was pop-in. Otherwise though, if it was in 3D, we were generally happy.

I loathe pop-in of any kind. I can't believe it still exists (LOD, shadows, filtering, etc...). If you had told me in 1997 it would still be a thing in 2021 I wouldn't have believed you.

More on topic, yeah, low framerates were basically just accepted during that gen. I certainly noticed it and understood why it was happening, but it was so common that I didn't give it much thought.
 

Diogo Arez

One Winged Slayer
Member
Oct 20, 2020
17,625
It's weird to see Majora's Mask singled out when this was true for Ocarina of Time as well. Majora's Mask did dip more, but they shared the 20fps limit.

Obviously CRTs helped smooth things out.

240p on Nintendo 64, 480i in all emulated rereleases (including the GameCube version OP played). No idea if it's different for PAL. I believe it can run at 480p on GameCube and Virtual Console as well with the right cables.

I did find one claim that it switches to 480i when you open the Bomber's notebook so they could fit more readable text on screen at once. I don't know for sure if that's true.


Lol

I don't think there's Switch games capped to 20fps though.
Deadly Premonition 2 isn't capped at that but if it ever reaches 30fps it's a miracle
 

chiller

Member
Apr 23, 2021
2,777
There's a whole patch out there that undoes quite a lot of the changes made in Majora's Mask 3D that ruin it for you all. It's not overtly complicated to setup / use either.

I actually prefer the remake, even with the changes. Please feel free to flame me because of this.
 

ZeroMaverick

Member
Mar 5, 2018
4,433
If no one would have ever pointed out the difference and "significance" of frame rate, I probably never would have cared to this day. I don't remember this game running poorly.
 

Conkerkid11

Avenger
Oct 25, 2017
13,949
There's a whole patch out there that undoes quite a lot of the changes made in Majora's Mask 3D that ruin it for you all. It's not overtly complicated to setup / use either.

I actually prefer the remake, even with the changes. Please feel free to flame me because of this.
Thanks for reminding me that I installed this and didn't play much of it. I've gotta play that all the way through one of these days.

For those unaware: https://github.com/leoetlino/project-restoration

Edit: Oh wow, it improves the Elegy of Emptiness during the Stone Tower portion too. That's awesome.
 

Pargon

Member
Oct 27, 2017
11,996
No one noticed, no one cared, game was amazing.
You never noticed or cared.
I remember desperately wanting an N64 as a kid that would never have been able to get one - especially when reading previews of Ocarina of Time in magazines.
I then saw Ocarina of Time at a friend's place, running at 16.67 FPS (50/3) since I lived in a PAL region, and thought it looked like SHIT.
I remember watching friends playing Star Fox 64 multiplayer and getting extremely motion sick from it. I couldn't play it at all. Golden Eye/Perfect Dark were unwatchable.

I'm 100% going to get flamed for my comment buuut...
Back when people were rarely complainng about those things
CRTs took the edge off, and game speed used to be locked to the frame rate.
So if the frame rate dipped the game slowed down, rather than the game logic running at full speed with the graphics dropping frames, as they do today.
But that's also the reason why a game can run at 30 FPS on console and have the frame rate unlocked on PC, rather than running at 4× speed when you try to play it at 120 FPS.
  • With a CRT, the amount of motion blur you see is determined by the screen itself, since the image is flashed once per refresh. You might have 0.5–2.0ms of motion blur depending on the phosphors used, regardless of the frame rate.
  • LCD/OLED draws the image differently. They hold the image until the next one is drawn. That means the amount of motion blur you see is directly linked to the frame rate. 60 FPS will have 16.67ms of motion blur (1000/60). 30 FPS will have 33.33ms motion blur (1000/60). The PAL version of Ocarina of Time would have 60ms of motion blur. To achieve 0.5ms with a sample-and-hold OLED, you would need a 2000Hz display, and a game running at 2000 FPS.
That's why black frame insertion (BFI) is a sought-after feature of newer displays.
Rather than holding the frame until the next is drawn, inserting black frames in-between each image reduces the amount of motion blur you will see, without requiring higher frame rates; but it does cause the screen to flicker.
Nothing is quite like a CRT, though. With OLED, BFI works well to reduce motion blur, but not by as much as a CRT, and the flicker is much worse than a CRT.
120Hz on a CRT is relatively flicker-free, while there is still obvious flicker on an OLED in brighter images (but not much in darker ones).

I believe this plays a big role in why so many people here say that the old Sonic games are bad, for example.
  • On a CRT you had near-zero latency, and no motion blur at all.
  • On a typical LCD, or even OLED, you have higher latency, and a ton of motion blur in comparison. Even extended to 16:9 it's almost like you can see less ahead of you and have less time to react, due to the motion blur and input lag (though latency is very good on OLED now).
I get a chuckle when people nitpick over a frame or two here and there in DF threads, etc. It's a definite "back in my day" situation... but man Majora's and Ocarina ran really bad and publications just handed out 9's and 10's like halloween candy and we absolutely ate these games up
Since game speed is tied to frame rate, when things slowed down you wouldn't get the same stuttering that you do today.
And they were built differently. Everything was v-sync locked - so 60 FPS games had to hold 60 FPS or else they'd drop to 30.

Today, that's not the case.
A game can run at 59 FPS at 60Hz, rather than instantly dropping from 60 FPS to 30 FPS - so it's not something critical for a developer to fix, like it used to be.
59 FPS at 60Hz is a minor but constant stutter in a modern game. Perhaps not a bad one, depending on how you feel about it, but it's always there.

And 60 FPS on a CRT was so smooth and clear.
Smoother than 60 FPS looks on your modern flat-panel TV. Even OLED.

However modern displays and systems do have one advantage: Variable Refresh Rate.
If you have a VRR display, dropping to 59 FPS is no big deal any more.
The display will update at 59Hz rather than 60Hz and you'll never see it - it will look no different than a game running at a constant 60 FPS.
VRR adds quite some leeway, and enables frame rates above 60 FPS too.

Agreed. FPS, proper RGB coloring, HDR settings….this knowledge is a burden.
One great thing about most CRT televisions back then was that RGB inputs would essentially bypass all the internal picture controls - so the picture was perfect* without doing anything.
And before you say "yeah, but who knew about RGB back then?" nearly everyone that I knew with a PSX had it hooked up via RGB SCART.
And this was not some videophile thing. It's because many televisions wouldn't display composite images from that generation of consoles correctly and you'd get a black and white image - especially with imported games.
I was using RGB back on the SNES as a kid, too. I had no idea what it was, I just knew that it looked better.

* The exception is that white balance controls were non-existent for televisions back then, at least outside of a service menu or physical controls inside the chassis.

And today people have frothing meltdowns over something like a framedrop in Mario Kart 8. Like… how did some of you survive gen 5?
I never owned a Gen 5 console.
I went from SNES to PC to Dreamcast (while also having PC around).

People nitpick the tiniest insignificant details in the 3DS version but ignore the framerate boost that makes it unarguable better.
I think the 3DS versions of both games look pretty ugly, to be honest. Especially the Link character model in OOT.
And I hear there are a lot of gameplay changes in Majora's Mask that people dislike - though there is a fan patch to fix some of them.
It's why I'm hopeful that we'll eventually see a decompiled Ocarina of Time port, similar to Mario 64. I want to play the original game, but at a higher frame rate.
 
Last edited:
Oct 27, 2017
3,363
I honestly didn't notice or care back then, or cared about framerate in general. Maybe the 50Hz European thing had something to do with it?

Are the Wii shop versions of OOT and MM also 20FPS or did they boost the framerate for those? I don't remember being too bothered by those versions either.
 

Hero_of_the_Day

Avenger
Oct 27, 2017
17,329
I could never even tell and still can't really. Then again I'm one of those who can't tell the difference between 30 and 60.

I was like that at the time. And the first time I played Gears of War on the PC, it felt too fast at 60. Now, I vastly prefer 60, and the 64 stuff feels like a crime against humanity.

I'm afraid to get into 120fps. 60 might feel bad then.
 

OnionPowder

Banned
Oct 25, 2017
9,323
Orlando, FL
You never noticed or cared.
I remember desperately wanting an N64 as a kid that would never have been able to get one - especially when reading previews of Ocarina of Time in magazines.
I then saw Ocarina of Time at a friend's place, running at 16.67 FPS (50/3) since I lived in a PAL region, and thinking it looked like SHIT.
I remember watching friends playing Star Fox 64 multiplayer and getting extremely motion sick from it. I couldn't play it at all. Golden Eye/Perfect Dark were unwatchable.


CRTs took the edge off, and game speed used to be locked to the frame rate.
So if the frame rate dipped the game slowed down, rather than the game logic running at full speed with the graphics dropping frames, as they do today.
But that's also the reason why a game can run at 30 FPS on console and have the frame rate unlocked on PC, rather than running at 4× speed when you try to play it at 120 FPS.
  • With a CRT, the amount of motion blur you see is determined by the screen itself, since the image is flashed once per refresh. You might have 0.5–2.0ms of motion blur depending on the phosphors used, regardless of the frame rate.
  • LCD/OLED draws the image differently. They hold the image until the next one is drawn. That means the amount of motion blur you see is directly linked to the frame rate. 60 FPS will have 16.67ms of motion blur (1000/60). 30 FPS will have 33.33ms motion blur (1000/60). The PAL version of Ocarina of Time would have 60ms of motion blur. To achieve 0.5ms with a sample-and-hold OLED, you would need a 2000Hz display, and a game running at 2000 FPS.
That's why black frame insertion (BFI) is a sought-after feature of newer displays.
Rather than holding the frame until the next is drawn, inserting black frames in-between each image reduces the amount of motion blur you will see, without requiring higher frame rates; but it does cause the screen to flicker.
Nothing is quite like a CRT, though. With OLED, BFI works well to reduce motion blur, but not by as much as a CRT, and the flicker is much worse than a CRT.
120Hz on a CRT is relatively flicker-free, while there is still obvious flicker on an OLED in brighter images (but not much in darker ones).

I believe this plays a big role in why so many people here say that the old Sonic games are bad, for example.
  • On a CRT you had near-zero latency, and no motion blur at all.
  • On a typical LCD, or even OLED, you have higher latency, and a ton of motion blur in comparison. Even extended to 16:9 it's almost like you can see less ahead of you and have less time to react, due to the motion blur and input lag (though latency is very good on OLED now).

Since game speed is tied to frame rate, when things slowed down you wouldn't get the same stuttering that you do today.
And they were built differently. Everything was v-sync locked - so 60 FPS games had to hold 60 FPS or else they'd drop to 30.

Today, that's not the case.
A game can run at 59 FPS at 60Hz, rather than instantly dropping from 60 FPS to 30 FPS - so it's not something critical for a developer to fix, like it used to be.
59 FPS at 60Hz is a minor but constant stutter in a modern game. Perhaps not a bad one, depending on how you feel about it, but it's always there.

And 60 FPS on a CRT was so smooth and clear.
Smoother than 60 FPS looks on your modern flat-panel TV. Even OLED.

However modern displays and systems do have one advantage: Variable Refresh Rate.
If you have a VRR display, dropping to 59 FPS is no big deal any more.
The display will update at 59Hz rather than 60Hz and you'll never see it - it will look no different than a game running at a constant 60 FPS.
VRR adds quite some leeway, and enables frame rates above 60 FPS too.


One great thing about most CRT televisions back then was that RGB inputs would essentially bypass all the internal picture controls - so the picture was perfect* without doing anything.
And before you say "yeah, but who knew about RGB back then?" nearly everyone that I knew with a PSX had it hooked up via RGB SCART.
And this was not some videophile thing. It's because many televisions wouldn't display composite images from that generation of consoles correctly and you'd get a black and white image - especially with imported games.
I was using RGB back on the SNES as a kid, too. I had no idea what it was, I just knew that it looked better.

* The exception is that white balance controls were non-existent for televisions back then, at least outside of a service menu or physical controls inside the chassis.


I never owned a Gen 5 console.
I went from SNES to PC to Dreamcast (while also having PC around).


I think the 3DS versions of both games look pretty ugly, to be honest. Especially the Link character model in OOT.
And I hear there are a lot of gameplay changes in Majora's Mask that people dislike - though there is a fan patch to fix some of them.
It's why I'm hopeful that we'll eventually see a decompiled Ocarina of Time port, similar to Mario 64. I want to play the original game, but at a higher frame rate.

Very well said. CRT's were just a different breed when it came to playing games. I still try and snag one from a pawn shop every once in a while to hook up something and play with the zero input latency and beautiful scanlines. No post processing scanline filter applied on an LCD gets close to the natural occurrence in the CRT, which so many designers were very aware of when creating the art and designing the games. Plus so many were crazy hard because they relied on you having 0 latency and being able to get really tight inputs. Something that feels off on an LCD compared to the original experience.

So many people revisit games from the past and wonder why it feels so flat and boring now. It's the display!