• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Veliladon

Member
Oct 27, 2017
5,557
So John Linneman has a new Doom video about the latest patch. Both the video and the patch are awesome. The only thing that's not awesome about the patch is one item in particular and John expertly describes it. Upscaling the frame to the display.

Watch the video first.



With apologies to John for ripping off his hard work of creating this example in his video, the effect I'm looking at in particular is this:

86ivLYF.jpg


John mentions a linear interpolation and I was thinking about a way to resize it from the native rendered 16:10 to intended 4:3 with less artifacts for pixel art.

If we take a 4:3 960x600p frame buffer from the Switch port for example. We need to get that into either a 1280x720p or 1920x1080p display. If we do that with some sort of bilinear or bicubic upscaling we're going to get blur. My preferred method would be a linear supersample then a bicubic downsample.

Now I don't have a 960x600p frame buffer so I'm going to be using a 320x200 image as an example.

This is the original image:

eefe8Ea.png


This is a bicubic upscale from 320x200 to 1440x1080:

vLY8g4h.jpg


It's the right aspect ratio but blurry as all fuck. What they should do is first a straight pixel multiply on each axis. For a 320x200 to 1440x1080p conversion this would first involve a 45x upscale to 14,400 pixels horizontal and a 54x upscale on the vertical. We're left with a 14,400 x 10,800 pixel image. At that point we can bicubic back down to 1440x1080p and get this:

g8b6NeJ.jpg


*mic drop*

So what have we done here? Let's take a look at the bottom right corner of the left wall.

Evc3rgO.jpg


Well instead of smearing the entire image over a bicubic interpolation or having jagged nearest neighbour we've instead interpolated only at the smaller resolution's old pixel boundaries. So where our nearest neighbour upscale has to decide between 4 or 5 pixels for each old pixel, we've got 4 for each and then a 5th pixel on the boundary which is interpolated between the two colors. Maximum sharpness because we're not smearing the difference between two colors over 9 pixels, interpolated stairstep.

Now going from a 960x600 would involve the same thing, but it would be an upscale of 15x across the horizontal and 18x on the vertical. Then you'd do the bicubic back down to 1080p. Going to 720p wold be a lot more involved because the lowest number with both 720 and 600 as a common factor would be 54,000. This would be atrocious because you'd be scaling 14.5GB per frame * 60fps means 869GB/sec of memory bandwidth compared to 1080p which is 0.57GB per frame and 34.8GB/sec needed to scale. You might just need to bicubic from 14,400 x 10,800 back to 960x720 and take the slight quality hit of doing a bicubic resample on non-integer factors.

Anyway, those are my thoughts about how to implement pixel art upscaling from integer multiplied native presentations.

Love the videos, John. Keep making them.
 

Jegriva

Banned
Sep 23, 2019
5,519
Expert DosBOX users who have to deal with 320x200 everyday can tell you that the magic resolutions are 1280x1000 in case of a 1080p screen (4x horizontal, 5x vertical, with an almost 1.33:1 aspect ratio, but totally plausible since in the CRT era many screen weren't properly calibrated to begin with), and 1600x1200, which is exactly 1.33:1.
 

Sinatar

Member
Oct 25, 2017
5,684
Expert DosBOX users who have to deal with 320x200 everyday can tell you that the magic resolutions are 1280x1000 in case of a 1080p screen (4x horizontal, 5x vertical, with an almost 1.33:1 aspect ratio, but totally plausible since in the CRT era many screen weren't properly calibrated to begin with), and 1600x1200, which is exactly 1.33:1.

1280x960.
 

Pargon

Member
Oct 27, 2017
11,996
RetroArch's "pixellate" shader is a great solution for this.
Here's an older post of mine on the subject, in the context of displaying retro games that are rendered with non-square pixels:

You should pretty much never be using unfiltered pixels.
Unfiltered pixels mean that you are either displaying the game in the wrong aspect ratio - likely with black borders instead of filling the height of the screen, or there will be pixel crawl/flickering artifacts when anything moves across the screen.

At the very minimum, you should be using something like the Pixellate shader.
This shader retains virtually all of the sharpness of unfiltered pixels, but prevents flickering artifacts caused by non-integer scaling - whether that is to fill the height of the display, or to correct the aspect ratio.

Here's an example of Castlevania: Symphony of the Night, scaled to 1080p.
Unfiltered pixels - displays in the wrong aspect ratio, and the image does not fill the display:
sotn-unfiltered-niesc.png


Bilinear filtering - displays in the correct aspect ratio and fills the display, but the image is significantly blurred:
sotn-linear-bilinear-wldoy.jpg

Note: this is bilinear filtering in linear light rather than gamma light, as most filters will use. Bilinear filtering in gamma light will dull the picture (look at the health counter).

Pixellate shader - displays in the correct aspect ratio and fills the display with minimal blurring:
sotn-pixelate-wkifq.png

If you zoom in close on this image, you can see that the blurring is generally constrained to a single pixel width around every source pixel, and at typical TV/monitor distances it should not be noticeable.
The higher the output resolution is, the better the result will be.
You have to wonder if the results might have been better on Switch by rendering natively at 320x200 using proper scaling/filtering to 960x720 (1440x1080 docked), rather than rendering at a higher resolution and having to use improper scaling. It's not like DOOM needs the high resolution rendering.

960 is not cleanly divisible by 200.
You use 1280x1000 because the slight difference in aspect ratio (1.28:1) is far less noticeable than non-integer scaling, if proper filtering is not available.
 

Leo-Tyrant

Member
Jan 14, 2019
5,083
San Jose, Costa Rica
Expert DosBOX users who have to deal with 320x200 everyday can tell you that the magic resolutions are 1280x1000 in case of a 1080p screen (4x horizontal, 5x vertical, with an almost 1.33:1 aspect ratio, but totally plausible since in the CRT era many screen weren't properly calibrated to begin with), and 1600x1200, which is exactly 1.33:1.

I may have been looking for this recommendation for years. Thank you friend.

If I have a 1080 screen, then I would have DosBox (or 320x200 content) running at 1280x1000 and that would be the best possible ratio?

What would be the resolution setting for 4K screens?
 

infinityBCRT

Member
Nov 1, 2017
1,132
So John Linneman has a new Doom video about the latest patch. Both the video and the patch are awesome. The only thing that's not awesome about the patch is one item in particular and John expertly describes it. Upscaling the frame to the display.

Watch the video first.



With apologies to John for ripping off his hard work of creating this example in his video, the effect I'm looking at in particular is this:

86ivLYF.jpg


John mentions a linear interpolation and I was thinking about a way to resize it from the native rendered 16:10 to intended 4:3 with less artifacts for pixel art.

If we take a 4:3 960x600p frame buffer from the Switch port for example. We need to get that into either a 1280x720p or 1920x1080p display. If we do that with some sort of bilinear or bicubic upscaling we're going to get blur. My preferred method would be a linear supersample then a bicubic downsample.

Now I don't have a 960x600p frame buffer so I'm going to be using a 320x200 image as an example.

This is the original image:

eefe8Ea.png


This is a bicubic upscale from 320x200 to 1440x1080:

vLY8g4h.jpg


It's the right aspect ratio but blurry as all fuck. What they should do is first a straight pixel multiply on each axis. For a 320x200 to 1440x1080p conversion this would first involve a 45x upscale to 14,400 pixels horizontal and a 54x upscale on the vertical. We're left with a 14,400 x 10,800 pixel image. At that point we can bicubic back down to 1440x1080p and get this:

g8b6NeJ.jpg


*mic drop*

So what have we done here? Let's take a look at the bottom right corner of the left wall.

Evc3rgO.jpg


Well instead of smearing the entire image over a bicubic interpolation or having jagged nearest neighbour we've instead interpolated only at the smaller resolution's old pixel boundaries. So where our nearest neighbour upscale has to decide between 4 or 5 pixels for each old pixel, we've got 4 for each and then a 5th pixel on the boundary which is interpolated between the two colors. Maximum sharpness because we're not smearing the difference between two colors over 9 pixels, interpolated stairstep.

Now going from a 960x600 would involve the same thing, but it would be an upscale of 15x across the horizontal and 18x on the vertical. Then you'd do the bicubic back down to 1080p. Going to 720p wold be a lot more involved because the lowest number with both 720 and 600 as a common factor would be 54,000. This would be atrocious because you'd be scaling 14.5GB per frame * 60fps means 869GB/sec of memory bandwidth compared to 1080p which is 0.57GB per frame and 34.8GB/sec needed to scale. You might just need to bicubic from 14,400 x 10,800 back to 960x720 and take the slight quality hit of doing a bicubic resample on non-integer factors.

Anyway, those are my thoughts about how to implement pixel art upscaling from integer multiplied native presentations.

Love the videos, John. Keep making them.

No one will probably believe me-- but I was the one who introduced this method to Razoola-- the guy who originally hacked CPS2 and was releasing a customized emulator which supported it IIRC. As far as I know it was the first emulator that did this. At first he said it would never work-- and then quietly in the next release he actually put this method in, and over time more and more emulators adopted it.
 

Sinatar

Member
Oct 25, 2017
5,684
But how do you scale perfectly 200 points into 960...?

In case of a 720p screen like the Switch, the lesser evil would be... 960x800 cropped?

When playing on a CRT, while doom is rendered at 320x200, it gets automatically resized to 4:3, so if you want it to look like it should look on a CRT with a proper 4:3 aspect ratio you want 1280x960 and you want Aspect Correction turned on (which simulates CRT behavior with non standard resolutions).
 

Pargon

Member
Oct 27, 2017
11,996
I switched from pixellate to bandlimit-pixel for interpolation since it looked about the same but is faster. Recently I did a zoomed in comparison and it's actually sharper than pixellate too: http://www.framecompare.com/image-compare/screenshotcomparison/JME11NNU

Maister wrote an in-depth blog post about the shader here.
Looking at that comparison, it seems like the difference is that Pixellate was updated to support blending in linear-light, while bandlimit-pixel is using gamma-light for blending.

Doing that is faster, but blending in gamma-light means that you're averaging RGB values, rather than blending colors as they would in real life, and often ends up with dark/ugly borders between colors. That might appear "sharper" though, much like image sharpening darkens the edges of objects to create more contrast.
Blending in linear-light is more accurate to the source image. Bright details often end up appearing smaller than dark ones when you're blending in gamma-light.

When viewed at normal sizes, rather than zoomed in, pixels should appear to be more uniform in size when using pixellate.
It would be good if bandlimit-pixel could be updated to blend in linear-light though, as that may still be faster and produce better results.

When playing on a CRT, while doom is rendered at 320x200, it gets automatically resized to 4:3, so if you want it to look like it should look on a CRT with a proper 4:3 aspect ratio you want 1280x960 and you want Aspect Correction turned on (which simulates CRT behavior with non standard resolutions).
You're missing the point: when your option is either to use nearest neighbor or bilinear scaling, it's preferable to scale to 1280x1000 with nearest neighbor than blur the image entirely to scale it to 1280x960. An aspect ratio of 1.28:1 is not that far removed from the ideal 1.33:1
When you are using better techniques, like the pixellate or bandlimit-pixel shaders, by all means, scale the image directly to 1280x960.

EDIT: Here's a comparison between these options - though DOOM is not a game I'd be using DOSBox for, and may not be the best example.
 
Last edited:

Awakened

Member
Oct 27, 2017
506
Looking at that comparison, it seems like the difference is that Pixellate was updated to support blending in linear-light, while bandlimit-pixel is using gamma-light for blending.
I brought this up in a github issue; hunterk (AKA hizzlekizzle) says it should already be using linear-gamma. I'm kinda confused by the combined terminology there, but I guess since the first pass of the bandlimit-pixel shader preset is a linearize pass (as opposed to the linear-gamma-correct pass), that should be the linear-light blending? Based on your explanation I can see how my comparison looks like it's not using linear-light compared to pixellate though.
 

Brhoom

Banned
Oct 25, 2017
1,654
Kuwait
Expert DosBOX users who have to deal with 320x200 everyday can tell you that the magic resolutions are 1280x1000 in case of a 1080p screen (4x horizontal, 5x vertical, with an almost 1.33:1 aspect ratio, but totally plausible since in the CRT era many screen weren't properly calibrated to begin with), and 1600x1200, which is exactly 1.33:1.

And that's why I gave up on Dos Box scaling
 
Nov 8, 2017
3,532
I'm assuming that the colored lighting from PSX DOOM is not part of this release or the new patch ?
Nope, and I'd say there's about zero chance of this happening. The developers will just continue to pretend that DOOM was never improved with colours, nicer sound effects and a much better soundtrack, as they've done for over two decades now.
 

Jegriva

Banned
Sep 23, 2019
5,519
I may have been looking for this recommendation for years. Thank you friend.

If I have a 1080 screen, then I would have DosBox (or 320x200 content) running at 1280x1000 and that would be the best possible ratio?

What would be the resolution setting for 4K screens?
Yes you should ideally play in a window, or centering the game image in the centre of the screen (I use DosBOX ECE that let me do it). For 4k The suggestion is stil 2560x2000 or 3200x2400 cropped, since 2880x2000 is 1.44.

When playing on a CRT, while doom is rendered at 320x200, it gets automatically resized to 4:3, so if you want it to look like it should look on a CRT with a proper 4:3 aspect ratio you want 1280x960 and you want Aspect Correction turned on (which simulates CRT behavior with non standard resolutions).
on a CRT monitor you should directly use 640x400, is a legacy resolution many monitors still support today.

This thread made me interested about the resolution of Doom on the Xbox 360 release. When I got home, I'll check it out.
 
Last edited:

Pargon

Member
Oct 27, 2017
11,996
Yes you should ideally play in a window, or centering the game image in the centre of the screen (I use DosBOX ECE that let me do it). For 4k The suggestion is stil 2560x2000 or 3200x2400 cropped, since 2880x2000 is 1.44.
Another option if you have an NVIDIA GPU is to create a custom DSR resolution and render at 3200x2400 with nearest neighbor - or some other integer scale that is both a multiple of 320x200 and a 4:3 aspect ratio.
EDIT: It would have to be higher resolution than that on a 4K display (4800x3600?) as I think DSR requires that both the horizontal and vertical resolution exceed the native resolution of the display.

That method should retain most of the sharpness of using nearest neighbor, but DSR will scale it to fit the display with a gaussian filter - which might be closer to the real-world result that you'd get on an actual CRT. Just make sure you keep the filter at its default of 33% (or increase to 50% if aliasing is a problem) rather than disabling it.

I brought this up in a github issue; hunterk (AKA hizzlekizzle) says it should already be using linear-gamma. I'm kinda confused by the combined terminology there, but I guess since the first pass of the bandlimit-pixel shader preset is a linearize pass (as opposed to the linear-gamma-correct pass), that should be the linear-light blending? Based on your explanation I can see how my comparison looks like it's not using linear-light compared to pixellate though.
I'll have to look at the shader myself when I get the chance.
 
Last edited:

liquidtmd

Avenger
Oct 28, 2017
6,129
Great video

Tbh all I want now on the port is the PSX music. I grew up with it, it was my first experience and finally playing it with the PC music, it's really jarring
 

mute

â–˛ Legend â–˛
Member
Oct 25, 2017
25,062
I feel like for home console versions at least in this scenario the default option is always going to be to fill up the display as much as possible. For same reasons as to why Simpsons is widescreen on Disney+.

Lots of 2d games offer a 4x or 5x scaling as an option though, and something similar should be here as well, at least.
 

Pargon

Member
Oct 27, 2017
11,996
It's not the way I'd want to play DOOM, but I tested out a custom DSR resolution of 4800x3600 in DOSBox, and it works exactly as I thought.
The image fills the screen as much as possible, and without ugly scaling artifacts since DSR handles the scaling to fit the display rather than DOSBox.

I haven't updated DOSBox in years and was using the D-Fend Reloaded front-end, but for some reason it's being scaled to 4799x3600 and the center column of pixels is 1px narrower than the rest. You aren't really going to notice it at this scale though, and it may have been fixed by now.
Except for the very center column, every other random pixel I checked was exactly 15x18 as it should be.

dosboxqnjkx.png


Even with the DSR filter set to 50%, the output is not being blurred much on my 3440x1440 monitor.
With a 1080p or regular 1440p screen you could use 3200x2400 instead (DSR requires a minimum of 3440 horizontal resolution on a 3440x1440 display).
I do wish NVIDIA made it easier to create custom DSR resolutions, rather than having to modify the registry.

Custom resolutions like these are only required in programs that lack support for advanced scaling/filtering via shaders though.
And I still haven't had the chance to look at bandlimit-pixel yet.

Mostly I just had a succession of crts with that resolution.
I'm not aware of any CRTs which had a 5:4 aspect ratio. It was common with older flat panels though.
I think you were using the wrong resolution on your monitors.
 
Last edited:

kami_sama

Member
Oct 26, 2017
6,998
There's a huge issue with your method, the sheer amount of memory bandwidth it needs. Considering the switch is already bandwidth starved, having a very large intermediate frame buffer in memory makes things worse.
I don't think Doom es specially memory intensive, but the issue is still the same.
 

Pargon

Member
Oct 27, 2017
11,996
I brought this up in a github issue; hunterk (AKA hizzlekizzle) says it should already be using linear-gamma. I'm kinda confused by the combined terminology there, but I guess since the first pass of the bandlimit-pixel shader preset is a linearize pass (as opposed to the linear-gamma-correct pass), that should be the linear-light blending? Based on your explanation I can see how my comparison looks like it's not using linear-light compared to pixellate though.
I had a chance to look at this now. Yes, it is blending in linear-light, and the results are better than pixellate.

EDIT: My screenshots loaded out of order and my original conclusion was wrong, as the bandlimit-pixel 0.50 and pixellate-linear screenshots were mixed up.
Bandlimit-pixel is universally superior to pixellate. Originally I thought the smoothness setting had to be increased from 0.50 to 0.60 for bandlimit-pixel to produce results matching pixellate, due to that mix-up.


Dosbox ECE build has pixel-perfect scaling support (can add black bars depending on your output resolution, but still better than nothing).
Ah, I checked and it is DOSBox ECE that I was running, but an older build (4191, current is 4301).
It was also configured to use openglnb (OpenGL, No Bilinear filter) which is why it was outputting 4799x3600.
Changing that to openglpp (OpenGL, Perfect Pixel) output an exact 4800x3600.
I noticed in the logs that it was having to scale 640x400 (possibly DOOM's menus?) which requires a 6400x4800 resolution on this display. You could get away with 3200x2400 on anything 2560x1440 or lower.

The "pixel perfect" setting in ECE seems to pick a best-fit aspect ratio, so it may not be perfect. But that's still a good option if you can't brute-force higher resolutions and scale with DSR.
The difference at 1440p is minor, but how far off the aspect ratio is will depend on the display resolution.
It would be 80 pixels lost at 1080p, rather than 40 at 1440p, or 120 pixels at 720p.

There's a huge issue with your method, the sheer amount of memory bandwidth it needs. Considering the switch is already bandwidth starved, having a very large intermediate frame buffer in memory makes things worse.
I don't think Doom es specially memory intensive, but the issue is still the same.
This is why it's much better to use shaders for the job rather than pre-scaling to stupidly high resolutions like the above.
But it works if you have the hardware to brute-force it, and have no alternative.

As I said previously, I'd much rather they dropped the resolution of the Switch port further; 320x200 or 640x400 rather than 960x600, if that meant they could use proper filtering/scaling via a shader like bandlimit-pixel.
 
Last edited:

Awakened

Member
Oct 27, 2017
506
I had a chance to look at this now. Yes, it is blending in linear-light, and the results are better than pixellate.

EDIT: My screenshots loaded out of order and my original conclusion was wrong, as the bandlimit-pixel 0.50 and pixellate-linear screenshots were mixed up.
Bandlimit-pixel is universally superior to pixellate. Originally I thought the smoothness setting had to be increased from 0.50 to 0.60 for bandlimit-pixel to produce results matching pixellate, due to that mix-up.
Neat. I thought bandlimit pretty much obsoleted pixellate when I first tried it. sharp-bilinear still has it's place since it's even faster than bandlimit. That's the only interpolation shader fast enough for my phone (probably the simple version too).
 

bmfrosty

Member
Oct 27, 2017
1,894
SF Bay Area
It's not the way I'd want to play DOOM, but I tested out a custom DSR resolution of 4800x3600 in DOSBox, and it works exactly as I thought.
The image fills the screen as much as possible, and without ugly scaling artifacts since DSR handles the scaling to fit the display rather than DOSBox.

I haven't updated DOSBox in years and was using the D-Fend Reloaded front-end, but for some reason it's being scaled to 4799x3600 and the center column of pixels is 1px narrower than the rest. You aren't really going to notice it at this scale though, and it may have been fixed by now.
Except for the very center column, every other random pixel I checked was exactly 15x18 as it should be.

dosboxqnjkx.png


Even with the DSR filter set to 50%, the output is not being blurred much on my 3440x1440 monitor.
With a 1080p or regular 1440p screen you could use 3200x2400 instead (DSR requires a minimum of 3440 horizontal resolution on a 3440x1440 display).
I do wish NVIDIA made it easier to create custom DSR resolutions, rather than having to modify the registry.

Custom resolutions like these are only required in programs that lack support for advanced scaling/filtering via shaders though.
And I still haven't had the chance to look at bandlimit-pixel yet.


I'm not aware of any CRTs which had a 5:4 aspect ratio. It was common with older flat panels though.
I think you were using the wrong resolution on your monitors.
They were 4:3 with that resolution. They were the step above 1024*768 for a lot of monitors is the late 90s and early 2000s. I always preferred 1600*1200.
 

Pargon

Member
Oct 27, 2017
11,996
They were 4:3 with that resolution. They were the step above 1024*768 for a lot of monitors is the late 90s and early 2000s. I always preferred 1600*1200.
So long as it's within the scan rate limits, you can send a CRT any resolution you want.
But 1280x1024 is not a 4:3 resolution - it's 5:4. Rendering 5:4 and scaling that to 4:3 squishes everything vertically.
 

Edward850

Software & Netcode Engineer at Nightdive Studios
Verified
Apr 5, 2019
991
New Zealand
Nope, and I'd say there's about zero chance of this happening. The developers will just continue to pretend that DOOM was never improved with colours, nicer sound effects and a much better soundtrack, as they've done for over two decades now.
It's not even compatible. It's an 8bit software renderer, the PSX lighting is done on a very different 16bit hardware renderer. PSX Doom wasn't some minor iterative change, it's a dramatic functional difference in engine tech, transplanting that back would be madness, I wouldn't even want to know how it'd work when it comes to maintaining vanilla add-on support as well if you wanted to completely change the renderer in the process, given the extremely rendering-specific behaviour and tricks various map authors have used over the years.
 
Last edited:
Nov 8, 2017
3,532
It's not even compatible. It's an 8bit software renderer, the PSX lighting is done on a very different 16bit hardware renderer. PSX Doom wasn't some minor iterative change, it's a dramatic functional difference in engine tech, transplanting that back would be madness, I wouldn't even want to know how it'd work when it comes to maintaining vanilla add-on support as well if you wanted to completely change the renderer in the process, given the extremely rendering-specific behaviour and tricks various map authors have used over the years.
I didn't know DOOM on PS1 used a hardware renderer. I always thought the original developers insisted on keeping it on a similar software based engine to the PC version, which is why there's no polygon folding graphical glitches that were in pretty much in every other PS1 game, but also why they weren't able to improve the performance as much as they would've liked.

But even so, that doesn't explain the sounds and music. Surely the DOOM engine isn't locked into the crappy stock sound effects and MIDI music from the PC version? Seems like they could easily be replaced with the improved PS1 versions without any major engine changes.
 
Last edited:

LuigiV

One Winged Slayer
Member
Oct 27, 2017
2,684
Perth, Australia
The whole integer scale up to a ridiculous resolution and downscale back to screen res seems needlessly bandwidth intensive. Since both the input and output resultions are fixed, why not just use a lookup table with precalculated input pixel blending ratios for every output pixel?
 

Edward850

Software & Netcode Engineer at Nightdive Studios
Verified
Apr 5, 2019
991
New Zealand
But even so, that doesn't explain the sounds and music. Surely the DOOM engine isn't locked into the crappy stock sound effects and MIDI music from the PC version? Seems like they could easily be replaced with the improved PS1 versions without any major engine changes.
It sure is your opinion that the PSX sound and music is better, but it's a drastic change and for other people that would be a bridge too far.

(Also fun tech note, the PSX version was built around audio reverb, so not that easy.)
 

Deleted member 16908

Oct 27, 2017
9,377
I just wanted to come in here and say that uneven pixel scaling makes me physically ill and I will not play a game if it has it.
 

hachikoma

Banned
Oct 29, 2017
1,628
The whole integer scale up to a ridiculous resolution and downscale back to screen res seems needlessly bandwidth intensive. Since both the input and output resultions are fixed, why not just use a lookup table with precalculated input pixel blending ratios for every output pixel?
According to the bandlimit writeup, apparently because they are costly and annoying to use?
Applying two filters after each other is the same as convolving them together. Convolution is an integral, so now we have some constraints on our filter kernel, because it needs to be cheap to analytically integrate. LUTs will be too costly and annoying to use.
 
Nov 8, 2017
3,532
It sure is your opinion that the PSX sound and music is better, but it's a drastic change and for other people that would be a bridge too far.
I don't see why they would've gone to all the effort to change it in the PS1 version if they didn't think it was better than the original.

And I'm not saying they should replace the original; could just make it optional for those who prefer the bad stock effects and MIDI music.
 

Dant21

Member
Apr 24, 2018
842
It sure is your opinion that the PSX sound and music is better, but it's a drastic change and for other people that would be a bridge too far.

(Also fun tech note, the PSX version was built around audio reverb, so not that easy.)
The ideal solution would be to either record the PC MIDI played through a Roland SC-55 or license a software Sound Canvas synth from Roland to play back the MIDIs.
 

Cian

One Winged Slayer
Member
Feb 17, 2018
576
It's not even compatible. It's an 8bit software renderer, the PSX lighting is done on a very different 16bit hardware renderer. PSX Doom wasn't some minor iterative change, it's a dramatic functional difference in engine tech, transplanting that back would be madness, I wouldn't even want to know how it'd work when it comes to maintaining vanilla add-on support as well if you wanted to completely change the renderer in the process, given the extremely rendering-specific behaviour and tricks various map authors have used over the years.

Was PS1 Doom built on the engine they'd later use for Doom 64? With the upcoming Doom 64 port, how possible would it be to bring back the PS1 lighting and ambience?
 

Edward850

Software & Netcode Engineer at Nightdive Studios
Verified
Apr 5, 2019
991
New Zealand
Was PS1 Doom built on the engine they'd later use for Doom 64?
PSX Doom is derived from Jaguar Doom and Doom v1.666, which in turn Doom64 was, as far as we can tell, based on PSX Doom.
With the upcoming Doom 64 port, how possible would it be to bring back the PS1 lighting and ambience?
Well it's a port of Doom64, so it'll have Doom64's lighting and sound. Doom64's lighting is far more complex and each sector now has 5 different light colour values instead of just brightness (and colour in PSX) and thus not exactly forward or backward compatible. It's also not the same levels, assets, nor compatible with any other version due to having its own formats.
 
Last edited:

Pargon

Member
Oct 27, 2017
11,996
Why even bother with upscaling the image when you can just hack the renderer to output the desired resolution? It's not like Doom isn't the most well documented game in existence.
  1. Rendering in the original native resolution, or close to it, can be desirable for retro games.
  2. Sprites won't be displayed correctly at 720p/1080p (though I can think of several ways you could work around it).
  3. The Switch doesn't seem to be capable of running this port at 720p/1080p anyway.
 

Edward850

Software & Netcode Engineer at Nightdive Studios
Verified
Apr 5, 2019
991
New Zealand
Either way, they didn't do it just to be different.
It is entirely plausible that they did do it just to be different. After all, using the original soundtrack wasn't viable on the hardware, hence the shift to CD audio and the PlayStation's SPU. From there they could have simply decided to change the audio themes completely because they had Aubrey Hodges particular style on hand, especially as an additional hook to sell a game that had already been on PC for 2 years.

The point is you can't argue they decided it was better. In fact nobody knows why the sound specifically, might have even been for licensing reasons, nobody asked why they changed it.
 

GreenMonkey

Member
Oct 28, 2017
1,861
Michigan
They were 4:3 with that resolution. They were the step above 1024*768 for a lot of monitors is the late 90s and early 2000s. I always preferred 1600*1200.
No, people frequently picked 1280x1024 on CRTs but it was the wrong aspect ratio. Proper 4:3 is 1280x960 which the CRT normally supports just fine.

I think 1280x1024 (5:4...which is not equal to 4:3) was only around because of early TFT LCD screens that used it.
 
Nov 8, 2017
3,532
It is entirely plausible that they did do it just to be different. After all, using the original soundtrack wasn't viable on the hardware, hence the shift to CD audio and the PlayStation's SPU. From there they could have simply decided to change the audio themes completely because they had Aubrey Hodges particular style on hand, especially as an additional hook to sell a game that had already been on PC for 2 years.

The point is you can't argue they decided it was better. In fact nobody knows why the sound specifically, might have even been for licensing reasons, nobody asked why they changed it.
Other lesser console platforms got the original soundtrack, including the SNES version, so I think that "licensing reasons" or "PS1 hardware not viable" are both a pretty big stretch. It's true that we don't know for certain why they changed it, which is why I'm going with what is by far the most likely explanation.
 

dgrdsv

Member
Oct 25, 2017
11,846
...Or you can just use Doomsday and render the games in native display resolution without any rescaling.