• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

gofreak

Member
Oct 26, 2017
7,734
"precog trickery"

Basically the servers will try to predict the player's inputs, and send multiple different results at once for the most likely possibilities. Then the input actually happens, the client chooses the output frame that matches best, and possibly does some light image manipulation if something is slightly different, like the direction or magnitude of an analog stick.

Microsoft experimented with something like that a while ago and apparently got great results. Technical paper on it:

Graphs from the paper:
NcZM56h.png

On the left, general opinion score from people playing. In the middle, player performance based on health remaining at the end of a segment. On the right, time taken for players to complete the segment.

Yeah, it's a thing, but it gets a lot more complicated with higher input complexity. The state space explodes.

You can apply ML, but there's always the possibility of it going wrong.

And, of course, there's no reason a local box can't apply the same tech. Or even just, when powerful enough, return a same-frame response to input without any prediction.
 

Deleted member 49438

User requested account closure
Banned
Nov 7, 2018
1,473
Why would I want a game or service predicting my next move? That's the illusion of no latency, not actually addressing the issue. And what happens when it guesses wrong? Seems weird (and arrogant) to the think they'll be able to pull something like that off effectively.
 

GhostTrick

Member
Oct 25, 2017
11,298
Sure, if you send the data back in time a few millisecond before, you can achieve lower latency than local hardware. :""")
 

RoninStrife

Banned
Oct 27, 2017
4,002
Like, 4 years from now when Google sells this business unit, or shut it down, I'm going to point and laugh at this notion that anything online can be faster in terms of latency than a local machine.

In fact, perhaps I shall laugh that Stadia wanted to predict button presses to make that possible. By all means, why not play the games for us, Stadia. While we eat our snacks and watch you play the game, maybe at that time we will also have a robot on hand to help us chew our food and partially digest it for us as well.
 

Lokimaster

Alt Account
Banned
May 12, 2019
962
Lol wow. they are really trying to sell this thing art they?

Faster than local hardware? Just how do they think they will manage to do that?
 

exodus

Member
Oct 25, 2017
9,942
Yeah, it's a thing, but it gets a lot more complicated with higher input complexity. The state space explodes.

You can apply ML, but there's always the possibility of it going wrong.

And, of course, there's no reason a local box can't apply the same tech. Or even just, when powerful enough, return a same-frame response to input without any prediction.

I expect we'll see more complex video streams that transmit frame deltas back to the local hardware so that the end prediction can be decided on the local hardware itself to even further reduce latency.

Anyways, a lot of tech incompetent people in this thread spewing nonsense about physical impossibility when they have no idea what they're talking about.
 

Slayven

Never read a comic in his life
Moderator
Oct 25, 2017
93,009
Like, 4 years from now when Google sells this business unit, or shut it down, I'm going to point and laugh at this notion that anything online can be faster in terms of latency than a local machine.

In fact, perhaps I shall laugh that Stadia wanted to predict button presses to make that possible. By all means, why not play the games for us, Stadia. While we eat our snacks and watch you play the game, maybe at that time we will also have a robot on hand to help us chew our food and partially digest it for us as well.
4 years? I think if Google doesn't see instant domination they will got into maintenance mode in 18 months, quietly shutting it down 8 months after that
 

crimilde

Member
Oct 26, 2017
6,004
Specifically Bakar notes Google's "negative latency" will act as a workaround for any potential lag between player and server. This term describes a buffer of predicted latency, inherent to a Stadia players setup or connection, in which the Stadia system will run lag mitigation. This can include increasing fps rapidly to reduce latency between player input and display, or even predictive button presses.

Yes, you heard that correctly. Stadia might start predicting what action, button, or movement you're likely to do next and do it for you – which sounds rather frightening.

Made me think of this:

 

gofreak

Member
Oct 26, 2017
7,734
I expect we'll see more complex video streams that transmit frame deltas back to the local hardware so that the end prediction can be decided on the local hardware itself to even further reduce latency.

I think that's what the MS 'delorean' technique does - the base frame + deltas based on the RTT latency, and then the client picks and synthesizes the 'correct' frame. So that when it is correct, you can get 1 frame latency. Of course, this gets more complicated on all fronts when the input is more than a couple of binary toggles.
 

Log!

Member
Oct 27, 2017
1,410
So basically the aim assist of console shooters, but for the entire game.
I'll stick with local hardware, thanks.
 

jotun?

Member
Oct 28, 2017
4,484
Yeah, it's a thing, but it gets a lot more complicated with higher input complexity. The state space explodes.

You can apply ML, but there's always the possibility of it going wrong.
Yeah, it may introduce glitches when there's a gross misprediction. But the same thing happens in fighting games that use GGPO/rollback netcode - sometimes characters warp around and things you saw happen end up not actually happening, but it's all worth it for the increased responsiveness.
 

Alucardx23

Member
Nov 8, 2017
4,711

You can compensate for network latency by increasing the framerate. See the DF video below for more information on how this works.

"Specifically Bakar notes Google's "negative latency" will act as a workaround for any potential lag between player and server. This term describes a buffer of predicted latency, inherent to a Stadia players setup or connection, in which the Stadia system will run lag mitigation. This can include increasing fps rapidly to reduce latency between player input and display, or even predictive button presses."

Capture.png





What he says about predicting the button presses sounds a lot like what Microsoft was experimenting with Outatime.





Outatime: Using Speculation to Enable Low-Latency Continuous Interaction for Cloud Gaming


There is a whole range of different framerates with different input lags. A lot of people think that all 30 or 60fps games have the same input lag. Going forward we will see a lot of optimization for cloud games, to reduce the input lag as much as possible on the server side. See below for the range in input latency from Call of Duty to Killzone Shadowfall.

IMG-20190611-084953-491.jpg
 
Last edited:

TitlePending

The Fallen
Dec 26, 2018
5,338
I had better see "NEGATIVE LATENCY" in big, bold letters across that Stadia box or I'm going to be mad upset.
 

Mantrox

Member
Oct 27, 2017
2,907
That's pretty much how eliminating lag on Retroarch works.
Delaying the drawing of a frame to accomodate a more recent batch of input information works if your input method is connected directly to the machine.
In the case of Stadia, you controller is several kilometers away, and even with the fastest roundtrip speeds in the world you will get more delay than on local hardware.

The comparisons they are doing must be cherry picked. And if the titles have more input delay on console, its most probably due to bad implementation instead of a superior method of handling inputs.

The only method i see where this could theoretically work is if the engine could, in between every frame, render the future frames resulting from the combination of inputs you could press in that moment, store the frames, read your input, and then deliver the correct frame.
Using AI could optimize this process and reduce the number of frames you would have to render, by predicting what is the most likely input to be next in line.
But still, i don't see how this could be possible today, while providing state of the art graphics.
 

uzipukki

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
5,722
Ooooooh mama. Stadia is a time machine boiiiiii.
 

dom

▲ Legend ▲
Avenger
Oct 25, 2017
10,433
"precog trickery"

Basically the servers will try to predict the player's inputs, and send multiple different results at once for the most likely possibilities. Then the input actually happens, the client chooses the output frame that matches best, and possibly does some light image manipulation if something is slightly different, like the direction or magnitude of an analog stick.

Microsoft experimented with something like that a while ago and apparently got great results. Technical paper on it:

Graphs from the paper:
NcZM56h.png

On the left, general opinion score from people playing. In the middle, player performance based on health remaining at the end of a segment. On the right, time taken for players to complete the segment.
This would be horribly expensive to do though. Then there is multiplayer games where it would all fall apart.
 

RoninStrife

Banned
Oct 27, 2017
4,002
4 years? I think if Google doesn't see instant domination they will got into maintenance mode in 18 months, quietly shutting it down 8 months after that
This. Any time I read a Stadia article, I always get the feeling their trying to swindle us. I mean, they talk in "the now" for the least amount of time, other times they offer "theories" or a "hypothesis" to solve their problems like "The ISPs will handle this, because we're the future", now they will predict button presses to reduce latency.

Honestly, I think your timeline is accurate, and a Chinese investor will buy it off and turn it into something very different from. what people will be buying on launch.
 

AtomicShroom

Tools & Automation
Verified
Oct 28, 2017
3,075
I think that's what the MS 'delorean' technique does - the base frame + deltas based on the RTT latency, and then the client picks and synthesizes the 'correct' frame. So that when it is correct, you can get 1 frame latency. Of course, this gets more complicated on all fronts when the input is more than a couple of binary toggles.

And that's really the key here. There's so many possible different inputs at any given time that no matter how good it can be at predicting, it will always inevitably get things wrong ever so often. And whenever that happens, the standard lag will apply while it corrects itself. It will make for a very uneven jarring experience where sometimes your inputs register instantly, sometimes there's a delay. No thank you.
 

Azurik

Attempted to circumvent ban with alt account
Banned
Nov 5, 2017
2,441
I love how a lot of people here, who are mostly hobby gamers with basic understanding of tech, doubt a developer, who is dealing with tech on a daily basis.
Now, don't get me wrong, I'm not taking away any deep knowledge a lot of people here have and I am not silly enough to believe everything devs like to promise, but to straight away shout " lies, lol, yeah whatever etc" without even knowing what tech will be available in 2 years time, is just simple bashing on google/ stadia.
 

Doomguy Fieri

Member
Nov 3, 2017
5,261
What? Kids watching games on Youtube is evolving! Congratulations, Kids watching games on Youtube has evolved into Youtube playing the games for them!
 

Deleted member 58846

User requested account closure
Banned
Jul 28, 2019
5,086
This actually sounds super cool. It's one of the better uses of cloud and AI. I'm interested in seeing this.