343 Industries responds to everything about Halo Infinite gameplay

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,677
287
83
#4
That 120fps is probably a play at PC gamers, or current console gamers who've grown up and are considering buying/building their first PC. MS has been pushing that pretty hard, but as best as i can tell there are basically no 120fps TV's out that make any rational sense purchasing. For the money, you can get sets that have vastly superior image/colour accuracy for example. Ironically, a quick google search (since 120Hz screens a next to non-existent down our parts) shows Sony as one of the very few who do put out 120Hz screens.

Point here is that if your TV only does 60Hz, you benefit next to nothing from 120fps; you are still playing at 60fps. You need a matching 120fps monitor, and the only people who own those are PC gamers (in quantity at least). The only benefit you get is a single frame delivered to your screen ahead of a 60fps output, which is to say, you get a single frame update closer to when it happens... if you have a mouse and keyboard, this stuff may be noticeable, but with a controller, i'd wager you notice next to nothing.

If you go back to the start of most generations, you generally see features announced that quietly get swept under the carpet. I wonder how long this particular horse gets ridden until it too gets swept under the carpet. If you're serious about 120fps gaming... just get a PC.
 
Likes: Two4DaMoney
Sep 10, 2005
7,032
158
63
49
#5
That 120fps is probably a play at PC gamers, or current console gamers who've grown up and are considering buying/building their first PC. MS has been pushing that pretty hard, but as best as i can tell there are basically no 120fps TV's out that make any rational sense purchasing. For the money, you can get sets that have vastly superior image/colour accuracy for example. Ironically, a quick google search (since 120Hz screens a next to non-existent down our parts) shows Sony as one of the very few who do put out 120Hz screens.

Point here is that if your TV only does 60Hz, you benefit next to nothing from 120fps; you are still playing at 60fps. You need a matching 120fps monitor, and the only people who own those are PC gamers (in quantity at least). The only benefit you get is a single frame delivered to your screen ahead of a 60fps output, which is to say, you get a single frame update closer to when it happens... if you have a mouse and keyboard, this stuff may be noticeable, but with a controller, i'd wager you notice next to nothing.

If you go back to the start of most generations, you generally see features announced that quietly get swept under the carpet. I wonder how long this particular horse gets ridden until it too gets swept under the carpet. If you're serious about 120fps gaming... just get a PC.
With the graphics that HAlo has atm 120 is not a difficult thing to do tho...My Tv is good for 120+ but not at 4k tho...I do have a PC.
 

Aquanox

Forum Sage
May 26, 2005
8,495
97
48
#6
That 120fps is probably a play at PC gamers, or current console gamers who've grown up and are considering buying/building their first PC. MS has been pushing that pretty hard, but as best as i can tell there are basically no 120fps TV's out that make any rational sense purchasing. For the money, you can get sets that have vastly superior image/colour accuracy for example. Ironically, a quick google search (since 120Hz screens a next to non-existent down our parts) shows Sony as one of the very few who do put out 120Hz screens.
Not sure if we're talking about the same thing but I have an LG C9 Oled with HDMI 2.1 inputs. That panel supports [email protected] 10bit and you can get them (or the B9) for "cheap" these days.

I remember when PS3 was launched and HDMI wasn't a thing before it. I guess MS is looking at the future.

As for 120fps not being noticeable with a controller, I'm can't say... I'm under the impression that's more for competitive gameplay but that's not my case... 60fps is just fine for me. There are other games that will support 120fps like Gears 5, Ori 2 and Dirt5... I will definitely give them a try but I don't think it's gonna be big for me.

I do believe that, as a selling point, it might have an impact on PC gamers though. Can't blame MS for trying to push boundaries.
 

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,677
287
83
#7
As for 120fps not being noticeable with a controller, I'm can't say... I'm under the impression that's more for competitive gameplay but that's not my case... 60fps is just fine for me. There are other games that will support 120fps like Gears 5, Ori 2 and Dirt5... I will definitely give them a try but I don't think it's gonna be big for me.
If you have the hardware, you'll most definitely notice it! What few games do end up running at 120 will probably spoil you rotten. But for most people, this is probably not gunna be the case for the aforementioned reason that most TV sets (for an average family for example) will tap out at 60Hz. When i picked up my set, i was more concerned about the image foremost, as such, mine taps out at 4K60; which i think would end up being more representative of most families as well.

Not sure if we're talking about the same thing but I have an LG C9 Oled with HDMI 2.1 inputs. That panel supports [email protected] 10bit and you can get them (or the B9) for "cheap" these days.
Comparatively, PC monitors for 120Hz+ would only set you back a few hundred dollars.
 
May 20, 2008
10,977
110
63
#8
My LG tv is 4K/120 FPS and it has Dolby Vision and Dolby Atmos support. 60FPS is good enough but I'll take 120 whenever I can get it.
 

Two4DaMoney

Master Sage
Jun 4, 2007
13,568
291
83
#9
Good luck showing off that 120fps, MS. Not enough people have capable tvs or monitors for this to have any huge impact outside of esports.
 

Aquanox

Forum Sage
May 26, 2005
8,495
97
48
#11
Good luck showing off that 120fps, MS. Not enough people have capable tvs or monitors for this to have any huge impact outside of esports.
You can't blame Microsoft for pushing boundaries though. I'm not excited about the 120fps story, but sure I will give it a try.

Also, it might not be a thing now for console gamers at least... but PC Gamers is a different story. MS is clearly looking after that market too... if they buy a XSX (and I believe they will) they will definitely a capable screen for taking advantage of it... and/or buy one. New videocards will sport HDMI 2.1 as well, so I think it's safe to say they will have a market sooner or later.

Also, in the next few years, 120fps panels will become more popular and cheaper due to the introduction of HDMI 2.1. Keep in mind consoles are meant to be future proof.
 

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,677
287
83
#12
You can't blame Microsoft for pushing boundaries though. I'm not excited about the 120fps story, but sure I will give it a try.
In fairness, they themselves aren't really pushing boundaries tho, since the relevant specs weren't drafted by them - These being the theoretical limits of the AMD design, the HDMI spec, and the relevant screen specifications. The base Xbox can currently do 4K per the AMD design with no consideration for framerate (for the 7000 series), but the HDMI spec only allows for a signal of 60Hz [EDIT] 30Hz, it's 60 for the 'S' variant.

It seems like it's purely marketing, and there's next to no substance behind it. It's going to be a bit of a novelty just for the start of the generation. When they get serious, we'll be back to the 30fps consoles are accustomed to.

If anyone has done algorithm analysis, improving an algorithm to the tune of 2x is absurd (for already known algorithms). This is what you see with 30fps vs 60 fps, and why 60fps titles are simpler in design and complexity. 120 on the other hand is a massive 4x difference. Mind, this is all moot because if you make it faster at 120/60, it's also faster if you're hitting 30 as well.

Also, in the next few years, 120fps panels will become more popular and cheaper due to the introduction of HDMI 2.1. Keep in mind consoles are meant to be future proof.
That's a hard sell for anyone who isn't a gamer tho since traditional media is still 24fps. I think Netflix only has a small selection of doco's that are 60fps or something?
 

Aquanox

Forum Sage
May 26, 2005
8,495
97
48
#13
In fairness, they themselves aren't really pushing boundaries tho, since the relevant specs weren't drafted by them - These being the theoretical limits of the AMD design, the HDMI spec, and the relevant screen specifications. The base Xbox can currently do 4K per the AMD design with no consideration for framerate (for the 7000 series), but the HDMI spec only allows for a signal of 60Hz [EDIT] 30Hz, it's 60 for the 'S' variant.

It seems like it's purely marketing, and there's next to no substance behind it. It's going to be a bit of a novelty just for the start of the generation. When they get serious, we'll be back to the 30fps consoles are accustomed to.

If anyone has done algorithm analysis, improving an algorithm to the tune of 2x is absurd (for already known algorithms). This is what you see with 30fps vs 60 fps, and why 60fps titles are simpler in design and complexity. 120 on the other hand is a massive 4x difference. Mind, this is all moot because if you make it faster at 120/60, it's also faster if you're hitting 30 as well.
Marketing's good as long as they're not misleading the customer. They're doing 120fps not only on indy games but Gears 5 and Ori 2. Halo is also a major project. Marketing or not, they're working their asses off in order to achieve this milestone. As said before, I'll probably be sticking to 60fps even though I have a HDMI 2.1/120hz TV, but actually, one of the reasons I chose that TV was because of the next generation. I'm probably on nerd side here though.

I don't know if 120fps will ever be a thing for me, but I praise Microsoft for pulling this off. What I'm sure I will be doing is racing games at 120fps. That's a genre that I believe can take advantage of it. On a racing game... say Forza (or Dirt) if they offer a 120fps mode with some kind of DLSS (Whatever the equivalent is on these AMD console) rendered @1080p and upscaled to 4k, I will be all-in for it.

That's a hard sell for anyone who isn't a gamer tho since traditional media is still 24fps. I think Netflix only has a small selection of doco's that are 60fps or something?
I believe that isn't meant to be sold to casuals. Most likely thinking about potential PC/PS4 Gamers (that aren't brand tied) acquiring an XSX.
 

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,677
287
83
#14
On a racing game... say Forza (or Dirt) if they offer a 120fps mode with some kind of DLSS (Whatever the equivalent is on these AMD console) rendered @1080p and upscaled to 4k, I will be all-in for it.
That stuff is a bit questionable on the AMD side, and subsequently for both consoles. AMD run two product lines; RDNA and CDNA. The CDNA line falls more in line with what nvidia offer for DLSS, and ML (tho still isn't a 1:1 map). On RDNA the GPU is using a fallback within the graphics API - so technically any AMD gpu can perform it - where you do BVH within the TMU's. Nvidia can also map this function (since it's part of the "normal"* raster pipeline), and it's probably how they enabled raytracing on the 10 series cards just to hammer home how fast their new stuff was. All this however, and there is no mention of the Tensor cores; these things are beast for matrix multiplication. I don't think AMD have anything on any of their product lines that matches it, and not sure (given some of AMD's patent's) that they ever will. This is what sets Nvidia apart, and what speeds up use of AI, and ML for practical applications. We can assume that neither GPU from Sony/MS has any fixed function units since they already have to squeeze in a giant CPU cluster. So probably, AMD do something with the shader units to speed up denoising.

That last bit was a little bit of a bug bear for me when DF looked at the GT demo from the Sony launch. What raytracing creates is not a 1:1 map of the thing it's reflecting. It's noisy as balls, and the resulting reconstruction/denoise pass does not preserve detail.
 

Aquanox

Forum Sage
May 26, 2005
8,495
97
48
#15
That stuff is a bit questionable on the AMD side, and subsequently for both consoles. AMD run two product lines; RDNA and CDNA. The CDNA line falls more in line with what nvidia offer for DLSS, and ML (tho still isn't a 1:1 map). On RDNA the GPU is using a fallback within the graphics API - so technically any AMD gpu can perform it - where you do BVH within the TMU's. Nvidia can also map this function (since it's part of the "normal"* raster pipeline), and it's probably how they enabled raytracing on the 10 series cards just to hammer home how fast their new stuff was. All this however, and there is no mention of the Tensor cores; these things are beast for matrix multiplication. I don't think AMD have anything on any of their product lines that matches it, and not sure (given some of AMD's patent's) that they ever will. This is what sets Nvidia apart, and what speeds up use of AI, and ML for practical applications. We can assume that neither GPU from Sony/MS has any fixed function units since they already have to squeeze in a giant CPU cluster. So probably, AMD do something with the shader units to speed up denoising.

That last bit was a little bit of a bug bear for me when DF looked at the GT demo from the Sony launch. What raytracing creates is not a 1:1 map of the thing it's reflecting. It's noisy as balls, and the resulting reconstruction/denoise pass does not preserve detail.
Looks like nobody knows exactly how will RDNA2 on the Series X (or PS5) process anything regarding to ML, but according to this article, they've come up with a solution on the shaders. Its performance is yet to be seen.

"The RDNA 2 architecture used in Series X does not have tensor core equivalents, but Microsoft and AMD have come up with a novel, efficient solution based on the standard shader cores. With over 12 teraflops of FP32 compute, RDNA 2 also allows for double that with FP16 (yes, rapid-packed math is back). However, machine learning workloads often use much lower precision than that, so the RDNA 2 shaders were adapted still further.

"We knew that many inference algorithms need only 8-bit and 4-bit integer positions for weights and the math operations involving those weights comprise the bulk of the performance overhead for those algorithms," says Andrew Goossen. "So we added special hardware support for this specific scenario. The result is that Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations. Note that the weights are integers, so those are TOPS and not TFLOPs. The net result is that Series X offers unparalleled intelligence for machine learning."
 

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,677
287
83
#16
Looks like nobody knows exactly how will RDNA2 on the Series X (or PS5) process anything regarding to ML, but according to this article, they've come up with a solution on the shaders. Its performance is yet to be seen.

"The RDNA 2 architecture used in Series X does not have tensor core equivalents, but Microsoft and AMD have come up with a novel, efficient solution based on the standard shader cores.
That's pretty much what i posted.

"We knew that many inference algorithms need only 8-bit and 4-bit integer positions for weights and the math operations involving those weights comprise the bulk of the performance overhead for those algorithms," says Andrew Goossen. "So we added special hardware support for this specific scenario. The result is that Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations. Note that the weights are integers, so those are TOPS and not TFLOPs. The net result is that Series X offers unparalleled intelligence for machine learning."
To be quite blunt, those numbers are kinda terrible. On a tech sheet, the SX (in particular) reads well against a 2080. But the values quoted for ML are a mere ~33% of that cards dedicated hardware (this of course ignores how the thing is implemented). They even fair unfavorably against the introductory 2060. Maybe it just sounds impressive? When someone uses 8-bit or something... all it is is a registry size.. or word size if you're so inclined.

Machine learning is not new. Programs like MatLab have been used for AI for decades. You would just run the simulation on the CPU. You can do it on pretty much any hardware. The question is how fast. The turing approach is quite novel in that it takes these inference engines, and builds circuitry dedicated to them. AMD's approach does not.

Nonetheless, I went looking for what MS might want to use it for and as it turns out, the have patents for upscaling textures using a CNN. For the given power, this seems like it would work as the size of the data would be considerably smaller. This further makes sense since it would happen within the TMU's (since no dedicated circuitry, and RDNA implements RT though the TMU's).

That bit that i highlighted at the end tho is kinda silly. It's like that COD v BF video where the COD exec talks up the franchise next to what BF was putting out at the time.
It's only an impressive statement if you are talking about the PS5, which in this case fairs even worse since the lower CU count means it invariably will suck harder.
 

Aquanox

Forum Sage
May 26, 2005
8,495
97
48
#17
That bit that i highlighted at the end tho is kinda silly. It's like that COD v BF video where the COD exec talks up the franchise next to what BF was putting out at the time.
It's only an impressive statement if you are talking about the PS5, which in this case fairs even worse since the lower CU count means it invariably will suck harder.
Indeed.

Not sure it will be a catastrophic as you point out, but for what has been shared, 2080 cards do have a big advantage in terms of ML.

Btw... have you seen the posts from this guy https://twitter.com/blueisviolet ? What's your take on this?
 

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,677
287
83
#18
Indeed.

Not sure it will be a catastrophic as you point out, but for what has been shared, 2080 cards do have a big advantage in terms of ML.

Btw... have you seen the posts from this guy https://twitter.com/blueisviolet ? What's your take on this?
Reads like a misterxmedia alt account if im being honest. You can see from a few posts down this guy has no idea what he's talking about.

@_rogame is a more reputable source. Among other things, he's been releasing data about AMD's upcoming APU line for quite a while, and broke down the PS5/SX chips a while ago - can't be bothered scrolling for them, but they're on his feed a few months back. [EDIT]: Scratch that, im not too sure which leaker covered the console APU break down, maybe @KOMACHI_ENSAKA... regardless, neither of these guys put up content that is biased one way or another, their feeds are full with product lines from just about every tech company (Mobo, CPU, SSD etc. etc.): Tech for tech's sake.
 
Last edited:

Jabjabs

Elite Guru
May 10, 2006
5,410
13
38
36
www.superderp.com
#19
When it comes to these higher frame rates, I am always reminded of stupid processing latency of displays. That we can send data packets across continents faster than we can get an image out the back of a box and onto a screen a meter away is something that has always bothered me.
 

NoUseMercenary

Newbie
Staff member
Sep 17, 2005
5,779
207
63
#22
Yeah, I didn't think Halo infinite's reveal was all the awe inspiring. The visuals actually looked pretty dated and I didn't really see any new gameplay features that had me thinking "man, I really need to play that."
 
Likes: Two4DaMoney