Console ranks by FLOPS

Lethal

Administrator
Staff member
Nov 14, 2007
14,971
313
83
39
#1
I found this pretty interesting. Original Xbox surprised me the most.

[TABLE="width: 100%"]
[TR]
[TD]Console[/TD]
[TD]FLOPS[/TD]
[TD]Release Year[/TD]
[/TR]
[TR]
[TD]Dreamcast[/TD]
[TD]1.4 GFLOPS[/TD]
[TD]1998[/TD]
[/TR]
[TR]
[TD]PlayStation 2[/TD]
[TD]6.2 GFLOPS[/TD]
[TD]2000[/TD]
[/TR]
[TR]
[TD]GameCube[/TD]
[TD]9.4 GFLOPS[/TD]
[TD]2001[/TD]
[/TR]
[TR]
[TD]Xbox[/TD]
[TD]20 GFLOPS[/TD]
[TD]2001[/TD]
[/TR]
[TR]
[TD]Xbox 360[/TD]
[TD]240 GFLOPS[/TD]
[TD]2005[/TD]
[/TR]
[TR]
[TD]PlayStation 3[/TD]
[TD]230.4 GFLOPS[/TD]
[TD]2006[/TD]
[/TR]
[TR]
[TD]Wii[/TD]
[TD]12 GFLOPS[/TD]
[TD]2006[/TD]
[/TR]
[TR]
[TD]Wii U[/TD]
[TD]352.0 GFLOPS[/TD]
[TD]2012[/TD]
[/TR]
[TR]
[TD]PlayStation 4[/TD]
[TD]1.843 TFLOPS[/TD]
[TD]2013[/TD]
[/TR]
[TR]
[TD]Xbox One[/TD]
[TD]1.310 TFLOPS[/TD]
[TD]2013[/TD]
[/TR]
[TR]
[TD]Xbox One S[/TD]
[TD]1.4 TFLOPS[/TD]
[TD]2016[/TD]
[/TR]
[TR]
[TD]PlayStation 4 Pro[/TD]
[TD]4.2 TFLOPS[/TD]
[TD]2016[/TD]
[/TR]
[TR]
[TD]Nintendo Switch[/TD]
[TD]1 TFLOPS[/TD]
[TD]2017[/TD]
[/TR]
[TR]
[TD]Xbox One X[/TD]
[TD]6 TFLOPS[/TD]
[TD]2017[/TD]
[/TR]
[/TABLE]


https://www.gamespot.com/gallery/console-gpu-power-compared-ranking-systems-by-flop/2900-1334/16/
 

Vyse

Extreme Poster
Mar 27, 2006
26,789
357
83
#2
Sounds like 40 TFLOPS is the magic number for true dynamic photorealism.

I wonder how much computing power they use for the CGI in movies though. The rendering of the apes in War for the Planet of the Apes, for example, looks mighty impressive.
 

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,447
141
63
#3
It would have been nice if they included the CPU FLOP's too.

While not a useful measure of the cpu performance, it's a little insightful regarding the way companies designed their machines, and the way the market might be changing.

A single Ryzen core[only 1 of 2 threads] has ~50 GFlops which alone puts it far above a lot of the GPU's of days gone past.

We had that Ubisoft demo from a while back and although i don't recall it using figures, it put the Jaguar cores on level pegging with the CellBE for whatever it was trying to put onscreen.

[QUOTE="Vyse, post: 6531427]Sounds like 40 TFLOPS is the magic number for true dynamic photorealism.

I wonder how much computing power they use for the CGI in movies though. The rendering of the apes in War for the Planet of the Apes, for example, looks mighty impressive.[/QUOTE]
CGI films take a long time to render. IIRC to render a single frame from 'Avatar' it required thousands of CPU cores, and it would churn out that single frame after a few hours. 24 frames in a second of film, and an over 2 hour run time means it took a long time to create that master copy! ;)

Games generally tend to get by using a lot of tricks. Take Screen space ambient occlusion(SSAO), as the name implies the technique only affects screen space, meaning anything offscreen is not included. A lot of God rays in games are also screen space techniques (Uncharted is one of the few which isn't). All games also tend to cull everything but the floor and walls from render to save GPU time. Hollywood tends to not have this luxury. If you're going to have reflections in a characters eyes, you should probably have it reflect what is actually behind the camera, which can only be done if there is something behind the camera to reflect.

Games also don't typically require the accuracy of Hollywood CGI to achieve their effect. If an explosion is calculated differently each time, who cares? Marketing can even sell the randomness of the explosion as a selling point of the game :snicker
 

Vyse

Extreme Poster
Mar 27, 2006
26,789
357
83
#4
Aren't SSAO and god rays post-processing techniques in modern video games? The outer glow-like shadows around objects and rays of light that seem to hang off every object? Most likely from a lead game designer who doesn't understand subtlety, that you don't need to constantly show off a fancy new effect since it doesn't always apply everywhere in the real world when aiming for photorealism.

Not to derail Lethal's thread, I just... *folds arms and frowns profusely* hate some of these "tricks".
 
Last edited:

Vyse

Extreme Poster
Mar 27, 2006
26,789
357
83
#6
How much does GPU clock speed affect overall performance? I noticed they aren't listed for the upgraded consoles in the overall spec comparison by Gamespot.
 

FinalxxSin

Dedicated Member
Jul 26, 2015
1,118
15
38
#7
[QUOTE="Vyse, post: 6531482]How much does GPU clock speed affect overall performance? I noticed they aren't listed for the upgraded consoles in the overall spec comparison by Gamespot.[/QUOTE]
It's difficult to say. Since GPUs and CPUs rely on each other, one can easily bottleneck the other which would reduce the maximum potential performance.
 

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,447
141
63
#8
[QUOTE="Vyse, post: 6531464]Aren't SSAO and god rays post-processing techniques in modern video games? The outer glow-like shadows around objects and rays of light that seem to hang off every object? Most likely from a lead game designer who doesn't understand subtlety, that you don't need to constantly show off a fancy new effect since it doesn't always apply everywhere in the real world when aiming for photorealism.

Not to derail Lethal's thread, I just... *folds arms and frowns profusely* hate some of these "tricks".[/QUOTE]
Yup, SSAO, and HBAO are both postfx techniques since they happen after you've rendered the scene, essentially.

God rays can be both depending on how its implemented, if a dev old schools it and just puts a texture gradient on a mesh, then it happens during the render process. Otherwise, it too is a postfx technique that also only within the screen space.

But yeah, i get what you mean. I remember when i played Oblivion and i thought in 10 years time you'd be able to see the cobblestone pathway leading up to castles off in the distance... We kinda went a bit sideways in that most improvements since then have only been to what was there 10 years ago.

Even the witcher and its impressive draw distances use quite a lot of tricks to give you that long vista, and still keep performance up.

[QUOTE="Vyse, post: 6531482]How much does GPU clock speed affect overall performance? I noticed they aren't listed for the upgraded consoles in the overall spec comparison by Gamespot.[/QUOTE]
Since GPU's a parallel processors it's not so much the clock speed, but the number of cores (or shader units these days). Obviously, if you take one identical GPU core and compare it to another identical GPU core then clock speed determines which of the two is better. But typically, the architecture, and how it handles working on lots of data (see AMD's Async Compute vs Maxwell) determines how good the gpu is.
 

Vyse

Extreme Poster
Mar 27, 2006
26,789
357
83
#9
When you say "cobblestone pathway leading up to castles off in the distance", do you mean anisotropic filtering? Are you saying there isn't much improvement there with PS4 and Xbox One games?
 

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,447
141
63
#10
Not as such. I meant in a more literal sense. In oblivion, you could see a distant castle, but not much else. If there was a path leading straight up the front door (for example), you wouldn't be able to see it from that far away. Only when you were close enough would the game engine allow the ability to resolve assets of that level of detail.

The Witcher has taken the best go at resolving this though:


But even still, you can see that the texture in use to display the roads off in the distance are quite low resolution... and not quite indicative of what the path would actually look like.

As for AF on consoles... I don't know why they barely manage 4xAF (if at all). Maybe its a driver optimisation issue? On PC, the drivers from both Nvidia and AMD are capable of looking at textures as they come in, and determining weather or not AF should be applied - Too small and too far away? No AF needed. I'd guess the console drivers are very specific, and lack this automating ability so it's kinda down to the dev to be pedantic... maybe? I don't know. The big thing about these consoles is the GPU size relative to CPU size, GPU surely has enough room for at least 8xAF... you'd hope anyway.
 

Vyse

Extreme Poster
Mar 27, 2006
26,789
357
83
#11
Games on PC are capable of 16xAF, right? Maybe even double that amount. At least that's what I recall when I last checked the settings.

I noticed in that screenshot you posted that you can't quite make out the horizon. While I understand in movie scenes rendering objects (specifically large ones) far away they add a filter when doing render passes to best create the illusion that it's far away, it seems like game developers are adding a similar "fog" effect but it doesn't look as good. I don't know how else to put it but I feel like that screenshot has two different "light settings" with the foreground and background.
 

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,447
141
63
#13
Maybe the next Xbox will use a ryzen CPU and be 14 flops
It looks unlikely at the moment that either would use Ryzen (though admittedly AMD doesn't really have any other design to give them). Ryzen is designed around it's Core complex's (CCX). Two of these make their flagship model - with a very scalable interconnect!

The APU's AMD have coming simply remove one of the CCX's, and in its place they put a small iGPU - granted comparable to the base XB1 model though:

Ryzen on the left / Vega on the right

As it is, the next iterations of Ryzen are looking to move to a 4nm process (which is damn tiny). But as can be seen, the Ryzen cores (and it's supplementary circuitry) take up half the silicon space.
This is the PS4 by comparison:

Jaguar (and its suplimentary circuitry) take up about ~1/4 of the die space.

The next consoles aren't likely to backtrack on 4K, so they're probably going to want (again) the biggest GPU they can squeeze in there.

This is why, personally, the "EMIB" design from Intel looks a far better prospect since it gives the engineer's some leeway in design, while also keeping a small form factor, and low TDP, but potentially allowing you to couple the highest end components (both GPU and GPU) into their design.
 

Fijiandoce

Administrator
Staff member
Oct 8, 2007
6,447
141
63
#15
Hey coudl be worse, consoles could have bene base don INTEL's flawed CPU's, and take a 10% hit in performance to fix it.
The biggest industries affected are mostly servers though - where it's anywhere from 10%-30% loss in performance.

For games it's >1%-5%, depending on the game (at least as being reported by GamersNexus, Digital Foundry, Hardware Unboxed etc.).

However, given Intel's position, what's a 10% loss in performance when you consider that at one point they held a 50% (or more in the server space) performance lead over their next rival - until the zen chips launched at least.

The backdoor was proven to work on Intel foremost, the report focused on intel's CPU's, but also tested the FX line from AMD (amongst others). Ryzen being new to the party, and having a brand new architecture more inline with intels, could also potentially mean the new chips are compromised (AMD did make a big song and dance about their branch prediction during the Zen buildup).
 

mynd

Ultimate Veteran
May 3, 2006
20,851
175
63
47
Down Under
#16
However, our Witcher 3 test run - which hits storage hard and thrives on memory bandwidth - is hit comparatively hard, losing 8.2 per cent of its performance, rising to 9.4 per cent with the Spectre-orientated BIOS microcode update.
Thats nasty shit either way.