PS4 Specs for Dummies.

PS4freak

Counting Mod
Staff member
May 15, 2006
17,374
127
63
Louisiana
#62
[QUOTE="Nitey, post: 6024123]Mark Rein was on Gametrailer's after show. Seemed really happy with the hardware[/QUOTE]

Yeah he sounded like a kid with a new toy. All the devs sounded off the same. I don't think I've ever seen so much widespread joy from devs about a console. The gt post show showed how happy they are.
 

shepard

PSU Live Streamer
Dec 21, 2006
2,703
46
0
39
#64
[QUOTE="ps3freak18, post: 6024160]Yeah he sounded like a kid with a new toy. All the devs sounded off the same. I don't think I've ever seen so much widespread joy from devs about a console. The gt post show showed how happy they are.[/QUOTE]

Is there a clip of this online somewhere I can watch?
 

Nitey

Elite Guru
Jan 17, 2011
5,498
83
48
#65
[QUOTE="Two4DaMoney, post: 6024217]Tech heads, is GDDR5 the next gen's cell? Some member seems to think it is. I'm sure it's way out of bounds.[/QUOTE]

I think because a lot of developers have really come out and backed Sony on it's architecture and picking 8gb gddr5 the other side of the fence believe it's just a fad and is being overhyped as something that promises big but won't deliver in a gaming sense. Something that people believe Cell was guilty of

Anyway I'm glad it's 8gb of gddr5 and not 4 lol
 

Mikael

Apprentice
Nov 20, 2007
206
5
0
36
Gothenburg
#67
[QUOTE="Two4DaMoney, post: 6024217]Tech heads, is GDDR5 the next gen's cell? Some member seems to think it is. I'm sure it's way out of bounds.[/QUOTE]
GDDR5 is just reasonably cheap and fast memory. It's nothing new, since graphics cards have used it for years. It's a first for consoles, though. The point of having high-bandwidth memory like this is mainly to keep the GPU fed with data. GDDR5 in itself probably has little to no benefit for the CPU (DDR3 would likely be sufficient), but having it shared with the GPU could prove beneficial.

So, the memory in itself is just an enabler. Without fast hardware at the other end (i.e. CPU/GPU) it's not going to do any good.
 

jlippone

Forum Guru
Dec 2, 2004
3,855
20
0
#68
[QUOTE="Two4DaMoney, post: 6024217]Tech heads, is GDDR5 the next gen's cell? Some member seems to think it is. I'm sure it's way out of bounds.[/QUOTE]
Not really.
Cell was powerful, but needed a lot of attention and work to get decent results.
A lot of GDDR5 enables coders to get results easier as they do not need to shuffle data around as much.
 

Jabjabs

Elite Guru
May 10, 2006
5,406
11
38
35
www.superderp.com
#69
[QUOTE="Mikael, post: 6024978]GDDR5 is just reasonably cheap and fast memory. It's nothing new, since graphics cards have used it for years. It's a first for consoles, though. The point of having high-bandwidth memory like this is mainly to keep the GPU fed with data. GDDR5 in itself probably has little to no benefit for the CPU (DDR3 would likely be sufficient), but having it shared with the GPU could prove beneficial.
[/QUOTE]

To be fair DDR3 would probably be a little better for CPU efficiency but I'm not worried about Jaguar on GDDR5, it looks to be more than capable of working it's way around any memory latency issues.
 

Brandon

Administrator
Staff member
Nov 8, 2004
15,224
113
63
#72
[QUOTE="Ixion, post: 6025448]So from what I understand, the PS3 actually had a higher peak performance than the PS4 in terms of FLOPS. Is that true?[/QUOTE]
No, it doesn't. The original FLOPS claim by Sony for the PS3 was largely inflated.
 

Cyn

Veteran
Feb 19, 2008
4,232
48
0
#74
[QUOTE="Ixion, post: 6025448]So from what I understand, the PS3 actually had a higher peak performance than the PS4 in terms of FLOPS. Is that true?[/QUOTE]
The CELL has a higher peak performance than this CPU from AMD, but not the system as a whole no. CELL really is a great bit of engineering, it was just too complicated and I suppose they couldn't make it more user friendly, or else I'm sure it would be back for PS4. The PS3 could have been so much more if the CELL wasn't relegated to babysitting the RSX 90% of the time.

The more you look at the PS4 specs and think about what Mark Cerny said, it's apparent that the CPU will probably be nothing more than the doorman and the Radeon chip will be everything, including handling usual CPU tasks.
 

Jabjabs

Elite Guru
May 10, 2006
5,406
11
38
35
www.superderp.com
#75
Yeah Cell has twice the single precision floating point performance but this is a single metric to compare the performance of the CPU on. Cell can still out do the best x86 chips in certain situations, but these are rare and shrinking every day.

Jaguar is much more suitable for it's situation, now that the CPU isn't hand holding the GPU it doesn't need so much floating point performance and as such is a much more balanced and reasonable CPU as a whole.
 

PS4freak

Counting Mod
Staff member
May 15, 2006
17,374
127
63
Louisiana
#76
The cell was much more innovative for sure since it wasn't a customized CPU. It was standalone made specifically for PS3. That turned out to be a double edged sword.
 
Dec 23, 2010
1,537
47
0
#77
[QUOTE="Jabjabs, post: 6026341] Jaguar is much more suitable for it's situation, now that the CPU isn't hand holding the GPU it doesn't need so much floating point performance and as such is a much more balanced and reasonable CPU as a whole.[/QUOTE]

Cell was never hand holding the GPU for floating point operations. GPU work was offloaded to Cell to make up for a lack of memory bandwidth. The GPU bandwidth on the PS3 was puny. A 32-bit frame buffer/back buffer at 720p could easily consume almost all of the bandwidth by itself, so some GPU work was offloaded to Cell where the additional bandwidth from the XDR could be used to compensate.

RSX was crippled by the lack of bandwidth. It could easily have done far more with more bandwidth. Just look what the GPU in the 360 could pull off by simply offloading the frame buffer to the extremely high bandwidth EDRAM. Without that EDRAM it had less than half the total bandwidth of the PS3 (The 360's shared GDDR3 had the same bandwidth as RSX alone). Simply offloading that frame buffer to the EDRAM made an absolutely massive difference in the system though.
 

jlippone

Forum Guru
Dec 2, 2004
3,855
20
0
#78
RSX vas also crippled by comparably weak vertex processing capabilities, which was reason why most games pruned vertex data with SPUs before sending it to RSX.
Removed non visible polygons from display lists, converted lists to better fit RSX caches and so on..

If cell did work on pixels it was carefully planned and never a normal texturing or shading of polygons.

This time GPU should be quite capable and with decently low access we should see some interesting things.
 
Dec 23, 2010
1,537
47
0
#79
[QUOTE="jlippone, post: 6027606]RSX vas also crippled by comparably weak vertex processing capabilities[/QUOTE]

It was no weaker than the 360 GPU.

And I didn't say that Cell was doing a lot of pixel work, just that it was taking over some of the GPU load to reduce bandwidth requirements. One thing that Cell did was antialiasing which normally chews up tons of GPU bandwidth. (That's also why antialiasing was done by the custom logic in the EDRAM in the 360 rather than the GPU)

As far as the PS4 goes, it will surely be far better than the PS3. As I've said before, I expect it to be able to do anything an HD 7850 GPU can do and then a bit more. It appears to have similar raw processing power as an HD 7850, but will surely have some refinements beyond that year+ old GPU.
 
Last edited:

jlippone

Forum Guru
Dec 2, 2004
3,855
20
0
#80
[QUOTE="Completely Average, post: 6027608]It was no weaker than the 360 GPU.
[/QUOTE]
In terms of vertex input, constants etc it is by a mile. (in terms of pixel shading, it wins and loses some.)
There is reason why all games do animations, prune geometry and so on with SPUs, even though RSX could do it as well.

http://forum.beyond3d.com/showthread.php?t=40458
SPUs are needed to help RSX on vertex jobs to get parity with Xenos.
[QUOTE="Completely Average, post: 6027608]
And I didn't say that Cell was doing a lot of pixel work, just that it was taking over some of the GPU load to reduce bandwidth requirements. One thing that Cell did was antialiasing which normally chews up tons of GPU bandwidth. (That's also why antialiasing was done by the custom logic in the EDRAM in the 360 rather than the GPU)[/QUOTE]
Changing from MSAA to MLAA is not done simply for bandwidth, there are other advantages as well. (less work on GPU, shading, post processing artifacts.. etc.)
Also the process is done in completely different phase, MSAA is done during rendering, MLAA is done in post.
[QUOTE="Completely Average, post: 6027608]
As far as the PS4 goes, it will surely be far better than the PS3. As I've said before, I expect it to be able to do anything an HD 7850 GPU can do and then a bit more. It appears to have similar raw processing power as an HD 7850, but will surely have some refinements beyond that year+ old GPU.[/QUOTE]
Oh yes, it's huge leap to right direction.
It certainly will be fascinating to see how pipelines will change, especially if new LibGCM (LibGNM on Ps4?) allows varied use of GPU. (CU job for pruning and sending result for another CU for rendering?..etc)

Exciting times ahead. :D
 
Last edited:

mynd

Ultimate Veteran
May 3, 2006
20,848
175
63
46
Down Under
#81
[QUOTE="Completely Average, post: 6027608]It was no weaker than the 360 GPU.

And I didn't say that Cell was doing a lot of pixel work, just that it was taking over some of the GPU load to reduce bandwidth requirements. One thing that Cell did was antialiasing which normally chews up tons of GPU bandwidth. (That's also why antialiasing was done by the custom logic in the EDRAM in the 360 rather than the GPU)

As far as the PS4 goes, it will surely be far better than the PS3. As I've said before, I expect it to be able to do anything an HD 7850 GPU can do and then a bit more. It appears to have similar raw processing power as an HD 7850, but will surely have some refinements beyond that year+ old GPU.[/QUOTE]

Cell=vertex manipulation
Rsx=drawing
Cell=post processing

All this was handled by gpu on 360.

The rsx was considerably fat worse than the 360 gpu.

As for the ps4, i have nothing bad to say about it. Save that perhaps fuck only know what they will do with all that bandwidth....
 

Jabjabs

Elite Guru
May 10, 2006
5,406
11
38
35
www.superderp.com
#82
[QUOTE="Completely Average, post: 6027597]Cell was never hand holding the GPU for floating point operations. GPU work was offloaded to Cell to make up for a lack of memory bandwidth. The GPU bandwidth on the PS3 was puny. A 32-bit frame buffer/back buffer at 720p could easily consume almost all of the bandwidth by itself, so some GPU work was offloaded to Cell where the additional bandwidth from the XDR could be used to compensate.
[/QUOTE]

Yeah I may have worded that badly, I wasn't implying that Cell directly aided in GPU calculation but it did play a significant roll in assisting RSX as the above posts have pointed out.
 
Dec 23, 2010
1,537
47
0
#83
[QUOTE="mynd, post: 6027652]
As for the ps4, i have nothing bad to say about it. Save that perhaps fuck only know what they will do with all that bandwidth....[/QUOTE]

Funny, I already see the bandwidth as a limiting factor. 1080P @ 60FPS with 4XAA and 16X AF isn't going to be possible in most games with that bandwidth.

For example, to run Crysis 3 on it's High Quality settings at 1080p @ 60FPS with 4X AA and 16X AF you need a minimum of a GTX 590 GPU which has 331.7 GB/s bandwidth dedicated to just the GPU. Push the graphics up to Very High Quality settings and the GTX 690 with 384.5 GB/s bandwidth is the only GPU on the market that can do it without having to go to an SLI/Crossfire setup.


Now, the work around for this is easy. Render games at 720p native and scale it, or drop frame rate, or drop AA, or drop AF or a combination. But, the limit is there and it does require a sacrifice. I expect that most games will still be 720p native and scaled to 1080p to preserve frame rate and graphics fidelity at the cost of resolution. Fewer people would notice the scaling than would notice a loss of frame rate, clarity, or graphical features.
 
Last edited:

Bigdoggy

Master Guru
Jan 24, 2008
7,250
58
0
39
#84
[QUOTE="Completely Average, post: 6027871]Funny, I already see the bandwidth as a limiting factor. 1080P @ 60FPS with 4XAA and 16X AF isn't going to be possible in most games with that bandwidth.

For example, to run Crysis 3 on it's High Quality settings at 1080p @ 60FPS with 4X AA and 16X AF you need a minimum of a GTX 590 GPU which has 331.7 GB/s bandwidth dedicated to just the GPU. Push the graphics up to Very High Quality settings and the GTX 690 with 384.5 GB/s bandwidth is the only GPU on the market that can do it without having to go to an SLI/Crossfire setup.


Now, the work around for this is easy. Render games at 720p native and scale it, or drop frame rate, or drop AA, or drop AF or a combination. But, the limit is there and it does require a sacrifice. I expect that most games will still be 720p native and scaled to 1080p to preserve frame rate and graphics fidelity at the cost of resolution. Fewer people would notice the scaling than would notice a loss of frame rate, clarity, or graphical features.[/QUOTE]



Not even Battlefield 3 needs that. it doesn't matter either way. You are using a 690GTX as an example which isn't right. I can definitely argue by the time a high end GPU card was fully utilized by 2 games or maybe 3 games the next game needed a better GPU to be utilized to it's full potential anyway. Such a purchase is nothing more then bending over and getting pumpkined by Nvidia. $1000 for a card just for gaming. lolz oh geezes and a waste at that.

Consoles really don't need to compete wiht high end GPU's. In fact, they probably just compete with the middle end GPU's. The main reason games look much better is because people are maxing their resolutions. It doesn't make the graphics better, it's just a better resolution, there is a difference.


Also, the majority of PC gamers that I do know usually game around 720p resolution or rather (1024 x 768.) a few I know go up as high as (1280 x 1024). Either way you are going to be much better performance with higher settings with a lower resolution that is clear anyway. Going any higher for a single monitor is pointless. you better be running double or triple monitors.
 
Last edited: