mad respect dude. you're way over my head lolThis, they'll be coding much closer to the metal. And although the raw GPU numbers will never surpass the power of a card like the 7970 or GTX680, the actual game results could easily match and eventually surpass the quality those cards can produce (and isn't that all that really matters?). The reason is the HSA nature of the APU in the PS4. It's basically impossible to directly compare it to a regular CPU\GPU like you could do with past APUs because of this and for the obvious reason that it's being implemented into a console(closed-platform).
Foraeli doesn't understand the point of the asynchronous fine-grain compute. Using the GPU for these other normally CPU intensive calculations only while the GPU isn't maxing out is an extremely efficient way of doing things. And it doesn't take away from anything the GPU would have been doing anyways, it's just getting more use out of the GPU, leaving more for the CPU to do, and wasting little to no available power in there. This will also make direct comparisons fall further off the mark than usual when comparing consoles to PCs. As APIs are adjusted to make better use of GPGPU functions, the resulting quality of games will incrementally increase in everything from visual fidelity to AI.
There's also much less overhead in the PS4, with all the dedicated chips(so the specs are much more impressive, knowing almost 100% is going toward pure gaming). The OS is a modified version of FreeBSD 9.0 that has a very small memory footprint and is exceptionally good at multitasking. The secondary chip ARM processor runs the OS and background processes, HDD access and downloads. The audio chip handles cross-game chat and 200 concurrent mp3 streams in-game. The video encoder/decoder handles the constant 15 minute block of recorded gameplay and Vita remote-play. And the zlib decompressor extracts compressed media on the fly. There should really be much more hype surrounding these new AMD APUs with full HSA, it really is the right direction for the future of computing, and PS4 and Xbone are going to be the first pieces of hardware to put it to work. So, in response to Foraeli, taking into account the nature of the hardware architecture, the resulting visual quality and overall scope of games on the PS4 is probably going to be closer to that of a current top end PC and GPU than the PS3 and 360 were for their time.
Sent from my iPhone using Tapatalk 2
Results 101 to 125 of 235
Also, if one wants to read about GCN, this is nice resource. (although without ps4 modifications.)
(This is a bit over my head, so I've been holding out on posting until now.)
So this means that some task may have been too ambitious in the past? Task that would have had a too high a delay to run satisfactory, maybe presently only possible on PC, may be achievable on this next generation of hardware?
The GPU in the default APU they started with allowed 2 sources of compute for handling CPU calculations. Which Mark Cerny took the default 2 sources of compute commands and upped it to 64 sources, which is what makes it basically fine-grain. This allows many CPU complex computations to be offloaded to the GPU and placed in a queue, that the GPU will automatically sort and run in the most effective and efficient order.
Another thing, a little off topic, that isn't mentioned much about PS4 vs Xbone is the fact that Unix/Linux-OpenGL(PS4) is much more flexible and faster than Windows-DirectX(Xbone). There's an article about Gabe Newell of Valve(Click Here), who's been working hard recently to port Steam and many games over to Linux. He states that in just a short few months of transitioning, he got Left 4 Dead 2 to run on Linux using OpenGL at 315fps compared to on Windows 7 using DirectX at 270.6fps - both using the same hardware. That's a 20% increase for Linux/OpenGL, it turns out that OpenGL renders faster than DirectX because it has a smoother and more efficient pipeline. Everyone usually uses DirectX because Microsoft pushes it so hard, and does whatever they can to hamper OpenGL considering it's direct competition to their DirectX. With everyone's distaste for Windows 8, Gabe's Steam and PS4 pushing Unix/Linux and a very mature OpenGL, Windows itself can end up having heavy competition, with Linux builds becoming increasingly more user-friendly.
Last edited by XtraTrstrL; 07-10-2013 at 15:15.
so how the $#@! do you know all this? with all due respect of course.
if i didn't know any better, i'd think you were into programming/software engineering like mynd and vulgotha. keep it up though, we need more techies here.
The only real 'confirmation' is the KZ:SF tech demo breakdown, which admittedly was using old devkits, showed it using only 6 cores. People presume this means that 2 cores are for the OS.. Nothing beyond that.
But man if the OS is really offloaded onto those ARM processors that'd be great!
I'm not sure that the DX\OpenGL stuff is wholly applicable.. Isn't PS4 using Libgcm (?) or something? And they're developing PSSL for GPGPU.. But niether are using "pc type" DX or OpenGL abstraction layers im pretty sure. Mynd can correct me if I'm wrong.
Both are fairly down to the metal, but I have heard that PS4 is closer than Xbox One..
And I thought GCN came standard with 2-4 ACES that can handle 8 each? (2x8 or 4x, in PS4's case it has 8 ACE's (8x8=64).
Last edited by Vulgotha; 07-10-2013 at 16:11.
Oh no doubt, I'm positive what they're using will be loads more efficient than anything PC's are utilizing in practice (DX or OpenGL).
I've heard similar things about the maturity of the dev tools for Xbone, does not bode well for them. Do we know for sure that they're actually using a Win8 kernel?
I guess on the one hand it makes sense.. They're MS. But on the other, God isn't that a performance pit?
Last edited by Vulgotha; 07-10-2013 at 16:52.
I'm not sure if it's a win8 kernel, I've read they are using a 64bit Windows NT 6.x kernel-based OS. It is 3 OSes together though, and they have stated that they are marrying their Windows Phone and Windows 8 user interface to the Xbox more closely than ever before. So, there very well could be a win8-based kernel in there. I'd be willing to bet on it actually, they are doing whatever they can to save Windows 8, such as the new DirectX 11.2 (with the new tiled resources to enhance in-game textures) requiring Windows 8.1 to operate. It'd make sense that they'd have some tie-in to Win8 with Xbone, to try and brush aside some of the negativity surrounding it.
Also, on the OS comparison, adding to the bulk of the Xbone's OSes, it reserves not only 3GB ram, and 2 CPU cores, but it also locks off 10% of the GPU. So Xbone games have 5GB ram, 6 CPU cores and 90% GPU to work with. I guess this stems from them marketing it more as an all-in-one entertainment box, rather than purely a gaming device that can do other tasks, such as the PS4. I believe they were banking on PS4 having a max of 4GB ram, because they knew that the other dominant hardware pieces in the PS4 wouldn't matter as much as that huge gap in memory. So, when Sony surprised everyone, including 1st party devs, it really threw a monkey wrench in Microsoft's plans. That's when all the scrambling began, and the cancelling of E3 interviews and the E3 roundtable. It's gonna be pretty hard at this point to convince any neutral gamers to pay $100 more for a console that on paper is so obviously outpaced by it's main competitor.
From what I understand though, PS3 and Xbox 360 had a similar GPU reservation in place for their OS's as well. So in that regard, it isn't really 'news' that they do reserve GPU time
And I agree, many of my friends (and myself) have walked away from Xbox One. It's just not worth it. Why pay more, to get less, for a stable of franchises that have grown old to us (Halo)? Obviously others have differing opinions of course, but Sony really knocked it out of the park this time (compared to last gen) and MS seems to want to try and mimic the failures the PS3 had.
John Willaford likes this post
Most people who have, say, a 2GB 6950 have felt no need to upgrade.
The PS4 is right in line with that board and the CPU is being fed 5500mhz memory as well, which can't be done on PC.
AMD CPU's tend to climb in performance with bandwidth, UNLIKE Intel, which performs most admirably on most any memory. I'm guessing it was a very good idea to feed Jaguar 5500mhz GDDR5 even if it's not an i7.
When you add in the kind of performance boosts that hUMA would give such a platform, its VERY TURBO CHARGED and in many ways has efficiencies we won't see on a PC.
that's exactly what i want, a console that does games really really well but also offers something additional like apps.
Xbox One will get a lot more features than the PS4 as a media center but you're also losing out on gaming and if that's ok with you then it's perfect for you. Not for me though, if I wanted an all-in-one device, I'd get a PC.
Yes, there's a difference in an all-in-one PC and a console and I'd rather have a console than a PC but until we get to a point where we have enough power to spare...I don't want this route yet. eventually all consoles will become all-in-one...that is if we still will have physical consoles.
@Sufi, I agree, these consoles aren't ready to be sharing their power for such robust entertainment uses like Xbone is so boldly attempting, unless it had the raw hardware power of a top end PC/GPU to begin with, but it's far from that. Like John Willaford states, the hUMA will definitely give a great performance boost, but this is to help keep the consoles going for a few years as PCs fly by their raw computing power even further. The PS4 has done everything in it's power to reduce the overhead of all the background OS and extra social functionality, while the Xbone just throws a large chunk of it's vital resources away on excessive features that most people won't even use. Many people don't even use cable boxes anymore, I dunno though, they have research teams that put big money into these things, so I guess they know what they are doing. It just doesn't seem to be aimed at hardcore gamers though, that's for sure.
Last edited by XtraTrstrL; 07-10-2013 at 17:55.
Right, so there's a CPU reservation.
Something I've been quite curious about- In all honesty, how useful is an 8 core Jaguar really for a CPU in a game console? We're talking barely i3 performance here..
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)