Oh come on, this is ridiculous. Neither MS nor Sony re-engineered every component of these systems. They used off the shelf components and made tweaks to those components to better fit their strategies. I mean, they're using Jaguar CPU's. If they had completely re-engineered the CPU's they would no longer be called "Jaguar." They also would have lost most if not all of the cost benefits of CHOOSING to go with standard off the shelf parts! Logic...simple logic. The biggest tweaks MS had to make were a result of their decision to use DDR3. As a result, they had to budget for a less powerful GPU and use the space for the ESRAM. They customized some of the buses to assist with this. It's not black magic, people. Just some modifications. Not anywhere on the scale of ground up design. That's like saying Sony DESIGNED a GPU with more shaders and compute units..just not true.
Bottom line, both these machines will have great looking games and people who buy them will be happy. On the other hand for us "tech geeks" it appears the PS4 does have a decent edge in horsepower but there are questions as to how much that will count for in the real world due to the design modifications MS made including the ESRAM. Regardless of that, more shaders and compute units should reasonably account for something, at least at some point during the generation. We'll just have to see what the future brings though. The software, as always, will truly tell the tale. It may be that we see some basic differences in most titles or it may be that we see differences only in first party titles. Possibly none. I don't think it's very likely that the additional horsepower will count for nothing, myself. But, there's simply no way to prove it either way at this point.
It just seems that some people are making almost mystical arguments for one side or the other at this point. I think that's foolhardy when we don't even have really solid demos at this point and certainly haven't even seen how the launch titles will finally shape up yet. Rather than theory and arguments based on how we think things may be used, etc. I think at this point the safest bet is to use only the known solid numbers, IF that kind of point can or needs to be argued at ALL. Otherwise we run the risk of looking mighty silly over the coming months/years.
Results 176 to 200 of 235
I'm clearly a Sony fan, but I'm not gonna try and predict the future, although I do feel the raw numbers will speak for themselves in the end. I also have complete faith in Mark Cerny, he's a God-send. You can bet Microsoft's design process was alot more cold and calculating, just like all their anti-consumer policies that got shot down recently. Sony allowing a prodigy like Cerny to be the architect of the system designed for developers and gamers alike is nothing short of amazing.I mean, he even had id Software's John Carmack praising his design choices:
"I can’t speak freely about PS4, but now that some specs have been made public, I can say that Sony made wise engineering choices." ~ John Carmack
That is saying a heck of a lot. I mean, Cerny is only the 13th inductee into The Academy of Interactive Arts & Sciences (AIAS)
Cerny will join an elite group of 12 other interactive entertainment industry luminaries in the AIAS Hall of Fame: Trip Hawkins (Electronic Arts), Peter Molyneux (Lionhead Studios), Yu Suzuki (Sega), Will Wright (Maxis), John Carmack (id Software), Hironobu Sakaguchi (Square Enix), Sid Meier (Firaxis Games), Shigeru Miyamoto (Nintendo), Richard Garriott (origin Systems), Dan/Danielle Bunten Berry (Ozark Landscape), Michael Morhaime (Blizzard Entertainment) and Bruce Shelley(Ensemble Studios).
Anything can happen throughout this next-gen console war, but I'm confident that I'll be making the right choice by sticking with Sony's PS4.
LOL.. Are we really comparing a Bonaire-architected GPU to a Pitcairn mated with 5GHz RAM?
..THat's not truly going to end well. Especially if that GPU can access cache off the CPU using SPURS.
Wow, this thread has gone places and a lot of interesting discussion.
@MYND - I agree with a lot of what your say - Especially that building an SoC and the architecture around it makes light and day differences as to how a system can perform given the same CPU and GPU cores.
That said, I'm not sure I'd be so adamant that MS has tweaked every component here and Sony hasn't. The only real point of reference we have to go on is that MS has stated that they have touched every component inside and Sony hasn't said $#@! about what they've done. They've just said sure, it's based on Jaguar. But we know for a fact that it's more than just Jaguar as the SoC has direct GDDR5 support w/ a huge ass bus. They've mentioned the GPU's ability to perform Async CPs (which this thread started about.)
The fact is we don't have the spec's for either of these APUs and everything is speculation except for specific details that have been provided, which is basically the external memories and the CPU Cores. Nothing about the SoC architecture itself has really been spelled out beyond some guess work.
@XTL - You mentioned the spec's say ... earlier. Please provide a reference to actual specs or don't reference the imaginary.
I personally live and breath SoC Architecture on a daily basis on large multicore platforms. It's been interesting reading this thread for sure, but without Spec's it's still all hearsay.
I would love to pick at the Sys Architects brains for both of the systems for a few days. That would be great. And with all do respect to Cerny, while he drove overall vision and some top level details regarding the SoC - he most likely was about as much of a System Architect on the project as our advanced R&D folks. While they have great ideas and are always looking at the future, they miss a lot of what the nuts and bolts are that really pull the SoC together and make it Tier 1 performing.
Omar likes this post
My main point, not aimed at you specifically, is that MS needed to do these customizations because of the memory situation. they needed to find a way to improve the available bandwidth or accept a larger performance penalty since they were using the DDR3 memory. That's all.
Personally, I believe that the actual difference in shaders and compute units SHOULD make for visible differences between the platforms, similar to the way that multi-plat titles differed this generation. I say similar because I think it should be as widespread but NOT as visible to the layman if that makes sense. Perhaps internal rendering resolution differences primarily. But overall, whether or not I believe there is likely to make a difference, it's way to early for me or anyone else, IMHO, to say that all will be equal OR that there will be a noticeable difference. It will depend oh so highly on the developers and their code. Should be VERY interesting to see and I'm very curious. IF on the one hand, the extra horsepower does make a difference, will it make a difference as to the sales, especially if it isn't super obvious? Even if it IS an obvious difference will it come down to a war between "games versus entertainment/tv functionality?" Or....if there is no real difference what does that say about the status of game development currently if all those extra shaders, etc. actually result in little to no difference?
That's what I'm interested in seeing this generation. Unfortunately, I think it's just WAY too early to tell anything definitive. I THINK you seem to be "arguing" (for lack of a better term) that you feel that the ESRAM and other MS tweaks should negate or largely negate the bandwidth advantage the PS4 has by using GDDR memory? That may be correct or largely correct, we will see. I'm more of the mind that it will make up for a good bit, but not all of the difference. I'm more curious on what, even if bandwidth IS equal, difference if any all those extra shaders and compute units will bring. Hopefully that makes my statements a little clearer. ??
I think it would be incredibly silly to think that Sony would put a product with an expected shelf-life of 10 years out in the wild and NOT spend any time optimizing its parts or architecting it to go that distance, especially when this is the same company that designed Emotion Engine and Cell Broadband Engine.
I can't help but giggle, with much malicious intent, at how people talk about Move Engines and ESRam as if this is some mystical Wizard $#@! but completely blow off Sony when they talk about SPURS and "Supercharged PC Architecture." For some people, Sony is pissing in the wind, and that is not lost on me whatsoever. But when these same people are singing the song that GDDR makes for bad system memory. Warn us of "latency". Carry on, and don't let me stop you. It was obviously catastrophic to developers when GDDR3 was used as a shared pool of system ram on the
MS took their route and SONY took their route and SONY's gamble paid off for sheer horsepower while MS can show you your own already paid for cable tv system with MORE advertising on their XBOX One and if you are into fantasy sports they will show you your standings live during the game, OH WHOOOPIE!!!
Microsoft isn't the worlds best hardware designer. JUST LOOKING AT THE INCREDIBLY STUPID EXTERIOR DESIGN of the XBOne makes me laugh. It's not for ANYONE. It's designed like a receiver, which means MS wants you to hide it and protect it. Don't let it be around kids, i'll say that now and let you figure out what I mean if you lack that bit of common sense.
Also, please give me a complete architectural breakdown of the ringbus that you just said is proposed and the method by which it is beneficial for the VSHELL (is that the HyperVisor? or what) to skip past it? I've been looking at leaked block diagrams which look reasonable and reading all of the officially released data and summarizing it as best as I can given a reasonable amount of IT and Programming experience. I look forward to your own personal analysis. Thanks!
Last edited by John Willaford; 07-11-2013 at 18:38.
Exactly. MS did a good job with the first 2 XBOX and now they are hitting their turbulence.
This machine was designed to much around trying to adapt hardware to a software model. NEVER do that when all parts of your eco system have bandwidth problems!!!!
Their data move engines have alot to do with their GRAINS programming model apparently. I don't want to get into it, but, i don't want to explain this in a year when the Hiccups become to hard to manage and hard to explain why stupid things like sending GRAINS out to the cloud to be processed results in delays in games and continuity issues. This isn't a highspeed lan environment, it's an unpredictable WWAN.
MS will then just tell people to buy faster monthly internet to support their model!!! except, while I can do that, it's not available to a load of once loyal XBOX owners, that's what I think will be happening.
Don't forget, THESE ASSHOLES HAVE A GUY IN CHARGE OF A GAMES STUDIO WHO JUST SAID THAT THINGS LIKE LIGHTING AND WATER EFFECTS CAN BE CALCULATED IN THE CLOUD.... So, your going to send those GRAINS (a kind of program object apparently, which is environment agnostic as to if it's processed in the cloud or local box) to the cloud... and my game is supposed to operate correctly and respond well, I DON'T THINK SO.
When the XBOne tries to use the cloud, the PS4 will still be blazing through things locally.
I sincerely hope that guy was given a load of crap piece of paper to recite rather than actually believing that.
The ESRAM can only increase the speed of GPU related computational effects. SONY completely bled MS's advantage dry when they went with GDDR5 system wide! Every Since the x86-64 architecture debuted, there's been a VERY noticeable direct relationship between AMD processors and BANDWIDTH. The Jaguar Cores hooked to 5500mhz memory is going to test that theory to the extreme!
Things will be better once games are out. The first couple of months, some will be saying I have Ryse you don't, and I have Killzone you dont', etc, and do Forza vs Drive Club's textures, etc, and then we'll see how that $100 vs Camera arguement pans out. Hopefully nobody spills anything on their XBOne because it's gonna short out immediately with that open top.
Last edited by John Willaford; 07-11-2013 at 18:46.
To bad we don't have Specifications we only have Speculations - Too bad some people confuse the two.
I should have put that in my initial response but just to be clear, I'm aware you weren't saying the processors were identical, etc. I was just quoting your material and using it as a response to the many, many times and places I've seen people say that the two systems CPU and GPU are identical or "virtually identical" when they are clearly not. No worries!
Remember, it appears SONY's engineers were on the job and did alot to validate the Jaguar and hUMA to work with the GDDR5 memory and the additional bus work that was done. We have hUMA and the Jaguar Cores on GDDR5 bandwidth now, just one more tick in their favor.
Pulled from one of my older threads.
The Biggest Thing" About the PlayStation 4
What Does 'Supercharged' Mean, Anyway?
Familiar Architecture, Future-Proofed
Enabling the Vision: How Sony Modified the Hardware
Launch and Beyond
Freeing Up Resources: The PS4's Dedicated Units[game installs;load times]
Sounds Good, But... Bottlenecks?
Brandon likes this post
"Worried about GDDR5 latencies? Don't be. Large cache, very capable bus, shortcuts and clever data transferring between CPU and GPU make that a non-issue."
"The PS4 uses a state-of-the-art heterogenous processor architecture from AMD (the so called "HSA") which combines CPU and GPU in one single chip. To ensure that such a heterogeneous processor can deliver maximum bandwidth for rendering and minimum latency for computing, AMD integrates a special DRAM controller. This DRAM controller allows the CPU memory controller to have low latency access while at the same time the GPU memory controller can burst access the RAM. That's why Sony can go for maximum bandwidth with one big GDDR5 RAM pool without having any headaches because of latency."
So, the only benefit Xbone had with the slower DDR3 is pretty much factored out by the PS4's architecture.
Peregrin8X likes this post
- Join Date
- Oct 2006
- Fort Worth, TX
- PSN ID
- Rep Power
On top of that GDDR5, architecturally, does not suffer from overly latent memory throughput anyways. The PS4 just happens to minimize that effect to a greater degree due to what Xtra noted.
GDDR5 operates with two different clock types. A differential command clock (CK) as a reference for address and command inputs, and a forwarded differential write clock (WCK) as a reference for data reads and writes. Being more precise, the GDDR5 SGRAM uses two write clocks, each of them assigned to two bytes. The WCK runs at twice the CK frequency. Taking a GDDR5 with 5 Gbit/s data rate per pin as an example, the CK clock runs with 1.25 GHz and WCK with 2.5 GHz. The CK and WCK clocks will be aligned during the initialization and training sequence. This alignment allows read and write access with minimum latency. Most modern graphics cards now use this technology, including the new PlayStation 4.
The PS4 is a monster. It is architected with pretty good performing parts. It has been enhanced by Sony to be highly relevant in game performance and what independant processors there are on-board are there to off-load a whole lot of the not-so-direclty game performance related stuff to make room for it. (Video capture, background downloading, Vita mobile integration, etc etc).
I'm glad to see this thread has been flooded with great information and members offering to explain the more confusing bits of all this techno mumbo jumbo. Some great reading.
Seems to me that to summarize, Sony and AMD have definitely modified the PS4 to make it very graphics capable, while at the same time future-proofing it for any new features it may have in the future.
Microsoft and AMD have modified the Xbone not so much for graphics, but for all the various features the Xbone will support... such as the 3 OS's... multitasking, Kinect 2.0, DVR functionality, etc.
So both have been tweaked for their specific purposes.
There's no point in arguing which is more graphics savvy since after reading all this it's quite clear the PS4 is the winner. But the Xbone serves a clearly different purpose.
Both are going to churn out great games, but graphically the PS4 has the edge... as well as the SDKs... to make devs very very happy... and they've made a point of telling us so.
Unlike this current gen where the CELL architecture was a pain in the ass to developer for... even with SPURS... the PS4 will just be raw power that's simple to program for... which will also cut back on development time considerably.
Last edited by Brandon; 07-11-2013 at 21:27."Love the life you live, and live the life you love." ~ Bob Marley
Lefein likes this post
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)