Page 8 of 10 FirstFirst ... 8 ... LastLast
Results 176 to 200 of 235
  1. #176
    Superior Member
    jphuff's Avatar
    Join Date
    Jun 2007
    Location
    Phoenix, Arizona
    PSN ID
    jhuff
    Posts
    854
    Rep Power
    59
    Points
    11,941 (0 Banked)
    Oh come on, this is ridiculous. Neither MS nor Sony re-engineered every component of these systems. They used off the shelf components and made tweaks to those components to better fit their strategies. I mean, they're using Jaguar CPU's. If they had completely re-engineered the CPU's they would no longer be called "Jaguar." They also would have lost most if not all of the cost benefits of CHOOSING to go with standard off the shelf parts! Logic...simple logic. The biggest tweaks MS had to make were a result of their decision to use DDR3. As a result, they had to budget for a less powerful GPU and use the space for the ESRAM. They customized some of the buses to assist with this. It's not black magic, people. Just some modifications. Not anywhere on the scale of ground up design. That's like saying Sony DESIGNED a GPU with more shaders and compute units..just not true.

    Bottom line, both these machines will have great looking games and people who buy them will be happy. On the other hand for us "tech geeks" it appears the PS4 does have a decent edge in horsepower but there are questions as to how much that will count for in the real world due to the design modifications MS made including the ESRAM. Regardless of that, more shaders and compute units should reasonably account for something, at least at some point during the generation. We'll just have to see what the future brings though. The software, as always, will truly tell the tale. It may be that we see some basic differences in most titles or it may be that we see differences only in first party titles. Possibly none. I don't think it's very likely that the additional horsepower will count for nothing, myself. But, there's simply no way to prove it either way at this point.

    It just seems that some people are making almost mystical arguments for one side or the other at this point. I think that's foolhardy when we don't even have really solid demos at this point and certainly haven't even seen how the launch titles will finally shape up yet. Rather than theory and arguments based on how we think things may be used, etc. I think at this point the safest bet is to use only the known solid numbers, IF that kind of point can or needs to be argued at ALL. Otherwise we run the risk of looking mighty silly over the coming months/years.

    All you have to decide is what to do with the time give to you....
    http://www.woodchuckproductions.net

  2. Likes Two4DaMoney , Omar likes this post
  3. #177
    Forum Guru
    KnotGamer901's Avatar
    Join Date
    Jan 2010
    Location
    NC
    PSN ID
    markiemark901
    Age
    23
    Posts
    3,669
    Rep Power
    59
    Points
    30,525 (0 Banked)
    Quote Originally Posted by XtraTrstrL View Post
    Both seem to be semi-customized Jaguar APUs, although there were earlier comparisons that showed the CPU of the Xbone as being an 8-core Microsoft custom CPU(mainly before Xbone official specs were released). Going by both specs though, it seems the PS4 has been just as heavily modified as the Xbone. The PS4 has upped the GPGPU compute sources from 2 to 64, and added an extra bus from the GPU to the ram, etc. Adding eSRAM doesn't make Xbone more heavily customized than the PS4. The eSRAM is a necessity to offset the limited bandwidth of the DDR3 ram, which isn't ideal for modern GPU rendering. Sony even mentioned that they could have halved the 256bit to 128bit and brought the bandwidth down from 176gb/s to 88gb/s and used eDRAM to bring it back up to over 1TB/s, but chose to stick with what they had because the eDRAM would add an extra hurdle for devs to have to decide when and what goes down that small 32mb pipeline at a time.

    "However John Taylor, head of marketing for AMD's Global Business Units, said that a version of the same chip without Sony's technology will be available for consumers later this year.

    Taylor told The INQUIRER that the AMD branded APU chip will not have the same number of cores or the same computing capability as Sony's part.

    He said, "Everything that Sony has shared in that single chip is AMD [intellectual property], but we have not built an APU quite like that for anyone else in the market. It is by far the most powerful APU we have built to date, it leverages [intellectual property] that you will find in our A-series APUs later this year, our new generation of APUs but none that will quite be to that level of sheer number of cores, sheer number of teraflops.""

    This speaks alot on power, but there's a plethora of info covering the sheer amount of customization that's been done on the PS4. Microsoft has done a lot on their end too, like the added move engines in their box and such, but both APUs are based on a custom 8-core Jaguar APU at their core and neither are completely built from the ground up.
    fantastic information cause before ur post I had no idea in hell what was going on but now Im very enlightened. +rep señor



    Currently Playing: COD Ghosts, Battlefield 4, Assassins Creed 4

  4. #178
    Superior Member
    jphuff's Avatar
    Join Date
    Jun 2007
    Location
    Phoenix, Arizona
    PSN ID
    jhuff
    Posts
    854
    Rep Power
    59
    Points
    11,941 (0 Banked)
    Quote Originally Posted by XtraTrstrL View Post
    Both seem to be semi-customized Jaguar APUs, although there were earlier comparisons that showed the CPU of the Xbone as being an 8-core Microsoft custom CPU(mainly before Xbone official specs were released). Going by both specs though, it seems the PS4 has been just as heavily modified as the Xbone. The PS4 has upped the GPGPU compute sources from 2 to 64, and added an extra bus from the GPU to the ram, etc. Adding eSRAM doesn't make Xbone more heavily customized than the PS4. The eSRAM is a necessity to offset the limited bandwidth of the DDR3 ram, which isn't ideal for modern GPU rendering. Sony even mentioned that they could have halved the 256bit to 128bit and brought the bandwidth down from 176gb/s to 88gb/s and used eDRAM to bring it back up to over 1TB/s, but chose to stick with what they had because the eDRAM would add an extra hurdle for devs to have to decide when and what goes down that small 32mb pipeline at a time.

    "However John Taylor, head of marketing for AMD's Global Business Units, said that a version of the same chip without Sony's technology will be available for consumers later this year.

    Taylor told The INQUIRER that the AMD branded APU chip will not have the same number of cores or the same computing capability as Sony's part.

    He said, "Everything that Sony has shared in that single chip is AMD [intellectual property], but we have not built an APU quite like that for anyone else in the market. It is by far the most powerful APU we have built to date, it leverages [intellectual property] that you will find in our A-series APUs later this year, our new generation of APUs but none that will quite be to that level of sheer number of cores, sheer number of teraflops.""

    This speaks alot on power, but there's a plethora of info covering the sheer amount of customization that's been done on the PS4. Microsoft has done a lot on their end too, like the added move engines in their box and such, but both APUs are based on a custom 8-core Jaguar APU at their core and neither are completely built from the ground up.
    See, this is the other thing bugging me a little bit. A lot of people keep saying that the CPU and GPU's for both the Xbox One and the PS4 are "identical" or "virtually identical" when they are clearly not. Like it or not right now, at least by "specs" and "on paper" the PS4 is clearly the more powerful piece of hardware. The question again comes back to how this will be manifest in software/games or IF it will be manifest. Again I say that logically, we should see a difference. But, that doesn't mean that we will CERTAINLY see a difference.

    All you have to decide is what to do with the time give to you....
    http://www.woodchuckproductions.net

  5. #179
    Apprentice
    XtraTrstrL's Avatar
    Join Date
    Jun 2013
    PSN ID
    XtraTrstrL
    Posts
    339
    Rep Power
    15
    Points
    4,813 (0 Banked)
    Quote Originally Posted by jphuff View Post
    See, this is the other thing bugging me a little bit. A lot of people keep saying that the CPU and GPU's for both the Xbox One and the PS4 are "identical" or "virtually identical" when they are clearly not. Like it or not right now, at least by "specs" and "on paper" the PS4 is clearly the more powerful piece of hardware. The question again comes back to how this will be manifest in software/games or IF it will be manifest. Again I say that logically, we should see a difference. But, that doesn't mean that we will CERTAINLY see a difference.
    Well, saying that they are both semi-custom Jaguar APUs isn't saying that they are "identical". Keeping things simple, the PS4's raw GPU compute power is 50% more powerful than Xbone's. As of the moment, with the PS4's programming ease of use and Xbone's lack of a matched mature API environment, it's looking like PS4 exclusives will atleast start out looking a little more sparkly than Xbone exclusives. Any multiplatform games made by devs that are known for not taking port shortcuts, like DICE with Battlefield, should really let us see what the difference is right off the bat. Things are still early though, and there's alot more advancements to be made with the APIs, especially creating more HSA hUMA compatible APIs that are designed to use the CPU and GPU in unison from the groundup. This can take a year or 2 before the more capable dev teams really start digging their heels in and pushing the hardware to the fullest, but when they do, oh boy, it's gonna get crazy.

    I'm clearly a Sony fan, but I'm not gonna try and predict the future, although I do feel the raw numbers will speak for themselves in the end. I also have complete faith in Mark Cerny, he's a God-send. You can bet Microsoft's design process was alot more cold and calculating, just like all their anti-consumer policies that got shot down recently. Sony allowing a prodigy like Cerny to be the architect of the system designed for developers and gamers alike is nothing short of amazing.I mean, he even had id Software's John Carmack praising his design choices:

    "I can’t speak freely about PS4, but now that some specs have been made public, I can say that Sony made wise engineering choices." ~ John Carmack

    That is saying a heck of a lot. I mean, Cerny is only the 13th inductee into The Academy of Interactive Arts & Sciences (AIAS)

    Cerny will join an elite group of 12 other interactive entertainment industry luminaries in the AIAS Hall of Fame: Trip Hawkins (Electronic Arts), Peter Molyneux (Lionhead Studios), Yu Suzuki (Sega), Will Wright (Maxis), John Carmack (id Software), Hironobu Sakaguchi (Square Enix), Sid Meier (Firaxis Games), Shigeru Miyamoto (Nintendo), Richard Garriott (origin Systems), Dan/Danielle Bunten Berry (Ozark Landscape), Michael Morhaime (Blizzard Entertainment) and Bruce Shelley(Ensemble Studios).

    Anything can happen throughout this next-gen console war, but I'm confident that I'll be making the right choice by sticking with Sony's PS4.

  6. #180
    Power Member
    mynd's Avatar
    Join Date
    May 2006
    Age
    41
    Posts
    17,367
    Rep Power
    160
    Points
    155,864 (0 Banked)
    Items User name style
    Achievements IT'S OVER 9000!
    Quote Originally Posted by jphuff View Post
    Oh come on, this is ridiculous. Neither MS nor Sony re-engineered every component of these systems. They used off the shelf components and made tweaks to those components to better fit their strategies. I mean, they're using Jaguar CPU's. If they had completely re-engineered the CPU's they would no longer be called "Jaguar." They also would have lost most if not all of the cost benefits of CHOOSING to go with standard off the shelf parts! Logic...simple logic. The biggest tweaks MS had to make were a result of their decision to use DDR3. As a result, they had to budget for a less powerful GPU and use the space for the ESRAM. They customized some of the buses to assist with this. It's not black magic, people. Just some modifications. Not anywhere on the scale of ground up design. That's like saying Sony DESIGNED a GPU with more shaders and compute units..just not true.

    Bottom line, both these machines will have great looking games and people who buy them will be happy. On the other hand for us "tech geeks" it appears the PS4 does have a decent edge in horsepower but there are questions as to how much that will count for in the real world due to the design modifications MS made including the ESRAM. Regardless of that, more shaders and compute units should reasonably account for something, at least at some point during the generation. We'll just have to see what the future brings though. The software, as always, will truly tell the tale. It may be that we see some basic differences in most titles or it may be that we see differences only in first party titles. Possibly none. I don't think it's very likely that the additional horsepower will count for nothing, myself. But, there's simply no way to prove it either way at this point.

    It just seems that some people are making almost mystical arguments for one side or the other at this point. I think that's foolhardy when we don't even have really solid demos at this point and certainly haven't even seen how the launch titles will finally shape up yet. Rather than theory and arguments based on how we think things may be used, etc. I think at this point the safest bet is to use only the known solid numbers, IF that kind of point can or needs to be argued at ALL. Otherwise we run the risk of looking mighty silly over the coming months/years.
    Yeah, lol, I should have never used the word ground. People have mistaken that to mean that the core components are some how wildly different.

  7. #181
    Ultimate Veteran
    Lefein's Avatar
    Join Date
    Jun 2005
    Age
    33
    Posts
    22,962
    Rep Power
    193
    Points
    106,264 (0 Banked)
    Achievements IT'S OVER 9000!
    LOL.. Are we really comparing a Bonaire-architected GPU to a Pitcairn mated with 5GHz RAM?

    ..THat's not truly going to end well. Especially if that GPU can access cache off the CPU using SPURS.

  8. #182
    Elite Guru
    TAZ427's Avatar
    Join Date
    Nov 2007
    Location
    Sugar Land, TX
    PSN ID
    TAZ427
    Age
    43
    Posts
    5,095
    Rep Power
    74
    Points
    21,115 (0 Banked)
    Wow, this thread has gone places and a lot of interesting discussion.

    @MYND - I agree with a lot of what your say - Especially that building an SoC and the architecture around it makes light and day differences as to how a system can perform given the same CPU and GPU cores.

    That said, I'm not sure I'd be so adamant that MS has tweaked every component here and Sony hasn't. The only real point of reference we have to go on is that MS has stated that they have touched every component inside and Sony hasn't said $#@! about what they've done. They've just said sure, it's based on Jaguar. But we know for a fact that it's more than just Jaguar as the SoC has direct GDDR5 support w/ a huge ass bus. They've mentioned the GPU's ability to perform Async CPs (which this thread started about.)

    The fact is we don't have the spec's for either of these APUs and everything is speculation except for specific details that have been provided, which is basically the external memories and the CPU Cores. Nothing about the SoC architecture itself has really been spelled out beyond some guess work.

    @XTL - You mentioned the spec's say ... earlier. Please provide a reference to actual specs or don't reference the imaginary.

    I personally live and breath SoC Architecture on a daily basis on large multicore platforms. It's been interesting reading this thread for sure, but without Spec's it's still all hearsay.

    I would love to pick at the Sys Architects brains for both of the systems for a few days. That would be great. And with all do respect to Cerny, while he drove overall vision and some top level details regarding the SoC - he most likely was about as much of a System Architect on the project as our advanced R&D folks. While they have great ideas and are always looking at the future, they miss a lot of what the nuts and bolts are that really pull the SoC together and make it Tier 1 performing.



  9. Likes Omar likes this post
  10. #183
    Superior Member
    jphuff's Avatar
    Join Date
    Jun 2007
    Location
    Phoenix, Arizona
    PSN ID
    jhuff
    Posts
    854
    Rep Power
    59
    Points
    11,941 (0 Banked)
    Quote Originally Posted by mynd View Post
    Yeah, lol, I should have never used the word ground. People have mistaken that to mean that the core components are some how wildly different.
    I hadn't actually realized that when I wrote this but on second glance, yeah... LOL. I think I see what you're trying to say though. You're saying with the move units and the ESRAM, etc., that Microsoft did more customization to the parts they purchased from AMD, correct? Is that it? If so, I think yes you're probably correct.

    My main point, not aimed at you specifically, is that MS needed to do these customizations because of the memory situation. they needed to find a way to improve the available bandwidth or accept a larger performance penalty since they were using the DDR3 memory. That's all.

    Personally, I believe that the actual difference in shaders and compute units SHOULD make for visible differences between the platforms, similar to the way that multi-plat titles differed this generation. I say similar because I think it should be as widespread but NOT as visible to the layman if that makes sense. Perhaps internal rendering resolution differences primarily. But overall, whether or not I believe there is likely to make a difference, it's way to early for me or anyone else, IMHO, to say that all will be equal OR that there will be a noticeable difference. It will depend oh so highly on the developers and their code. Should be VERY interesting to see and I'm very curious. IF on the one hand, the extra horsepower does make a difference, will it make a difference as to the sales, especially if it isn't super obvious? Even if it IS an obvious difference will it come down to a war between "games versus entertainment/tv functionality?" Or....if there is no real difference what does that say about the status of game development currently if all those extra shaders, etc. actually result in little to no difference?

    That's what I'm interested in seeing this generation. Unfortunately, I think it's just WAY too early to tell anything definitive. I THINK you seem to be "arguing" (for lack of a better term) that you feel that the ESRAM and other MS tweaks should negate or largely negate the bandwidth advantage the PS4 has by using GDDR memory? That may be correct or largely correct, we will see. I'm more of the mind that it will make up for a good bit, but not all of the difference. I'm more curious on what, even if bandwidth IS equal, difference if any all those extra shaders and compute units will bring. Hopefully that makes my statements a little clearer. ??

    All you have to decide is what to do with the time give to you....
    http://www.woodchuckproductions.net

  11. #184
    Ultimate Veteran
    Lefein's Avatar
    Join Date
    Jun 2005
    Age
    33
    Posts
    22,962
    Rep Power
    193
    Points
    106,264 (0 Banked)
    Achievements IT'S OVER 9000!
    I think it would be incredibly silly to think that Sony would put a product with an expected shelf-life of 10 years out in the wild and NOT spend any time optimizing its parts or architecting it to go that distance, especially when this is the same company that designed Emotion Engine and Cell Broadband Engine.

    I can't help but giggle, with much malicious intent, at how people talk about Move Engines and ESRam as if this is some mystical Wizard $#@! but completely blow off Sony when they talk about SPURS and "Supercharged PC Architecture." For some people, Sony is pissing in the wind, and that is not lost on me whatsoever. But when these same people are singing the song that GDDR makes for bad system memory. Warn us of "latency". Carry on, and don't let me stop you. It was obviously catastrophic to developers when GDDR3 was used as a shared pool of system ram on the



  12. #185
    Dedicated Member
    John Willaford's Avatar
    Join Date
    Feb 2013
    Location
    Owings Mills, MD
    Age
    39
    Posts
    1,062
    Rep Power
    18
    Points
    407,944 (0 Banked)
    Quote Originally Posted by mynd View Post
    This is complete crap:



    Considering the custom silicon was built and designed by MS, using their own visualization software to validate it befor eit was ever handed to any one to FAB.

    Now as for the GPU:

    High Priority Graphics (HP3D) ring and pipeline

    New for Liverpool
    Same as GFX pipeline except no compute capabilities
    For exclusive use by VShell


    There is a ring without compute used exclusively by the Vshell (according to VGleaks anyway).
    This takes priority over any other GPU tasks. It bumps it directly pass the rest of the system.
    I sense people who can't stand their own medicine, lol.
    MS took their route and SONY took their route and SONY's gamble paid off for sheer horsepower while MS can show you your own already paid for cable tv system with MORE advertising on their XBOX One and if you are into fantasy sports they will show you your standings live during the game, OH WHOOOPIE!!!

    Microsoft isn't the worlds best hardware designer. JUST LOOKING AT THE INCREDIBLY STUPID EXTERIOR DESIGN of the XBOne makes me laugh. It's not for ANYONE. It's designed like a receiver, which means MS wants you to hide it and protect it. Don't let it be around kids, i'll say that now and let you figure out what I mean if you lack that bit of common sense.

    Also, please give me a complete architectural breakdown of the ringbus that you just said is proposed and the method by which it is beneficial for the VSHELL (is that the HyperVisor? or what) to skip past it? I've been looking at leaked block diagrams which look reasonable and reading all of the officially released data and summarizing it as best as I can given a reasonable amount of IT and Programming experience. I look forward to your own personal analysis. Thanks!
    Last edited by John Willaford; 07-11-2013 at 17:38.

  13. #186
    Dedicated Member
    John Willaford's Avatar
    Join Date
    Feb 2013
    Location
    Owings Mills, MD
    Age
    39
    Posts
    1,062
    Rep Power
    18
    Points
    407,944 (0 Banked)
    Quote Originally Posted by Lefein View Post
    I think it would be incredibly silly to think that Sony would put a product with an expected shelf-life of 10 years out in the wild and NOT spend any time optimizing its parts or architecting it to go that distance, especially when this is the same company that designed Emotion Engine and Cell Broadband Engine.

    I can't help but giggle, with much malicious intent, at how people talk about Move Engines and ESRam as if this is some mystical Wizard $#@! but completely blow off Sony when they talk about SPURS and "Supercharged PC Architecture." For some people, Sony is pissing in the wind, and that is not lost on me whatsoever. But when these same people are singing the song that GDDR makes for bad system memory. Warn us of "latency". Carry on, and don't let me stop you. It was obviously catastrophic to developers when GDDR3 was used as a shared pool of system ram on the



    Exactly. MS did a good job with the first 2 XBOX and now they are hitting their turbulence.
    This machine was designed to much around trying to adapt hardware to a software model. NEVER do that when all parts of your eco system have bandwidth problems!!!!
    Their data move engines have alot to do with their GRAINS programming model apparently. I don't want to get into it, but, i don't want to explain this in a year when the Hiccups become to hard to manage and hard to explain why stupid things like sending GRAINS out to the cloud to be processed results in delays in games and continuity issues. This isn't a highspeed lan environment, it's an unpredictable WWAN.
    MS will then just tell people to buy faster monthly internet to support their model!!! except, while I can do that, it's not available to a load of once loyal XBOX owners, that's what I think will be happening.
    Don't forget, THESE ASSHOLES HAVE A GUY IN CHARGE OF A GAMES STUDIO WHO JUST SAID THAT THINGS LIKE LIGHTING AND WATER EFFECTS CAN BE CALCULATED IN THE CLOUD.... So, your going to send those GRAINS (a kind of program object apparently, which is environment agnostic as to if it's processed in the cloud or local box) to the cloud... and my game is supposed to operate correctly and respond well, I DON'T THINK SO.

    When the XBOne tries to use the cloud, the PS4 will still be blazing through things locally.

    I sincerely hope that guy was given a load of crap piece of paper to recite rather than actually believing that.

    The ESRAM can only increase the speed of GPU related computational effects. SONY completely bled MS's advantage dry when they went with GDDR5 system wide! Every Since the x86-64 architecture debuted, there's been a VERY noticeable direct relationship between AMD processors and BANDWIDTH. The Jaguar Cores hooked to 5500mhz memory is going to test that theory to the extreme!

    Things will be better once games are out. The first couple of months, some will be saying I have Ryse you don't, and I have Killzone you dont', etc, and do Forza vs Drive Club's textures, etc, and then we'll see how that $100 vs Camera arguement pans out. Hopefully nobody spills anything on their XBOne because it's gonna short out immediately with that open top.
    Last edited by John Willaford; 07-11-2013 at 17:46.

  14. #187
    Elite Guru
    TAZ427's Avatar
    Join Date
    Nov 2007
    Location
    Sugar Land, TX
    PSN ID
    TAZ427
    Age
    43
    Posts
    5,095
    Rep Power
    74
    Points
    21,115 (0 Banked)
    Quote Originally Posted by Lefein View Post
    I can't help but giggle, with much malicious intent, at how people talk about Move Engines and ESRam as if this is some mystical Wizard $#@! but completely blow off Sony when they talk about SPURS and "Supercharged PC Architecture." For some people, Sony is pissing in the wind, and that is not lost on me whatsoever. But when these same people are singing the song that GDDR makes for bad system memory. Warn us of "latency". Carry on, and don't let me stop you. It was obviously catastrophic to developers when GDDR3 was used as a shared pool of system ram on the
    Agreed - The GDDR5 latency doesn't mean crap on the graphics side, and it wouldn't mean crap if they have a good Prefetcher and large L2 Cache on the CPU side - especially for a game console.

    To bad we don't have Specifications we only have Speculations - Too bad some people confuse the two.



  15. #188
    Extreme Poster
    mistercrow's Avatar
    Join Date
    Nov 2007
    Location
    Texas
    PSN ID
    mistercrow
    Posts
    25,535
    Rep Power
    165
    Points
    169,417 (0 Banked)
    Achievements IT'S OVER 9000!
    Quote Originally Posted by Lefein View Post
    I think it would be incredibly silly to think that Sony would put a product with an expected shelf-life of 10 years out in the wild and NOT spend any time optimizing its parts or architecting it to go that distance, especially when this is the same company that designed Emotion Engine and Cell Broadband Engine. I can't help but giggle, with much malicious intent, at how people talk about Move Engines and ESRam as if this is some mystical Wizard $#@! but completely blow off Sony when they talk about SPURS and "Supercharged PC Architecture." For some people, Sony is pissing in the wind, and that is not lost on me whatsoever. But when these same people are singing the song that GDDR makes for bad system memory. Warn us of "latency". Carry on, and don't let me stop you. It was obviously catastrophic to developers when GDDR3 was used as a shared pool of system ram on the
    I agree. The double standard of the Xbox crowd is amusing.

  16. #189
    Superior Member
    jphuff's Avatar
    Join Date
    Jun 2007
    Location
    Phoenix, Arizona
    PSN ID
    jhuff
    Posts
    854
    Rep Power
    59
    Points
    11,941 (0 Banked)
    Quote Originally Posted by XtraTrstrL View Post
    Well, saying that they are both semi-custom Jaguar APUs isn't saying that they are "identical". Keeping things simple, the PS4's raw GPU compute power is 50% more powerful than Xbone's. As of the moment, with the PS4's programming ease of use and Xbone's lack of a matched mature API environment, it's looking like PS4 exclusives will atleast start out looking a little more sparkly than Xbone exclusives. Any multiplatform games made by devs that are known for not taking port shortcuts, like DICE with Battlefield, should really let us see what the difference is right off the bat. Things are still early though, and there's alot more advancements to be made with the APIs, especially creating more HSA hUMA compatible APIs that are designed to use the CPU and GPU in unison from the groundup. This can take a year or 2 before the more capable dev teams really start digging their heels in and pushing the hardware to the fullest, but when they do, oh boy, it's gonna get crazy.

    I'm clearly a Sony fan, but I'm not gonna try and predict the future, although I do feel the raw numbers will speak for themselves in the end. I also have complete faith in Mark Cerny, he's a God-send. You can bet Microsoft's design process was alot more cold and calculating, just like all their anti-consumer policies that got shot down recently. Sony allowing a prodigy like Cerny to be the architect of the system designed for developers and gamers alike is nothing short of amazing.I mean, he even had id Software's John Carmack praising his design choices:

    "I can’t speak freely about PS4, but now that some specs have been made public, I can say that Sony made wise engineering choices." ~ John Carmack

    That is saying a heck of a lot. I mean, Cerny is only the 13th inductee into The Academy of Interactive Arts & Sciences (AIAS)

    Cerny will join an elite group of 12 other interactive entertainment industry luminaries in the AIAS Hall of Fame: Trip Hawkins (Electronic Arts), Peter Molyneux (Lionhead Studios), Yu Suzuki (Sega), Will Wright (Maxis), John Carmack (id Software), Hironobu Sakaguchi (Square Enix), Sid Meier (Firaxis Games), Shigeru Miyamoto (Nintendo), Richard Garriott (origin Systems), Dan/Danielle Bunten Berry (Ozark Landscape), Michael Morhaime (Blizzard Entertainment) and Bruce Shelley(Ensemble Studios).

    Anything can happen throughout this next-gen console war, but I'm confident that I'll be making the right choice by sticking with Sony's PS4.

    I should have put that in my initial response but just to be clear, I'm aware you weren't saying the processors were identical, etc. I was just quoting your material and using it as a response to the many, many times and places I've seen people say that the two systems CPU and GPU are identical or "virtually identical" when they are clearly not. No worries!

    All you have to decide is what to do with the time give to you....
    http://www.woodchuckproductions.net

  17. #190
    Dedicated Member
    John Willaford's Avatar
    Join Date
    Feb 2013
    Location
    Owings Mills, MD
    Age
    39
    Posts
    1,062
    Rep Power
    18
    Points
    407,944 (0 Banked)
    Remember, it appears SONY's engineers were on the job and did alot to validate the Jaguar and hUMA to work with the GDDR5 memory and the additional bus work that was done. We have hUMA and the Jaguar Cores on GDDR5 bandwidth now, just one more tick in their favor.

  18. #191
    Ultimate Veteran
    Lefein's Avatar
    Join Date
    Jun 2005
    Age
    33
    Posts
    22,962
    Rep Power
    193
    Points
    106,264 (0 Banked)
    Achievements IT'S OVER 9000!
    Quote Originally Posted by TAZ427 View Post
    Agreed - The GDDR5 latency doesn't mean crap on the graphics side, and it wouldn't mean crap if they have a good Prefetcher and large L2 Cache on the CPU side - especially for a game console.

    To bad we don't have Specifications we only have Speculations - Too bad some people confuse the two.
    And that is the real X factor with the PS4 and no one is even asking Sony about it.. Which seems downright bizarre to me when we get inundated with articles about the XBone.. despite the fact that MS themselves haven't mentioned so much as a clock speed to the press. ..How big are the L1, L2 and L3 caches on the CPU? Was SPURS implemented to allow the GPU to read from the chip cache? Can the CPU read/write from the GPU? If that genie gets out of the lamp, there won't just be an architectural advantage for the PS4 anymore. It will be nothing short of a blood bath.

  19. #192
    Extreme Poster
    mistercrow's Avatar
    Join Date
    Nov 2007
    Location
    Texas
    PSN ID
    mistercrow
    Posts
    25,535
    Rep Power
    165
    Points
    169,417 (0 Banked)
    Achievements IT'S OVER 9000!
    http://societyandreligion.com/ps4-pc-steam-box-ram/ Interesting article in laymans terms.

  20. #193
    PSU Technical Advisor
    Vulgotha's Avatar
    Join Date
    Jan 2007
    Age
    23
    Posts
    15,950
    Rep Power
    143
    Points
    106,557 (0 Banked)
    Achievements IT'S OVER 9000!
    Quote Originally Posted by Lefein View Post
    And that is the real X factor with the PS4 and no one is even asking Sony about it.. Which seems downright bizarre to me when we get inundated with articles about the XBone.. despite the fact that MS themselves haven't mentioned so much as a clock speed to the press. ..How big are the L1, L2 and L3 caches on the CPU? Was SPURS implemented to allow the GPU to read from the chip cache? Can the CPU read/write from the GPU? If that genie gets out of the lamp, there won't just be an architectural advantage for the PS4 anymore. It will be nothing short of a blood bath.
    ?


  21. #194
    Elite Sage
    Two4DaMoney's Avatar
    Join Date
    Jun 2007
    Age
    27
    Posts
    12,397
    Rep Power
    110
    Points
    14,585 (75,576 Banked)
    Items Naughty DogPS3 SlimNaughty DogUser name style
    Achievements IT'S OVER 9000!
    Pulled from one of my older threads.

    The Biggest Thing" About the PlayStation 4


    What Does 'Supercharged' Mean, Anyway?


    Familiar Architecture, Future-Proofed

    Enabling the Vision: How Sony Modified the Hardware


    Launch and Beyond

    Freeing Up Resources: The PS4's Dedicated Units[game installs;load times]


    Sounds Good, But... Bottlenecks?

    http://www.gamasutra.com/view/featur...with_mark_.php

    Destiny and Middle Earth: Shadow of Mordor is all I need for the rest of the year.

  22. Likes Brandon likes this post
  23. #195
    Ultimate Veteran
    Lefein's Avatar
    Join Date
    Jun 2005
    Age
    33
    Posts
    22,962
    Rep Power
    193
    Points
    106,264 (0 Banked)
    Achievements IT'S OVER 9000!
    Quote Originally Posted by Vulgotha View Post
    ?
    SPURS. It is on the PS4, and it IS no small part of some Secret Sauce in the PS4s melty cheeseburger of gaming performance love.

  24. #196
    Apprentice
    XtraTrstrL's Avatar
    Join Date
    Jun 2013
    PSN ID
    XtraTrstrL
    Posts
    339
    Rep Power
    15
    Points
    4,813 (0 Banked)
    Quote Originally Posted by John Willaford View Post
    Remember, it appears SONY's engineers were on the job and did alot to validate the Jaguar and hUMA to work with the GDDR5 memory and the additional bus work that was done. We have hUMA and the Jaguar Cores on GDDR5 bandwidth now, just one more tick in their favor.
    Yes, and they've gone into detail about why because of HSA the GDDR5 latency won't be a problem.

    "Worried about GDDR5 latencies? Don't be. Large cache, very capable bus, shortcuts and clever data transferring between CPU and GPU make that a non-issue."

    "The PS4 uses a state-of-the-art heterogenous processor architecture from AMD (the so called "HSA") which combines CPU and GPU in one single chip. To ensure that such a heterogeneous processor can deliver maximum bandwidth for rendering and minimum latency for computing, AMD integrates a special DRAM controller. This DRAM controller allows the CPU memory controller to have low latency access while at the same time the GPU memory controller can burst access the RAM. That's why Sony can go for maximum bandwidth with one big GDDR5 RAM pool without having any headaches because of latency."

    So, the only benefit Xbone had with the slower DDR3 is pretty much factored out by the PS4's architecture.

  25. Likes Peregrin8X, Brandon , MonkeyClaw, Lefein likes this post
  26. #197
    Ultimate Veteran
    Lefein's Avatar
    Join Date
    Jun 2005
    Age
    33
    Posts
    22,962
    Rep Power
    193
    Points
    106,264 (0 Banked)
    Achievements IT'S OVER 9000!
    Ding!

  27. Likes Peregrin8X likes this post
  28. #198
    Veteran
    MonkeyClaw's Avatar
    Join Date
    Oct 2006
    Location
    Fort Worth, TX
    PSN ID
    Tha_MonkeyClaw
    Age
    39
    Posts
    4,807
    Rep Power
    94
    Points
    146,751 (0 Banked)
    Items Protect yourself
    Quote Originally Posted by XtraTrstrL View Post
    Yes, and they've gone into detail about why because of HSA the GDDR5 latency won't be a problem.

    "Worried about GDDR5 latencies? Don't be. Large cache, very capable bus, shortcuts and clever data transferring between CPU and GPU make that a non-issue."

    "The PS4 uses a state-of-the-art heterogenous processor architecture from AMD (the so called "HSA") which combines CPU and GPU in one single chip. To ensure that such a heterogeneous processor can deliver maximum bandwidth for rendering and minimum latency for computing, AMD integrates a special DRAM controller. This DRAM controller allows the CPU memory controller to have low latency access while at the same time the GPU memory controller can burst access the RAM. That's why Sony can go for maximum bandwidth with one big GDDR5 RAM pool without having any headaches because of latency."

    So, the only benefit Xbone had with the slower DDR3 is pretty much factored out by the PS4's architecture.
    You are a wizard at explaining the inner workings of the systems! Love your posts!

    -=[ PSN ID: Tha_MonkeyClaw ]=-

  29. #199
    Ultimate Veteran
    Lefein's Avatar
    Join Date
    Jun 2005
    Age
    33
    Posts
    22,962
    Rep Power
    193
    Points
    106,264 (0 Banked)
    Achievements IT'S OVER 9000!
    On top of that GDDR5, architecturally, does not suffer from overly latent memory throughput anyways. The PS4 just happens to minimize that effect to a greater degree due to what Xtra noted.

    GDDR5 operates with two different clock types. A differential command clock (CK) as a reference for address and command inputs, and a forwarded differential write clock (WCK) as a reference for data reads and writes. Being more precise, the GDDR5 SGRAM uses two write clocks, each of them assigned to two bytes. The WCK runs at twice the CK frequency. Taking a GDDR5 with 5 Gbit/s data rate per pin as an example, the CK clock runs with 1.25 GHz and WCK with 2.5 GHz. The CK and WCK clocks will be aligned during the initialization and training sequence. This alignment allows read and write access with minimum latency. Most modern graphics cards now use this technology, including the new PlayStation 4.
    http://en.wikipedia.org/wiki/GDDR5

    The PS4 is a monster. It is architected with pretty good performing parts. It has been enhanced by Sony to be highly relevant in game performance and what independant processors there are on-board are there to off-load a whole lot of the not-so-direclty game performance related stuff to make room for it. (Video capture, background downloading, Vita mobile integration, etc etc).

  30. #200
    Administrator
    Brandon's Avatar
    Join Date
    Nov 2004
    Age
    30
    Posts
    12,230
    Rep Power
    126
    Points
    119,629 (18,076 Banked)
    Items Ghost in the ShellTidusLightningBruce LeeAppleUser name style
    Achievements IT'S OVER 9000!
    I'm glad to see this thread has been flooded with great information and members offering to explain the more confusing bits of all this techno mumbo jumbo. Some great reading.

    Seems to me that to summarize, Sony and AMD have definitely modified the PS4 to make it very graphics capable, while at the same time future-proofing it for any new features it may have in the future.

    Microsoft and AMD have modified the Xbone not so much for graphics, but for all the various features the Xbone will support... such as the 3 OS's... multitasking, Kinect 2.0, DVR functionality, etc.

    So both have been tweaked for their specific purposes.

    There's no point in arguing which is more graphics savvy since after reading all this it's quite clear the PS4 is the winner. But the Xbone serves a clearly different purpose.

    Both are going to churn out great games, but graphically the PS4 has the edge... as well as the SDKs... to make devs very very happy... and they've made a point of telling us so.

    Unlike this current gen where the CELL architecture was a pain in the ass to developer for... even with SPURS... the PS4 will just be raw power that's simple to program for... which will also cut back on development time considerably.
    Last edited by Brandon; 07-11-2013 at 20:27.
    "The biggest adversary in our life is ourselves. We are what we are, in a sense, because of the dominating thoughts we allow to gather in our head. All concepts of self-improvement, all actions and paths we take, relate solely to our abstract image of ourselves. Life is limited only by how we really see ourselves and feel about our being. A great deal of pure self-knowledge and inner understanding allows us to lay an all-important foundation for the structure of our life from which we can perceive and take the right avenues.”

  31. Likes Lefein likes this post

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
vBCredits II Deluxe v2.1.0 (Pro) - vBulletin Mods & Addons Copyright © 2010-2014 DragonByte Technologies Ltd.