Page 5 of 5 FirstFirst ... 5
Results 101 to 115 of 115
  1. #101
    Soldier 95B
    Guest
    I like what Sony did with PlayTV in the UK. I hope they include that in the PS4 for the states as well.

  2. #102
    Dedicated Member

    Join Date
    Dec 2010
    Posts
    1,485
    Rep Power
    40
    Points
    19,654 (0 Banked)
    Quote Originally Posted by TGO View Post
    Thats what I was referring to, ie the 360 has GDDR3 RAM but the 720 has DDR3
    I was just correcting Average because he kept mentioning GDDR3

    Sent via Codec
    There is a reason I keep saying GDDR3 and not DDR3. The rumors are that AMD/ATI is making both CPU and GPU, right?


    Care to guess who invented GDDR memory?


    Why would AMD pay to license out DDR memory from someone else when GDDR is their own product?
    Last edited by Completely Average; 01-22-2013 at 22:22.

  3. Likes MonkeyClaw likes this post
  4. #103
    Veteran
    MonkeyClaw's Avatar
    Join Date
    Oct 2006
    Location
    Fort Worth, TX
    PSN ID
    Tha_MonkeyClaw
    Age
    39
    Posts
    4,843
    Rep Power
    94
    Points
    148,915 (0 Banked)
    Items Protect yourself
    Quote Originally Posted by Completely Average View Post
    There is a reason I keep saying GDDR3 and not DDR3.

    The rumors are that AMD is making both CPU and GPU, right?



    Care to guess who invented GDDR memory?


    Why would AMD pay to license out DDR memory from someone else when GDDR is their own product?
    Good point!

    -=[ PSN ID: Tha_MonkeyClaw ]=-

  5. #104
    Staff Writer & Graphic Designer
    mondofish's Avatar
    Join Date
    Dec 2009
    Location
    Buxton, Maine
    Age
    26
    Posts
    493
    Rep Power
    35
    Points
    7,110 (0 Banked)
    From what I have seen in a lot of the "next-gen" tech demos the vast graphical improvements lie not in the textures and environments, but in the quantity and quality of effects and particle rendering (i.e...bloom, lighting, all that jazz) This rumored architecture sounds like the right system for the job. I think the next Playstation is going to be just fine and you'd be kidding yourself if you think you know better then Sony in that regard.

  6. Likes mynd , Itachi likes this post
  7. #105
    Forum Sage
    MATRIX 2's Avatar
    Join Date
    Jul 2005
    Location
    D.C.
    Posts
    7,984
    Rep Power
    112
    Points
    43,240 (0 Banked)
    Quote Originally Posted by Vulgotha View Post
    There are other considerations at play. For one DDR3 is dirt cheap at the moment, for another I think it is becoming clear (totally conjecture!) that they are focusing on more than just gaming with this box. Maybe it will have DVR features or some other special stuff.. If this is the case the CPU is going to need reasonable access to memory in order to adequately perform these functions.

    To make up for the lackluster bandwidth they've included special custom silicon to aid the GPU.

    I'm not entirely sure I like this direction, but I haven't seen any games yet (or anything for Orbis) so I cannot pass judgment. Until I get something more concrete, though, my opinion here is that MS didn't want to go balls out for performance like they did with the 360 and Xbox with this new device. The Wii made an impression and they'd rather aim for the casual market out of the box, make a profit immediately (or very soon after launch), and target the living room as an entertainment center.

    So far I'm not impressed with what I'm seeing. From either side..
    Ehh I'm doubtful.

    And I fail to see why they can't achieve those things with a unified GDDR5 memory setup.

    Also GDDR5 has been in use since 2009 so it should be fairly cheap as well. MS doesn't have any major costs to deal with outside of the CPU/GPU setup. Blu-ray drives should be as cheap as dvd drives were for the 360. Everything else is pretty straightforward so I don't see why they can't put in decent internals and keep the next xbox at ~$400. (even with an integrated kinect 2 sensor/bundled with every next xbox)

    This will be the longest gap between successive gaming consoles so MS has to come up with something decent.

  8. #106
    PSU Technical Advisor
    Vulgotha's Avatar
    Join Date
    Jan 2007
    Age
    24
    Posts
    15,953
    Rep Power
    144
    Points
    107,782 (0 Banked)
    Achievements IT'S OVER 9000!
    Because GDDR5 is alot more expensive than DDR3. They would probably spend as much on 4GB of GDDR5 as they would 8GB of DDR3. For their purposes they don't want GDDR5.

    This could be why Sony 'only' has 4GB of GDDR5 vs MS's rumored 8GB of DDR3. Expense.

    However, long term it may be Sony who saves the money. If they're opting for 4GB of GDDR5 on a 128bit bus they, the second GDDR5 drops in price they start saving money. It's unlikely that the cost to create buses is going to go down anywhere near as much over the years.

    So if MS has 8GB of DDR3 using a 256bit bus, while DDR3 is at rock bottom like right now, the savings won't really translate as much long term. They're still going to need the bandwidth provided by that bus width.

    @Everyone else

    It doesn't matter who 'invented' GDDR, what matters is the specifications and demands MS has of their engineers and the guys at AMD.
    Last edited by Vulgotha; 01-23-2013 at 01:27.


  9. #107
    Elite Guru
    J3ff3's Avatar
    Join Date
    Dec 2006
    Age
    30
    Posts
    5,230
    Rep Power
    80
    Points
    7,726 (0 Banked)
    Items User name style
    Quote Originally Posted by Vulgotha View Post
    There are other considerations at play. For one DDR3 is dirt cheap at the moment, for another I think it is becoming clear (totally conjecture!) that they are focusing on more than just gaming with this box. Maybe it will have DVR features or some other special stuff.. If this is the case the CPU is going to need reasonable access to memory in order to adequately perform these functions.

    To make up for the lackluster bandwidth they've included special custom silicon to aid the GPU.

    I'm not entirely sure I like this direction, but I haven't seen any games yet (or anything for Orbis) so I cannot pass judgment. Until I get something more concrete, though, my opinion here is that MS didn't want to go balls out for performance like they did with the 360 and Xbox with this new device. The Wii made an impression and they'd rather aim for the casual market out of the box, make a profit immediately (or very soon after launch), and target the living room as an entertainment center.

    So far I'm not impressed with what I'm seeing. From either side..
    i'm not either. but lackluster bandwidth? rumours so far suggest that, with esram, the durango has 170gb/s and the ps3 has 192. both higher than the 7970's 155 (or so i believe)
    Got YLOD? In the UK? I'll buy it off you.

  10. #108
    Forum Sage
    Itachi's Avatar
    Join Date
    Nov 2010
    Location
    Winterfell
    PSN ID
    iwinulose042
    Age
    20
    Posts
    8,322
    Rep Power
    83
    Points
    30,849 (151,503 Banked)
    Items Final Fantasy XIII-2Final Fantasy XIIIFull Metal AlchemistDragon Ball ZNarutoDeath NoteNaughty DogLightningNoctisAssassins Creed EzioPS3 Slim
    Quote Originally Posted by J3ff3 View Post
    i'm not either. but lackluster bandwidth? rumours so far suggest that, with esram, the durango has 170gb/s and the ps3 has 192. both higher than the 7970's 155 (or so i believe)
    But the ESRAM is only 32mb, as far as I know you feed ESRAM with the main RAM and the ESRAM then sends it to the GPU right? So that means that at one time only 32mb of data is going to GPU? That is a considerable limitation is it not? I read that EDRAM is one of the reasons why games like Killzone aren't possible on 360.

    I have no idea how embedded RAM works, or 3d rendering for that matter

  11. #109
    Supreme Veteran
    mynd's Avatar
    Join Date
    May 2006
    Location
    Down Under
    Age
    41
    Posts
    17,557
    Rep Power
    162
    Points
    161,197 (0 Banked)
    Items User name style
    Achievements IT'S OVER 9000!
    Quote Originally Posted by itachi73378 View Post
    But the ESRAM is only 32mb, as far as I know you feed ESRAM with the main RAM and the ESRAM then sends it to the GPU right? So that means that at one time only 32mb of data is going to GPU? That is a considerable limitation is it not? I read that EDRAM is one of the reasons why games like Killzone aren't possible on 360.

    I have no idea how embedded RAM works, or 3d rendering for that matter
    No its not an input cache at all.

    When ever you write create graphics, you have to "write" the results back to the frame buffer (normally multiple), these include the z-buffer, the actual frame buffer, and in the case of deffered rendering, multiple frame buffers with various things.

    So normally you go...

    READ From memory: Textures, model data
    OPERATE: In the GPU with you shader code.
    WRITE to memory: Your various framebuffers.

    So your constantly read/write read/write your memory when creating a single frame.

    Dont forget CAS/RAS delays and you can be waiting 3 clocks before being able to write.


    They throw ESRAM in to the mix as a frame buffer accumulator.

    READ From memory: Textures, model data
    OPERATE: In the GPU with your shader code.
    WRITE to ESRAM: Your various frame buffers.

    End result your VRAM is just read read read read until the image is complete, then the frambuffers are written back to VRAM and displayed.

    Think of the ESRAM as a giant GPU output cache.

    Normally you "build" you image in the VRAM, the same place you get all your raw data from.

    In the case of ESRAM, you build you image in a completely separate place.
    Last edited by mynd; 01-23-2013 at 02:25.

  12. Likes Itachi likes this post
  13. #110
    Elite Guru
    J3ff3's Avatar
    Join Date
    Dec 2006
    Age
    30
    Posts
    5,230
    Rep Power
    80
    Points
    7,726 (0 Banked)
    Items User name style
    Quote Originally Posted by itachi73378 View Post
    But the ESRAM is only 32mb, as far as I know you feed ESRAM with the main RAM and the ESRAM then sends it to the GPU right? So that means that at one time only 32mb of data is going to GPU? That is a considerable limitation is it not? I read that EDRAM is one of the reasons why games like Killzone aren't possible on 360.

    I have no idea how embedded RAM works, or 3d rendering for that matter
    i'm no expert, because on the face of it it would appear that 32gigs would achieve nothing. but i would assume its not a case of having to empty the memory before refilling it, so the bandwidth (@102gb/s) is most important ie its a constant flow that the gpu can access. this would tie in with the fact that 10mb edram should've been useless on the 360 (what's 10mb going to achieve?), and the fact that more knowledgable people than i say that the bandwidth is cumulative. ie esram bandwidth + ddr3 bandwidth.

    more importantly, and if i remember rightly, the 360s edram was implemented at a late stage and therefore had read/write access problems. it was tacked on, so to speak. hopefully this esram isn't, but maybe someone with more knowledge can chime in....
    Got YLOD? In the UK? I'll buy it off you.

  14. #111
    PSU Technical Advisor
    Vulgotha's Avatar
    Join Date
    Jan 2007
    Age
    24
    Posts
    15,953
    Rep Power
    144
    Points
    107,782 (0 Banked)
    Achievements IT'S OVER 9000!
    Quote Originally Posted by J3ff3 View Post
    i'm not either. but lackluster bandwidth? rumours so far suggest that, with esram, the durango has 170gb/s and the ps3 has 192. both higher than the 7970's 155 (or so i believe)
    But it's still just 32MB of ESRAM. That's not a legitimate fullsale replacement, and it's still a lower number than the GDDR5 figures we have.


  15. #112
    Supreme Veteran
    mynd's Avatar
    Join Date
    May 2006
    Location
    Down Under
    Age
    41
    Posts
    17,557
    Rep Power
    162
    Points
    161,197 (0 Banked)
    Items User name style
    Achievements IT'S OVER 9000!
    Quote Originally Posted by J3ff3 View Post
    i'm no expert, because on the face of it it would appear that 32gigs would achieve nothing. but i would assume its not a case of having to empty the memory before refilling it, so the bandwidth (@102gb/s) is most important ie its a constant flow that the gpu can access. this would tie in with the fact that 10mb edram should've been useless on the 360 (what's 10mb going to achieve?), and the fact that more knowledgable people than i say that the bandwidth is cumulative. ie esram bandwidth + ddr3 bandwidth.

    more importantly, and if i remember rightly, the 360s edram was implemented at a late stage and therefore had read/write access problems. it was tacked on, so to speak. hopefully this esram isn't, but maybe someone with more knowledge can chime in....
    The problem was they didnt have enough of it 10mb wasn't enough to hold a 720p image with associated frame buffers.

    The end reuslt is, you actually have to "tile" the 360 frame, so the GPU does a lot more work than normal to get it out.

    Its a testament to how fast it was that it still did this easily.

    The 360 would have it double its framerate, if they had put in about 4 more mb of EDRAM.

    Quote Originally Posted by Vulgotha View Post
    But it's still just 32MB of ESRAM. That's not a legitimate fullsale replacement, and it's still a lower number than the GDDR5 figures we have.
    It can be, if it's being used in the one place you actually need that sort of bandwidth.
    Last edited by mynd; 01-23-2013 at 02:46.

  16. #113
    Forum Sage
    MATRIX 2's Avatar
    Join Date
    Jul 2005
    Location
    D.C.
    Posts
    7,984
    Rep Power
    112
    Points
    43,240 (0 Banked)
    Quote Originally Posted by Vulgotha View Post
    Because GDDR5 is alot more expensive than DDR3. They would probably spend as much on 4GB of GDDR5 as they would 8GB of DDR3. For their purposes they don't want GDDR5.

    This could be why Sony 'only' has 4GB of GDDR5 vs MS's rumored 8GB of DDR3. Expense.

    However, long term it may be Sony who saves the money. If they're opting for 4GB of GDDR5 on a 128bit bus they, the second GDDR5 drops in price they start saving money. It's unlikely that the cost to create buses is going to go down anywhere near as much over the years.

    So if MS has 8GB of DDR3 using a 256bit bus, while DDR3 is at rock bottom like right now, the savings won't really translate as much long term. They're still going to need the bandwidth provided by that bus width.

    @Everyone else

    It doesn't matter who 'invented' GDDR, what matters is the specifications and demands MS has of their engineers and the guys at AMD.
    Last I checked MS isn't against losing money on the hardware initially if they can reduce manufacturing costs fairly quickly.

    (remember epic pushing MS to have 512MB of ram in the 360 over 256MB and a standard HDD, cost MS 1 billion dollars.)

    Also based off their shortsightedness regarding the deal with nvidia for the original xbox gpu I think they would be more aware of the pitfalls of going with DDR3 instead of GDDR5 (regarding cost).

  17. #114
    Dedicated Member
    Centurion's Avatar
    Join Date
    May 2008
    Posts
    1,021
    Rep Power
    50
    Points
    2,837 (0 Banked)
    Quote Originally Posted by MATRIX 2 View Post
    I find that hard to believe since the fastest DDR3 memory (DDR3-3000) tops out at 24GB/s.

    And seeing as DDR4 will be coming to consumer platforms in 2014, it would be rather shortsighted of MS not to use GDDR5.
    DDR3-2133 with 256-bit bus would give 68.2GB/sec..

  18. #115
    Dedicated Member

    Join Date
    Dec 2010
    Posts
    1,485
    Rep Power
    40
    Points
    19,654 (0 Banked)
    Quote Originally Posted by Vulgotha View Post
    Because GDDR5 is alot more expensive than DDR3.
    Who told you that? They are actually really close to the same price. You can get PC video cards with a GB of GDDR5 for less than $100 retail now. Strip away the PCB, GPU, and cooling system and you're probably only looking at $15 or so for the RAM.

    And like I said before, EVERY GPU on the market today is designed for GDDR5. NOTHING runs on DDR4. It would be a massive expense to redesign a GPU and Memory Controller that works with DDR3. Seems kind of silly to spend that sort of money for RAM that offers less than 1/10th the bandwidth of the standard GDDR5.



    Like I said before, I'll bet money the 360 uses GDDR5 RAM. I'll also bet money that the final console only ships with 4GB, and not the rumored 8GB. I think you'll find that the hardware difference between the next Xbox and the PS4 will be even less than the current 360 and PS3. From a developer point of view it will be fairly irrelevent, and the same applies to end users as well.

    And for the record, it's virtually impossible to do native 1080p with DDR3. The bandwidth required just isn't there, and I can assure you the next Xbox will be doing 1080p native with Anti Ailiasing, and so will the PS4.




    Quote Originally Posted by Centurion View Post
    DDR3-2133 with 256-bit bus would give 68.2GB/sec..
    Which is less than half the bandwidth of GDDR5 with no cost savings. You're looking at about $75 for the RAM alone in that configuration if you're buying it from a good supplier. You can get GDDR5 for about $15 per GB.

    I just don't see it happening. I especially don't see AMD reworking their GPU Memory Controller to use RAM that has half the bandwidth of their cheapest retail GPU currently in production.
    Last edited by Completely Average; 01-23-2013 at 05:46.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
vBCredits II Deluxe v2.1.0 (Pro) - vBulletin Mods & Addons Copyright © 2010-2014 DragonByte Technologies Ltd.