Guru3D.com Forums

Go Back   Guru3D.com Forums > General Chat > Frontpage news
Frontpage news Perhaps you have some news to report or want to check out the latest Guru3D headlines and comment ? Check it in here.


Reply
 
Thread Tools Display Modes
Unreal Engine 4 'Infiltrator' Demo
Old
  (#1)
Hilbert Hagedoorn
Don Vito Corleone
 
Hilbert Hagedoorn's Avatar
 
Videocard: AMD | NVIDIA
Processor: Core i7 4770K
Mainboard: Z77
Memory: 8GB
Soundcard: X-Fi - GigaWorks 7.1
PSU: 1200 Watt
Default Unreal Engine 4 'Infiltrator' Demo - 03-30-2013, 09:43 | posts: 21,527 | Location: Guru3D testlab

Epic showed off its Unreal Engine 4 today at the Game Developers Conference. ...

Unreal Engine 4 'Infiltrator' Demo
   
Reply With Quote
 
Old
  (#2)
Chillin
Ancient Guru
 
Chillin's Avatar
 
Videocard: Gigabyte GTX 560 930/2300
Processor: i5-2500K@4.4GHz 1.2v H60
Mainboard: Asrock Z77 Pro4
Memory: G.Skill 2X4GB DDR3-1600
Soundcard: X-Fi XtremeGamer+Z506 5.1
PSU: Corsair TX 750w v2
Default 03-30-2013, 09:48 | posts: 6,440 | Location: Chilling

Nice tech demo.

And here is how it will look after being downsized for cross-platform launches:

   
Reply With Quote
Old
  (#3)
spex_2
Newbie
 
Videocard: GW Phantom 660 TI
Processor: AMD FX-8350 - 4,6 Ghz
Mainboard: Asus Sabertooth 990 FX r0
Memory: GeIL EVO Kit 32GB 1866
Soundcard: Saffire Pro 40
PSU: Antec HCG 750
Default 03-30-2013, 14:08 | posts: 23 | Location: Europe

chillin, thats the same thing i thought.

do we know on what hardware this demo was running?

anyway, the huge optical gap between the now ( bf3/crysis 3 ) and the shown "next gen" (ps4 tech demos / unreal 4 tech demo) is absolutely massive. it is hard to believe, that this will come tu us @ home in about 12 months.

what i believe is, they gain a huge optical boost on the new console hardware, with tweaks and tricks to come close to the things shown. on the pc we will see lousy ports because of the lack of will, time and money to do it proper. so they sell more console hw, the pc will be left behind again and all the next gen grafix bloom will only be a smokebomb from the marketing machinery.
   
Reply With Quote
Old
  (#4)
Denial
Ancient Guru
 
Denial's Avatar
 
Videocard: EVGA GTX 980
Processor: i7-3770K
Mainboard: ASUS Maximus 5 Formula
Memory: 16GB Corsair DDR3 2133
Soundcard: ZxR & HD800 Lyr/PC350SE
PSU: Seasonic 1000w
Default 03-30-2013, 14:57 | posts: 7,240 | Location: Above Earth in a Big Rocket Ship

Quote:
Originally Posted by spex_2 View Post
chillin, thats the same thing i thought.

do we know on what hardware this demo was running?

anyway, the huge optical gap between the now ( bf3/crysis 3 ) and the shown "next gen" (ps4 tech demos / unreal 4 tech demo) is absolutely massive. it is hard to believe, that this will come tu us @ home in about 12 months.

what i believe is, they gain a huge optical boost on the new console hardware, with tweaks and tricks to come close to the things shown. on the pc we will see lousy ports because of the lack of will, time and money to do it proper. so they sell more console hw, the pc will be left behind again and all the next gen grafix bloom will only be a smokebomb from the marketing machinery.
It was shown on a single GTX680. And why would we see lousy ports? The PS4 is and the 720 is rumored to both have PC hardware. It's not even a port anymore.
   
Reply With Quote
 
Old
  (#5)
PhazeDelta1
Moderator
 
PhazeDelta1's Avatar
 
Videocard: EVGA 780Ti Classified SLI
Processor: Intel i7 4770k
Mainboard: Asus Sabertooth Z87
Memory: 16GB Corsair 2133MHz
Soundcard: Asus Xonar Phoebus
PSU: EVGA SuperNOVA 1200 P2
Default 03-30-2013, 15:05 | posts: 13,763 | Location: USA

wrong thread.
   
Reply With Quote
Old
  (#6)
boodikon
Ancient Guru
 
boodikon's Avatar
 
Videocard: Leadtek 8800 GTS 640mb (600 core)
Processor: Amd 64 X2 4400+ @ 2.4 Ghz 2mb cache
Mainboard: Asus A8N SLI-Deluxe Nforce4
Memory: 4 x 512mb PC4000 Crucial Ballistix
Soundcard: Audigy 2 ZS
PSU: Tagan 480w silent psu
Default 03-30-2013, 15:45 | posts: 2,825 | Location: England

I like it, the unreal engine is going in the right direction.
   
Reply With Quote
Old
  (#7)
CPC_RedDawn
Ancient Guru
 
CPC_RedDawn's Avatar
 
Videocard: 3GB HD7970OC/2GB HD7770
Processor: 4770K@4.5GHz/Q6600@3.6GHz
Mainboard: Z87-GD65 / P5K PREMIUM
Memory: 16GB@1866MHz/4GB@1066MHz
Soundcard: Creative SoundBlaster Z
PSU: 1200W/900W
Default 03-30-2013, 15:47 | posts: 6,044 | Location: Wolverhampton/United Kingdom

Its all well and good having flashy graphics but what I want is proper optimizations. Better performance as well as better graphics. They push as many particles and textures as they want to unless we actually have hardware they can run it properly WITH a power hungry OS in the background then I shall be happy.

Also, video of tech demos are never one to go by, as its been recorded to a video file and that video file will be, compressed, capped at 24fps or 30fps, and this may not even be the demo running in full detail... They could of turned stuff down to maintain a fluid frame rate.

What I want to see is a real time demo, record on a camera someone actually playing the demo or the demo actually running on a PC with current hardware with all the bells and whistles turned on and show us what frame rates it gets. Its that simple, if they showed us that I would be impressed
   
Reply With Quote
Old
  (#8)
thatguy91
Ancient Guru
 
Videocard: HIS R9-280x Iceq X2 Turbo
Processor: i5-3570K
Mainboard: Asrock Z77 Extreme6
Memory: DDR3-2400 2x8GB
Soundcard: ALC898 + Microlab FC-730
PSU: Enermax Platimax 750W
Default 03-30-2013, 17:12 | posts: 4,152 | Location: Australia

Quote:
Originally Posted by Denial View Post
It was shown on a single GTX680. And why would we see lousy ports? The PS4 is and the 720 is rumored to both have PC hardware. It's not even a port anymore.
Hmmm, yes and no. Comparing PS4, Xbox Next, and PC (Windows) would be like comparing Linux and Windows, and saying that a program moved over from Linux to Windows isn't ported. Same hardware, but same issues as with console ports (potentially buggy and slow).

If any thing, if the porting is more simple it will mean less likelihood of better graphics on PC. Games ported from Xbox Next would be the Directx based games, and games from PS4 the OpenGL games. It means we probably won't get much extra graphical stuff added to Xbox Next and PS4 Ports, and they'll probably do the same old trick of directly porting the textures, or upsampling the textures. I highly doubt they'd do things properly and make textures for PC that they downsample, noooo, that makes too much sense!
   
Reply With Quote
Old
  (#9)
CronoGraal
Ancient Guru
 
CronoGraal's Avatar
 
Videocard: XFX Radeon HD6870 1GB
Processor: Intel Q9450 @ 3.4ghz
Mainboard: Gigabyte X48-DQ6
Memory: 2x2GB Corsair Dominator
Soundcard: ASUS Xonar Essence STX
PSU: Corsair 750W TX
Default 03-30-2013, 17:58 | posts: 3,843 | Location: Sweden

-insert bandwagon console bashing post here-
   
Reply With Quote
Old
  (#10)
Koniakki
Maha Guru
 
Koniakki's Avatar
 
Videocard: Palit 780Ti JS@1280/7500
Processor: Intel 3770k@4.6G's
Mainboard: Asrock Z77 OC Formula
Memory: Kingston P-X 16GB@2400Mhz
Soundcard: Asrock SPDIF-OUT
PSU: XFX BLACK PRO 750W GOLD
Default 03-30-2013, 19:26 | posts: 1,377 | Location: Inside My Thoughts..

WOW?? Holly molly. Nice demo... Really nice demo.. REALLY, REALLY NICE DEMO!! Did I said already how nice this demo is?
   
Reply With Quote
 
Old
  (#11)
Chillin
Ancient Guru
 
Chillin's Avatar
 
Videocard: Gigabyte GTX 560 930/2300
Processor: i5-2500K@4.4GHz 1.2v H60
Mainboard: Asrock Z77 Pro4
Memory: G.Skill 2X4GB DDR3-1600
Soundcard: X-Fi XtremeGamer+Z506 5.1
PSU: Corsair TX 750w v2
Default 03-30-2013, 20:20 | posts: 6,440 | Location: Chilling

Quote:
Originally Posted by Denial View Post
It was shown on a single GTX680. And why would we see lousy ports? The PS4 is and the 720 is rumored to both have PC hardware. It's not even a port anymore.
Because this thing ran on a system with a GTX 680 (which is far stronger than the GPU in the PS4), an i7 (which is another world away from the pathetic CPU in the PS4) and 16GB of RAM + 4GB (safe to assume) on the GTX680; not to mention probably an SSD, etc.

Good luck with that.
   
Reply With Quote
Old
  (#12)
Koniakki
Maha Guru
 
Koniakki's Avatar
 
Videocard: Palit 780Ti JS@1280/7500
Processor: Intel 3770k@4.6G's
Mainboard: Asrock Z77 OC Formula
Memory: Kingston P-X 16GB@2400Mhz
Soundcard: Asrock SPDIF-OUT
PSU: XFX BLACK PRO 750W GOLD
Default 03-30-2013, 20:42 | posts: 1,377 | Location: Inside My Thoughts..

Quote:
Originally Posted by Chillin View Post
Because this thing ran on a system with a GTX 680 (which is far stronger than the GPU in the PS4), an i7 (which is another world away from the pathetic CPU in the PS4) and 16GB of RAM + 4GB (safe to assume) on the GTX680; not to mention probably an SSD, etc.

Good luck with that.
Sad, but probably true...
   
Reply With Quote
Old
  (#13)
Denial
Ancient Guru
 
Denial's Avatar
 
Videocard: EVGA GTX 980
Processor: i7-3770K
Mainboard: ASUS Maximus 5 Formula
Memory: 16GB Corsair DDR3 2133
Soundcard: ZxR & HD800 Lyr/PC350SE
PSU: Seasonic 1000w
Default 03-30-2013, 22:11 | posts: 7,240 | Location: Above Earth in a Big Rocket Ship

Quote:
Originally Posted by thatguy91 View Post
Hmmm, yes and no. Comparing PS4, Xbox Next, and PC (Windows) would be like comparing Linux and Windows, and saying that a program moved over from Linux to Windows isn't ported. Same hardware, but same issues as with console ports (potentially buggy and slow).

If any thing, if the porting is more simple it will mean less likelihood of better graphics on PC. Games ported from Xbox Next would be the Directx based games, and games from PS4 the OpenGL games. It means we probably won't get much extra graphical stuff added to Xbox Next and PS4 Ports, and they'll probably do the same old trick of directly porting the textures, or upsampling the textures. I highly doubt they'd do things properly and make textures for PC that they downsample, noooo, that makes too much sense!
Linux to Windows ports are only an issue when you're going from Direct3D to OpenGL. ID has no problems porting all it's OpenGL games to Linux with excellent performance. Further, the 720 will presumably use Direct3D for it's console development. If it indeed uses x86 hardware and standard GPU architecture it will literally be a PC. Microsoft development tools are the best afaic and that will mean we will see excellent ports.

Also I disagree with you're last paragraph entirely. What reason was there before hand to put better graphics into the game? Regardless it costs money and that's all that matters.

Quote:
Originally Posted by Chillin View Post
Because this thing ran on a system with a GTX 680 (which is far stronger than the GPU in the PS4), an i7 (which is another world away from the pathetic CPU in the PS4) and 16GB of RAM + 4GB (safe to assume) on the GTX680; not to mention probably an SSD, etc.

Good luck with that.
It's 2.5TF vs 1.84TF. 25%. The CPU is irrelevant in consoles for the most part as the rendering pipeline is highly optimized for the hardware and there are no background tasks or other items eating CPU cycles.

The amount of ram is whatever for games as all textures that are being rendered are stored on the GPU. PS4 GPU has access to probably 6GB (I figured at least 2 will be used by other stuff) vs 4GB on the 680 (I think far more users have the 2GB version).

Plus you have no clue what the framerate is of the demo, for all you know it's running at 100FPS scaled down and you still get 75fps which is more than playable, especially since most console gamers are used to like 30fps.
   
Reply With Quote
Old
  (#14)
tsunami231
Ancient Guru
 
tsunami231's Avatar
 
Videocard: EVGA 660gtx sig2
Processor: i7 920 CNPS10X Quiet
Mainboard: Evga x58 SLI LE
Memory: 3x2gb Dominator@1600 6Gb
Soundcard: Realtek HD Audio
PSU: Antec Truepower 750
Default 03-30-2013, 22:36 | posts: 3,607 | Location: USA

Quote:
Originally Posted by Hilbert Hagedoorn View Post
Epic showed off its Unreal Engine 4 today at the Game Developers Conference. ...

Unreal Engine 4 'Infiltrator' Demo

next gen for pc's sure, next gen for consoles, doubt full we still have yet to see the visuals that where initial show for Killzone 2 on current consoles cause there was night and day difference between what was initial shown there and what we actual got when the game came out.
   
Reply With Quote
Old
  (#15)
The Chubu
Maha Guru
 
The Chubu's Avatar
 
Videocard: MSi GTX560 TwinFrozrII OC
Processor: i5 2500K stock
Mainboard: Asus P8P67-M Pro
Memory: 16Gb Patriot G2 1333Mhz
Soundcard: Onboard Realtek
PSU: Satellite SL-8600EPS 600w
Default 03-31-2013, 00:50 | posts: 2,541 | Location: Look out!

Could please somebody clean the simulated dirt on the simulated lens of the simulated camera? Thank you
   
Reply With Quote
Old
  (#16)
Chillin
Ancient Guru
 
Chillin's Avatar
 
Videocard: Gigabyte GTX 560 930/2300
Processor: i5-2500K@4.4GHz 1.2v H60
Mainboard: Asrock Z77 Pro4
Memory: G.Skill 2X4GB DDR3-1600
Soundcard: X-Fi XtremeGamer+Z506 5.1
PSU: Corsair TX 750w v2
Default 03-31-2013, 05:00 | posts: 6,440 | Location: Chilling

Quote:
Originally Posted by Denial View Post
Linux to Windows ports are only an issue when you're going from Direct3D to OpenGL. ID has no problems porting all it's OpenGL games to Linux with excellent performance. Further, the 720 will presumably use Direct3D for it's console development. If it indeed uses x86 hardware and standard GPU architecture it will literally be a PC. Microsoft development tools are the best afaic and that will mean we will see excellent ports.

Also I disagree with you're last paragraph entirely. What reason was there before hand to put better graphics into the game? Regardless it costs money and that's all that matters.



It's 2.5TF vs 1.84TF. 25%. The CPU is irrelevant in consoles for the most part as the rendering pipeline is highly optimized for the hardware and there are no background tasks or other items eating CPU cycles.

The amount of ram is whatever for games as all textures that are being rendered are stored on the GPU. PS4 GPU has access to probably 6GB (I figured at least 2 will be used by other stuff) vs 4GB on the 680 (I think far more users have the 2GB version).

Plus you have no clue what the framerate is of the demo, for all you know it's running at 100FPS scaled down and you still get 75fps which is more than playable, especially since most console gamers are used to like 30fps.
Which is already why we saw a scale down of the previous tech demo, Elemental, in order to get it to run on the PS4:

http://www.youtubemultiplier.com/515...n-pc-left-.php



(top is PC):
[IMG]http://www.*********.com/wp-content/uploads/2013/02/ue4.jpg[/IMG]
[IMG]http://www.*********.com/wp-content/uploads/2013/02/Elemental-PCvsPS4.jpg[/IMG]

Quote:
As we can clearly see, the PC version features more particles effects than the PS4 version. Of course some might say that the PS4 version still features enough particles, however one of the things that really impressed us on the Elemental Tech Demo – back in 2012 – was the amount of its particles effects and the fact that it could easily handle them.

Lighting also seems to be better on the PC. Not only that, but pay attention to the Knights eyes. On the PC version, you can clearly see that his eyes are ‘burning’. That effect though is nowhere to be found on the PS4 version. Due to the lack of such feature, the Knight on the PS4 does not look as frightening as the one on the PC. As Reddit’s user ‘ForHomeUseOnly’ also notes, ‘there isn’t a shadowed area in the corner wall area by the door, and it looks like there isn’t any light bouncing going on, and missing shadowing in some areas.’

In addition specular highlights – or the shininess of the character - seem better on the PC. The PS4 version has very ‘plastic and blown out reflections‘. Furthermore, the Knight seems to be getting less indirect lighting on the PS4 version. As we can easily spot, there is some blue indirect lighting from the enviroment on the Knight’s armor on the PC tech demo, something that is missing on the PS4.

Textures also seem of a lower quality. This can be easily spotted on the Knight’s armor, the ice chunks and the door. Ah yes, the door looks really awful on the PS4 version of the Elemental demo. The PS4 version also lacks the DOF effect that was present on the PC version, though some might be glad that Epic Games did not use it. Smoke effects are also decreased to a minimum, something that can be easily spotted in the second comparison shot.

You can easily tell that the PS4 version is basically a downgrade from the PC version of Elemental. As a means of proof, we’ve included both tech demos below. We should also note that Elemental was running on a single GTX680 on the PC. We also know that Elemental was running at 1080p and 60fps on the PC. Epic Games has not revealed the resolution and the framerate of the PS4 version, though we expect it to be at 1080p and 30fps.

And this is 6-9 months before the console even launches.

And the CPU does matter even on a console, why does it magically cease to exist as an impediment on a console vs a PC game? Even Anandtech tweeted how disappointed he was with the CPU side.

Last edited by Chillin; 03-31-2013 at 05:07.
   
Reply With Quote
Old
  (#17)
Denial
Ancient Guru
 
Denial's Avatar
 
Videocard: EVGA GTX 980
Processor: i7-3770K
Mainboard: ASUS Maximus 5 Formula
Memory: 16GB Corsair DDR3 2133
Soundcard: ZxR & HD800 Lyr/PC350SE
PSU: Seasonic 1000w
Default 03-31-2013, 06:11 | posts: 7,240 | Location: Above Earth in a Big Rocket Ship

Quote:
Originally Posted by Chillin View Post
Which is already why we saw a scale down of the previous tech demo, Elemental, in order to get it to run on the PS4:

And this is 6-9 months before the console even launches.

And the CPU does matter even on a console, why does it magically cease to exist as an impediment on a console vs a PC game? Even Anandtech tweeted how disappointed he was with the CPU side.
One is running D3D the other is OpenGL, there are bound to be differences in graphics between the two especially when it comes to lighting as the shaders are totally different. Also why would the textures be limited on PS4? It has nearly the same texture bandwidth as the GTX680 and has access way to more memory. Anyway, we don't even know how long Epic has had to optimize their engine for the PS4. AMD's SIMD architecture barely gets utilized properly now as a 7970 despite having significantly more shader performance than 680 performs similarly. Once these games are optimized to fill all those SIMDs the performance gap will close.

As for the CPU, it hardly matters on a PC either. Look at AMD vs Intel, AMD despite performing worse in pretty much everything matches Intel when the load is GPU bottlenecked. Which is going to be the case in pretty much everything next-gen. On a console it's even less, especially considering they are offloading more and more onto either specialized chips (like PS4's encoding unit or PS4's GPU based physics)

I'm also pretty sure that the engineers who have been designing CPUs and GPU's for the last 15 or so years know how properly strike a balance between the two, especially AMD who have been dominating in the APU area.

I mean obviously these consoles aren't going to be as good as PCs. But they aren't terrible and they are more than capable of running this demo at decent framerate. And because the hardware is so similar to that of a PC ports off the consoles should be less buggy and should have better performance. Hopefully they'll take the time it used to, to implement better control schemes for mouse and keyboard.

Last edited by Denial; 03-31-2013 at 06:15.
   
Reply With Quote
Old
  (#18)
Angushades
Member Guru
 
Videocard: Evga 680SLI EK COOLED
Processor: I7 3770K @4.5 EK COOLED
Mainboard: Asus P8Z77-V Deluxe
Memory: 8GB G.Skill-2133 CL9
Soundcard: Auzentech X-Fi Forte
PSU: Enermax 1250W
Default 03-31-2013, 07:33 | posts: 129 | Location: Australia

Quote:
Originally Posted by Denial View Post
One is running D3D the other is OpenGL, there are bound to be differences in graphics between the two especially when it comes to lighting as the shaders are totally different. Also why would the textures be limited on PS4? It has nearly the same texture bandwidth as the GTX680 and has access way to more memory. Anyway, we don't even know how long Epic has had to optimize their engine for the PS4. AMD's SIMD architecture barely gets utilized properly now as a 7970 despite having significantly more shader performance than 680 performs similarly. Once these games are optimized to fill all those SIMDs the performance gap will close.

As for the CPU, it hardly matters on a PC either. Look at AMD vs Intel, AMD despite performing worse in pretty much everything matches Intel when the load is GPU bottlenecked. Which is going to be the case in pretty much everything next-gen. On a console it's even less, especially considering they are offloading more and more onto either specialized chips (like PS4's encoding unit or PS4's GPU based physics)

I'm also pretty sure that the engineers who have been designing CPUs and GPU's for the last 15 or so years know how properly strike a balance between the two, especially AMD who have been dominating in the APU area.

I mean obviously these consoles aren't going to be as good as PCs. But they aren't terrible and they are more than capable of running this demo at decent framerate. And because the hardware is so similar to that of a PC ports off the consoles should be less buggy and should have better performance. Hopefully they'll take the time it used to, to implement better control schemes for mouse and keyboard.
BS buddy is that why AMD CPU's can't keep up with there own GPU's , CPU matters alot , it feeds the GPU and dont think of bad console ports as an example either.
   
Reply With Quote
Old
  (#19)
-Tj-
Ancient Guru
 
-Tj-'s Avatar
 
Videocard: ZOTAC GTX780 OC AmpFan
Processor: i7 4770K OC 4.7GHz @1.28v
Mainboard: ASUS Z87 Deluxe
Memory: Crucial BLE 16GB 2400MHz
Soundcard: Creative X-Fi Titanium HD
PSU: Chieftec NTRO88+ 650W
Default 03-31-2013, 07:47 | posts: 8,318 | Location: Urban`Jungle

Well Jaguar will be a lot better then Bulldozer/Vishera architecture, also it won't have 2 shared threads per module, but each module with its own thread.. If Bulldozer had this apporach it would crush current SB/IB.
   
Reply With Quote
Old
  (#20)
Angushades
Member Guru
 
Videocard: Evga 680SLI EK COOLED
Processor: I7 3770K @4.5 EK COOLED
Mainboard: Asus P8Z77-V Deluxe
Memory: 8GB G.Skill-2133 CL9
Soundcard: Auzentech X-Fi Forte
PSU: Enermax 1250W
Default 03-31-2013, 07:49 | posts: 129 | Location: Australia

Quote:
Originally Posted by -Tj- View Post
Well Jaguar will be a lot better then Bulldozer/Vishera architecture, also it won't have 2 shared threads per module, but each module with its own thread.. If Bulldozer had this apporach it would crush current SB/IB.
Yea Key word there , It Would... Where is it.....living in theory.
OMG if it had this and it did this ohh wait...It doesn't exist.

Last edited by Angushades; 03-31-2013 at 07:53.
   
Reply With Quote
Old
  (#21)
-Tj-
Ancient Guru
 
-Tj-'s Avatar
 
Videocard: ZOTAC GTX780 OC AmpFan
Processor: i7 4770K OC 4.7GHz @1.28v
Mainboard: ASUS Z87 Deluxe
Memory: Crucial BLE 16GB 2400MHz
Soundcard: Creative X-Fi Titanium HD
PSU: Chieftec NTRO88+ 650W
Default 03-31-2013, 07:54 | posts: 8,318 | Location: Urban`Jungle

Quote:
Originally Posted by Angushades View Post
Yea Key word there , It Would... Where is it.....living in theory.
OMG if it had this and it did this ohh wait...It doesn't exist.
Its in Jaguar and some bits in SteamRoller..
   
Reply With Quote
Old
  (#22)
Angushades
Member Guru
 
Videocard: Evga 680SLI EK COOLED
Processor: I7 3770K @4.5 EK COOLED
Mainboard: Asus P8Z77-V Deluxe
Memory: 8GB G.Skill-2133 CL9
Soundcard: Auzentech X-Fi Forte
PSU: Enermax 1250W
Default 03-31-2013, 08:03 | posts: 129 | Location: Australia

OK buddy.
   
Reply With Quote
Old
  (#23)
Chillin
Ancient Guru
 
Chillin's Avatar
 
Videocard: Gigabyte GTX 560 930/2300
Processor: i5-2500K@4.4GHz 1.2v H60
Mainboard: Asrock Z77 Pro4
Memory: G.Skill 2X4GB DDR3-1600
Soundcard: X-Fi XtremeGamer+Z506 5.1
PSU: Corsair TX 750w v2
Default 03-31-2013, 08:10 | posts: 6,440 | Location: Chilling

Quote:
Originally Posted by -Tj- View Post
Well Jaguar will be a lot better then Bulldozer/Vishera architecture, also it won't have 2 shared threads per module, but each module with its own thread.. If Bulldozer had this apporach it would crush current SB/IB.
Jaguar is in the same league as Atom, Vishera would eat it alive. It's not even supposed to be in the same benchmark graph as Vishera or Ivy Bridge.
   
Reply With Quote
Old
  (#24)
Chillin
Ancient Guru
 
Chillin's Avatar
 
Videocard: Gigabyte GTX 560 930/2300
Processor: i5-2500K@4.4GHz 1.2v H60
Mainboard: Asrock Z77 Pro4
Memory: G.Skill 2X4GB DDR3-1600
Soundcard: X-Fi XtremeGamer+Z506 5.1
PSU: Corsair TX 750w v2
Default 03-31-2013, 08:14 | posts: 6,440 | Location: Chilling

Quote:
Originally Posted by Denial View Post
One is running D3D the other is OpenGL, there are bound to be differences in graphics between the two especially when it comes to lighting as the shaders are totally different. Also why would the textures be limited on PS4? It has nearly the same texture bandwidth as the GTX680 and has access way to more memory. Anyway, we don't even know how long Epic has had to optimize their engine for the PS4. AMD's SIMD architecture barely gets utilized properly now as a 7970 despite having significantly more shader performance than 680 performs similarly. Once these games are optimized to fill all those SIMDs the performance gap will close.

As for the CPU, it hardly matters on a PC either. Look at AMD vs Intel, AMD despite performing worse in pretty much everything matches Intel when the load is GPU bottlenecked. Which is going to be the case in pretty much everything next-gen. On a console it's even less, especially considering they are offloading more and more onto either specialized chips (like PS4's encoding unit or PS4's GPU based physics)

I'm also pretty sure that the engineers who have been designing CPUs and GPU's for the last 15 or so years know how properly strike a balance between the two, especially AMD who have been dominating in the APU area.

I mean obviously these consoles aren't going to be as good as PCs. But they aren't terrible and they are more than capable of running this demo at decent framerate. And because the hardware is so similar to that of a PC ports off the consoles should be less buggy and should have better performance. Hopefully they'll take the time it used to, to implement better control schemes for mouse and keyboard.

Again, I want to know where this myth that the CPU doesn't matter at high resolutions and graphic levels came from. It's been debunked so many times, it's almost criminal to keep repeating it.

Here's a bunch at 1080p (the same resolution this console is targeted at):
















==========
==========

The problem is the console ports up till now have nearly always been like this:



But that's 100% a problem of the game itself not being properly coded by the devs. The games that are properly coded end up looking like the one's I quoted.

And now keep in mind that Jaguar is far weaker than all of the CPU's quoted on the list, even 8 cores of them wouldn't come close to many of them.
   
Reply With Quote
Old
  (#25)
-Tj-
Ancient Guru
 
-Tj-'s Avatar
 
Videocard: ZOTAC GTX780 OC AmpFan
Processor: i7 4770K OC 4.7GHz @1.28v
Mainboard: ASUS Z87 Deluxe
Memory: Crucial BLE 16GB 2400MHz
Soundcard: Creative X-Fi Titanium HD
PSU: Chieftec NTRO88+ 650W
Default 03-31-2013, 08:27 | posts: 8,318 | Location: Urban`Jungle

Quote:
Originally Posted by Chillin View Post
Jaguar is in the same league as Atom, Vishera would eat it alive. It's not even supposed to be in the same benchmark graph as Vishera or Ivy Bridge.
Maybe, but it has more efficient architecture then Vishera, it has no sharing threads and each module can do more..
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.