Discussion in 'Videocards - AMD Radeon Drivers Section' started by trocio2, Mar 7, 2014.
I know all about Khronos, as do most people here.
I suggest you not speak up when you havent a clue what others are talking about
If you dont have anything positive to contribute to the discussion, keep your opinions to yourself.
since you clearly are unable to read context i will restate my assersion.
OGL and Khronos are not any faster to implementation than MS.
Historically OGL feature parity with DX has been behind, although that has improved recently.
opengl ( opengl es ) giving very good results at poor tablet gpus and cpus.
im used amlogic 8726 mx based tabled. its including mali400 mp2.
if its true its have 5 gflops per core. its total power 10 or 20 gflops( 2core or 4core )... resolution was 1024 x 600
and mali gpu internaliy mp2 or mp4 mp8 or... they are always using multi gpu but there is no problem like sli or crossfire. ( may be engineered in one chip. not external connected... )
im played them like that. and tablet firmware updates effecting like hardware changed.
i have 15x powerfull igpu i have 30x powerfull dgpu than tablet. but not satisfied. driver problem.... api problem... bu new gpu problem...
Well, playing on 7" screen 1024x600 at feature level lower than DX9_3 is not that much of problem.
- 1st screen has 3.375 lower amount of pixels to render than 1080p
- 2nd those games are not really using anything GPU intensive as they are made for below average tablets as their authors want to sell them (it looks good only due to small screen compared to resolution)
- 3rd I have 2048x1536 screen on my tablet and it's pain to play many of games at this native resolution
- 4th mp2/4/8 does not state it's multi gpu like SLi/CFx, it's like having one GPU with more shader count and ROPs
And in the end comparing tablet CPU vs it's GPU compute power vs PC ones... Average tablets have like 20times worse GPU to CPU ratio than PC.
(Means CPU can feed GPU, but in most cases GPU is way too slow on tablet and there is no big need to remove CPU overhead... Only to prolong battery life.)
You can probably have Doom3 running on tablet if you drop four 8k x 8k and 4k x 4k textures per model of monster down to 256x256, disable lightmaps and shadowmaps, bumpmaps and bake those informations directly to textures. And that's 10 years old game which would have to be crippled considerably to work on tablet well.
- There is simply day and night difference in workload developers place upon desktop PCs and tablet
- only thing which is same: If you get yourself low end, do not expect anything at all to run smoothly.
is your 2048x1536 tablet ipad new ??? i was see ipad new. there is no game painly. always smoth fps. there is no stuttering no fps slow down ( includes good firmware/hardware android tablets )
and tablet gpus start on evolution. pc gpus amd apis more old more more mature.
perfect smoth ui effects for ios desktop.
doom base builded for desktop pc. not designed for tabled opengl-es.
im remembering nvidia fx 5200 gpus from windows. there is best graphics/performance combination on doom3 when released. its totaly opengl. when doom3 released ( specialy when their technogy demos released ) there was no game comparable with directx.
and wolfenstein: new order coming.... we will see what is opengl capabilities with new standarts.
mantle is for laptops and weak pcs .. nobody wants mantle.. unless u want to play games on a tablet
Nope, it is where it shines most, but mantle (and the supposed new dx) just allows you to get more of the old cpu. I'm with 2x290, but old i7-920@4ghz, I know games are hitting the cpu bottleneck and I have to change it. Now imagine a new api, that makes my cpu/mobo/memory viable for several years to come. Thing is, cpu speed hardly matters in recent years - even if you get twice faster, the only place the casual user is going to notice that is gaming. And when that is covered too by not wasting resources in API overhead, then we can skip on upgrading anything but the videocard for many, many years. Now who would want that?
You can't use all the Opengl Extensions in a engine if other vendors didn't have such extension or alternatives, is like saying that mantle don't gives you any benefits on Intel or Nvidia cards, or saying that bf4 has no 9x more drawcalls than dx11.1 version.
There's not going to be a lot more drawcalls in anything but tech demos unless all major vendors has support for it, so Mantle for Nvidia and Intel too, or just a better Opengl implementation for AMD and Intel.
To put a real world example of the driver problems around opengl:
You can't have native WebGL in windows in almost any browser cause the Opengl Drivers of some vendors has so many bugs that devs ended translating all webgl calls to directX (using ANGLE) to workaround it, anyway Angle gives you lower performance (in vendors that has a well implemented Opengl) it's not compatible with all of the webgl functions and is slower compiling shaders.
But Angle works in the same way with all GPU's while Native webgl works only in some cases.
You can ignore part of market and Carmack would do too, just to show how good it can get. Then he would just turn it off for AMD and replaced it by another low cost effect.
Because it's fundamental OpenGL feature that you can decide based on available feature level. What will be rendered on accelerated HW level, you can allow CPU emulation or chose another effect as replacement or turn it off completely. No need to limit your imagination.
You know, there is even issue finding a tech demo showing those improvements of OpenGL over DirectX.
And no, Mantle does not deliver 9x more draw calls in BF4, but it works. Once paired with poor CPU you still get better graphics and performance than BF3 or BF4 in DX mode.
You can see some reality here (Video about how it can, but not about how it does.)
Then you can see what today's available APIs do in action in UE4.
It ends on massive amount of particles. Display lists help, command queue helps, geometry instancing helps. But that's far from enough or revolutionary.
And the point is: Even if you use all knowledge about tricks and optimizations DX/OGL has, you would still not get CPU overhead down enough.
The problem with DX 12 is... my current Graphics Card may not support it. My current Operating System doesnt support it either. The only way I would switch to DX 12/Windows 9 is if I could get full support for features from my current card. Too many unknowns. If MS makes it backwards compatible somehow with old DX 11 cards then MANTLE is closer to death.
However, my current Graphics Card/OS kindof should support MANTLE eventually.
Ah, I was misunderstanding that post, it's response was low :roll:
I have actually programmed with OpenGL. Of the 3 vendors, AMD has always been best for me. Any time somebody has an issue with OGL on AMD, it's because the developer only tested their crap on Nvidia, do you want to know how Nvidia manages to get drivers that support the newest OpenGL released on the day of the new OpenGL version announcement? It's because they cut corners. They don't follow the OpenGL spec correctly. Programmers then write malformed code which the Nvidia driver will accept and run just fine, but won't work at all for anybody else.
I have worked with id Tech 3, 4, and whatever the hell you want to call the mix of 4 and 5 used for Doom 3 BFG Edition. I thought Carmack was a pretty cool dude and a great developer, but now after working with that crap and seeing the bold statements he makes he comes across as more of an ass than anything.
If Nvidia was actually trying to improve OpenGL, they'd put their "super amazing extensions" as ARB or EXT and NOT NV. That's just monopolizing the market.
Don't even get me started on your FUD about AMD and Linux. You like many people are extremely out of date in regards to that. GCN performance on the open-source radeon driver was terrible at first, yes, but for a good month now that has been completely changed. Go have a look at Nvidia's open-source driver on Linux, it's hilarious.
HOW DOES AMD SUCK ON OPENCL. Bitcoin/Litecoin/etc. The end.
I can't even continue trying to correct everything that's incorrect in your post, so I'll sum it up with this: Not one thing in that post is correct.
As I said above, I'm sorry, I misunderstood what you were saying in your post.
np, sorry i overreacted...
We are so close to the big day...
I don't think you overreacted at all, in retrospect I was being kind of a :tool:, you had every right to react the way you did.
Could someone give me link for the live streamming of GDC 2014 where DirectX Will be announced?
They're starting in 55min right?
My Google skill only showing this to me.
Here is the highlights
Sounds like most DX11 cards will support DX12, not everything DX12 but a good portion but the vendors will release a list of what cards are supported.
My guess is it sounds like the 5000 series + will get support.
Release date is Holiday 2015 with a preview this year.
Also a early access will be open, but no mention on when or how to get it.
They say laptops are supported too which is cool but... I don't trust AMD. They surely won't care about laptops like they did with mantle, what a shame..
Lets just all pray they do. I don't have a laptop so it doesn't affect me in anyway, but I hope this will change with DX12.