Emulating OpenGL 2.0 for Intel 915GM graphics?

Discussion in 'Videocards vs General Purpose - NVIDIA Ageia PhysX' started by Seestern, Nov 15, 2005.

  1. Seestern

    Seestern Member

    Messages:
    12
    Likes Received:
    0
    Hi,
    i need OpenGL 2.0 for my Laptop with Intel 915GM graphics chipset. Intel only supports 1.4. Does anyone know a opportunity to emulate og 2.0 with a software tool or something similar?
     
  2. Glidefan

    Glidefan Don Booze Staff Member

    Messages:
    12,332
    Likes Received:
    1
    GPU:
    GTX 1070 | 8600M GS
    You could try the latest version of 3DAnalyzer, but i don't know if that would help much. If you are able to sort of emulate the 2.0 extensions, it will be slow. Or i think you might be able to sort of trick the system into thinking that it supports 2.0, but not actually show any of the 2.0 tricks. Why do you need 2.0 for on your laptop? Perhaps there is a way or use something else that doesn't need OpenGL 2.0
     
  3. Seestern

    Seestern Member

    Messages:
    12
    Likes Received:
    0
    I tried it with 3danalyse but with no success. OpenGL is not really the problem. I wonder why it does not work with Direct X.

    I want to use RenderMonkey from ATI for shader programming.

    After i know it i would had been better to buy a laptop with ati/nvidia card.
     
  4. realdude19

    realdude19 Active Member

    Messages:
    51
    Likes Received:
    0
    GPU:
    9600GT 512mb PCIx16 2.0
    yeah that sucks, I have a dell latitude d500 that my dad gave me and it could use some help too. when you were looking for opengl stuff did you find anything on overclocking these things?
     

  5. {HLH}

    {HLH} Guest

    you can't emulate opengl2.0

    opengl 2.0 is the shader language, you require a video card capable of shader model 2.0/3.0

    the intel chip is a fixed function chip and can not handle opengl 2.0

    it should be capable of opengl1.5 though... its strange it isn't,
    blame intel for not using decent integrated graphics.
     
  6. Seestern

    Seestern Member

    Messages:
    12
    Likes Received:
    0
    The Intel915gm chip supports only Opengl 1.4. It supports pixel and vertex shader 2.0. It is a DirectX 9 card. But it has no T & L engine.
    You can use RenderMonkey if you install the DirectX SDK from MS. With the SDK you can emulate the DX9 shaders within RM.

    I have got a message from Intel regarding to this problem:

    "Our graphics controllers are discrete controllers embedded into motherboards and laptops as an integrated and cost-effective video solution. We are not competing in the graphics market nor we comment or compare our products with those of third party manufacturers."
     
  7. Mannerheim

    Mannerheim Ancient Guru

    Messages:
    4,749
    Likes Received:
    1
    GPU:
    nvidia GTX 1070 mobile
    wow. Even old nvidia vanta from 90´s support that OpenGL 1.4 :D
     
  8. {HLH}

    {HLH} Guest

    as far as i remember.. no intel graphics supports pixel shading
     
  9. Dojomann

    Dojomann Ancient Guru

    Messages:
    3,628
    Likes Received:
    0
    GPU:
    GTX 275 896MB
    yea they do. the two latest ones the 915 and 945 have ps2.0. I should know. I use it.
     
  10. {HLH}

    {HLH} Guest

    ahuh.. they won't go that fast then... considering the cpu has to do it.. which means its just software accelerated...
     

  11. Dojomann

    Dojomann Ancient Guru

    Messages:
    3,628
    Likes Received:
    0
    GPU:
    GTX 275 896MB
    nah the pixel shader is done by hardware. only the vertex shader is done by the cpu.
     
  12. {HLH}

    {HLH} Guest

    i doubt it, if T&L is done via rasteriser then PS will be as well as the pixel shader makes calls on some t&l functions
     
  13. Dr. Vodka

    Dr. Vodka Ancient Guru

    Messages:
    3,803
    Likes Received:
    9
    GPU:
    Sapphire R9 290 Tri-X
    IMO, Intel's design works, but the support of pixel shaders is just marketing. Without a T&L engine, all that damn vertex data will eat CPU cycles, as well the PS engine will have less performance. (AFAIK PS data is rendered in the pixel pipes, and it doesn't use transform calls, but lightning calls, these are rendered in the PS pipeline... think of the Transform as the vertex shader and the Lightning as the pixel shader) That integrated solution is good for CS 1.6, AOE2 and light DX7 games, and windows use.
     
  14. {HLH}

    {HLH} Guest

    thanks Doc :)

    so in the end... its all marketing bs,.. and an intel chip couldn't possible have enough power to render a shader... lmao id like to see it play nfs:mw HAHAHA
     
  15. nirvek

    nirvek Member

    Messages:
    44
    Likes Received:
    0
    GPU:
    Mobile Intel 915GM
    hmm

    ya , software vertex shader lowering performance , could be part of vertex shader code emulated by pixel shader? many games stil dont use ps but many have high overload by vertex shader calculations ... sorry for my bad english
     

  16. {HLH}

    {HLH} Guest

    geforce 4 mx 440 has 1 vertex pipeline.. so your correct there
     
  17. nirvek

    nirvek Member

    Messages:
    44
    Likes Received:
    0
    GPU:
    Mobile Intel 915GM
    i had gf 4 mx 4000 , ya this had t&l and hardware vertex shader but now i have laptop with mobile i915gm which have pixel shader 2.0 hardware but vertex shader only in software , so i was ask if software vertex shader emulation can use pixel shader to calculating something , maybe performance would increse cause cpu cant calculate everything (phisics , ai , rest of game and all calculations for vertex shader)
     
  18. nirvek

    nirvek Member

    Messages:
    44
    Likes Received:
    0
    GPU:
    Mobile Intel 915GM
    i mean if could software vertex shader use pixel shader for fpu calculations for lightning bumping and other related tasks ...
     
  19. {HLH}

    {HLH} Guest

    nup, both are 2 different mechanisms
     
  20. nirvek

    nirvek Member

    Messages:
    44
    Likes Received:
    0
    GPU:
    Mobile Intel 915GM
    but for pixel shader we can write own programs then why what can do cpu pixel shader cant do?
     

Share This Page