eFX Shader Injector Support Thread

Discussion in 'Games, Gaming & Game-demos' started by Crosire, Oct 22, 2013.

  1. conan2k

    conan2k Guest

    Messages:
    43
    Likes Received:
    0
    GPU:
    Gigabyte 780GTX
    Splash screen setting

    I've also noticed that the GENERAL section has disappeared from eFX.ini and respective old ini keys no longer work when added manually.

    Crosire, do you plan to add them back? In particular, the setting to disable the splash screen.
     
  2. Crosire

    Crosire Member Guru

    Messages:
    164
    Likes Received:
    0
    GPU:
    -
    You too are using an old version. OpenGL support was broken in that one.
     
  3. Crosire

    Crosire Member Guru

    Messages:
    164
    Likes Received:
    0
    GPU:
    -
    The splash screen will be changed to something shown ingame (similar to the one of ENB) in the near future, so I didn't see anything useful in those settings anymore (as the menu was removed completly too).

    About your game problem described earlier, can't really explain that to me, does it work with InjectSMAA (would make bugfinding a lot easier if it does)?

    I'm even more confused about the rendertarget creation error that happens for some (not all) in BF4 for example. That one should never fail for the first backbuffer retrieved from the swapchain. This is something I need to research, hopefully I will find some game where this can be reproduced.
     
    Last edited: Nov 25, 2013
  4. conan2k

    conan2k Guest

    Messages:
    43
    Likes Received:
    0
    GPU:
    Gigabyte 780GTX
    Unfortunately InjectSMAA is a no-go on that machine -- it's nVidia Optimus-enabled laptop. I'll test this on a desktop PC once my new video card arrives.
     

  5. kurtferro

    kurtferro Guest

    Messages:
    115
    Likes Received:
    1
    GPU:
    SAPPHIRE NiTRO R9 390 WB
    try with dxgi.dll name
     
  6. Boulotaur2024

    Boulotaur2024 Guest

    Messages:
    142
    Likes Received:
    0
    GPU:
    AMD HD7950
    Sample GLSL shader does not work on AMD hardware

    The provided untouched 'shaderOPENGL.fx' shader crashes with the following error :

    I'm assuming 'sd' is a typo so I deleted it and :

    Turns out you can't use the keyword 'struct' in GLSL (it's not specs-compliant) with AMD drivers which are more strict, and rightly so, than NVIDIA :
    http://stackoverflow.com/questions/18553791/glsl-fragment-shader-struct-out

    I don't know what to use as a replacement though. Some people say use a vec4 instead some others a vec3... I have no idea I'm not a GLSL guru at all :|
    EDIT : this compiles fine but it doesn't process the image at all (while the picture should be purple-ish shouldn't it ?) :

    uniform sampler2D texColor <
    string type = "input";
    >;
    uniform sampler2D texTarget <
    string type = "output";
    >;

    // --------------------------------------------------------- \\

    vec4 pos;
    vec2 tex;

    // --------------------------------------------------------- \\

    void PostProcess_VS()
    {
    gl_Position = pos;
    }
    void PostProcess_PS_Target(out vec4 FragColor0 : 0)
    {
    // 0 --> Rendertarget 1
    // 1 --> Rendertarget 2
    // ...
    FragColor0 = texture2D(texColor, tex) * vec4(tex, 1.0, 1.0);
    }
    void PostProcess_PS(out vec4 FragColor)
    {
    vec4 color = texture2D(texTarget, tex);
    color.a = 1.0f;

    FragColor = color;
    }

    // --------------------------------------------------------- \\

    technique PostProcess
    {
    pass p0 < string output0 = "texTarget"; >
    {
    VertexShader = compile 420 PostProcess_VS();
    FragmentShader = compile 420 PostProcess_PS_Target();
    }
    pass p1
    {
    VertexShader = compile 420 PostProcess_VS();
    FragmentShader = compile 420 PostProcess_PS();
    }
    }
     
    Last edited: Nov 26, 2013
  7. Crosire

    Crosire Member Guru

    Messages:
    164
    Likes Received:
    0
    GPU:
    -
    Yeah, forgot abou that, might add support for structures in the future. You can replace them the same way you would do in HLSL, just write the two parameters directly, but don't forget the "in" keywotd. My parser will take care of the rest an translate that to something the GLSL compiler can read (because that one doesn't even allow function parameters for the "main()" entry point. It would look like this:

    void MyShader(in vec4 pos, in vec2 tex, out vec4 colorOut) { }
     
  8. Boulotaur2024

    Boulotaur2024 Guest

    Messages:
    142
    Likes Received:
    0
    GPU:
    AMD HD7950
    Err I'm trying my best here I can't seem to make the shader modify the image, it does compile fine though but no effect on the picture

    Am I doing something wrong ? My guess is that the vertex shader is not properly initialized because it doesn't change the picture as it should :/
    Can you please write up a working sample shader without structures ? You can test it out on FurMark, it's OpenGL. I want the damn thing to be purpleish :bang:

    EDIT : I thought structures were the problem per se. I was wrong : they are allowed but not as an input parameter in the vertex shader declaration. So the best I could do is to replace them by vec4 and vec2 parameters for pos and tex respectively :

    void PostProcess_VS(in vec4 pos : 1, in vec2 tex, out VS_OUTPUT OUT)
    {
    gl_Position = pos;

    OUT.pos = pos;
    OUT.tex = tex;
    }
    So this compiles fine even with structures But it still doesn't work of course... :)
     
    Last edited: Nov 26, 2013
  9. Crosire

    Crosire Member Guru

    Messages:
    164
    Likes Received:
    0
    GPU:
    -
    You are missing the concept behind vertex and pixel shaders here :)
    The vertex shader isn't of much use in case of post processing shaders, as we just want to draw a predefined fullscreen quad, without changing any of the vertices here. But that shader is the first one executed in the row and it gets all the information by the application. All following shaders do get their parameters from the previous shader, which means that the values the vertex shader outputs are than used by the pixel shader etc.

    Now you sepcified those shaders like this:
    VertexShader: void main(in vec4 pos);
    PixelShader: void main(in vec2 tex, out vec4 FragColor0);

    This means that the vertex shader takes a "vec4" as input at the first position (which is the position in case of eFX). But it does not output anything (except writing to the global "gl_Position" variable, but that one isn't from much use for use now). Now your pixel shader expects a "vec2" as input, but it won't recieve anything, because the vertex shader didn't output such a value.
    To fix that you need to write them as the following (where VS is the VertexShader and PS the pixelshader):

    Code:
    void VS(in vec4 posIn : 1, in vec2 texIn, out vec2 texOut)
    {
    	gl_Position = posIn;
    	texOut = texIn;
    }
    void PS(in vec2 texOutFromVS, out vec4 color)
    {
    	color = texture2D(texColor, texOutFromVS) * vec4(texOutFromVS, 1.0, 1.0);
    }
    Note that a pixel shader (like it's called in HLSL) in GLSL actually is called "fragment shader", but it's the same.

    EDIT: Try the following: http://pastebin.com/5Ums2Ayx
    Removed the usage of rendertargets and any structs (although I just read your edit, and so it seems to work to use them for the vertexshader output) there.
     
    Last edited: Nov 26, 2013
  10. Boulotaur2024

    Boulotaur2024 Guest

    Messages:
    142
    Likes Received:
    0
    GPU:
    AMD HD7950
    Now it makes FurMark crash (script) (dump). Same for every OpenGL app I've tried it on

    EDIT : no longer crashes when I reintroduce the structures. But still no effect : http://pastebin.com/pfg8qNxY. Guess I'm still missing something...
     
    Last edited: Nov 26, 2013

  11. Johnny5srv

    Johnny5srv Guest

    Messages:
    200
    Likes Received:
    0
    GPU:
    nvidia SLI gtx670 2GB
    I'm getting pink screenshots using efx/sweetfx. Using prntscreen button. Any ideas as to why? If more info is needed, let me know.
     
  12. bwana

    bwana Member

    Messages:
    14
    Likes Received:
    1
    GPU:
    780GTX
    I got this version of efx and sweetfx to work in win8.1

    post 1133 of this thread

    "www dot overclock dot net/t/1296721/how-to-anti-aliasing-injection-fxaa-smaa-and-sweetfx/1133"

    trouble is, the cursor makes trails forever and the shader produces a dithered look-even when i set dither to 0.

    the only other package of efx and sweetfx to work in win8.1 for the game, Ghost REcon Advanced Warfighter ( a dx9 title) is the previous version

    eFX (v2.0 Alpha 1.9.30)+SweetFX 1.5.1 which creates a ground glass appearrance.
     
  13. writer21

    writer21 Guest

    Messages:
    68
    Likes Received:
    0
    GPU:
    EVGA SC ACX 780 sli 3gb
    How do I change the settings for efx 2.0. There was a config for sweetfx to change smaa, sharpen etc... But I don't see it with efx 2.0
     
  14. fighting4fun

    fighting4fun Guest

    Messages:
    18
    Likes Received:
    0
    GPU:
    Zotac GTX 760 AMP!
    with the new one nothing is working... maybe you have a hint for me... with no sweetfx i have your efx sign but i cant swap settings... and is there a workaround to use sweetfx presents with your efx (without importing sweetfx injectors)
     
  15. Crosire

    Crosire Member Guru

    Messages:
    164
    Likes Received:
    0
    GPU:
    -
    @Johnny5srv: Please update to the latest version. This was a bug in an older build.

    , fighting4fun:

    eFX is a standalone injector, you need to add the SweetFX shaders yourself (or use a prebundled version which was not yet created for the latest alpha).
    It should be enough to download an older eFX/SeetFX bundle and replace the eFX DLLs (d3d9.dll or dxgi.dll or however they are called) with the new ones. Read the readme for instructions, and why to rename the eFX32/64.dll in one of these: d3d9.dll, dxgi.dll, opengl32.dll, ...
     
    Last edited: Dec 10, 2013

  16. fighting4fun

    fighting4fun Guest

    Messages:
    18
    Likes Received:
    0
    GPU:
    Zotac GTX 760 AMP!
    i know.. but it works 3 of 5 times... and i dont know why... after a restart it didnt work anymore :D and i dont know why..

    so i have to use radeonpro... since there is a stable soulution =(
     
  17. Crosire

    Crosire Member Guru

    Messages:
    164
    Likes Received:
    0
    GPU:
    -
    Not sure how often I have to repeat this. I cannot help nor fix bugs without a log and a description of what's not working. "It doesn't work" is not very much of information unfortunately :)
     
  18. fighting4fun

    fighting4fun Guest

    Messages:
    18
    Likes Received:
    0
    GPU:
    Zotac GTX 760 AMP!
    there is no edit in the log...

    i start bf4 64 bit (yes i use the 64 bit indicator) and it works... and the next time it doesnt work... same configuration and no edit in the config... its very strange...

    i will try to make a new efx + sweetfx package and maybe it work flawless...
     
  19. Crosire

    Crosire Member Guru

    Messages:
    164
    Likes Received:
    0
    GPU:
    -
    That indeed is strange. Are you sure you run the game with the same executable each time, anything else that might be different? How did you call the dll?
     
  20. padolamap

    padolamap Guest

    Messages:
    42
    Likes Received:
    0
    GPU:
    ATI RANDEON 6990 4GB
    What are the chances of combining with eFX Boulotaur2024 files to work eg GAUSSIAN effect this in DX11. Thanks
     

Share This Page