Guru3D.com Forums

Go Back   Guru3D.com Forums > Videocards > Videocards - NVIDIA GeForce Drivers Section
Videocards - NVIDIA GeForce Drivers Section In this section you can discuss everything GeForce driver related. GeForce ForceWare and GeForce Exerience drivers are for NVIDIA Quadro and all GeForce based videocards.



Reply
 
Thread Tools Display Modes
Old
  (#76)
Angrycrab
Master Guru
 
Angrycrab's Avatar
 
Videocard: Titan XP EVGA Hybrid Kit
Processor: i7 7700k @ 5.1GHz
Mainboard: Asus Maximus IX Apex
Memory: GSkill 16GB@4133MHz CL17
Soundcard: On Board
PSU: Corsair 850watt
Default Yesterday, 12:04 | posts: 249

Quote:
Originally Posted by dr_rus View Post
HDMI 2.1 is a transport specification, HDR (or anything else really) is implemented on top of the transport specification as a separate protocol for data transmission. Not sure why your link is saying that HDMI 2.1 have something to do with HDR, DV is implemented and working fine over any HDMI really, or DP for that matter. Same is true for HDR10, HDR10+ or HDR12 - the transmitter and receiver devices must know of this protocol to exchange over it via HDMI or DP transport, that's pretty much all.

HDMI 2.1 is important for other reasons, like the increase in bandwidth which will make possible use of HDR10/DV in 4K@60Hz (and 4K@120Hz probably) and the apparent requirement on supporting adaptive sync - although it remains to be seen if it will be a requirement or an optional extension again. In any case, this is all rather moot for us here as we have an option of using DP 1.4 which already have considerably higher bandwidth than HDMI 2.0.
Basically what HDMI 2.1 will do Is eliminate any type of additional hardware for Dynamic Metadata processing(Dolby Vision METADATA Parser,Composer). So In a sense there won't be any need for the Industry to change the current HDR10/DV Standards. It will also add “Game Mode VRR” "variable frame rate" feature which expectedly might put an end to Gsync and Freesync.

That's all hopefulness at this point.

Last edited by Angrycrab; Yesterday at 12:30.
   
Reply With Quote
 
Old
  (#77)
EdKiefer
Maha Guru
 
Videocard: MSI 970 Gaming 4G
Processor: i5-3570k@4.5ghz
Mainboard: Asus P8Z77V pro
Memory: 16Gb-DDR3 1600
Soundcard:
PSU: Corsair XT650
Default Yesterday, 12:28 | posts: 1,236

Quote:
Originally Posted by Minotaur View Post
It's working fine here now, it has for a few driver versions so far!
btw, The link doesn't work, I bet it's a test with older drivers and versions of BF1 back from Dec/Jan? A recent test would be interesting to see...

Because things have definitely changed here, it seems smoother than it was before. The FPS swing is a lot lower too and more consistent. Poor G-Sync doesn't kick in now, no lows lol
Yeh, this forum won't let full name of site, but if you copy/paste to Google it be first one, even with the **** in name.
Its like a week old and shows no improvements, some cards even slower or worse frametimes.

http://www.e_teknix.com/battlefield-...ance-analysis/
remove the "_" and should work.
   
Reply With Quote
Old
  (#78)
slickric21
Maha Guru
 
slickric21's Avatar
 
Videocard: 1080 hybrid 2150mhz/Gsync
Processor: 4770k 4.8core 4.3cache
Mainboard: MSI Z87 DG65
Memory: 16Gb Gskill TriX 2666mhz
Soundcard: Realtek 1150
PSU: Corsait GS 800
Default Yesterday, 12:51 | posts: 2,405 | Location: U.K

Quote:
Originally Posted by rla1999 View Post
Hows this driver performing with these games: witcher 3, GTA5, Infinite warfare, wildlands? Better than 378.78?
Exactly the same
   
Reply With Quote
Old
  (#79)
rla1999
Newbie
 
Videocard: NVIDIA 1080 TI FE
Processor: I7 5930K
Mainboard: Gigabyte X99 Gaming G1
Memory: DDR4 / 64GB's
Soundcard: Creative Sound Core3D
PSU: 1000W Rosewill Extreme
Default Yesterday, 13:00 | posts: 10 | Location: Fort Lauderdale, Florida

Quote:
Originally Posted by RealNC View Post
I never notice any difference here. Every time I upgrade the driver, it's neither faster nor slower (unless you consider a 2FPS difference as being important) nor more smooth nor anything.

I seem to be in the minority :-P
I typically don't check too closely but I see people do so I ask. Seems this is a driver I can skip.
   
Reply With Quote
 
Old
  (#80)
rla1999
Newbie
 
Videocard: NVIDIA 1080 TI FE
Processor: I7 5930K
Mainboard: Gigabyte X99 Gaming G1
Memory: DDR4 / 64GB's
Soundcard: Creative Sound Core3D
PSU: 1000W Rosewill Extreme
Default Yesterday, 13:02 | posts: 10 | Location: Fort Lauderdale, Florida

Quote:
Originally Posted by slickric21 View Post
Exactly the same
Thanks for the info
   
Reply With Quote
Old
  (#81)
benjamin
Newbie
 
Videocard: gigabyte 1060 d5 6g rev1
Processor: i54690k h100igtx
Mainboard: msi z97 g55 sli
Memory: hyperx fury ddr3 1866
Soundcard: on board
PSU: corsair 750
Default Yesterday, 13:15 | posts: 7 | Location: uk

Quote:
Originally Posted by rla1999 View Post
Hows this driver performing with these games: witcher 3, GTA5, Infinite warfare, wildlands? Better than 378.78?
About the same mate but I thought that 378.78 was a big improvement I had no problems with the witcher 3 but gta 5 I was having a bit of trouble with still playable but since 378.78 gta 5 is smooth, the best its been for me, 378.78 has been the best driver for me since I got my 1060 for me 378.92 is just as good as 378.78 I had lower score in 3d mark fire strike which I think is like just a ruff guide but I will take that any day for smoother game play, good driver for me mate, hope it is just as good for you.
   
Reply With Quote
Old
  (#82)
RealNC
Master Guru
 
RealNC's Avatar
 
Videocard: EVGA GTX 980 Ti FTW
Processor: Intel Core i5 2500K
Mainboard: MSI P67A-C43
Memory: DDR3 16GB
Soundcard: Asus Xonar D1, JBL Spot
PSU: Corsair HX650
Default Yesterday, 13:27 | posts: 613

Quote:
Originally Posted by Angrycrab View Post
It will also add “Game Mode VRR” "variable frame rate" feature which expectedly might put an end to Gsync and Freesync.
How so?
   
Reply With Quote
Old
  (#83)
dr_rus
Maha Guru
 
dr_rus's Avatar
 
Videocard: GTX 1080 GRP
Processor: i7-6850K
Mainboard: Sabertooth X99
Memory: 64 GB DDR4
Soundcard: SB X-Fi Ti
PSU: CM V1200 Platinum
Default Yesterday, 13:55 | posts: 1,468

Quote:
Originally Posted by Angrycrab View Post
Basically what HDMI 2.1 will do Is eliminate any type of additional hardware for Dynamic Metadata processing(Dolby Vision METADATA Parser,Composer). So In a sense there won't be any need for the Industry to change the current HDR10/DV Standards. It will also add “Game Mode VRR” "variable frame rate" feature which expectedly might put an end to Gsync and Freesync.

That's all hopefulness at this point.
You will still need additional h/w for HDR signal processing, HDMI by itself won't (and shouldn't really) be able to handle this. You will need it just because you'd still want to support Dolby Vision (or some other HDR standard) for example which isn't a part of HDMI spec and never will be.

Quote:
Originally Posted by RealNC View Post
How so?
"VRR" is what's called Adaptive Sync in VESA specification for Display Port 1.2a+ and it's essentially the same thing which FreeSync is built upon. The big question still is how HDMI will handle the spec as requiring all HDMI devices to support adaptive sync may be a bit extreme (not all devices need this) so it may end up the same as VESA's Adaptive Sync in DP - as in an optional extension which you may use or not. In that case it will be pretty much the same as it is now with FreeSync and G-Sync, the biggest difference would be that TVs and consoles would get such support and that may push NV to supporting it in addition to their own more advanced G-Sync.
   
Reply With Quote
Old
  (#84)
RealNC
Master Guru
 
RealNC's Avatar
 
Videocard: EVGA GTX 980 Ti FTW
Processor: Intel Core i5 2500K
Mainboard: MSI P67A-C43
Memory: DDR3 16GB
Soundcard: Asus Xonar D1, JBL Spot
PSU: Corsair HX650
Default Yesterday, 14:32 | posts: 613

Quote:
Originally Posted by dr_rus View Post
"VRR" is what's called Adaptive Sync in VESA specification for Display Port 1.2a+ and it's essentially the same thing which FreeSync is built upon. The big question still is how HDMI will handle the spec as requiring all HDMI devices to support adaptive sync may be a bit extreme (not all devices need this) so it may end up the same as VESA's Adaptive Sync in DP - as in an optional extension which you may use or not. In that case it will be pretty much the same as it is now with FreeSync and G-Sync, the biggest difference would be that TVs and consoles would get such support and that may push NV to supporting it in addition to their own more advanced G-Sync.
The question was how will it "put an end to FreeSync and G-Sync." It's an HDMI spec, not a DisplayPort spec. Also, FreeSync didn't put an end to G-Sync simply because NVidia refuses to support it. The only one who can put an end to G-Sync is NVidia, and they don't intend to do so.
   
Reply With Quote
Old
  (#85)
dr_rus
Maha Guru
 
dr_rus's Avatar
 
Videocard: GTX 1080 GRP
Processor: i7-6850K
Mainboard: Sabertooth X99
Memory: 64 GB DDR4
Soundcard: SB X-Fi Ti
PSU: CM V1200 Platinum
Default Yesterday, 14:57 | posts: 1,468

Quote:
Originally Posted by RealNC View Post
The question was how will it "put an end to FreeSync and G-Sync." It's an HDMI spec, not a DisplayPort spec. Also, FreeSync didn't put an end to G-Sync simply because NVidia refuses to support it. The only one who can put an end to G-Sync is NVidia, and they don't intend to do so.
As I've said, appearance of adaptive sync in HDMI specs will certainly lead to most consumer devices which use HDMI (TVs, consoles, receivers, projectors, etc) support the spec and this will force NV to support it in their output devices too as it will be a big loss of value otherwise.

Doesn't mean that they'll stop developing G-Sync of course, only that they'll support "G-Sync" over adaptive sync spec too, without the h/w G-Sync module (FreeSync is just that, AMD's s/w implementation of VESA adaptive sync). Technically they already do this for VESA adaptive sync in notebooks, it remains to be seen though if they'll extend this to all HDMI 2.1 devices or there will be a white list of devices they support.
   
Reply With Quote
Old
  (#86)
rla1999
Newbie
 
Videocard: NVIDIA 1080 TI FE
Processor: I7 5930K
Mainboard: Gigabyte X99 Gaming G1
Memory: DDR4 / 64GB's
Soundcard: Creative Sound Core3D
PSU: 1000W Rosewill Extreme
Default Yesterday, 17:27 | posts: 10 | Location: Fort Lauderdale, Florida

Quote:
Originally Posted by benjamin View Post
About the same mate but I thought that 378.78 was a big improvement I had no problems with the witcher 3 but gta 5 I was having a bit of trouble with still playable but since 378.78 gta 5 is smooth, the best its been for me, 378.78 has been the best driver for me since I got my 1060 for me 378.92 is just as good as 378.78 I had lower score in 3d mark fire strike which I think is like just a ruff guide but I will take that any day for smoother game play, good driver for me mate, hope it is just as good for you.
good to know. Looks like I'll be staying with 378.78. Thanks.
   
Reply With Quote
Old
  (#87)
CK the Greek
Maha Guru
 
CK the Greek's Avatar
 
Videocard: 2x970G1 SLI,Gsync,3DVsn2
Processor: i5 4670K @4.4Ghz H2O H110
Mainboard: GA Z87X-UD5H
Memory: G.skill F3-2400C10D16GTX
Soundcard: Premium 5.1 Snd Sys
PSU: Corsair RM1000x
Default Yesterday, 20:33 | posts: 1,138 | Location: Greece

Quote:
Originally Posted by The Goose View Post
The Division and Arma3 are a bit choppy with these
About Division, try to delete everything except bindings file from folder in your documents. You will have to re configure your settings (except key bindings) though. Some people seems to solve their issues after updating to newer driver.
   
Reply With Quote
Old
  (#88)
-Tj-
Ancient Guru
 
-Tj-'s Avatar
 
Videocard: ZOTAC GTX980Ti Amp!Omega
Processor: Intel i7 4770K OC 4.7GHz
Mainboard: ASUS Z87 Deluxe 2103
Memory: DDR3 G.skill 16GB 2400MHz
Soundcard: X-Fi Titanium HD @Bose A5
PSU: Nitro88+ 650W 52A
Default Yesterday, 20:50 | posts: 13,528 | Location: Proxima \/82

Quote:
Originally Posted by CK the Greek View Post
About Division, try to delete everything except bindings file from folder in your documents. You will have to re configure your settings (except key bindings) though. Some people seems to solve their issues after updating to newer driver.
Interesting, I started to get stutter @ 1.6 and dx12, will try this trick.

I remember it helped before by other games
   
Reply With Quote
Old
  (#89)
Memorian
Maha Guru
 
Memorian's Avatar
 
Videocard: GTX 1070 FTW@2.1/9.1Ghz
Processor: i5 3570K@4.4Ghz
Mainboard: Gigabyte Z77X UD3H
Memory: 16GB TridentX 2400C10
Soundcard: M2Tech HiFace DAC
PSU: TT 750W
Default Yesterday, 21:49 | posts: 2,502 | Location: Hellas

Win10 RTM(15063/Creator's Update). Game alt-tabing breaks the custom color settings from NV CP.
   
Reply With Quote
Old
  (#90)
RealNC
Master Guru
 
RealNC's Avatar
 
Videocard: EVGA GTX 980 Ti FTW
Processor: Intel Core i5 2500K
Mainboard: MSI P67A-C43
Memory: DDR3 16GB
Soundcard: Asus Xonar D1, JBL Spot
PSU: Corsair HX650
Default Today, 05:21 | posts: 613

Quote:
Originally Posted by dr_rus View Post
As I've said, appearance of adaptive sync in HDMI specs will certainly lead to most consumer devices which use HDMI (TVs, consoles, receivers, projectors, etc) support the spec and this will force NV to support it in their output devices too as it will be a big loss of value otherwise.

Doesn't mean that they'll stop developing G-Sync of course, only that they'll support "G-Sync" over adaptive sync spec too, without the h/w G-Sync module (FreeSync is just that, AMD's s/w implementation of VESA adaptive sync). Technically they already do this for VESA adaptive sync in notebooks, it remains to be seen though if they'll extend this to all HDMI 2.1 devices or there will be a white list of devices they support.
I connect my monitor through DisplayPort. Computers in general will be using DisplayPort primarily.

So, again, how will this put an end to FreeSync and G-Sync? How can an HDMI-only spec put an end to a DisplayPort spec?

Note: "put an end to", not "complement". "Put and end to", last time I checked, means it will kill it, make it disappear from the market.
   
Reply With Quote
Old
  (#91)
dr_rus
Maha Guru
 
dr_rus's Avatar
 
Videocard: GTX 1080 GRP
Processor: i7-6850K
Mainboard: Sabertooth X99
Memory: 64 GB DDR4
Soundcard: SB X-Fi Ti
PSU: CM V1200 Platinum
Default Today, 12:06 | posts: 1,468

Quote:
Originally Posted by RealNC View Post
I connect my monitor through DisplayPort. Computers in general will be using DisplayPort primarily.

So, again, how will this put an end to FreeSync and G-Sync? How can an HDMI-only spec put an end to a DisplayPort spec?

Note: "put an end to", not "complement". "Put and end to", last time I checked, means it will kill it, make it disappear from the market.
It won't, that's pretty obvious. G-Sync is more than a standard.
   
Reply With Quote
Old
  (#92)
Bloodred217
Master Guru
 
Videocard: 2x GTX 1080 G1 Gaming 8GB
Processor: i7 4790K @ 4.7GHz
Mainboard: Gigabyte Z97X-Gaming GT
Memory: 2x8GB 2133MHz CL10
Soundcard: Musical Fidelity V-DAC II
PSU: Corsair RM1000i
Default Today, 12:22 | posts: 321

I don't think it will kill GSync (sadly, I don't actually like proprietary solutions and how NVIDIA is refusing to support industry standards). I believe the idea here is that if some mandatory variable refresh rate functionality were imposed via HDMI to every new display out there with the corresponding HDMI version NVIDIA may feel pressured to implement this feature as well, they may not want to fall behind compared to AMD and consoles for instance by not supporting new HDMI displays.

I doubt anything like that will happen though, I would be surprised for VRR to be mandatory in HDMI, it makes sense for games and PCs, but doesn't seem too useful for movie playback at all, as such it wouldn't make sense to force it into video playback devices and TVs which aren't intended for gaming at all.

FreeSync/VESA Adaptive Sync cannot put this sort of pressure on NVIDIA as it's not a mandatory part of the spec and its support is limited to AMD nowadays. Despite being part of the standard it currently exists as an AMD-only competitor to GSync.
   
Reply With Quote
Old
  (#93)
dr_rus
Maha Guru
 
dr_rus's Avatar
 
Videocard: GTX 1080 GRP
Processor: i7-6850K
Mainboard: Sabertooth X99
Memory: 64 GB DDR4
Soundcard: SB X-Fi Ti
PSU: CM V1200 Platinum
Default Today, 14:02 | posts: 1,468

Quote:
Originally Posted by Bloodred217 View Post
I don't think it will kill GSync (sadly, I don't actually like proprietary solutions and how NVIDIA is refusing to support industry standards). I believe the idea here is that if some mandatory variable refresh rate functionality were imposed via HDMI to every new display out there with the corresponding HDMI version NVIDIA may feel pressured to implement this feature as well, they may not want to fall behind compared to AMD and consoles for instance by not supporting new HDMI displays.

I doubt anything like that will happen though, I would be surprised for VRR to be mandatory in HDMI, it makes sense for games and PCs, but doesn't seem too useful for movie playback at all, as such it wouldn't make sense to force it into video playback devices and TVs which aren't intended for gaming at all.

FreeSync/VESA Adaptive Sync cannot put this sort of pressure on NVIDIA as it's not a mandatory part of the spec and its support is limited to AMD nowadays. Despite being part of the standard it currently exists as an AMD-only competitor to GSync.
As I've said already, NV does support VESA adaptive sync - in notebooks, where it makes business sense for them. Generally, I struggle to think of any industry standard which NV doesn't support - could you provide any examples?
   
Reply With Quote
Old
  (#94)
Elajitz
Master Guru
 
Elajitz's Avatar
 
Videocard: GTX 1080 TI 11GB
Processor: i7 5820K 4.0Ghz OC
Mainboard: ASUS X99-A, Socket 2011-3
Memory: Crucial DDR4 2133MHz 32G
Soundcard: Razer Adaro
PSU: Corsair 1000W RM Modular
Default Today, 15:15 | posts: 579 | Location: Sweden

About my earlier post about battlefield 1
it seems to be MSI afterburner that makes the game to crash in DX12...
Not the driver that i thought in the first place!



and this is want event viewer say

- <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
- <System>
<Provider Name="Application Error" />
<EventID Qualifiers="0">1000</EventID>
<Level>2</Level>
<Task>100</Task>
<Keywords>0x80000000000000</Keywords>
<TimeCreated SystemTime="2017-03-21T14:17:22.232778600Z" />
<EventRecordID>53508</EventRecordID>
<Channel>Application</Channel>
<Computer>AsusX99A</Computer>
<Security />
</System>
- <EventData>
<Data>bf1.exe</Data>
<Data>1.0.49.28890</Data>
<Data>58ad6a16</Data>
<Data>dxgi.dll</Data>
<Data>10.0.14393.953</Data>
<Data>58ba5daf</Data>
<Data>c0000005</Data>
<Data>00000000000074a0</Data>
<Data>16a0</Data>
<Data>01d2a24dd706bc27</Data>
<Data>C:\Program Files (x86)\Origin Games\Battlefield 1\bf1.exe</Data>
<Data>C:\Windows\SYSTEM32\dxgi.dll</Data>
<Data>393253fb-db31-437b-b546-3f7d35e81c8a</Data>
<Data />
<Data />
</EventData>
</Event>

Last edited by Elajitz; Today at 15:17.
   
Reply With Quote
Old
  (#95)
khanmein
Maha Guru
 
khanmein's Avatar
 
Videocard: EVGA GTX 1070 SC ACX 3.0
Processor: Intel® Core™ i5-4460
Mainboard: ASRock H97 Pro4
Memory: 16GB DDR3 Kingston CL11
Soundcard: Realtek ALC892
PSU: Seasonic X-750 (KM3)
Default Today, 15:23 | posts: 1,077 | Location: Batu Pahat, Johor, Malaysia

^^ BF1 performed well in DX 11 for NV cards.

https://www.hardocp.com/article/2017...eo_card_review
   
Reply With Quote
Old
  (#96)
-Tj-
Ancient Guru
 
-Tj-'s Avatar
 
Videocard: ZOTAC GTX980Ti Amp!Omega
Processor: Intel i7 4770K OC 4.7GHz
Mainboard: ASUS Z87 Deluxe 2103
Memory: DDR3 G.skill 16GB 2400MHz
Soundcard: X-Fi Titanium HD @Bose A5
PSU: Nitro88+ 650W 52A
Default Today, 15:23 | posts: 13,528 | Location: Proxima \/82

Wow either 3d api test got a small uplift or this driver is tweaked more, either way I got the best score yet, especially in dx11. This is with 4.6GHz OC, all my older tests were at 4.7GHz.

Vulkan owns too


DirectX 11 Multi-threaded draw calls per second
2 439 599
DirectX 11 Single-threaded draw calls per second
2 586 663
DirectX 12 draw calls per second
21 212 018
Vulkan draw calls per second
24 469 431

http://www.3dmark.com/compare/aot/19...7136/aot/47269
   
Reply With Quote
Old
  (#97)
Blackfyre
Maha Guru
 
Blackfyre's Avatar
 
Videocard: MSI GTX 1070 Gaming X
Processor: 4790K @ 4.6Ghz @ 1.220v
Mainboard: Gigabyte Z97X-Gaming GT
Memory: 16Gb @2400Mhz 11-13-13-30
Soundcard: SoundBlaster Z + AD900X
PSU: EVGA SuperNOVA 1000W G2
Default Today, 16:20 | posts: 853 | Location: Australia

Quote:
Originally Posted by -Tj- View Post
Wow either 3d api test got a small uplift or this driver is tweaked more, either way I got the best score yet, especially in dx11. This is with 4.6GHz OC, all my older tests were at 4.7GHz.

Vulkan owns too


DirectX 11 Multi-threaded draw calls per second
2 439 599
DirectX 11 Single-threaded draw calls per second
2 586 663
DirectX 12 draw calls per second
21 212 018
Vulkan draw calls per second
24 469 431

http://www.3dmark.com/compare/aot/19...7136/aot/47269
I just ran it too. They added Vulkan support!

I'm running it at 4K:

http://www.3dmark.com/3dm/18794129

EDIT:

I'm 100% sure they've added HDR support for all the tests. Without it affecting scores too. Tested Time Spy & Fire Strike and my TV is automatically switching to HDR Mode. They look almost like new shiny tests.


Last edited by Blackfyre; Today at 16:32.
   
Reply With Quote
Old
  (#98)
dr_rus
Maha Guru
 
dr_rus's Avatar
 
Videocard: GTX 1080 GRP
Processor: i7-6850K
Mainboard: Sabertooth X99
Memory: 64 GB DDR4
Soundcard: SB X-Fi Ti
PSU: CM V1200 Platinum
Default Today, 16:52 | posts: 1,468

Quote:
Originally Posted by -Tj- View Post
Wow either 3d api test got a small uplift or this driver is tweaked more, either way I got the best score yet, especially in dx11. This is with 4.6GHz OC, all my older tests were at 4.7GHz.

Vulkan owns too


DirectX 11 Multi-threaded draw calls per second
2 439 599
DirectX 11 Single-threaded draw calls per second
2 586 663
DirectX 12 draw calls per second
21 212 018
Vulkan draw calls per second
24 469 431

http://www.3dmark.com/compare/aot/19...7136/aot/47269
What's surprising is to see NV cards taking a lead in Vulkan while AMD cards seems to be more comfortable submitting draw calls through DX12.
   
Reply With Quote
Old
  (#99)
ManuelG
NVIDIA Rep
 
ManuelG's Avatar
 
Videocard: Geforce GTX TitanX Pascal
Processor: Intel Core i7 5960X
Mainboard: Alienware Area 51 R2
Memory: 32GB DDR4
Soundcard: Creative Sound Blaster Zx
PSU: Alienware 1200W
Default Today, 18:51 | posts: 525 | Location: Santa Clara, California

Quote:
Originally Posted by genbrien View Post
my friend got that problem with his 1070.
Got an error message

Can you ask your friend to fill out the feedback form below?

http://surveys.nvidia.com/index.jsp?...f07694a40f8ac6

Also anyone else who has an out of memory error message in Mass Effect Andromeda. Thanks.
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.