1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Windows: Line-Based vs. Message Signaled-Based Interrupts ...

Discussion in 'Operating Systems' started by mbk1969, May 7, 2013.

  1. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    6,869
    Likes Received:
    237
    GPU:
    ASUS 1080GTX STRIX
    Then I'll just put it back. I don't upgrade drivers that regularly anyways, only if a game has horrible performance (and nvidia fixes it) or if a new tweaking feature is added.

    Either way it works, system seems a bit snappier in general, loading pages in browsers, etc. Still need to test it in games.

    Have to admit though, when running dpc latency checks or latencymonitor, the nvidia driver was one of the ones reporting the highest latencies, together with the audio and network drivers.

    Network drivers seem alot better in Windows 8 compared to 7 though, it's almost as if alot of tweaks from Von Dach's thread have been applied by default in Windows 8.

    Running wireshark on an idle system, on W7 there's also alot more going on on the NIC in the background compared to W8.

    EDIT: Scrap "a bit snappier", I knew nvidia drivers were getting a bit laggier and bloated lately compared to like 2 or 3 years ago, but damn every visual change on my screen just pops up so smooth now.
     
    Last edited: May 27, 2013
  2. mbk1969

    mbk1969 Ancient Guru

    Messages:
    7,347
    Likes Received:
    3,856
    GPU:
    GeForce GTX 1070
    By highest latencies of nvidia, audio and network drivers you mean highest DPC routine execution time, right? I think, DPC execution time doesn`t depend on device IRQ mode, because DPC routine does actual work of driver, which is the same all the time - to service video, audio and network requests.

    Highest ISR routine execution time can depend on device IRQ mode, being the routine that handles interrupt request...
     
    Last edited: May 27, 2013
  3. tsunami231

    tsunami231 Ancient Guru

    Messages:
    9,369
    Likes Received:
    288
    GPU:
    EVGA 1070Ti Black
    switch my 660gtx to msi mode cant see we see if make any diffrence

    Should pci e roots be changed to msi if gpu is in msi?
     
    Last edited: May 28, 2013
  4. mbk1969

    mbk1969 Ancient Guru

    Messages:
    7,347
    Likes Received:
    3,856
    GPU:
    GeForce GTX 1070
    At the home rig (Intel chipset) I switched manually pci-e ports along with the other devices. At work rig (AMD chipset) pci-e ports were switched automatically after I switched connected to them devices. At home rig I don`t see any performance diffs, but at work rig the diffs are huge. Try to switch pci-e ports too, but if you have powerful rig you most probably won`t see big improvements.
     

  5. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    6,869
    Likes Received:
    237
    GPU:
    ASUS 1080GTX STRIX
    Never checked for those, guess they might've been pretty high as well. Overal anything visual related just feels smoother.
     
  6. tsunami231

    tsunami231 Ancient Guru

    Messages:
    9,369
    Likes Received:
    288
    GPU:
    EVGA 1070Ti Black
    did you do pci express ports too? im all for smoother visuals right now I just have Nic,Sata,gpu in msi mode. trying to decided if I should put my onboard soundcard and pci e in that mode too.
     
  7. mbk1969

    mbk1969 Ancient Guru

    Messages:
    7,347
    Likes Received:
    3,856
    GPU:
    GeForce GTX 1070
    I asked only because LatencyMon v5 doesn`t show latencies of drivers. It shows DPC routine execution times at the "Drivers" tab (along with the ISR amd DPC count). And it shows highest ISR execution time (along with the name of driver with highest ISR routine execution time) at the "Stats" tab.
     
  8. tsunami231

    tsunami231 Ancient Guru

    Messages:
    9,369
    Likes Received:
    288
    GPU:
    EVGA 1070Ti Black
    its does show the latency of drivers per say just not the drivers as a whole. it show the individual files within the said drivers. That being said.

    I Stop looking at latencymon 5 or hell all of them, as any time i seen anything over 500us i get the urge to start messing around with stuff. And I dont see it as worth it anymore. Unless you getting massive studdering in games or poping/click audio dropping its not worth it to mess with hpet or settings that may mess with latency.

    As for the MSI stuff I only changed my gpu in to this mode and i cant say I see a diffrence, but then I only really play mmo (Swtor Ro2)these days and there not the best game to judge performance on last time I play Deus Ex Human Revolution the game ran 60fps smooth as butter.

    Though the following comes to mind. IF MSI mode is support and a better way of running things why are things not running in the mode by default if its supported?

    Sata and Nic were only thing running in those modes which leads me to believe its the drivers that have to set those modes. Which would mean intel RST Drivers and Realtek nic drivers set those modes by default? Mean while Nvidia drivers and Intel Chipset Drivers do not?

    I mean I get everything should have it own IRQ and not share em and for most part few few things share irq and there are no conflicts. between peices that do share

    Though I would be curious to see if there is any real world benefits that can be documented

    [​IMG]
     
    Last edited: May 29, 2013
  9. mbk1969

    mbk1969 Ancient Guru

    Messages:
    7,347
    Likes Received:
    3,856
    GPU:
    GeForce GTX 1070
    DPC execution time adds to the latencies of processes execution, yes. But DPC execution time doesn`t depend on the interrupt mode of device and its driver, because: device`s interrupt occures -> device`s driver`s ISR routine executes and schedules DPC -> device`s driver`s DPC routine executes and does the needed operation. I wrote to Corrupt^ namely in this context - latencies dictated by current interrupt mode (line- of message-).

    If I was a driver developer I would give you a smart answer to that question. But as a common developer I can say, that it is one of developement rule: do not touch well working code. I can assume that driver developers has no enough desire/motivation to implement new features or choose new MSI-mode as the default one.

    Edit: Also switching to MSI-mode may be one avaliable workaround in case IRQ is shared. I don`t know if it is possible in modern Windows` to change IRQ setting manually as it was in old days. All controls are there - in drivers properties dialog - but they are disabled.
     
    Last edited: May 29, 2013
  10. xodius80

    xodius80 Master Guru

    Messages:
    764
    Likes Received:
    9
    GPU:
    GTX780Ti
    hi, ive noticed something, my sata is auto configured to msi BUT i checked in the registry and found out that the DWORD value wasent enabled.

    did you manualy enabled yours?
     

  11. tsunami231

    tsunami231 Ancient Guru

    Messages:
    9,369
    Likes Received:
    288
    GPU:
    EVGA 1070Ti Black
    no mine was already enabled my guess it was enabled with I install Intel RST, that or it always was cause I only enabled it for my gpu, I gona leave the rest as is
     
  12. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    6,869
    Likes Received:
    237
    GPU:
    ASUS 1080GTX STRIX
    Tried to switch my ASUS soundcard, it didn't work yet, it reverted back to line based immediately.

    For AHCI or RAID controllers, it was set to line based but installing the newer Intel RST drivers turned it to MSI.
     
    Last edited: Jun 1, 2013
  13. tsunami231

    tsunami231 Ancient Guru

    Messages:
    9,369
    Likes Received:
    288
    GPU:
    EVGA 1070Ti Black
    well that comferms my though suspicion that its the drivers that put it in that mode.
     
  14. AudigyMaster

    AudigyMaster Member

    Messages:
    25
    Likes Received:
    0
    GPU:
    nVidia Quadro 3700M@1GB
    Looks like my posts evaporated with no prior notice.

    Well...
     
  15. mbk1969

    mbk1969 Ancient Guru

    Messages:
    7,347
    Likes Received:
    3,856
    GPU:
    GeForce GTX 1070

  16. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    6,869
    Likes Received:
    237
    GPU:
    ASUS 1080GTX STRIX
    As me and some other guy asked before the server issues:

    We would still like to get that application that allows us to set all the different Devices to MSI (preferable 1 by 1 mode so I can test), even if an AV is being a pain in the ass (I'll just turn off the realtime scanner and then add it to the allowed list first).
     
  17. pipes

    pipes Member Guru

    Messages:
    162
    Likes Received:
    0
    GPU:
    gtx 1080 ti oc lab
  18. drbaltazar

    drbaltazar Banned

    Messages:
    416
    Likes Received:
    0
    GPU:
    amd 7950 3gb
    @tsunami !you ask why this isn't on by default ? ROFL ,
    Microsoft that's why .you just are trying to enable it ! Which is relatively easy .but ms suggest setting MSI to one MSI per physical core .(it's defaulted to 1 per CPU socket by default . why you think ? Again , because of ms ! Ms doesn't allow driver to set more then one MSI per socket . but ms suggest to set MSI to one per physical core . so for a company like Intel setting things to one per physical core is relatively easy but not for company like amd or nvidia .and now a day ms is so hell bent on saving power that they push for one interrupt per socket . at the end when proper tweaked ?latencymon is likly to be at the top of the list (ROFL)Don't ask me why the idea in MSI is to reduce the numbers (this include pave fault . you shouldn't get lot of page fault if you do something is wrong somewhere . so what about the industry in all this ? Microsoft can enable this and probably does since the performance gain are very nice .they don't have a lot of hardware so its easy .but for hardware maker it is harder .simplest way is to have a small program asking user what it has and set it . I sure would love to see tester. Bothering setting various MSI to enable and setting the value to 1msi per CPU core .on an fx we re talking 8 core this means 8 MSI . I suspect this might be of help for those with say 2 or more 7990 . I left and a while back so I don't know .you have to remember this everything in a computer is controlled by interrupt . if I go at your house try to speak to you and you don't answer because you are too busy I'll be at the door a long while waiting for your acknowledgement .all the while other try to also communicate with you etc this more an issue these days because people are more social . they also record and stream at the same time so this put huge pressure on interrupt system .even enabled it means only 1 CPU core will be used for interrupt .so what happen if you have 38 other interrupt ? Yep they have to wait in line .but if you have 1 MSI per CPU core . the ODs of all of them all being busy are very low . I suspect this will be fully enabled and set to proper ms recommended value in Xbox 1 . it is after all meant to be a very interactive box
     
  19. mbk1969

    mbk1969 Ancient Guru

    Messages:
    7,347
    Likes Received:
    3,856
    GPU:
    GeForce GTX 1070
    @drbaltazar

    In any case for me MSI-based logic (mechanics) looks like big improvement over the Line-based one.
     
  20. drbaltazar

    drbaltazar Banned

    Messages:
    416
    Likes Received:
    0
    GPU:
    amd 7950 3gb
    ya and Asus already released new driver.sadly on the Intel front it isn't enabled by default .but I suspect its at the mono end that the problem lies.maybe upcoming hardware from various maker will have this properly set for desktop.true even tho it isn't properly set most of the time (ya even this month patch from Asus.going from IRQ to MSI alone fix most issue .but streamer are out of luck.
    .
     

Share This Page