Not an unusual situation. Drivers for disk subsystems are, I am told, a tricky proposition because there is a constant battle between getting the job done and not messing up the rest of the system performance. RAIDs are even worse depending on how much of the necessary calculation (what RAID level are you running?) is handed off to hardware and what has to be handled by the driver software code. Such drivers are difficult to do well and there are very few people with the necessary experience. Those with a good track record in this field are sought after and costly.
Your example is one of the best arguments for leaving a multimedia computer as simple as possible as the performance ‘out of the box’ is likely to be as good as it gets. As soon as you start adding hardware and software complications you are on the road to ruin!
DPC latency is just how long the driver takes to respond a DPC event, and has essentially no relation to how complex the RAID calculations can be. RAID and storage drivers are whole tend to be very fast by nature of storage needing to respond fast. Even fakeraid and it's CPU-stealing ways.
In modern times, there's also very few true RAID controller manufacturers - LSI/Avago/Broadcom, Marvell and Adaptec/Microsemi/PMC-Sierra. essentially everyone else selling a RAID "controller" is using some form of fakeraid/softraid with extra boot firmware. Ironically, it's the guys with the big, expensive true RAID controllers that have some of the slowest drivers, since a longer DPC call is fine if the overall storage/network performance goes up.
I have a system drive and a pair of drives I have in a RAID 1 mirror. In all honestly, reloading the computer was probably for the best. Even before dealing with the audio popping and figuring out Foobar, I was dealing with reoccurring blue screens cause by a Windows update. Turns out the new update no longer played well with the memory timings I had set in the bios (That was a fun one to track down!). However, getting to that conclusion led me to do all manner of loading and rolling back drivers, tweaking bios settings among other things. So getting back to a good one and done baseline will most likely help prevent other issues. And I learned a thing or two extra about Windows that might help down the road;I didn't even know DPC latency was a thing until I happened to stumble upon it during one of my searches.
Happens, and is a *huge* part of why nobody considers overclocked systems as any sort of valid case for software/driver stability. PS: step 1 of diagnosing any sort of reproducible error is to disable any and all overclocks.
Mmm... I'm not at all convinced by the idea of running a RAID1 internally within a Windows machine (even as a data-only drive). As an external NAS, yes – but as an internal within Windows???? Simply because a far more likely failure mode than a drive fault is that Windows itself will mess up the filing system and your RAID will faithfully create two faulty copies. Such a scenario is all the more likely if the system is struggling with latency issues.
My instinct is to run your second drive as a standard solo (no RAID, no problematic high-latency drivers) and use the other RAID partner drive as a weekly-or-so backup.
RAID1 is perfectly fine even for Windows. It's just a bit pointless because odds are you'll break something in the OS install or accidentally delete something or upgrade the drive somehow before a drive breaks.
In my case, I have one particular driver that spits out crazy DPCs when I load it up: my Mellanox 40Gig network card. Changing the windows scheduler to prefer background tasks rather than foreground tasks seems to help, but I have yet to run serious benchmarks with it, so don't take my words as an absolute truth.