A few weeks ago, I rebuilt a recently decommissioned server to run as an infrastructure test and development rig at home. I installed Windows Server 2008 R2, enabled the Hyper-V role and all was good until I started to configure my networks, during which I experienced a “blue screen of death” (BSOD) – never a good thing on your virtualisation host, especially when it does the same thing again on reboot:
“Oh dear, my freshly built Windows Server 2008 R2 machine has just thrown 3 BSODs in a row… after running normally for an hour or so :-(“
The server is a Dell PowerEdge 840 (a small, workgroup server that I bought a couple of years ago) with 8GB RAM and a quad core Xeon CPU. The hardware is nothing special – but fine for my infrastructure testing – and it had been running with Windows Server 2008 Hyper-V since new (with no issues) but this was the first time I’d tried R2.Â
I have 3 network adapters in the server: a built in Broadcom NetXtreme Gigabit card (which I’ve reserved for remote access); and 2 Intel PRO/100s (for VM workloads). Ideally I’d use Gigabit Ethernet cards for the VM workload too, but this is only my home network and they were what I had available!
Trying to find out the cause of the problem, I ran WhoCrashed, which gave me the following information:
This was likely caused by the following module: efe5b32e.sys
Bugcheck code: 0xD1 (0x0, 0x2, 0x0, 0xFFFFF88002C4A3F1)
Error: DRIVER_IRQL_NOT_LESS_OR_EQUAL
Dump file: C:\Windows\Minidump\020410-15397-01.dmp
file path: C:\Windows\system32\drivers\efe5b32e.sys
product: Intel(R) PRO/100 Adapter
company: Intel Corporation
description: Intel(R) PRO/100 Adapter NDIS 5.1 driver
That confirmed that the issue was with the Intel NIC driver, which sounded right as, after enabling the Hyper-V role, I connected an Ethernet cable to one of the Intel NICs and got a BSOD each time the server came up. If I disconnected the cable, no BSOD. Back to the twitters:
“Does anyone know of any problems with Intel NICs and Hyper-V R2 (that might cause a BSOD)?”
I switched the in-box (Microsoft) drivers for some (older) Intel ones. That didn’t fix things, so I switched back to the latest drivers. Eventually I found that the issue was caused by the checkbox for “Allow management operating system to share this network adapter” and that,  if the NIC is live and I selected this, I could reproduce the error:
“Found the source of yesterday’s WS08R2 Hyper-V crash… any idea why enabling this option http://twitpic.com/11b64y would trip a BSOD?”
Even though I could work around the issue (because I don’t want to share a NIC between the parent partition and the children anyway – I have the Broadcom NIC for remote access) it seemed strange that this behaviour should occur. There was no NIC teaming involved and the server was still a straightforward UK installation (aside from enabling Hyper-V and setting up virtual networks).Â
Based on suggestions from other Virtual Machine MVPs I also:
- Flashed the NICs to the latest release of the Intel Boot Agent (these cards don’t have a BIOS).
- Updated the Broadcom NIC to the latest drivers too.
- Attempted to turn off Jumbo frames but the the option was not available in the properties so I could rule that out.
Thankfully, @stufox (from Microsoft in New Zealand) saw my tweets and was kind enough to step in to offer assistance. It took us a few days, thanks to timezone differences and my work schedule, but we got there in the end.
First up, I sent Stu a minidump from the crash, which he worked on with one of the Windows Server kernel developers. They suggested running the driver verifier (verifier.exe
) against the various physical network adapters (and against vmswitch.sys). More details of this tool can be found in Microsoft knowledge base article 244617 but the response to the verifier /query
command was as follows:
09/02/2010, 23:19:33
Level: 000009BB
RaiseIrqls: 0
AcquireSpinLocks: 44317
SynchronizeExecutions: 2
AllocationsAttempted: 152850
AllocationsSucceeded: 152850
AllocationsSucceededSpecialPool: 152850
AllocationsWithNoTag: 0
AllocationsFailed: 0
AllocationsFailedDeliberately: 0
Trims: 41047
UnTrackedPool: 141544
Â
Verified drivers:
Â
Name: efe5b32e.sys, loads: 1, unloads: 0
CurrentPagedPoolAllocations: 0
CurrentNonPagedPoolAllocations: 0
PeakPagedPoolAllocations: 0
PeakNonPagedPoolAllocations: 0
PagedPoolUsageInBytes: 0
NonPagedPoolUsageInBytes: 0
PeakPagedPoolUsageInBytes: 0
PeakNonPagedPoolUsageInBytes: 0
Â
Name: ndis.sys, loads: 1, unloads: 0
CurrentPagedPoolAllocations: 6
CurrentNonPagedPoolAllocations: 1926
PeakPagedPoolAllocations: 8
PeakNonPagedPoolAllocations: 1928
PagedPoolUsageInBytes: 984
NonPagedPoolUsageInBytes: 1381456
PeakPagedPoolUsageInBytes: 1296
PeakNonPagedPoolUsageInBytes: 1381968
Â
Name: b57nd60a.sys, loads: 1, unloads: 0
CurrentPagedPoolAllocations: 0
CurrentNonPagedPoolAllocations: 3
PeakPagedPoolAllocations: 0
PeakNonPagedPoolAllocations: 3
PagedPoolUsageInBytes: 0
NonPagedPoolUsageInBytes: 188448
PeakPagedPoolUsageInBytes: 0
PeakNonPagedPoolUsageInBytes: 188448
Â
Name: vmswitch.sys, loads: 1, unloads: 0
CurrentPagedPoolAllocations: 1
CurrentNonPagedPoolAllocations: 18
PeakPagedPoolAllocations: 2
PeakNonPagedPoolAllocations: 24
PagedPoolUsageInBytes: 108
NonPagedPoolUsageInBytes: 50352
PeakPagedPoolUsageInBytes: 632
PeakNonPagedPoolUsageInBytes: 54464
To be honest, I haven’t a clue what half of that means but the guys at Microsoft did – and they also asked me for a kernel dump (Dirk A D Smith has written an article at Network World that gives a good description of the various types of memory dump: minidump; kernel; and full). Transmitting this file caused some issues (it was 256MB in size – too big for e-mail) but it compressed well, and 7-zip allowed me to split it into chunks to get under the 50GB file size limit on Windows Live SkyDrive. Using this, Stu and his kernel developer colleagues were able to see that there is a bug in the Intel driver I’m using but it turns out there is another workaround too – turning off Large Send Offload in the network adapter properties. Since I did this, the server has run without a hiccup (as I would have expected).
“Thanks to @stufox for helping me fix the BSOD on my Hyper-V R2 server. Turned out to be an Intel device driver issue – I will blog details”
It’s good to know that Hyper-V was not at fault here: sure, it shows that a rogue device driver can bring down a Windows system but that’s hardly breaking news – the good thing about the Hyper-V architecture is that I can easily update network device drivers. And, let’s face it, I was running enterprise-class software on a workgroup server with some old, unsupported, hardware – you could say that I was asking for trouble…
It’s disappointing to see Eric Gray take this blog post out of context in his post (pingback in comment 1)… if only he’d read how VMware’s architecture limits the hardware I can actually use… still, what can I expect from a VMware employee who claims to offer “informed virtualization criticism”… at least he doesn’t claim to be objective I guess.
The same error: bugcheck D1 and C:\Windows\system32\drivers\efe5b32e.sys mantioned. And the same config: Intel(R) PRO/100 Adapter configured as on your picture http://twitpic.com/11b64y
Thanks a lot, shall try your recipe. And by the way, have you heard of Intel fixed this error in newest drivers?
Unfortunately there do not seems to be any newer drivers as this NIC is now out of support.
Microsoft version of driver is 8.0.47.1. The Intel’s one is 8.0.47. It seems thay are the same. And I’ve used 8.0.43 from Intel when turned on sharing on network card and server began crashing. Switched back to Microsoft driver and server works pretty well.
Brilliant! Thanks Mark.
Fixed my problem which was with an old HP NetServer NIC (a rebadged Intel).
Thanks for posting this, I experienced the same issue with a test setup. This solved my problem. Thanks
THANK YOU! Home 2k8 R8 server Hyper-V kept crashing whenever I stressed the network on the VM guests – turned out to be an old Intel Pro 100 adaptor and hte large send offload issue.
Your blog pointed me to the answer…
Thanks again
Paul Adams
Hi,
What an excellent post! I had exactly the same issue as you and I am also running environment at home built using Hyper-V and some out-dated hw like that Intel dual port network interface.
Greetings and Thanks!
Vesa
5 years later, I am using some old supported hardware with the same rouge driver. Thanks!!!