In my main home PC, I have a 3Ware 9650SE-8LPML, to which are attached four 6TB WD Red drives.
About two weeks ago, it decided that drive 3 had a problem, and that it had to rebuild the RAID. I've left the computer running ever since, and it has never gotten beyond around an 8% or 9% rebuild, before going back to 0% and starting over.
RAID1.PNG
According to the SMART data reported by the controller, there is nothing wrong with the drive itself:
RAID2.PNG
I wondered if it was something that Windows was doing that was preventing it from completing the rebuild, so I left the PC on the GRUB bootloader selection screen, that appears after the RAID card has booted (this PC dual boots into W10 and Ubuntu, using partitions on an SSD that is totally separate from the RAID controller, and hooked directly to the computer's motherboard), for a day. But when I came back, it was the same story - rebuilding at 2%, and then back to zero a few hours later. A 6TB drive should take about 1.5 to 2 days to rebuild completely, so something is wrong.
There has to be either a problem with the RAID controller, the mini-SAS to SATA breakout cable, or the drive itself: logically, I can't think of any other possible cause. The card's diagnostics are claiming that the drive is OK; and given that it's two years and 11,000 hours old, and has been properly ventilated throughout those hours, it certainly should be. Just wondered if anyone else has come across this, and what the cause is.
About two weeks ago, it decided that drive 3 had a problem, and that it had to rebuild the RAID. I've left the computer running ever since, and it has never gotten beyond around an 8% or 9% rebuild, before going back to 0% and starting over.
RAID1.PNG
According to the SMART data reported by the controller, there is nothing wrong with the drive itself:
RAID2.PNG
I wondered if it was something that Windows was doing that was preventing it from completing the rebuild, so I left the PC on the GRUB bootloader selection screen, that appears after the RAID card has booted (this PC dual boots into W10 and Ubuntu, using partitions on an SSD that is totally separate from the RAID controller, and hooked directly to the computer's motherboard), for a day. But when I came back, it was the same story - rebuilding at 2%, and then back to zero a few hours later. A 6TB drive should take about 1.5 to 2 days to rebuild completely, so something is wrong.
There has to be either a problem with the RAID controller, the mini-SAS to SATA breakout cable, or the drive itself: logically, I can't think of any other possible cause. The card's diagnostics are claiming that the drive is OK; and given that it's two years and 11,000 hours old, and has been properly ventilated throughout those hours, it certainly should be. Just wondered if anyone else has come across this, and what the cause is.
Comment