4 Replies Latest reply: Aug 27, 2012 9:39 AM by 956815 RSS

    How to know when a drive in a btrfs RAID is going bad?

      There seems to be tons of information about what to do if you know a drive is going bad. Mount the array as degraded, add the new drive, remove the old yadda yadda yadda.

      How does btrfs report that a drive is bad to begin with? I'm having trouble tracking down a solid answer.

      Does btrfs even report anything or is it up to me to use SMART to monitor the drive?

      Right now I have 2 BTRFS RAID1 setups, each on different machines. But I don't know how to monitor the drives health. Should I be using something btrfs can report or just SMART?