linux-raid.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
From: Konrad Rzepecki <krzepecki@dentonet.pl>
To: linux-raid@vger.kernel.org
Cc: NeilBrown <neilb@suse.de>
Subject: Re: Disk identity crisis on RAID10 recovery (3.1.0)
Date: Tue, 22 Nov 2011 13:22:36 +0100	[thread overview]
Message-ID: <4ECB940C.6010707@dentonet.pl> (raw)
In-Reply-To: <20111122221522.3f525fc6@notabene.brown>

W dniu 22.11.2011 12:15, NeilBrown pisze:
> You can probably get your data back... but really you should have 
> asked for
> help as soon as strange things started happening!

Possibly, but I don't like to bother others when I don't need to. And in 
the beginning
I didn't connect this behavior whit kernel but with some hardware disk 
issues.

Moreover reporting bugs on list is hmm... inconvenient. Please, back 
Bugzilla
online ASAP.

> If you have all important data backed up then just upgrade to 3.1.2 
> and make
> the array again  from scratch.

Probably this is best way...

> If you want to try to recover the array please report that output of 
> "mdadm
> --examine" on all of the devices.

...but I paste output of this commands below. If some of system survive, 
I can
make it up faster. Of course, if it isn't to complicated or take too much of
your time to describe it.

mdadm --examine /dev/sda2
/dev/sda2:
           Magic : a92b4efc
         Version : 0.90.00
            UUID : 0f01ef9e:ab0117cf:3d186b3c:53958f34 (local to host 
(none))
   Creation Time : Wed Aug 17 09:34:04 2011
      Raid Level : raid10
   Used Dev Size : 1463946752 (1396.13 GiB 1499.08 GB)
      Array Size : 5855787008 (5584.51 GiB 5996.33 GB)
    Raid Devices : 8
   Total Devices : 7
Preferred Minor : 1

     Update Time : Tue Nov 22 11:43:58 2011
           State : clean
  Active Devices : 6
Working Devices : 7
  Failed Devices : 2
   Spare Devices : 1
        Checksum : e8423abc - correct
          Events : 155878

          Layout : near=2
      Chunk Size : 512K

       Number   Major   Minor   RaidDevice State
this     0       8        2        0      active sync   /dev/sda2

    0     0       8        2        0      active sync   /dev/sda2
    1     1       0        0        1      faulty removed
    2     2       0        0        2      faulty removed
    3     3       8       50        3      active sync   /dev/sdd2
    4     4       8       66        4      active sync   /dev/sde2
    5     5       8       82        5      active sync   /dev/sdf2
    6     6       8       98        6      active sync   /dev/sdg2
    7     7       8      114        7      active sync   /dev/sdh2
    8     8       8       34        8      spare   /dev/sdc2


sdb - is back, but this is new empty one.


mdadm --examine /dev/sdc2
/dev/sdc2:
           Magic : a92b4efc
         Version : 0.90.00
            UUID : 0f01ef9e:ab0117cf:3d186b3c:53958f34 (local to host 
(none))
   Creation Time : Wed Aug 17 09:34:04 2011
      Raid Level : raid10
   Used Dev Size : 1463946752 (1396.13 GiB 1499.08 GB)
      Array Size : 5855787008 (5584.51 GiB 5996.33 GB)
    Raid Devices : 8
   Total Devices : 7
Preferred Minor : 1

     Update Time : Tue Nov 22 11:43:58 2011
           State : clean
  Active Devices : 6
Working Devices : 7
  Failed Devices : 2
   Spare Devices : 1
        Checksum : e8423ae6 - correct
          Events : 155878

          Layout : near=2
      Chunk Size : 512K

       Number   Major   Minor   RaidDevice State
this     8       8       34        8      spare   /dev/sdc2

    0     0       8        2        0      active sync   /dev/sda2
    1     1       0        0        1      faulty removed
    2     2       0        0        2      faulty removed
    3     3       8       50        3      active sync   /dev/sdd2
    4     4       8       66        4      active sync   /dev/sde2
    5     5       8       82        5      active sync   /dev/sdf2
    6     6       8       98        6      active sync   /dev/sdg2
    7     7       8      114        7      active sync   /dev/sdh2
    8     8       8       34        8      spare   /dev/sdc2


mdadm --examine /dev/sdd2
/dev/sdd2:
           Magic : a92b4efc
         Version : 0.90.00
            UUID : 0f01ef9e:ab0117cf:3d186b3c:53958f34 (local to host 
(none))
   Creation Time : Wed Aug 17 09:34:04 2011
      Raid Level : raid10
   Used Dev Size : 1463946752 (1396.13 GiB 1499.08 GB)
      Array Size : 5855787008 (5584.51 GiB 5996.33 GB)
    Raid Devices : 8
   Total Devices : 7
Preferred Minor : 1

     Update Time : Tue Nov 22 11:43:58 2011
           State : clean
  Active Devices : 6
Working Devices : 7
  Failed Devices : 2
   Spare Devices : 1
        Checksum : e8423af2 - correct
          Events : 155878

          Layout : near=2
      Chunk Size : 512K

       Number   Major   Minor   RaidDevice State
this     3       8       50        3      active sync   /dev/sdd2

    0     0       8        2        0      active sync   /dev/sda2
    1     1       0        0        1      faulty removed
    2     2       0        0        2      faulty removed
    3     3       8       50        3      active sync   /dev/sdd2
    4     4       8       66        4      active sync   /dev/sde2
    5     5       8       82        5      active sync   /dev/sdf2
    6     6       8       98        6      active sync   /dev/sdg2
    7     7       8      114        7      active sync   /dev/sdh2
    8     8       8       34        8      spare   /dev/sdc2


mdadm --examine /dev/sde2
/dev/sde2:
           Magic : a92b4efc
         Version : 0.90.00
            UUID : 0f01ef9e:ab0117cf:3d186b3c:53958f34 (local to host 
(none))
   Creation Time : Wed Aug 17 09:34:04 2011
      Raid Level : raid10
   Used Dev Size : 1463946752 (1396.13 GiB 1499.08 GB)
      Array Size : 5855787008 (5584.51 GiB 5996.33 GB)
    Raid Devices : 8
   Total Devices : 7
Preferred Minor : 1

     Update Time : Tue Nov 22 11:43:58 2011
           State : clean
  Active Devices : 6
Working Devices : 7
  Failed Devices : 2
   Spare Devices : 1
        Checksum : e8423b04 - correct
          Events : 155878

          Layout : near=2
      Chunk Size : 512K

       Number   Major   Minor   RaidDevice State
this     4       8       66        4      active sync   /dev/sde2

    0     0       8        2        0      active sync   /dev/sda2
    1     1       0        0        1      faulty removed
    2     2       0        0        2      faulty removed
    3     3       8       50        3      active sync   /dev/sdd2
    4     4       8       66        4      active sync   /dev/sde2
    5     5       8       82        5      active sync   /dev/sdf2
    6     6       8       98        6      active sync   /dev/sdg2
    7     7       8      114        7      active sync   /dev/sdh2
    8     8       8       34        8      spare   /dev/sdc2


mdadm --examine /dev/sdf2
/dev/sdf2:
           Magic : a92b4efc
         Version : 0.90.00
            UUID : 0f01ef9e:ab0117cf:3d186b3c:53958f34 (local to host 
(none))
   Creation Time : Wed Aug 17 09:34:04 2011
      Raid Level : raid10
   Used Dev Size : 1463946752 (1396.13 GiB 1499.08 GB)
      Array Size : 5855787008 (5584.51 GiB 5996.33 GB)
    Raid Devices : 8
   Total Devices : 7
Preferred Minor : 1

     Update Time : Tue Nov 22 11:43:58 2011
           State : clean
  Active Devices : 6
Working Devices : 7
  Failed Devices : 2
   Spare Devices : 1
        Checksum : e8423b16 - correct
          Events : 155878

          Layout : near=2
      Chunk Size : 512K

       Number   Major   Minor   RaidDevice State
this     5       8       82        5      active sync   /dev/sdf2

    0     0       8        2        0      active sync   /dev/sda2
    1     1       0        0        1      faulty removed
    2     2       0        0        2      faulty removed
    3     3       8       50        3      active sync   /dev/sdd2
    4     4       8       66        4      active sync   /dev/sde2
    5     5       8       82        5      active sync   /dev/sdf2
    6     6       8       98        6      active sync   /dev/sdg2
    7     7       8      114        7      active sync   /dev/sdh2
    8     8       8       34        8      spare   /dev/sdc2


mdadm --examine /dev/sdg2
/dev/sdg2:
           Magic : a92b4efc
         Version : 0.90.00
            UUID : 0f01ef9e:ab0117cf:3d186b3c:53958f34 (local to host 
(none))
   Creation Time : Wed Aug 17 09:34:04 2011
      Raid Level : raid10
   Used Dev Size : 1463946752 (1396.13 GiB 1499.08 GB)
      Array Size : 5855787008 (5584.51 GiB 5996.33 GB)
    Raid Devices : 8
   Total Devices : 7
Preferred Minor : 1

     Update Time : Tue Nov 22 11:43:58 2011
           State : clean
  Active Devices : 6
Working Devices : 7
  Failed Devices : 2
   Spare Devices : 1
        Checksum : e8423b28 - correct
          Events : 155878

          Layout : near=2
      Chunk Size : 512K

       Number   Major   Minor   RaidDevice State
this     6       8       98        6      active sync   /dev/sdg2

    0     0       8        2        0      active sync   /dev/sda2
    1     1       0        0        1      faulty removed
    2     2       0        0        2      faulty removed
    3     3       8       50        3      active sync   /dev/sdd2
    4     4       8       66        4      active sync   /dev/sde2
    5     5       8       82        5      active sync   /dev/sdf2
    6     6       8       98        6      active sync   /dev/sdg2
    7     7       8      114        7      active sync   /dev/sdh2
    8     8       8       34        8      spare   /dev/sdc2



mdadm --examine /dev/sdh2
/dev/sdh2:
           Magic : a92b4efc
         Version : 0.90.00
            UUID : 0f01ef9e:ab0117cf:3d186b3c:53958f34 (local to host 
(none))
   Creation Time : Wed Aug 17 09:34:04 2011
      Raid Level : raid10
   Used Dev Size : 1463946752 (1396.13 GiB 1499.08 GB)
      Array Size : 5855787008 (5584.51 GiB 5996.33 GB)
    Raid Devices : 8
   Total Devices : 7
Preferred Minor : 1

     Update Time : Tue Nov 22 11:43:58 2011
           State : clean
  Active Devices : 6
Working Devices : 7
  Failed Devices : 2
   Spare Devices : 1
        Checksum : e8423b3a - correct
          Events : 155878

          Layout : near=2
      Chunk Size : 512K

       Number   Major   Minor   RaidDevice State
this     7       8      114        7      active sync   /dev/sdh2

    0     0       8        2        0      active sync   /dev/sda2
    1     1       0        0        1      faulty removed
    2     2       0        0        2      faulty removed
    3     3       8       50        3      active sync   /dev/sdd2
    4     4       8       66        4      active sync   /dev/sde2
    5     5       8       82        5      active sync   /dev/sdf2
    6     6       8       98        6      active sync   /dev/sdg2
    7     7       8      114        7      active sync   /dev/sdh2
    8     8       8       34        8      spare   /dev/sdc2

-- 
    Konrad Rzepecki

  reply	other threads:[~2011-11-22 12:22 UTC|newest]

Thread overview: 8+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2011-11-22 10:15 Disk identity crisis on RAID10 recovery (3.1.0) Konrad Rzepecki
2011-11-22 11:15 ` NeilBrown
2011-11-22 12:22   ` Konrad Rzepecki [this message]
2011-11-22 14:50     ` Konrad Rzepecki
2011-11-22 20:00     ` NeilBrown
2011-11-23  7:25       ` [OT] " Konrad Rzepecki
2011-11-23  7:44         ` David Brown
2011-11-23  7:49         ` NeilBrown

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=4ECB940C.6010707@dentonet.pl \
    --to=krzepecki@dentonet.pl \
    --cc=linux-raid@vger.kernel.org \
    --cc=neilb@suse.de \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).