Skip to Main Content

Infrastructure Software

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Interested in getting your voice heard by members of the Developer Marketing team at Oracle? Check out this post for AppDev or this post for AI focus group information.

Issue with Live Upgrade on ZFS root in LDOM

807559Oct 31 2009
Ok I've got a T5120 with a existing and working LDOM which is using a mkfile in ZFS for it's root disk. I'm working on converting the root disk for this LDOM over to a zvol. So in the control domain I've done the following:

- Created a new and larger zvol under the same ZFS file system where the existing mkfile lives
- Attached the zvol to the virtual disk server
- Attached the virtual disk to the working LDOM (and it shows up correctly in the LDOM under format)

In the LDOM I've done the following:

- Relabeled the new disk with a SMI label
- Touched up the partitioning for that s0 is what I'll be booting from
- Created a new zpool called rpool1_16g using the s0 partition of the new disk

I'm using lucreate to drive this process. When I run the lucreate to put a bootable root onto the s0 partition of the new disk I get the dreaded error:

ERROR: ZFS pool <rpool1_16g> does not support boot environments

My question is why ... OH WHY !!!???

This is while in the running LDOM:
zpool create rpool1_16g c0d1s0 <--- Using disk partition here ... not the entire disk
format c0d1 <<END
p
p
END
selecting c0d1
[disk formatted, no defect list found]
/dev/dsk/c0d1s0 is part of active ZFS pool rpool1_16g. Please see zpool(1M).

FORMAT MENU:
disk - select a disk
type - select (define) a disk type
...
Current partition table (original):
Total disk cylinders available: 453 + 2 (reserved cylinders)

Part      Tag    Flag     Cylinders       Size            Blocks
  0       root    wm       8 - 452       15.64GB    (445/0/0) 32808960
  1       swap    wu       4 -   7      144.00MB    (4/0/0)     294912
  2     backup    wu       0 - 452       15.93GB    (453/0/0) 33398784
  3 unassigned    wm       0              0         (0/0/0)          0
  4 unassigned    wm       0              0         (0/0/0)          0
  5 unassigned    wm       0              0         (0/0/0)          0
  6 unassigned    wm       0              0         (0/0/0)          0
  7 unassigned    wm       0              0         (0/0/0)          0

partition>
# zpool list
NAME SIZE USED AVAIL CAP HEALTH ALTROOT
rpool 7.94G 5.40G 2.54G 68% ONLINE -
rpool1_16g 15.6G 111K 15.6G 0% ONLINE -
# zpool status rpool1_16g
  pool: rpool1_16g
 state: ONLINE
 scrub: none requested
config:

        NAME        STATE     READ WRITE CKSUM
        rpool1_16g  ONLINE       0     0     0
          c0d1s0    ONLINE       0     0     0

errors: No known data errors
# lucreate -n S10u8zfs -p rpool1_16g
Analyzing system configuration.
ERROR: ZFS pool  does not support boot environments

Comments

Processing
Locked Post
New comments cannot be posted to this locked post.

Post Details

Locked on Nov 28 2009
Added on Oct 31 2009
0 comments
123 views