Resizing a Sparse Bundle Fails in 10.5.5?
Posted November 13th, 2008 @ 11:51pm by Erik J. Barzeski
Having learned that the Drobo slows down only when it hits 95% capacity, not 90%, I decided to up the maximum size of my backup drive slightly ((I originally sized it at 1225 GB. 1.33 TB * 1024 GB/TB * 0.95 ~= 1293
- so I settled on 1275 GB to be safe.)). I figured it would also be a good test should I someday increase the storage space of my Drobo and want to expand my Time Machine disk's size. I used the resize command:
% hdituil resize -size 1275g Bunny_0017f202b9ec.sparsebundle
Should work right? Oops: hdiutil: resize failed - error -5341
FAIL! The image was created based on the ReadMe left by Time Tamer:
% hdiutil create -size 1225g -fs HFS+J -volname "Time Machine Bunny Backup" /Volumes/Mimzy/Bunny_0017f202b9ec.sparsebundle
Disk Utility also offers the ability to resize volumes, but it fails as well. For some users, it won't report an error, but the disk will still remain the old size.
FAIL!
To further test this, I created a 10 MB image and attempted to increase the size to 20 MB. Same results. In Disk Utility, it didn't even give me the option to size upwards - only down.
Here's the verbose output from another attempt I made to go to 1250 GB:
% hdiutil resize -size 1250g Bunny_0017f202b9ec.sparsebundle -verbose DIBackingStoreInstantiatorProbe: interface 0, score 100, CBSDBackingStore DIBackingStoreInstantiatorProbe: interface 1, score 1000, CBundleBackingStore DIBackingStoreInstantiatorProbe: interface 2, score -1000, CRAMBackingStore DIBackingStoreInstantiatorProbe: interface 3, score -1000, CCarbonBackingStore DIBackingStoreInstantiatorProbe: interface 4, score -1000, CDevBackingStore DIBackingStoreInstantiatorProbe: interface 5, score -1000, CCURLBackingStore DIBackingStoreInstantiatorProbe: interface 6, score -1000, CVectoredBackingStore DIBackingStoreInstantiatorProbe: selecting CBundleBackingStore DIBackingStoreInstantiatorProbe: interface 0, score 100, CBSDBackingStore DIBackingStoreInstantiatorProbe: interface 1, score -1000, CBundleBackingStore DIBackingStoreInstantiatorProbe: interface 2, score -1000, CRAMBackingStore DIBackingStoreInstantiatorProbe: interface 3, score 100, CCarbonBackingStore DIBackingStoreInstantiatorProbe: interface 4, score -1000, CDevBackingStore DIBackingStoreInstantiatorProbe: interface 5, score -1000, CCURLBackingStore DIBackingStoreInstantiatorProbe: interface 6, score -1000, CVectoredBackingStore DIBackingStoreInstantiatorProbe: selecting CBSDBackingStore DIFileEncodingInstantiatorProbe: interface 0, score -1000, CMacBinaryEncoding DIFileEncodingInstantiatorProbe: interface 1, score -1000, CAppleSingleEncoding DIFileEncodingInstantiatorProbe: interface 2, score -1000, CEncryptedEncoding DIFileEncodingInstantiatorProbe: nothing to select. DIFileEncodingInstantiatorProbe: interface 0, score -1000, CMacBinaryEncoding DIFileEncodingInstantiatorProbe: interface 1, score -1000, CAppleSingleEncoding DIFileEncodingInstantiatorProbe: interface 2, score -1000, CEncryptedEncoding DIFileEncodingInstantiatorProbe: nothing to select. DIFileEncodingInstantiatorProbe: interface 0, score -1000, CUDIFEncoding DIFileEncodingInstantiatorProbe: nothing to select. DIFileEncodingInstantiatorProbe: interface 0, score -1000, CSegmentedNDIFEncoding DIFileEncodingInstantiatorProbe: interface 1, score -1000, CSegmentedUDIFEncoding DIFileEncodingInstantiatorProbe: interface 2, score -1000, CSegmentedUDIFRawEncoding DIFileEncodingInstantiatorProbe: nothing to select. DIDiskImageInstantiatorProbe: interface 0, score 0, CDARTDiskImage DIDiskImageInstantiatorProbe: interface 1, score 0, CDiskCopy42DiskImage DIDiskImageInstantiatorProbe: interface 2, score 0, CNDIFDiskImage DIDiskImageInstantiatorProbe: interface 3, score -1000, CUDIFDiskImage DIDiskImageInstantiatorProbe: interface 5, score 0, CRawDiskImage DIDiskImageInstantiatorProbe: interface 6, score -100, CShadowedDiskImage DIDiskImageInstantiatorProbe: interface 7, score -100, CSparseDiskImage DIDiskImageInstantiatorProbe: interface 8, score 1000, CSparseBundleDiskImage DIDiskImageInstantiatorProbe: interface 9, score -1000, CCFPlugInDiskImage DIDiskImageInstantiatorProbe: interface 10, score -100, CWrappedDiskImage DIDiskImageInstantiatorProbe: selecting CSparseBundleDiskImage DIDiskImageNewWithBackingStore: CSparseBundleDiskImage DIDiskImageNewWithBackingStore: instantiator returned 0 hdiutil: resize failed - error -5341
The "imageinfo" option on hdiutil (% hdiutil imageinfo Bunny_0017f202b9ec.sparsebundle
) gives me this:
Format: UDSB Backing Store Information: Name: Bunny_0017f202b9ec.sparsebundle URL: file://localhost/Volumes/Mimzy/Bunny_0017f202b9ec.sparsebundle/ Class Name: CBundleBackingStore Format Description: sparse Checksum Type: none partitions: partition-scheme: GUID block-size: 512 burnable: false partitions: 0: partition-length: 1 partition-synthesized: true partition-hint: MBR partition-name: Protective Master Boot Record partition-start: 0 1: partition-length: 1 partition-synthesized: true partition-hint: Primary GPT Header partition-name: GPT Header partition-start: 1 2: partition-length: 32 partition-synthesized: true partition-hint: Primary GPT Table partition-name: GPT Partition Data partition-start: 2 3: partition-length: 6 partition-synthesized: true partition-hint: Apple_Free partition-name: partition-start: 34 4: partition-length: 409600 partition-name: EFI System Partition partition-number: 1 partition-hint-UUID: C12A7328-F81F-11D2-BA4B-00A0C93EC93B partition-UUID: AD2E4395-91AE-4668-A439-3855995A2522 partition-hint: C12A7328-F81F-11D2-BA4B-00A0C93EC93B partition-start: 40 partition-filesystems: FAT32: EFI 5: partition-length: 2568339376 partition-name: disk image partition-number: 2 partition-hint-UUID: 48465300-0000-11AA-AA11-00306543ECAC partition-UUID: ECA15023-F8AC-435E-BEF1-17F79D54DAF0 partition-hint: Apple_HFS partition-start: 409640 partition-filesystems: HFS+: 6: partition-length: 105939031 partition-synthesized: true partition-hint: Apple_Free partition-name: partition-start: 2568749016 7: partition-length: 32 partition-synthesized: true partition-hint: Backup GPT Table partition-name: GPT Partition Data partition-start: 2674688047 8: partition-length: 1 partition-synthesized: true partition-hint: Backup GPT Header partition-name: GPT Header partition-start: 2674688079 Properties: Partitioned: false Software License Agreement: false Compressed: no Kernel Compatible: false Encrypted: false Checksummed: false Checksum Value: Size Information: Total Bytes: 1369440296960 Compressed Bytes: 1369440296960 Total Non-Empty Bytes: 0 Sector Count: 2674688080 Total Empty Bytes: 1369440296960 Compressed Ratio: 1 Class Name: CSparseBundleDiskImage Segments: 0: /Volumes/Mimzy/Bunny_0017f202b9ec.sparsebundle Resize limits (per hdiutil resize -limits): 1901729024 2568339376 34359738368
Any ideas?
Posted 14 Nov 2008 at 2:19pm #
I can confirm this. I can reduce the size of a sparse bundle disk image, but I can't increase it. 🙁
time to file a bug report
Posted 10 Mar 2009 at 12:07pm #
How can I reduce the size of my Time Capsule sparsebundle image to 250g so thaht could leave me space on my 500Go hard drive TC for other personnal file. My sparsebundle image is 200Go now and I tried to restore it via disk utility in a new "test.sparsebundle"... but it's always failing. The system tells me that there is not enought free space... how come ?
Can somebody help me please.
I followed those two webpages, unsuccessfully:
http://www.macosxhints.com/article.php?story=20071108020121567
http://discussions.apple.com/thread.jspa?threadID=1635356
Thanks
Posted 14 Nov 2008 at 2:35pm #
What happens if you try the resize on a local disk rather than on the Drobo?
Posted 14 Nov 2008 at 2:41pm #
[quote comment="50775"]What happens if you try the resize on a local disk rather than on the Drobo?[/quote]
The tests with the smaller (i.e. under 1.2 TB) images were done on a local SATA disk with plenty (650+ GB) of free space. I could have been clearer in saying this above.
Posted 14 Nov 2008 at 6:48pm #
Its surprising to learn about hdiutil's behavior -- only allowing reducing a sparsebundle's size, not increasing it. It certainly seems like a bug.
I would suggest setting a large size in the first place -- kind of "set it and forget it." The sparsebundle doesn't consumer much space and will grow up to it maximum size. Unlike creating a partition, a sparsebundle does not allocate space when it is created.
A Time Tamer created 298g (max size) sparsebundle takes 155MB
Cutting and pasting the hdiutil command from Time Tamer's README to create a 2024g (max size) sparsebundle takes 434.6MB
To me, its diminishing return to worry about saving a few hundred MB on a huge disk.
Posted 14 Nov 2008 at 9:38pm #
[quote comment="50780"]I would suggest setting a large size in the first place - kind of "set it and forget it."[/quote]
With all due respect, that misses the point. If I expand my Drobo, I may want to increase the size of the disk image.
Posted 15 Nov 2008 at 11:32pm #
[quote comment="50781"]With all due respect, that misses the point. If I expand my Drobo, I may want to increase the size of the disk image.[/quote]
Perhaps I don't understand your objection. You could create a sparsebundle that will grow up to Drobo's maximum of 16TB (many years away until disk drives get considerably larger). As you expand Drobo by adding or upsizing drives, the available room for Time Machine increases. This is an extreme example to illustrate the point -- or is it?
The "cost" of this approach is the additional size of the initial sparsebundle. It seems to me that you can make the tradeoffs that best fit your needs.
This a pragmatic approach to a complex situation. Are you aware of why Apple's hdiutil doesn't let you increase the size of a sparse bundle?
Perhaps you have another concern that I don't clearly understand?
Posted 16 Nov 2008 at 8:05am #
[quote comment="50807"]You could create a sparsebundle that will grow up to Drobo's maximum of 16TB (many years away until disk drives get considerably larger).[/quote]
That's where you seem to misunderstand. Why would I do that when the Drobo's going to fail (basically) when it hits 95% full? Time Machine will think it has 14.x TB of free space and won't do any pruning. You can't even manually prune. As soon as you hit the capacity that actually fills the drive (~1.3 TB), you're hosed.
[quote comment="50807"]This a pragmatic approach to a complex situation. Are you aware of why Apple's hdiutil doesn't let you increase the size of a sparse bundle?[/quote]
I believe it doesn't because it's a bug that's been fixed in a later version of Mac OS X.
Your solution is hardly what I would call "pragmatic," and I'm not sure where we're missing the communication juncture here. The goal is to create a sparsebundle sized under 95% and then to increase the size of that sparsebundle as your Drobo grows in size. Creating a 16 TB disk image makes no sense at all.
Posted 16 Nov 2008 at 5:05pm #
[quote comment="50807"]Your solution is hardly what I would call "pragmatic," and I'm not sure where we're missing the communication juncture here. The goal is to create a sparsebundle sized under 95% and then to increase the size of that sparsebundle as your Drobo grows in size. Creating a 16 TB disk image makes no sense at all.[/quote]
If Apple has fixed the bug, that should solve the problem and give you the conrol you want.
Both Time Machine and Drobo have properties that when used together create a difficult situation for users. The Time Tamer script provides a way out -- except for the hdiutil bug reported on above. If that gets fixed in a Leopard update, or Snow Leopard, then this situation will resolve itself over time.
Posted 16 Nov 2008 at 9:04pm #
[quote comment="50812"]If Apple has fixed the bug, that should solve the problem and give you the conrol you want.[/quote]
Yeah, when we get the bug fix. Could be six months from now. 😛
Posted 16 Nov 2008 at 10:39pm #
Related topic, Erik, what are your thoughts about other backup solutions? rsync? superduper!? amazon s3?
Posted 17 Nov 2008 at 7:32am #
[quote comment="50817"]Related topic, Erik, what are your thoughts about other backup solutions? rsync? superduper!? amazon s3?[/quote]
I clone my main drive with psync nightly to a bootable backup. Beyond answering your question that briefly, I'd prefer to keep this post and the comments about resizing a sparse bundle failing in 10.5.5.
Posted 29 Nov 2008 at 7:39pm #
I came across this thread searching for a general sparsebundle upsize tip, and didn't bother trying the hdiutil resize after reading the comments etc. So I'm currently trying a workaround - I used hdiutil to create a second, larger sparsebundle, and right now SuperDuper 2.5 is happily cloning the smaller contents to the larger disk. SuperDuper didn't allow me to select the disk images, but once they were both mounted in the Finder, I could select the source and target volumes. Crossing my fingers - this looks like it will take a bit since I have backups in the source dating back to March.
-- Marc
Posted 02 Dec 2008 at 2:38pm #
[...] this problem and the many others I've had with my Drobo1, about a week ago I shuffled things around in order to [...]
Posted 24 Feb 2009 at 1:37pm #
This post is a bit old, but I'm curious if anyones found the solution to upsize a sparsebundle.
Seems a bit dumb that you can't upsize easily.
Posted 24 Feb 2009 at 3:00pm #
After posting here and a bit more fidling, thanks to this post (http://sadilek.blogspot.com/2008/06/resizing-sparse-bundle-image-for-time.html), this is what worked for me
1. Open finder and mount the Backup volume (DONT Open the sparse image)
2. In terminal enter:
hdiutil resize -size 170g /Volumes/VolumeNameHere/SparseImageNameHere (I used 170g for 170gb, you can put whatever you want, just make sure its smaller than your actual volume)
3. Hit enter, grab some coffee (This only took about 5 min for me going from 80gb to 170gb)
4. You can mount the sparse image and check disk utility to verify the new size.
Hope this helps someone!
Posted 01 Oct 2009 at 6:25pm #
I've tried to use hdiutil resize -size 250g /Volumes/VolumeNameHere/SparseImageNameHere
but I get back:
hdiutil: resize: failed. Resource temporarily unavailable (35)
Any ideas? 😕
Posted 11 Mar 2009 at 2:12am #
I spent nearly 7 hours trying to work this out on the weekend - and I finally cracked it!
It's all about the partition layouts...
http://isnot.tv/text/368/using-hditul-to-resize-disk-images/
Posted 09 Apr 2009 at 1:35am #
Hi Shai/Aaron,
The resize using hdiutil command worked for me.
Thanks
Posted 02 Jul 2009 at 12:59am #
Ok, we're up to 10.5.7 in OS-X. Does resizing now work, or is this still a problem?
Posted 29 Sep 2009 at 12:09pm #
As of 10.5.7, Disk Utility resize still does not work, but the hdiutil resize command does.
Posted 01 Oct 2009 at 7:33am #
I've been using the com.apple.TimeMachine.plist MaxSize attribute. I'm not 100% sure that it works, but I think it might be working. It appears to me that it is measured in GB (contrary to some sources on the internet that guess it is in kb or MB). It's a more conventient way to go if it works, it would be good if some other people gave it a shot.
Posted 22 Nov 2010 at 12:51am #
[quote comment="55620"]I've tried to use hdiutil resize -size 250g /Volumes/VolumeNameHere/SparseImageNameHere
but I get back:
hdiutil: resize: failed. Resource temporarily unavailable (35)
Any ideas? :???:[/quote]
A year later, but maybe this will help someone else.
I got the above error because I had DiskUtil still running with the sparsebundle selected. Quitting Disk Utility allowed me to do the resize via hdiutil.
Posted 17 Aug 2011 at 10:09am #
Almost another year later. I encountered the "Resource temporarily unavailable" message when doing 'hdiutil convert' on a disk image. I couldn't figure it out because I was calling 'diskutil unmount' prior to the convert. Turns out I needed to call 'diskutil eject' instead. Conversion is now successful.
Posted 19 Feb 2013 at 11:47am #
Hello,
I tried so many times, and each time it fails ... until I try this :
I had a 2To sparsebundle image, and I wanted to limit the size to 600Go. When I did it, it fails, but if I limit in 2 steps, it's OK !
=> First, I resized to 1To
=> Second, I resized to 600 Go
It works !