invalid compression level error during restore

Go there if you encounter an unexpected error when using the softare. For installation problems, use another forum.

Moderator: feffer

invalid compression level error during restore

Postby bcalder » Sun Oct 24, 2004 6:55 pm

-Hello everyone,
I am using partimage v0.6.4 on Debian sarge/kernel 2.4.24. Using
the curses GUI, I saved a FAT32 partition image to my backup
hard drive in gzip format. The image name was
hda7.partimage.000. I told partimage to quit after the
image was successfully created, which it did.

When it came time to restore it, using either the GUI
or the commandline, I am getting this error:
"invalid compression level for
<path-to-image>/hda7.partimage.000", also when I use
"partimage imginfo <path-to-image>/hda7.partimage.000"

Are there any options to verify the validity of the
image and/or to restore all or part of it??
tar & gunzip don't seem to like the suffix when I try to use them to restore.
bcalder
 

Postby Viktor » Tue Dec 28, 2004 1:37 pm

gzip -t <filename>
But I had the same problem with gzip althought the image seems to be correct (I did not wait for gzip to end, but there were no errors). I did not find any solution so I took bzip2.
Viktor
 

fixing the "compression level" bug - workaround

Postby jon » Mon Aug 15, 2005 6:15 am

uncompress and use the uncompressed image:

eg. if you were restoring from myimage.gz, then do:
gunzip myimage.gz
which will create a file called myimage

and then in partimage restore from:
myimage

works for me.
i think its an early kernel and gzip problem to do with files over 2G in size, in any case its very annoying.

jon
jon
 

still getting this stupid error

Postby jon » Tue Nov 01, 2005 12:00 pm

still getting 'invalid compression level', though admittedly i have only tried with gzip.
its a bug, no doubt about it - and unfortunately uncompressing the image file to use it is not always an option (i'm juggling partitions at the moment because of this bug).

does anybody have any ideas about what is causing it ??

jon
jon
 

Postby Guest » Tue Nov 08, 2005 10:09 am

Hi.
I found a "sollution" (which works for me).
Instead of creating a single file, which sometimes could reach 3-5 GB (which caused the same error when I tried to restore), I choosed to create a few files, of 650-700 MB each. Problem solved.
My case:
I was trying to create a backup of a 20 GB NTFS partition, 5 GB used. Many times I choose to create the image in a single file, which was about 3 GB. (shame on me! I never thought I should choose to split the image file.) Every time I tried, at restore, it responded to me with "invalid compression level" and the image was unusable (after imaging, I didn't try to split that file, I tried every time just to restore from that big 3 GB file).
After a few weeks (can you imagine? It took me 3 weeks to think that I should try the other options :( :idea: ) I tried again: this time I choose to split the image and it works just fine. No more error message, the image is ok, I can restore from it. I used gzip.
Guest
 

Postby Guest » Wed Dec 28, 2005 8:32 pm

Here is my solution to the 'invalid compression level' error. Make sure your your image file has the .gz extension at the end. After I renamed my image to image.gz (or whatever), it worked fine.
Guest
 

Postby Guest » Thu Jan 19, 2006 10:25 pm

I've experienced the same trouble. I'm still asking 'cause they did not
pay attention to something so important. Maybe is better spend some money
on a commercial backup software? :(
Guest
 

STILL getting invalid compression level >2 G

Postby jonster » Wed Mar 08, 2006 1:44 am

I am STILL getting invalid compression level if the file is over 2 G.

This is with 0.64 with both client and server compiled on a Fedora Core 3 machine with updated libs and a 2.6.12 kernel.

Can this PLEASE be fixed somehow ???

Cheers
Jon
jonster
 
Posts: 4
Joined: Wed Mar 08, 2006 1:32 am

Fixed!

Postby Xavier » Wed Jun 07, 2006 2:14 am

I was getting the same error message as was the topic of this post. I made the original backups using Knoppix 3.9 via partimage 3.4. My backup files were both > 2GB, which partimage seemed to not like. To solve it, I did:

gunzip -c imagefile | partimage restore partition stdin


I have no idea why partimage would create a file it can't understand, but that's the wonder of stuff you get for free ;-)
Xavier
 
Posts: 1
Joined: Wed Jun 07, 2006 2:03 am

Postby Hazmat » Thu Jul 20, 2006 5:05 pm

Just got this error myself and came looking for information.

This appears to occur when the compressed image file size is greater than 2 GB.

This is most certainly a bug in partimage. My guess is that something on the decompression side of the code is not using the large filesystem stuff (eg. using 32-bit integers or fstat instead of 64-bit integers and/or the large filesystem fstat64).

Probably a relatively easy fix, where are the devs?
Hazmat
 
Posts: 1
Joined: Thu Jul 20, 2006 4:59 pm

Postby jorge_angel_cg » Tue Jul 25, 2006 7:06 am

This appears to occur when the compressed image file size is greater than 2 GB and is in a NTFS filesystem.

I have a image of 2.9 Gb of a NTFS filesystem save in an NTFS filesystem:

# partimage imginfo image.gz
I got a Segmentation fault. :cry:
But if i copy this image to a FAT32 filesystem
# partimage imginfo image.gz
I got the information correct. :P


Knoppix 2.6.17
partimage 0.6.4
jorge_angel_cg
 
Posts: 4
Joined: Tue Jul 18, 2006 1:02 pm

Postby alister » Tue Sep 05, 2006 7:35 am

Well, let's see, if we want the devs to fix this, we have to make an accurate "bug report":

Someones have the image in a NTFS partition, and some others as me, do not. In both cases partimage can segfault, so is not a requesite for the bug to appear.

Someones have the image with a .gz at the end, and some others like me do not. In both cases partimage can segfault, so is not a requesite for the bug to appear.

Someones have images smaller than 2Gb, and some others as me, have 2Gb or more images. I'm not so sure if partimage can segfault with <2GB images.

Is everybody SURE that partimage does not segfault with smaller images?

Due to the fact that it seems a partimage's internal compression algorithm problem, I think the UNIQUE way to workaround this problem with 100% success is using gunzip and a pipe as proposed above, or gunzipping previously the image (a lot of space in harddisk...). Other solutions didn't worked for everybody.

Dear François and Franck, say something please, we are expecting our words :)
alister
 
Posts: 3
Joined: Tue Sep 05, 2006 6:52 am

SOLVED

Postby nadavkav » Sun Feb 11, 2007 6:39 pm

worked for me :-)

I've changed imagefile.ooo to imagefile.ooo.gz
gunzip imagefile.ooo.gz

it converted it to an imagefile.000 (8GB ~) x 3 files (000,001,002)
and deleted the gz files.

later, issuing the restore command went smoothly :-)

thanks for the help !
nadavkav
 
Posts: 1
Joined: Sun Feb 11, 2007 6:33 pm

Re: Fixed!

Postby thawn » Fri Nov 02, 2007 11:13 am

I had the same problem but renaming the image file from image.000 to image.000.gz and following xaviers instructions helped:

Xavier wrote:I was getting the same error message as was the topic of this post. I made the original backups using Knoppix 3.9 via partimage 3.4. My backup files were both > 2GB, which partimage seemed to not like. To solve it, I did:

Code: Select all
gunzip -c [i]imagefile[/i] | partimage restore [i]partition[/i] stdin


I have no idea why partimage would create a file it can't understand, but that's the wonder of stuff you get for free ;-)


I fullheartedly agree, that a program which is mostly used for backups should not create files it cannot read on it's own!!!

I'll update to a later version and check again. once the already running restore is finished...

Some info on my system:
partimage version: 0.6.4-r3
kernel:2.6.20-gentoo-r8
filesystem: ext3 (both target and the filesystem the image was stored in)
thawn
 
Posts: 1
Joined: Fri Nov 02, 2007 10:55 am

Postby jubenpa » Tue Jan 08, 2008 2:52 pm

When the image file has been created in NTFS filesystem, the partition that contains the images must be in NTFS or FAT filesystem too.
That works for me.
jubenpa
 
Posts: 1
Joined: Tue Jan 08, 2008 2:43 pm

Next

Return to Unexpected errors during usage

Who is online

Users browsing this forum: No registered users and 0 guests

cron