我试图压缩大量的数据,有时在100GB的范围内,当我运行我写的例程时,看起来文件的大小与之前的大小完全相同.有没有其他人在GZipStream上遇到过这个问题?
我的代码如下:
byte[] buffer = BitConverter.GetBytes(StreamSize);
FileStream LocalUnCompressedFS = File.OpenWrite(ldiFileName);
LocalUnCompressedFS.Write(buffer, 0, buffer.Length);
GZipStream LocalFS = new GZipStream(LocalUnCompressedFS, CompressionMode.Compress);
buffer = new byte[WriteBlock];
UInt64 WrittenBytes = 0;
while (WrittenBytes + WriteBlock < StreamSize)
{
fromStream.Read(buffer, 0, (int)WriteBlock);
LocalFS.Write(buffer, 0, (int)WriteBlock);
WrittenBytes += WriteBlock;
OnLDIFileProgress(WrittenBytes, StreamSize);
if (Cancel)
break;
}
if (!Cancel)
{
double bytesleft = StreamSize - WrittenBytes;
fromStream.Read(buffer, 0, (int)bytesleft);
LocalFS.Write(buffer, 0, (int)bytesleft);
WrittenBytes += (uint)bytesleft;
OnLDIFileProgress(WrittenBytes, StreamSize);
}
LocalFS.Close();
fromStream.Close();
Run Code Online (Sandbox Code Playgroud)
StreamSize是一个8字节的UInt64值,用于保存文件的大小.我将这8个字节原始写入文件的开头,所以我知道原始文件的大小.Writeblock的值为32kb(32768字节).fromStream是从中获取数据的流,在本例中是一个FileStream.压缩数据前面的8个字节是否会导致问题?
我使用以下代码进行了测试以进行压缩,它在7GB和12GB文件上运行时都没有问题(事先都知道压缩"井").这个版本适合你吗?
const string toCompress = @"input.file";
var buffer = new byte[1024*1024*64];
using(var compressing = new GZipStream(File.OpenWrite(@"output.gz"), CompressionMode.Compress))
using(var file = File.OpenRead(toCompress))
{
var bytesRead = 0;
while(bytesRead < buffer.Length)
{
bytesRead = file.Read(buffer, 0, buffer.Length);
compressing.Write(buffer, 0, buffer.Length);
}
}
Run Code Online (Sandbox Code Playgroud)
你看过文件了吗?
该GZipStream类不能解压,超过8 GB的未压缩数据导致数据.
您可能需要找到一个不同的库来支持您的需求,或者尝试将数据分成<=8GB可以安全地"缝合"在一起的块.