.net - Is it possible to stream a group of files to a browser as they're being compressed or have the client compress them? -


background info: i'm using .net framework , mvc.

here's dilemma: i'm using service open group of files (from sql server). there's delay in how long takes open entire file(s) service directly proportional size of file. i'm taking file , streaming web browser web app. can imagine, isn't scalable since browser times out file on 500mb (since takes long before start of streaming). solution we're using called "chunking" of data. i'm taking 64kb pieces of data service , streaming them right away browser.

this works great single file, however, have requirement if there multiple files, need compressed single file. problem compression need download of files in whole service before can start streaming compressed package. think know answer question, i'll ask anyway: there way stream group of files they're being compressed? highly doubt can since compression algorithm need able see files in whole. alternatively, there javascript package out there might able capture files individually (as they're streaming) , compress them once streaming done? i'd appreciate advice on this!!

there seems package out there zipping on client side, jszip. note you'd need downloadify create file on user's computer. doesn't cross-browser supported though, , amount of data you're throwing around in js on client cause issues.

instead of sending zip file, @ streaming different archive format such a tar file or iso file? contain meta-data files , file data.

alternatively, borrow solution used 7digital , bleep record music stores, zip files on server temporary directory while presenting page user. page uses piece of js on client side poll server until whole file ready download, can start download per normal.

update

i noticed if download directory dropbox website starts download , not know full file size - indicates it's starting download before it's finished creating archive. further read zip file format , deflate algorithm suggests can start generating compressed data , streaming client before have full file data service.

the code following untested , simplified example: (using dotnetzip class names)

// stream client using (var zipstream = zipoutputstream(response.outputstream)) {  foreach (var filename in filenames) {      // write file header      zipentry entry = new zipentry(filename);      zipstream.putnextentry(entry);       // write file chunks      byte[] chunk;      while ((chunk = service.getchunk(filename)).length > 0) {          zipstream.write(chunk, 0, chunk.length);      } }  // write zip file directory complete file zipstream.finish();  } 

if want files compressed further (which may case if give compressor larger blocks), want data streaming possible, , know data comes service application faster goes application client, implement sort of exponential buffer within foreach loop.

int chunksperwrite = 1; // better if defined outside of foreach loop byte[] chunk; var chunks = new list<byte[]>(); while ((chunk = service.getchunk(filename)).length > 0) {      chunks.add(chunk)       if (chunks.count >= chunksperwrite) {          // combine chunks array copying logic not included          byte[] megachunk = combineallchunks(chunks);          zipstream.write(megachunk, 0, megachunk.length);          chunksperwrite *= 2; // or chunksperwrite++ linear growth      } }  // cut brevity - combine last chunks , send zipstream. 

my reading of zip specification suggests there limit how data can compressed in single go, can't work out limit (it might depend on data?). interested hear knows spec better...

if find need roll own reason, zip files have plain storage mechanism no compression engine, making easier if you're not concerned bandwidth.


Comments