Support might request the local admins to provide them with big files, such as output from tcpdump or a log file which might be big in size. Depending on the size of the file, we could either generate a compressed archive or split the files into multiple small files.
I have published another article to generate compressed archives, so if the size of the compressed archive isn't relatively big you can share that with the support or otherwise you could always split the archive as well.
This article is to demonstrate how to split and combine files using a linux terminal and a simple cmd utility.
- have a look at the size of the target file : du -h /path-to-filename
- using the split utility we split the file into blocks of a predefined size : split -b 100M /path-to-file file-initials. (last arg is optional) (-b flag is also optional, default is 1000 lines per file)
- next we cat the files in stdin and redirect it to a new file, yes, it's that simple : cat file-initial.??or cat x* and redirect using : > new-file-name.extension
We could also directly split the contents of a dir directly via the zip utility (note : this can't be achieved using gzip since gzip is only capable of compression) :-
- zip -s 2g -r target-name.zip /path-to-dir (this will create a split archive of the dir with splits no bigger than 2gb, note: that the last file would be a .zip file, others would be of the type z01, z02, etc)
- concatenate using the following :- zip -F target-name.zip --out new-target.zip (note: the F option is used to fix the new offset entries generated by the split -s option)
Comments
0 comments
Please sign in to leave a comment.