bash - Space optimization for the UNIX sort -m command? -
I am trying to run a syndication test, in which thousands of very large pre-sorted files have a mega sort Merging is included in the file.
There are some current files that are usually merged (68M to 106M each).
I do not have enough hard drive space for input, temporary intermediate, and all output at the same time. What is a way to destructively destroy all these files using ? I am currently using: (via the files I know it It is possible to run a non-merge It may be that way (warning: wipe data.) < Touch pre-> Exchanging space for basically time, with the result of previous merge, you can merge one file at a time. In addition, you can remove the already merged file. Again, try to backup your data before trying; -) sort
sort-T / media / WD_Book / tmp --compress-program = gzip -g -k 6 -m * .rand.tab & gt; / Media / WD_Book / output / merged.rand.tab
0001.rand.tab to
1000.rand.tab < / Code>, and the ordered key is in the 6 column [thus
-k6 and
-g ] in the exponential marking.)
sort in-place, but MainePep specifically says that it will not work for
-m .
merged contacts. Rand.tab # File Result File Zero for File [0-9] *. Rand.tab; Sort -k 6 -g -m merge. Rand.tab $ file & gt; Result.rand.tab rm -f merged.rand.tab mv result.rand.tab merged.rand.tab # If you are really rare then you can make $ rm $. Done
Comments
Post a Comment