警告:此脚本为每个块启动一个shell,对于非常大的文件,这可能是数百个。这是我为此目的编写的脚本。在4处理器的机器上,它将分拣性能提高了100%!#! /bin/kshMAX_LINES_PER_CHUNK=1000000ORIGINAL_FILE=$1SORTED_FILE=$2CHUNK_FILE_PREFIX=$ORIGINAL_FILE.split.SORTED_CHUNK_FILES=$CHUNK_FILE_PREFIX*.sortedusage (){ echo Parallel sort echo usage: psort file1 file2 echo Sorts text file file1 and stores the output in file2 echo Note: file1 will be split in chunks up to $MAX_LINES_PER_CHUNK lines echo and each chunk will be sorted in parallel}# test if we have two arguments on the command lineif [ $# != 2 ]then usage exitfi#Cleanup any lefover filesrm -f $SORTED_CHUNK_FILES > /dev/nullrm -f $CHUNK_FILE_PREFIX* > /dev/nullrm -f $SORTED_FILE#Splitting $ORIGINAL_FILE into chunks ...split -l $MAX_LINES_PER_CHUNK $ORIGINAL_FILE $CHUNK_FILE_PREFIXfor file in $CHUNK_FILE_PREFIX*do sort $file > $file.sorted &donewait#Merging chunks to $SORTED_FILE ...sort -m $SORTED_CHUNK_FILES > $SORTED_FILE#Cleanup any lefover filesrm -f $SORTED_CHUNK_FILES > /dev/nullrm -f $CHUNK_FILE_PREFIX* > /dev/null