Is there a way to parallelize a bash for loop? -


i have simple script pulls smart data series of hard drives , writes timestamped log file later logged , parsed relevant data.

filename="filename$( date '+%y_%m_%d_%h%m' ).txt" in {a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p} smartctl -a /dev/sd$i >> /path/to/location/$filename done  

since takes several seconds run, find way parallelize it. i've tried appending '&' end of single line in loop, causes text file written haphazardly sections finish rather sequentially , in readable manner. there way fork seperate processes each drive pipe output orderly text file?

also, assume setting filename variable have moved loop in order forks able access it. causes issue if script runs long enough roll on new minute (or two) , script becomes sequentially datestamped fragments rather 1 contiguous file.

with gnu parallel this:

parallel -k 'smartctl -a /dev/{}' ::: b c d e f g h  j k l m n o p > path/to/output 

the -k option keeps output in order. add -j 8 if want run, say, 8 @ time, else 1 per core @ time. or -j 16 if want run them @ once...

parallel -j 16 -k 'smartctl .... 

of course, if in bash can too:

parallel -j 16 -k 'smartctl -a /dev/{}' ::: {a..o} > path/to/output 

Comments

Popular posts from this blog

html - Outlook 2010 Anchor (url/address/link) -

javascript - Why does running this loop 9 times take 100x longer than running it 8 times? -

Getting gateway time-out Rails app with Nginx + Puma running on Digital Ocean -