What is the best utility/library/strategy with Python to copy files across multiple computers? -


i have data across several computers stored in folders. many of folders contain 40-100 g of files of size 500 k 125 mb. there 4 tb of files need archive, , build unfied meta data system depending on meta data stored in each computer.

all systems run linux, , want use python. best way copy files, , archive it.

we have programs analyze files, , fill meta data tables , running in python. need figure out way copy files wuthout data loss,and ensure files have been copied successfully.

we have considered using rsync , unison use subprocess.popen run them off, sync utilities. these copy once, copy properly. once files copied users move new storage system.

my worries 1) when files copied there should not corruption 2) file copying must efficient though no speed expectations there. lan 10/100 ports being gigabit.

is there scripts can incorporated, or suggestions. computers have ssh-keygen enabled can passwordless connection.

the directory structures maintained on new server, similar of old computers.

i @ python fabric library. library streamlining use of ssh, , if concerned data integrity use sha1 or other hash algorithm creating fingerprint each file before transfer , compare fingerprint values generated @ initial , final destinations. of done using fabric.


Comments

Popular posts from this blog

delphi - How to convert bitmaps to video? -

jasper reports - Fixed header in Excel using JasperReports -

python - ('The SQL contains 0 parameter markers, but 50 parameters were supplied', 'HY000') or TypeError: 'tuple' object is not callable -