s3cmd failed too many times

Posted on Jan 24, 2022

Question

I used to be a happy s3cmd user. However recently when I try to transfer a large zip file (~7Gig) to Amazon S3, I am getting this error:

$> s3cmd put thefile.tgz s3://thebucket/thefile.tgz

…. 20480 of 7563176329 0% in 1s 14.97 kB/s failed WARNING: Upload failed: /thefile.tgz ([Errno 32] Broken pipe) WARNING: Retrying on lower speed (throttle=1.25) WARNING: Waiting 15 sec… thefile.tgz -> s3://thebucket/thefile.tgz [1 of 1] 8192 of 7563176329 0% in 1s 5.57 kB/s failed ERROR: Upload of ’thefile.tgz’ failed too many times. Skipping that file.

I am using the latest s3cmd on Ubuntu.

Why is it so? and how can I solve it? If it is unresolvable, what alternative tool can I use?

Answer

And now in 2014, the aws cli has the ability to upload big files in lieu of s3cmd.

http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-set-up.html has install / configure instructions, or often:

$ wget https://s3.amazonaws.com/aws-cli/awscli-bundle.zip
$ unzip awscli-bundle.zip
$ sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws
$ aws configure

followed by

$ aws s3 cp local_file.tgz s3://thereoncewasans3bucket

will get you satisfactory results.