download free 30 days trial version buy bucket explorer
Documentation  Download  Purchase  Support  FAQs   Forum   ScreenShots & Demos

How to Upload Frequently Updated Files on Amazon S3 Without Data Loss?

To ensure files are uploaded without data loss, Bucket Explorer calculates and sends a hash of every file to Amazon S3.

If you are facing a problem of data loss during upload like file is modified in local while uploading, then you have to make a change in the default settings of Bucket Explorer Configuration . With the default setting, Bucket Explorer uploads the user’s data directly to the Destination Bucket without copying it to temp folder.

For Example:
If you are a Webmaster, and want to backup your Web Server (IIS / Apache) logs on S3 but the Web Server logs are updated with every user request, the files will keep on changing frequently. And if you are uploading this type of files on S3 for backup, then you will face data loss problem. To avoid this problem, Bucket Explorer allows you to upload the files without corruption. For this, you are required to make a small change in Bucket Explorer Configuration

How to change Bucket Explorer Configuration?
Latest Version and After Version 2009.10
  1. Run Bucket Explorer.
  2. Click on Tools->Advance Preferences
  3. Click on Queue Setting tab.
  4. Select "True" the " Copy to temp before uploading " ComboBox
  5. Click on OK button.
  6. Now when you upload any file, it will be copied to temp before starting the upload operation.
If this tag is set to FALSE ,   then file won’t be copied to Temp folder of Bucket Explorer before uploading to S3. But if it is set to TRUE , file will first be copied to Temp of Bucket Explorer and then to S3 from Temp in every uploading process.
NOTE: Changing the property value to TRUE   will be helpful to avoid data loss. But in case of large files, copying may take time.

Related Topics :