FileZoomer http://filezoomer.com The easy way to store your files at Amazon S3 Thu, 10 Mar 2016 18:59:48 +0000 en hourly 1 http://wordpress.org/?v=3.3.1 Beta Version 0.9 Add S3 Lifecycle, Versioning, Glacier, Batch Processing http://filezoomer.com/2012/12/beta-version-0-9-add-s3-lifecycle-versioning-glacier-batch-processing/ http://filezoomer.com/2012/12/beta-version-0-9-add-s3-lifecycle-versioning-glacier-batch-processing/#comments Thu, 20 Dec 2012 14:59:54 +0000 Steve http://filezoomer.com/?p=386

The newest version of FileZoomer Adds support for several new Amazon Web Services S3 capabilities, including:

Object Life Cycle: specify that files be deleted or moved to low-cost AWS Glacier storage after a set number of days or after a certain date.

Versioning: Turned on at the bucket level, versioning means that even if you upload multiple updates to a file all previous versions are saved. The newest version shows up as usual, but if you right-click the file and “show versions” all the prior versions will be displayed, and they can then be downloaded.

It’s important to know that with the current version of S3 these two features — Object Life Cycle and Versioning — are mutually exclusive. If you turn on Versioning you can’t also use Life Cycle rules, and if you are using Life Cycle you can’t turn on Versioning.

The new version of FileZoomer also includes a Batch Processing option. After interactively defining a batch process using “File…Batch Configuration”, you can later initiate that process in FileZoomer with “File…Run Batch”. This makes it easy, for instance, to update a folder and its contents with all new and updated files since the last time the batch upload was run.

Using a pure batch processing version of the filezoomer java “jar” file, along with a configuration file you have created interactively, you can also do things like schedule an unattended run of an upload job.

For more details on these new features see the individual posts for Object Life Cycle, Versioning, and Batch Processing..

]]>
http://filezoomer.com/2012/12/beta-version-0-9-add-s3-lifecycle-versioning-glacier-batch-processing/feed/ 0
Use S3 Object Life Cycle to Automatically Migrate Files to Amazon Glacier, or Delete Files Based on Date http://filezoomer.com/2012/12/use-s3-object-life-cycle-to-automatically-migrate-files-to-amazon-glacier-or-delete-files-based-on-date/ http://filezoomer.com/2012/12/use-s3-object-life-cycle-to-automatically-migrate-files-to-amazon-glacier-or-delete-files-based-on-date/#comments Thu, 20 Dec 2012 14:57:01 +0000 Steve http://filezoomer.com/?p=390

AWS recently announced Object life Cycle support for S3, which enables automated actions to be performed on objects based on a date or based on a time interval having elapsed. There are currently two actions available:

  • Delete the objects (files)
  • Move the object to low-cost Amazon Glacier storage

To automatically delete files specify a number of days or a specific date and the objects will be deleted after the specified time.

The migrate files to Glacier option moves the object to the AWS Glacier system for archiving using cheaper storage. Files are moved to Glacier after the specified number of days have elapsed or a specific date is reached.

Object life cycle support is based on the object prefix, better known as the folder path.

A bucket may contain folder paths with different rules. For example your BACKUP bucket might contain a sqldb/backup folder path and a sqldb/logs path. The sqldb/backup path could have a rule to move the objects to Glacier after 10 days and delete the objects after 365 days. The sqldb/logs folder may have a rule to simply delete the logs after 60 days.

There are three key things to know about the Object Life Cycle Migrate to Glacier option:

  • It’s cheaper. As of this writing, standard S3 pricing starts at $0.095 per GB. Glacier costs $0.010 per GB, nearly an order of magnitude cheaper.
  • It takes time to get the files back to S3, about three to five hours after you “initiate recovery”.
  • Object Life Cycle (both Glacier migration and automatic delete) is currently incompatible with S3 Versioning. You can’t do both.

Other things to be aware of:

When you tell S3 to migrate files to Glacier the only way to access them is through S3. There is no vault support and you cannot use other Glacier tools to access your files. You don’t sign up for Glacier and you don’t get a separate bill for Glacier. S3 just takes care of it and your S3 bill goes down.

When displaying files in S3 that have been moved to Glacier there will be an indication that the file is in Glacier. To access a file in Glacier you must use an S3 request to recover the file(s) . In FileZoomer you right-click and “Initiate Recovery”. File recovery takes 3-5 hours. Once a file is recovered it is available for a limited time. In FileZoomer you specify the number of days to keep it available. To make the recovery permanent you must copy the recovered file to another file in the S3 bucket. If you don’t the recovered file will be moved back to Glacier.

Configuring Object Life Cycle

  • Navigate to the bucket and folder path for which you want to create a life cycle rule.
  • Select the file drop down menu and then select the Object Life Cycle menu option.
The Object Life Cycle Configuration display will appear.

The display will show the current bucket and the current folder path in the bucket. It will also show the life cycle rules for the current bucket

Save Rules saves the displayed rules for the current bucket. If any rules are added or deleted you must use the Save Rules button to finalize the changes.

To delete a rule select the rule and Delete Selected Rule. When you confirm with Ok the rule will be removed from the rule display but it has not been removed from the bucket. Use Save Rules to apply the displayed rules to the bucket.

Use Add Rule to create a new rule.

The Add Rule dialog will appear. Verify the bucket and path displayed are the bucket and path to which you want to apply the rule.

Name the rule using Rule ID. Use a meaningful description.

Check “Use Transition Rule” to migrate files to Glacier. Check “Use Expiration Rule” to automatically delete files. You can do both. For instance you could migrate files to Glacier after 90 days and then have them deleted automatically after 365 days.

Set the number of days after file upload to apply the transition and/or expiration rules.

You can also specify a date in the future when the migration or deletion will occur. The date must be in the format yyyy-mm-dd.

Add Rule To List saves the new rule..

]]>
http://filezoomer.com/2012/12/use-s3-object-life-cycle-to-automatically-migrate-files-to-amazon-glacier-or-delete-files-based-on-date/feed/ 0
FileZoomer Batch Processing Automates Repetitive S3 Upload Tasks http://filezoomer.com/2012/12/filezoomer-batch-processing-automates-repetitive-s3-upload-tasks/ http://filezoomer.com/2012/12/filezoomer-batch-processing-automates-repetitive-s3-upload-tasks/#comments Thu, 20 Dec 2012 14:49:49 +0000 Steve http://filezoomer.com/?p=434

A major new feature of FileZoomer Beta 0.9 is “Batch Processing,” designed to facilitate and automate repetitive S3 transfer tasks.

We needed to upload web site backups and logs to S3 from Linux systems, and the only access from Linux to S3, at the time, was with a Perl module which did not handle large files well. S3 access code from FileZoomer was reused to create a command line java jar file that could run using the scheduling system provided by the operating system. Being java-based, it could be used on Windows and OS X systems as well.

Manually creating the configuration file was soon replaced by an interactive GUI component in FileZoomer itself. After that it was a small step to also add a “Run Batch” command to FileZoomer so that common, repetitive tasks could be easily initiated from the application itself, when that is easier or more useful than using a scheduler to run an actual batch job. The ability to do interactive configuration and easily initiate regular “housekeeping” tasks (like backing up new and changed files in a Documents folder) soon made it popular around the office for Windows and Mac use.

Now it’s available to FileZoomer users. It’s a powerful, but somewhat complicated tool, so if you use it take some time to understand it.

We’ll describe here how to get started using Batch Processing, using as our example the most obvious and common reason to use it — uploading and updating a folder to S3. And remember that it works with all the other new S3 features, most notably Object Life Cycle migration to Glacier for low-cost file archiving.

To start, choose “Batch Configuration” from the File menu:

Then Highlight “Add New Configuration File” and click “OK”. Later after you’ve created one or more configurations, you will see them listed for selection.

Next, specify all the important details about  your configuration:

Notice the current bucket and S3 path is filled in for you. Browse to select where you want log files to be saved. Then “Add New Action” to specify what your batch configuration will accomplish:

This example shows an Upload to S3 of all files modified (or created) since the last time the job ran. This means the first time it runs all the files in the selected path will be uploaded. We’ve chosen “Server Side Encryption” so the files are “encrypted at rest” at AWS. We want to process subfolders. We’re not going to compress on upload, which would cause files to be stored in zip format (if you are concerned about saving space and related charges, consider setting up a migration to Glacier). Uncompress on Download is checked so that IF you used FileZoomer compression the application would uncompress files on a later download.

Browse for the local path of the folder and files to be processed, and click “OK.

Be sure to give the configuration a descriptive name, notice your new configuration is now listed, and BE SURE TO “SAVE CONFIG.”

Now you are ready to “Run Batch” from the File menu:

Select the desired configuration (there could be more than one) and click “OK”:

FileZoomer will tell you it’s “Running Batch Configuration” but if you need details on what’s transpiring you need to look at the log file.

There are more options available for Batch Processing.

In addition to the “Run Batch” command you can also run scheduled, or from a command line..

]]>
http://filezoomer.com/2012/12/filezoomer-batch-processing-automates-repetitive-s3-upload-tasks/feed/ 0
FileZoomer Batch Processing Provides Powerful Options http://filezoomer.com/2012/12/filezoomer-batch-processing-provides-powerful-options/ http://filezoomer.com/2012/12/filezoomer-batch-processing-provides-powerful-options/#comments Thu, 20 Dec 2012 14:48:50 +0000 Steve http://filezoomer.com/?p=450

An earlier post introduced the new FileZoomer Batch Processing feature and walked through the configuration steps needed to prepare to do a “Run Batch” using a single, but very common, batch processing option. Here we go into more detail on all the available options and what they offer:

When you add a new configuration file (File…Batch Configuration…Add New Configuration File) and then “Add a New Action” you see this dialog that contains all the options available:

Let’s go through all the options starting with Select Action:

  • Upload will upload files from the Local Path on the PC to the current S3 bucket and path.
  • Download will download files from the current S3 bucket and path to the Local Path on the PC.
  • Sync will synchronize the current S3 bucket and path and the Local Path on your PC.  Sync will not do subfolders. It will only do files in the specified folder itself. Files that exist only on S3 will be copied to the PC. Files that exist only on the PC will be copied to S3. Files that exist in both places will be compared, with the newer version replacing the older. That means if you delete a file on your PC that is also on S3 it will be copied back from S3. Likewise if you delete a file on S3 that is also on your Pc it will be copied back to S3. Sync will NOT resolve any update conflict issues. For instance, if an S3 file is updated from a source other than where FileZoomer Batch Processing is being used, and is the same file is also updated on the local PC then a FileZoomer Batch Sync will overwrite the older file. It will NOT merge changes to the file from multiple sources. Sync does “All Files” regardless of other settings.
  • Prune is a specialized action that offers  a way to control the number of files in an S3 bucket and path. Unlike Object Life Cycle, which removes (via Delete or Glacier migration) files based on a number of days elapsed, Prune deletes files based on the “Prune File Count”, retaining  the most recent Prune-File-Count number of files in the current S3 bucket and path and deleting the rest. Prune should usually not be used on a bucket with Versioning enabled as the two features work at cross-purposes. Note that when doing a “Prune” action, other parameters are ignored as not relevant (Local Path, Process Subfolders, Encryption, Compression, All Files, Files Modified Today, Files Modified since last run). If you don’t have a clear idea of why you need Prune, it’s best not to use it.
Effect of All Files, Files Modified Today, and Files Modified Since Last Run:
  • The “Default” is All Files, which means all files will be processed regardless date and time stamps. On a Download this could mean, for example, that older files on S3 replace updated files on the local PC. This is always used for “Sync“, and ignored for “Prune“.
  • Files Modified Today uses the date on the PC for Upload. Not recommended for use with Download, and ignored for “Sync” and “Prune“.
  • Files Modified Since Last Run works the same as All Files for the first run. After that it looks at the date and time stamp on the files and only does those files created or modified since the last run. IMPORTANT: on Windows and OS X if you move an older file to the local path the date is not changed, and if it is a file created or modified before the Last Run date it will NOT be uploaded using this option.
Effect of Server Side Encryption, Compress on Upload, Process Sub Folders and Uncompress on Download:
  • Server Side Encryption, when checked, tells S3 to store the file encrypted (“encryption at rest”). Technically all this does is protect files in the event AWS discards or otherwise loses control of a drive containing your files without first destroying the files. Note that when files are being uploaded or downloaded using FileZoomer, SSL (encrypted transmission) is also used. Neither requires any key management on the user’s part.
  • Process Sub Folders, when checked, includes files in subfolders (and will create a subfolder as needed) when doing an Upload or a Download. This option is ignored for Sync and Prune.
  • Compress on Upload, if checked, zips files before uploading and marks the files as being “FileZoomer Zipped“. When paired with “Uncompress on Download” any file zipped by FileZoomer will be unzipped. Other S3 clients will simply see the files as regular zip files. Files already in a compressed format (e.g. jpeg, png, mp3, and actual zip files)  will not be compressed. This option makes uploads complete more quickly and saves a bit on S3 cost.
  • Uncompress on Download, which checked, will unzip any files marked on S3 as having been ”FileZoomer Zipped“. Otherwise it has no effect. It’s a good idea to leave this checked.

Local Path is required (except for “Prune”, and identifies the folder on the PC to include in processing.

Once your configuration is created, you can use the “Run Batch” option, or set it up to run from a command line, or on a schedule.

 .

]]>
http://filezoomer.com/2012/12/filezoomer-batch-processing-provides-powerful-options/feed/ 0
Run or Schedule FileZoomer Batch Using Command Line or .BAT file http://filezoomer.com/2012/12/run-or-schedule-filezoomer-batch-using-command-line-or-bat-file/ http://filezoomer.com/2012/12/run-or-schedule-filezoomer-batch-using-command-line-or-bat-file/#comments Thu, 20 Dec 2012 14:47:24 +0000 Steve http://filezoomer.com/?p=464

Previous posts on the new FileZoomer Batch Option have shown how to interactively create a batch configuration file, and start a batch process from within FileZoomer.

You can also use the true batch version of the java jar file and run it from a command line, batch file, or terminal session on any operating system that supports command line or shell commands or terminal commands (and of course they all do). This makes it easy to create a configuration file (do it interactively) but to run the process unattended, including on a server or other machine different from the one used to create the configuration file.

When you save a configuration file you will notice that it tells you where that config file  was saved, and the name:

You can also check the box to put the path in the clipboard. You can also get this info anytime by clicking “Show Config File” on the Batch Processing Configuration page:

Next, get a copy of FileZoomerBatch.jar and its associated lib folder (the link is to a zip file, unzip and and put it where you want to execute it from) Here’s the zip file’s MD5 if you want to confirm it..

Then use the appropriate technique for your OS to execute FileZoomerBatch.jar, remembering to provide the location of the config file. For instance, a Windows .BAT file located in the same directory as FileZoomerBatch.jar and the config file (mcfg) would have a command that looked like this:

java -jar FileZoomerBatch.jar “xxxxxx.mcfg”

(where xxxxxx was the name you gave your config file).

If you are executing on a linux or Mac OS X system use appropriate method. If you aren’t comfortable with that consider executing the batch process from within the FileZoomer App with “Run Batch”.

 .

]]>
http://filezoomer.com/2012/12/run-or-schedule-filezoomer-batch-using-command-line-or-bat-file/feed/ 0
Use S3 Bucket Versioning to Track File Updates, Recover Older Versions of Files http://filezoomer.com/2012/12/use-s3-bucket-versioning-to-track-file-updates-recover-older-versions-of-files/ http://filezoomer.com/2012/12/use-s3-bucket-versioning-to-track-file-updates-recover-older-versions-of-files/#comments Thu, 20 Dec 2012 14:46:15 +0000 Steve http://filezoomer.com/?p=422

FileZoomer now supports Amazon S3 Versioning, which keeps track of all versions of files in a bucket, and allows you to display and retrieve older versions of updated files. There are two important things to know about this S3 feature:

  • Versioning is set for the whole bucket.
  • Once set you can’t turn Versioning back off for that bucket and its files. Versioning can, however, be suspended.
  • Versioning is incompatible with the Object Life Cycle feature (automatic deletion or migration to Glacier).

So plan ahead before enabling Versioning for a bucket.

To turn on Versioning navigate to the bucket and use “File…Bucket Versioning”

Check “Enable Versioning” and click OK.

To access earlier versions of a particular file, right-click the file and “Show Versions

 

The version list will initially display the current version at the top, along with all previous versions. Right-click to Download or Delete a version of the file.

 

 .

]]>
http://filezoomer.com/2012/12/use-s3-bucket-versioning-to-track-file-updates-recover-older-versions-of-files/feed/ 0
Beta version 0.8 adds the ability to upload files added to a directory after a specific date and time http://filezoomer.com/2012/01/beta-version-0-8-adds-the-ability-to-upload-files-added-to-a-directory-since-a-specific-date/ http://filezoomer.com/2012/01/beta-version-0-8-adds-the-ability-to-upload-files-added-to-a-directory-since-a-specific-date/#comments Tue, 17 Jan 2012 21:02:00 +0000 Steve http://filezoomer.com/?p=368

The newest version of FileZoomer includes an option to upload only those files within a directory that have been updated after a specific date and time.  This allows you to keep a complete version of a local folder in your S3 account without the need to constantly re-upload everything.

As an example – let’s say I want to upload the “Documents” folder from my Mac.  I’d open and log into File Zoomer, click the upload button, navigate to the “Documents” folder and select it to upload the entire folder (with sub-folders) into FileZoomer.  Because my documents folder is pretty large it’s going to take awhile to complete.

However, the next time I want to update the contents of my Documents folder I’ll just log into FZ, and “right click” the folder containing my documents folder.  That brings up the following menu:

Right clicking the Folder Name Brings up this menu

If I select the “Upload by Date” option I’ll see the following:

Upload by date dialog box

The date of the last upload to this folder is displayed so you can either use it or enter your own date.  Clicking OK will then launch an upload of every file that’s been updated after the date displayed in the dialog.

Check it out and let us know how you like it in the comments..

]]>
http://filezoomer.com/2012/01/beta-version-0-8-adds-the-ability-to-upload-files-added-to-a-directory-since-a-specific-date/feed/ 0
Latest Release Adds Support for Two More S3 Regions http://filezoomer.com/2011/12/latest-release-adds-support-for-two-more-s3-regions/ http://filezoomer.com/2011/12/latest-release-adds-support-for-two-more-s3-regions/#comments Tue, 20 Dec 2011 20:01:16 +0000 Steve http://filezoomer.com/?p=360

The latest release of FileZoomer ads support for the most recently added Amazon S3 Regions: US West (Oregon) and South America (Sao Paulo).

Because FileZoomer is a Java Webstart Application there is nothing you need to do to update your software. The next time you run FileZoomer it will automatically update to the most current version.

.

]]>
http://filezoomer.com/2011/12/latest-release-adds-support-for-two-more-s3-regions/feed/ 0
Calculate storage used across all buckets in your Amazon S3 account http://filezoomer.com/2011/07/calculate-storage-used-across-all-buckets-in-your-amazon-s3-account/ http://filezoomer.com/2011/07/calculate-storage-used-across-all-buckets-in-your-amazon-s3-account/#comments Fri, 22 Jul 2011 18:32:17 +0000 Steve http://filezoomer.com/?p=332

FileZoomer beta .6 has been released and it can now generate a report that shows the total amount of storage you’re using in an Amazon S3 account across all of your S3 Buckets.  We’re also providing a high level estimate of the monthly cost of that storage from Amazon.  The estimate uses US standard pricing and is based on standard storage costs so it may not be accurate for your particular location but it should provide a useful estimate.

To generate the report from within FileZoomer just click the menu item for view and then the usage report.

Click the menu item for view, then click "usage report"

 

The resulting report will look a bit like this, but hopefully you are making better use of your S3 account than we are with this test account.

S3 Storage Report

 

As you can see this test account has only a few files and folders and contains only 141,692,710 Kb of storage, or less than 140 MBs.

Let us know in the comments if there are any questions on how to use the report, or if you have any suggestions on improving the it..

]]>
http://filezoomer.com/2011/07/calculate-storage-used-across-all-buckets-in-your-amazon-s3-account/feed/ 0
FileZoomer Beta 0.5 Released with File Compression Support http://filezoomer.com/2011/05/filezoomer-beta-0-5-released-with-file-compression-support/ http://filezoomer.com/2011/05/filezoomer-beta-0-5-released-with-file-compression-support/#comments Tue, 24 May 2011 14:08:27 +0000 Steve http://filezoomer.com/?p=306

The user community has spoken and file compression, the feature you’ve requested most has been added to FileZoomer.

The benefits of compressing files are probably obvious but we’ll list them anyway.  Compressing files reduces bandwidth costs, reduces transfer times and reduces storage costs.  Now  FileZoomer is not only considerably faster but using it to manage your S3 accounts will actually save you money on your monthly Amazon S3 bills.

File compression is an account option that can be turned on and off as you wish.  The default setting for file compression is off so you’ll need to turn it on to see it work.  To turn on file compression just launch File Zoomer and log in to your S3 account, then click the “preferences” menu item.   Click “Zip Compressible Files” and once the checkmark is there all compressible files will be compressed during uploads, downloads and while stored.  You’ll see a new .zip extension added to each compressed file so that you can easily identify compressed files.

Downloading files that were compressed during upload with FileZoomer will automatically uncompress them.  If you choose to use another utility to download your files you’ll be able to unzip them using standard zip tools.

One of the things we use S3 for is to store log files, which are highly compressible.  In our testing of daily log file uploads we’re seeing transfer speed improvements of nearly 80% with a corresponding reduction in space used to store the files.

File types that are inherently already compressed, for instance jpeg, mp3, zip, and most video files, will not get compressed as they are not “compressible”.

Please let us know what you think of this feature and what other features you’d like to see added.

 

 

 .

]]>
http://filezoomer.com/2011/05/filezoomer-beta-0-5-released-with-file-compression-support/feed/ 0