FileZoomer » FIleZoomer http://filezoomer.com The easy way to store your files at Amazon S3 Thu, 10 Mar 2016 18:59:48 +0000 en hourly 1 http://wordpress.org/?v=3.3.1 Beta Version 0.9 Add S3 Lifecycle, Versioning, Glacier, Batch Processing http://filezoomer.com/2012/12/beta-version-0-9-add-s3-lifecycle-versioning-glacier-batch-processing/ http://filezoomer.com/2012/12/beta-version-0-9-add-s3-lifecycle-versioning-glacier-batch-processing/#comments Thu, 20 Dec 2012 14:59:54 +0000 Steve http://filezoomer.com/?p=386

The newest version of FileZoomer Adds support for several new Amazon Web Services S3 capabilities, including:

Object Life Cycle: specify that files be deleted or moved to low-cost AWS Glacier storage after a set number of days or after a certain date.

Versioning: Turned on at the bucket level, versioning means that even if you upload multiple updates to a file all previous versions are saved. The newest version shows up as usual, but if you right-click the file and “show versions” all the prior versions will be displayed, and they can then be downloaded.

It’s important to know that with the current version of S3 these two features — Object Life Cycle and Versioning — are mutually exclusive. If you turn on Versioning you can’t also use Life Cycle rules, and if you are using Life Cycle you can’t turn on Versioning.

The new version of FileZoomer also includes a Batch Processing option. After interactively defining a batch process using “File…Batch Configuration”, you can later initiate that process in FileZoomer with “File…Run Batch”. This makes it easy, for instance, to update a folder and its contents with all new and updated files since the last time the batch upload was run.

Using a pure batch processing version of the filezoomer java “jar” file, along with a configuration file you have created interactively, you can also do things like schedule an unattended run of an upload job.

For more details on these new features see the individual posts for Object Life Cycle, Versioning, and Batch Processing..

]]>
http://filezoomer.com/2012/12/beta-version-0-9-add-s3-lifecycle-versioning-glacier-batch-processing/feed/ 0
Use S3 Object Life Cycle to Automatically Migrate Files to Amazon Glacier, or Delete Files Based on Date http://filezoomer.com/2012/12/use-s3-object-life-cycle-to-automatically-migrate-files-to-amazon-glacier-or-delete-files-based-on-date/ http://filezoomer.com/2012/12/use-s3-object-life-cycle-to-automatically-migrate-files-to-amazon-glacier-or-delete-files-based-on-date/#comments Thu, 20 Dec 2012 14:57:01 +0000 Steve http://filezoomer.com/?p=390

AWS recently announced Object life Cycle support for S3, which enables automated actions to be performed on objects based on a date or based on a time interval having elapsed. There are currently two actions available:

  • Delete the objects (files)
  • Move the object to low-cost Amazon Glacier storage

To automatically delete files specify a number of days or a specific date and the objects will be deleted after the specified time.

The migrate files to Glacier option moves the object to the AWS Glacier system for archiving using cheaper storage. Files are moved to Glacier after the specified number of days have elapsed or a specific date is reached.

Object life cycle support is based on the object prefix, better known as the folder path.

A bucket may contain folder paths with different rules. For example your BACKUP bucket might contain a sqldb/backup folder path and a sqldb/logs path. The sqldb/backup path could have a rule to move the objects to Glacier after 10 days and delete the objects after 365 days. The sqldb/logs folder may have a rule to simply delete the logs after 60 days.

There are three key things to know about the Object Life Cycle Migrate to Glacier option:

  • It’s cheaper. As of this writing, standard S3 pricing starts at $0.095 per GB. Glacier costs $0.010 per GB, nearly an order of magnitude cheaper.
  • It takes time to get the files back to S3, about three to five hours after you “initiate recovery”.
  • Object Life Cycle (both Glacier migration and automatic delete) is currently incompatible with S3 Versioning. You can’t do both.

Other things to be aware of:

When you tell S3 to migrate files to Glacier the only way to access them is through S3. There is no vault support and you cannot use other Glacier tools to access your files. You don’t sign up for Glacier and you don’t get a separate bill for Glacier. S3 just takes care of it and your S3 bill goes down.

When displaying files in S3 that have been moved to Glacier there will be an indication that the file is in Glacier. To access a file in Glacier you must use an S3 request to recover the file(s) . In FileZoomer you right-click and “Initiate Recovery”. File recovery takes 3-5 hours. Once a file is recovered it is available for a limited time. In FileZoomer you specify the number of days to keep it available. To make the recovery permanent you must copy the recovered file to another file in the S3 bucket. If you don’t the recovered file will be moved back to Glacier.

Configuring Object Life Cycle

  • Navigate to the bucket and folder path for which you want to create a life cycle rule.
  • Select the file drop down menu and then select the Object Life Cycle menu option.
The Object Life Cycle Configuration display will appear.

The display will show the current bucket and the current folder path in the bucket. It will also show the life cycle rules for the current bucket

Save Rules saves the displayed rules for the current bucket. If any rules are added or deleted you must use the Save Rules button to finalize the changes.

To delete a rule select the rule and Delete Selected Rule. When you confirm with Ok the rule will be removed from the rule display but it has not been removed from the bucket. Use Save Rules to apply the displayed rules to the bucket.

Use Add Rule to create a new rule.

The Add Rule dialog will appear. Verify the bucket and path displayed are the bucket and path to which you want to apply the rule.

Name the rule using Rule ID. Use a meaningful description.

Check “Use Transition Rule” to migrate files to Glacier. Check “Use Expiration Rule” to automatically delete files. You can do both. For instance you could migrate files to Glacier after 90 days and then have them deleted automatically after 365 days.

Set the number of days after file upload to apply the transition and/or expiration rules.

You can also specify a date in the future when the migration or deletion will occur. The date must be in the format yyyy-mm-dd.

Add Rule To List saves the new rule..

]]>
http://filezoomer.com/2012/12/use-s3-object-life-cycle-to-automatically-migrate-files-to-amazon-glacier-or-delete-files-based-on-date/feed/ 0
FileZoomer Batch Processing Automates Repetitive S3 Upload Tasks http://filezoomer.com/2012/12/filezoomer-batch-processing-automates-repetitive-s3-upload-tasks/ http://filezoomer.com/2012/12/filezoomer-batch-processing-automates-repetitive-s3-upload-tasks/#comments Thu, 20 Dec 2012 14:49:49 +0000 Steve http://filezoomer.com/?p=434

A major new feature of FileZoomer Beta 0.9 is “Batch Processing,” designed to facilitate and automate repetitive S3 transfer tasks.

We needed to upload web site backups and logs to S3 from Linux systems, and the only access from Linux to S3, at the time, was with a Perl module which did not handle large files well. S3 access code from FileZoomer was reused to create a command line java jar file that could run using the scheduling system provided by the operating system. Being java-based, it could be used on Windows and OS X systems as well.

Manually creating the configuration file was soon replaced by an interactive GUI component in FileZoomer itself. After that it was a small step to also add a “Run Batch” command to FileZoomer so that common, repetitive tasks could be easily initiated from the application itself, when that is easier or more useful than using a scheduler to run an actual batch job. The ability to do interactive configuration and easily initiate regular “housekeeping” tasks (like backing up new and changed files in a Documents folder) soon made it popular around the office for Windows and Mac use.

Now it’s available to FileZoomer users. It’s a powerful, but somewhat complicated tool, so if you use it take some time to understand it.

We’ll describe here how to get started using Batch Processing, using as our example the most obvious and common reason to use it — uploading and updating a folder to S3. And remember that it works with all the other new S3 features, most notably Object Life Cycle migration to Glacier for low-cost file archiving.

To start, choose “Batch Configuration” from the File menu:

Then Highlight “Add New Configuration File” and click “OK”. Later after you’ve created one or more configurations, you will see them listed for selection.

Next, specify all the important details about  your configuration:

Notice the current bucket and S3 path is filled in for you. Browse to select where you want log files to be saved. Then “Add New Action” to specify what your batch configuration will accomplish:

This example shows an Upload to S3 of all files modified (or created) since the last time the job ran. This means the first time it runs all the files in the selected path will be uploaded. We’ve chosen “Server Side Encryption” so the files are “encrypted at rest” at AWS. We want to process subfolders. We’re not going to compress on upload, which would cause files to be stored in zip format (if you are concerned about saving space and related charges, consider setting up a migration to Glacier). Uncompress on Download is checked so that IF you used FileZoomer compression the application would uncompress files on a later download.

Browse for the local path of the folder and files to be processed, and click “OK.

Be sure to give the configuration a descriptive name, notice your new configuration is now listed, and BE SURE TO “SAVE CONFIG.”

Now you are ready to “Run Batch” from the File menu:

Select the desired configuration (there could be more than one) and click “OK”:

FileZoomer will tell you it’s “Running Batch Configuration” but if you need details on what’s transpiring you need to look at the log file.

There are more options available for Batch Processing.

In addition to the “Run Batch” command you can also run scheduled, or from a command line..

]]>
http://filezoomer.com/2012/12/filezoomer-batch-processing-automates-repetitive-s3-upload-tasks/feed/ 0
FileZoomer Batch Processing Provides Powerful Options http://filezoomer.com/2012/12/filezoomer-batch-processing-provides-powerful-options/ http://filezoomer.com/2012/12/filezoomer-batch-processing-provides-powerful-options/#comments Thu, 20 Dec 2012 14:48:50 +0000 Steve http://filezoomer.com/?p=450

An earlier post introduced the new FileZoomer Batch Processing feature and walked through the configuration steps needed to prepare to do a “Run Batch” using a single, but very common, batch processing option. Here we go into more detail on all the available options and what they offer:

When you add a new configuration file (File…Batch Configuration…Add New Configuration File) and then “Add a New Action” you see this dialog that contains all the options available:

Let’s go through all the options starting with Select Action:

  • Upload will upload files from the Local Path on the PC to the current S3 bucket and path.
  • Download will download files from the current S3 bucket and path to the Local Path on the PC.
  • Sync will synchronize the current S3 bucket and path and the Local Path on your PC.  Sync will not do subfolders. It will only do files in the specified folder itself. Files that exist only on S3 will be copied to the PC. Files that exist only on the PC will be copied to S3. Files that exist in both places will be compared, with the newer version replacing the older. That means if you delete a file on your PC that is also on S3 it will be copied back from S3. Likewise if you delete a file on S3 that is also on your Pc it will be copied back to S3. Sync will NOT resolve any update conflict issues. For instance, if an S3 file is updated from a source other than where FileZoomer Batch Processing is being used, and is the same file is also updated on the local PC then a FileZoomer Batch Sync will overwrite the older file. It will NOT merge changes to the file from multiple sources. Sync does “All Files” regardless of other settings.
  • Prune is a specialized action that offers  a way to control the number of files in an S3 bucket and path. Unlike Object Life Cycle, which removes (via Delete or Glacier migration) files based on a number of days elapsed, Prune deletes files based on the “Prune File Count”, retaining  the most recent Prune-File-Count number of files in the current S3 bucket and path and deleting the rest. Prune should usually not be used on a bucket with Versioning enabled as the two features work at cross-purposes. Note that when doing a “Prune” action, other parameters are ignored as not relevant (Local Path, Process Subfolders, Encryption, Compression, All Files, Files Modified Today, Files Modified since last run). If you don’t have a clear idea of why you need Prune, it’s best not to use it.
Effect of All Files, Files Modified Today, and Files Modified Since Last Run:
  • The “Default” is All Files, which means all files will be processed regardless date and time stamps. On a Download this could mean, for example, that older files on S3 replace updated files on the local PC. This is always used for “Sync“, and ignored for “Prune“.
  • Files Modified Today uses the date on the PC for Upload. Not recommended for use with Download, and ignored for “Sync” and “Prune“.
  • Files Modified Since Last Run works the same as All Files for the first run. After that it looks at the date and time stamp on the files and only does those files created or modified since the last run. IMPORTANT: on Windows and OS X if you move an older file to the local path the date is not changed, and if it is a file created or modified before the Last Run date it will NOT be uploaded using this option.
Effect of Server Side Encryption, Compress on Upload, Process Sub Folders and Uncompress on Download:
  • Server Side Encryption, when checked, tells S3 to store the file encrypted (“encryption at rest”). Technically all this does is protect files in the event AWS discards or otherwise loses control of a drive containing your files without first destroying the files. Note that when files are being uploaded or downloaded using FileZoomer, SSL (encrypted transmission) is also used. Neither requires any key management on the user’s part.
  • Process Sub Folders, when checked, includes files in subfolders (and will create a subfolder as needed) when doing an Upload or a Download. This option is ignored for Sync and Prune.
  • Compress on Upload, if checked, zips files before uploading and marks the files as being “FileZoomer Zipped“. When paired with “Uncompress on Download” any file zipped by FileZoomer will be unzipped. Other S3 clients will simply see the files as regular zip files. Files already in a compressed format (e.g. jpeg, png, mp3, and actual zip files)  will not be compressed. This option makes uploads complete more quickly and saves a bit on S3 cost.
  • Uncompress on Download, which checked, will unzip any files marked on S3 as having been ”FileZoomer Zipped“. Otherwise it has no effect. It’s a good idea to leave this checked.

Local Path is required (except for “Prune”, and identifies the folder on the PC to include in processing.

Once your configuration is created, you can use the “Run Batch” option, or set it up to run from a command line, or on a schedule.

 .

]]>
http://filezoomer.com/2012/12/filezoomer-batch-processing-provides-powerful-options/feed/ 0
Beta version 0.8 adds the ability to upload files added to a directory after a specific date and time http://filezoomer.com/2012/01/beta-version-0-8-adds-the-ability-to-upload-files-added-to-a-directory-since-a-specific-date/ http://filezoomer.com/2012/01/beta-version-0-8-adds-the-ability-to-upload-files-added-to-a-directory-since-a-specific-date/#comments Tue, 17 Jan 2012 21:02:00 +0000 Steve http://filezoomer.com/?p=368

The newest version of FileZoomer includes an option to upload only those files within a directory that have been updated after a specific date and time.  This allows you to keep a complete version of a local folder in your S3 account without the need to constantly re-upload everything.

As an example – let’s say I want to upload the “Documents” folder from my Mac.  I’d open and log into File Zoomer, click the upload button, navigate to the “Documents” folder and select it to upload the entire folder (with sub-folders) into FileZoomer.  Because my documents folder is pretty large it’s going to take awhile to complete.

However, the next time I want to update the contents of my Documents folder I’ll just log into FZ, and “right click” the folder containing my documents folder.  That brings up the following menu:

Right clicking the Folder Name Brings up this menu

If I select the “Upload by Date” option I’ll see the following:

Upload by date dialog box

The date of the last upload to this folder is displayed so you can either use it or enter your own date.  Clicking OK will then launch an upload of every file that’s been updated after the date displayed in the dialog.

Check it out and let us know how you like it in the comments..

]]>
http://filezoomer.com/2012/01/beta-version-0-8-adds-the-ability-to-upload-files-added-to-a-directory-since-a-specific-date/feed/ 0
FileZoomer Beta 0.5 Released with File Compression Support http://filezoomer.com/2011/05/filezoomer-beta-0-5-released-with-file-compression-support/ http://filezoomer.com/2011/05/filezoomer-beta-0-5-released-with-file-compression-support/#comments Tue, 24 May 2011 14:08:27 +0000 Steve http://filezoomer.com/?p=306

The user community has spoken and file compression, the feature you’ve requested most has been added to FileZoomer.

The benefits of compressing files are probably obvious but we’ll list them anyway.  Compressing files reduces bandwidth costs, reduces transfer times and reduces storage costs.  Now  FileZoomer is not only considerably faster but using it to manage your S3 accounts will actually save you money on your monthly Amazon S3 bills.

File compression is an account option that can be turned on and off as you wish.  The default setting for file compression is off so you’ll need to turn it on to see it work.  To turn on file compression just launch File Zoomer and log in to your S3 account, then click the “preferences” menu item.   Click “Zip Compressible Files” and once the checkmark is there all compressible files will be compressed during uploads, downloads and while stored.  You’ll see a new .zip extension added to each compressed file so that you can easily identify compressed files.

Downloading files that were compressed during upload with FileZoomer will automatically uncompress them.  If you choose to use another utility to download your files you’ll be able to unzip them using standard zip tools.

One of the things we use S3 for is to store log files, which are highly compressible.  In our testing of daily log file uploads we’re seeing transfer speed improvements of nearly 80% with a corresponding reduction in space used to store the files.

File types that are inherently already compressed, for instance jpeg, mp3, zip, and most video files, will not get compressed as they are not “compressible”.

Please let us know what you think of this feature and what other features you’d like to see added.

 

 

 .

]]>
http://filezoomer.com/2011/05/filezoomer-beta-0-5-released-with-file-compression-support/feed/ 0
Amazon’s Cloud Drive uses S3 Storage, but it’s definitely NOT S3 http://filezoomer.com/2011/04/amazons-cloud-drive-uses-s3-storage-but-its-definitely-not-s3/ http://filezoomer.com/2011/04/amazons-cloud-drive-uses-s3-storage-but-its-definitely-not-s3/#comments Mon, 11 Apr 2011 18:42:18 +0000 Steve http://filezoomer.com/?p=250

Chris Brogan wrote a scathing post about Amazon’s new Cloud Drive product that caught my eye.  He makes some excellent points about the closed nature of Cloud Drive and concludes that it just “needs to get better – please”.

Any file storage site that’s been developed by someone else for you to use has been developed to meet the needs and vision of those who designed and developed it.   However, if you used any of these services they’re not done the way you’d do it, right?

That’s one of the great things about Amazon’s S3 service.  It’s just storage and the rest is up to you. Sure it’s a hard to use.  Yes,  the login criteria is overly complex.   But, it is secure – and it’s yours to use AS YOU PLEASE.

That’s the best thing about S3.  It was designed to be used by anyone who wants to use it.  They don’t make it easy for non-technicians to use, they just make it available.

The fact that they left it as basic as they did leaves opportunity for people  to develop applications like FileZoomer that making using Amazon S3 as simple as using a local drive on your computer.

If you try CloudDrive and hate it (I tried it too, wasn’t crazy about it) just keep on using S3.  It’s not free but it’s really cheap and you have complete control over your files and can use whatever user interface is right for you.

And if you haven’t tried FileZoomer with your S3 account go ahead and download it while it’s still in beta and use it for free.

 .

]]>
http://filezoomer.com/2011/04/amazons-cloud-drive-uses-s3-storage-but-its-definitely-not-s3/feed/ 0
FileZoomer is now Available for Beta Testing http://filezoomer.com/2011/03/filezoomer-is-now-available-for-beta-testing/ http://filezoomer.com/2011/03/filezoomer-is-now-available-for-beta-testing/#comments Mon, 21 Mar 2011 17:08:04 +0000 administrator http://filezoomer.com/?p=235

Beta testing for FileZoomer is officially open so now’s the time to get signed up and start using FileZoomer to manage your Amazon S3 account.

We’ve had good response from Beta Testers so far but we need more comments so please, if you have an S3 account download FileZoomer and let us know what you think.

 .

]]>
http://filezoomer.com/2011/03/filezoomer-is-now-available-for-beta-testing/feed/ 0
Moving Files Between Buckets Using FileZoomer http://filezoomer.com/2011/03/moving-files-between-buckets-using-filezoomer/ http://filezoomer.com/2011/03/moving-files-between-buckets-using-filezoomer/#comments Tue, 15 Mar 2011 15:04:53 +0000 administrator http://filezoomer.com/?p=172

FileZoomer provides an extremely easy way to move files between the different buckets of your account.   To move files between buckets just take following steps:

  1. Highlight the file(s) you’d like to move
  2. Click Edit in the FileZoomer menu
  3. Select “Move to other S3 Folder”
  4. Change the current Bucket to the destination bucket you want to move the files to
  5. Navigate to the destination folder within that bucket
  6. Click Edit in the File Zoomer menu
  7. Select “Paste” and the files will be moved

Yeah – it seems like an 8 step process isn’t easy but these instructions are painfully detailed.  Once you do it once it completely makes sense and you’ll be able to do it easily.  The important thing to remember is to use the Edit button in the FileZoomer menu rather than “right clicking” which isn’t supported for moving files between folders.

It’s important to note that when the files are being moved they’re actually being moved within the S3 system so moving dozens of files can take a bit of time.

 

 

 .

]]>
http://filezoomer.com/2011/03/moving-files-between-buckets-using-filezoomer/feed/ 1
A logo winner – at least for now http://filezoomer.com/2011/02/a-logo-winner-at-least-for-now/ http://filezoomer.com/2011/02/a-logo-winner-at-least-for-now/#comments Mon, 28 Feb 2011 21:55:22 +0000 administrator http://filezoomer.com/?p=44

We have decided on a version of the third logo in the previous post, with a smaller image and no drop shadow as the “current” winner in the logo contest.  First prize is the satisfaction of a job well done.  Second prize is the same, just not quite as much satisfaction.  The current winner is now in use on the site.

We’re getting closer to a beta test release date for the first version of the product.  If you’ve got an Amazon S3 account and you’re willing to help us out with some testing please enter your email address on the home page and we’ll let you know when we’re ready to begin Zooming files to S3.

 .

]]>
http://filezoomer.com/2011/02/a-logo-winner-at-least-for-now/feed/ 0