in AWS, PHP

AWS S3 & PHP & Gaufrette

When your application is growing there are times in which you have to share your data between more than one App server. There are lots of different tools that allow you to sync or replicate data between multiple servers but nowadays it’s a better choice to save your data outside of your “app server”.

In order to achieve this, you should use a storage service. It can be an FTP or an external service with it’s own API like Amazon Web Service Simple Storage Service (better known as S3), Azure Blob Storage, Rackspace Cloud Files or any of the many other solutions available.

You can compare and decide which of these services fits best to your situation and implement it. One of my concerns is to avoid a vendor lock on one of most critical part of any app’s data. We can solve that using a “filesystem abstraction layer” while we are developing our app. We don’t even need to start using any of the vendors at beginning we just use a local folder adapter and when our app grows, we can switch our storage to one of the services mentioned above.

Now I’m going to introduce you to a magical package called Gaufrette that implements the filesystem abstraction layer and the adapters to work with the major storage services.

First of all we need to add this Composer packages to our project.

After install them with composer update.

We need  an AWS user and keys to connect to S3. For this we should logging into aws console and go to IAM service.

https://console.aws.amazon.com/iam/home?region=eu-west-1#groups

  1. Create a user
  2. Select Amazon S3 Full Access User Policy
  3. Store both keys on a secure place

When we have our user and keys, we just need to create an S3 bucket to store our files on AWS console for S3. In that step you can use aws s3 web interface to upload some files to test it.

Listing all files and dirs

To list all files you just need to inject your credentials to S3Client factory, inject the client to AwsS3 adapter with the bucket you want to connect and decorate this adapter with the FileSystem.

Then you can retrieve all keys stored to that bucket.

We can see the files we have on our bucket

Listing directory files

As we can see in the previous example we are getting the list of all files stored in our bucket. If we have a huge amount of files, we can ending managing really long lists of files. The solution I found for that is a way of sandboxing our Adapter to a directory before creating our filesystem.

Uploading a file

We know how to list files on a folder, know we want to upload a file there. The adapter have a method called write with the following parameters.

The parameter $key it’s the filename.
The parameter $content it’s the file content.
The parameter $overwrite it’s a flag that allow us to overwrite that file. If the file previously exists and we setup the overwrite to false, we will get an exception.

There is an example for upload a png image.

Reading a file

For reading a file we just need to call method read. If the file doesn’t exist it return null otherwise if it exists it will return the file contents.

“Debugging” / Inspecting

During previous examples I had a couple of issues and I need to debug or inspect the request I send and received from S3 to figure what is going on. To achieve that I installed an HTTP Proxy called Fiddler (http://www.telerik.com/fiddler) that allows me to inspect any http connection like Chrome inspector.

Then I just forced the S3 Client to use HTTP instead of HTTPS and use my local proxy.

Reading a file – Issues

HTTP/1.1 404 Not Found

The first issue I had is reading files that currently exists in S3 but I get a 404 Not found error. I solved it adding the region to the config in S3Client Factory.

Add ‘region’ => ‘eu-west-1’ //on your S3Client::factory config

HTTP/1.1 403 Forbidden

The second one I had it’s an 403 Forbidden error when reading files. I solved it adding the Amazon S3 Full Access credential to the user I’m using to connect to S3 on AWS IAM service (https://console.aws.amazon.com/iam/).

Now you have the basic examples of how to connect to AWS S3 service. If you have any question or issues, please send a comment.

Write a Comment

Comment