This article was first published on the Aeeiee Blog.
Why AWS?
AWS has a ton of amazing services to help in the software development process. From services that help you cache static assets like CloudFront, to services that scale based on the traffic to your application like ElasticBeanstalk (this comes with a load balancer by default). At Aeeiee, we're huge fans of AWS. Our servers run on EC2 instances and we even use AWS CodeCommit for version control. AWS provides high availability and has been the go-to cloud solutions provider for millions of developers for sometime now.
In this article, we'll explore one of AWS' most popular services - AWS S3. You'll learn how to use AWS's S3 to store and manage digital assets on your WordPress site.
Let's jump into it!
Creating an AWS account
Creating an AWS account is pretty straightforward. Follow the link here to sign up. You'll need a Credit Card to get started. Amazon uses it to verify your identity and keeps it on file for when they need to bill you. The steps needed in this tutorial are covered under the free tier. You can also destroy the bucket you've created when done to ensure your credit card is not charged.
Getting your S3 Credentials
On your AWS Dashboard, search for S3 in the search bar.
Next, Click the Create bucket button.
Give your bucket a name. We'll call ours aeeiee-test. You can leave the default region. Ours is currently set to EU (London) - EU West 2. You can leave the other options as they are.
To get your Access Key ID and Secret Access Key, click on your username in the top right of the dashboard.
Then, select the my security credentials option. Scroll down to the Access keys (Access Key ID and Secret Access Key) section and click to expand it. Click the create new access key button to create a new access key. You'll be able to see your Secret Access Key here. You can download the file containing your keys to your computer or simply copy and paste the keys somewhere secure for safe keeping. Once you exit this popup, you may be unable to retrieve your secret access key again.
Setting up the AWS JS SDK for use in WordPress
Next, we're going to create a Plugin to handle our uploads. To avoid running into issues, we've found that the best way to load the AWS S3 SDK is to link to the version hosted by AWS rather than self-hosting a copy of it on our own servers.
In your plugins directory, create a new folder - aeeiee-s3-uploader. Inside this folder, create a new file. We'll call ours index.php. Also, create two additional files: aeeiee-s3.js and aeeiee-s3-views.php. The JavaScript file will hold all the JS code to handle uploading files to our S3 bucket while the aeeiee-s3-views.php file will handle displaying HTML content on the page.
We will create a plugin file with the information below. This ensures that WordPress can correctly detect and load our plugins and make it available to us on the Plugins page.
<?php
/**
* Aeeiee S3 Plugin
*
* @author Aeeiee Inc.
*
* @wordpress-plugin
* Plugin Name: Aeeiee S3 Plugin
* Description: This plugin allows us upload files to an Amazon S3 bucket.
* Author: Aeeiee Inc.
* Author URI: https://www.aeeiee.com
* Version: 1.0
* Requires PHP: 7.2
*/
Next, we enqueue the AWS JS SDK script using the link provided by AWS.
add_action('admin_enqueue_scripts', function () {
// loads the AWS SDK
wp_enqueue_script('aeeiee-aws-sdk', 'https://sdk.amazonaws.com/js/aws-sdk-2.828.0.min.js');
});
Walkthrough
With our scripts enqueued, we would need to create a new page when our plugin is activated. When a user navigates to this page, they'll be presented with an upload form. Once they hit the upload button, the file they've selected will be uploaded to our S3 bucket.
Creating a Page and enqueuing Scripts
First, we'll create a new page when the plugin is activated using the add_menu_page
API provided by WordPress. See code below.
We want to ensure that our JavaScript files are enqueued only on the pages we need them, to avoid conflicts with other plugins. To do that, we use the $hook_suffix
variable that's automatically passed to the admin_enqueue_scripts hook callback. The $hook_suffix
tells us what page we're on and we can use that to activate our JavaScript file only on the pages we want.
We also use wp_localize_script
API in WordPress to pass our AWS keys as variables to our JavaScript file using PHP.
NB: Replace YOUR ACCESS KEY ID and YOUR SECRET ACCESS KEY with the relevant values you obtained when you created your AWS account.
add_action('admin_menu', 'aeeiee_s3_uploader');
function aeeiee_s3_uploader()
{
// add a new page to the menu
add_menu_page('Aeeiee S3 Page', 'Aeeiee S3 Uploader', 'manage_options', 'aeeiee-s3-uploader', 'aeeiee_s3_callback', 'dashicons-chart-pie');
// enqueue the JS scripts in the admin page
add_action('admin_enqueue_scripts', 'aeeiee_s3_enqueue_scripts');
}
function aeeiee_s3_enqueue_scripts($hook_suffix)
{
// if on the uploader page in the admin section, load the JS file
if ($hook_suffix === 'toplevel_page_aeeiee-s3-uploader') {
wp_enqueue_script('aeeiee-js', plugins_url('/aeeiee-s3.js', __FILE__));
$aws_vars = array(
'accessKeyId' => "YOUR ACCESS KEY ID",
'secretAccessKey' => "YOUR SECRET ACCESS KEY",
);
// pass AWS Keys from the server to the client
wp_localize_script('aeeiee-js', 'vars', $aws_vars);
}
}
With our JavaScript file enqueued. Time to finally hook up our views too. Still within the index.php file, add the following:
function aeeiee_s3_callback(){
include_once 'aeeiee-s3-views.php';
}
In the aeeiee-s3-views.php file, add the code to display the HTML on the frontend.
<section>
<label for="s3-uploader"><strong>Upload a file to our S3 Bucket! </strong></label>
<div>
<input type="file" id="file-uploader" />
<button id="start-upload">Start Upload</button>
</div>
<div>
<p class="message"> </p>
</div>
</section>
We've also added a paragraph tag with a class of message that we can use to inform users about the status of their upload.
Uploading the Files to S3
Finally, for the main bit. We will write this part in JQuery. Head into your aeeiee-s3.js file. First step is to initialize the SDK with our Keys.
// initialize AWS SDK
var s3 = new AWS.S3({
accessKeyId: aws_vars.accessKeyId,
secretAccessKey: aws_vars.secretAccessKey,
region: 'eu-west-2'
});
const bucketName = "aeeiee-test";
Here's how the upload is going to work.
When a user selects a file, we will display a "Starting your file Upload to AWS S3...." message.
When the upload is done, we will once again inform the user by displaying a "File successfully uploaded to S3" message.
Here's the complete code for the JavaScript upload process.
jQuery(document).ready(function ($) {
const fileUploadInput = $("#file-uploader");
const messageSection = $(".message");
$("#start-upload").on("click", function () {
const file = fileUploadInput[0].files[0];
messageSection.html("Starting your file Upload to AWS S3....");
var upload = new AWS.S3.ManagedUpload({
service: s3,
params: {
Body: file,
Bucket: "aeeiee-test",
Key: file.name,
},
});
// start the upload
upload.send(function (err, data) {
if (err) {
console.log("Error", err.code, err.message);
alert("There was an error uploading the file, please try again");
} else {
messageSection.html("File successfully uploaded to S3");
}
});
});
});
Listing files in the bucket and downloading files
The AWS SDK provides high level APIs that allow us to perform a number of actions on buckets. There's a listObjects
API that allows us list the objects in a bucket.
In our case, we will also be using the getSignedURL
API to generate URLs that expire in 2mins. These URLs will be attached to objects from our buckets when displaying them on the frontend. This way, if a user clicks on that link within 2mins, the file(object) will be downloaded to their machine.
// Call S3 to obtain a list of the objects in the bucket
s3.listObjects({ Bucket: bucketName }, function (err, data) {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data.Contents);
data.Contents.map((content) => {
objectsSection.append(
`<li><a href="${getPresignedURL(content.Key)}">${
content.Key
}</a></li>`
);
});
}
});
function getPresignedURL(key) {
return s3.getSignedUrl("getObject", {
Bucket: bucketName,
Key: key,
Expires: 120,
});
}
Handling CORS errors
You'll most likely run into CORS errors when trying to connect to your AWS bucket from your local machine/server. To resolve this problem, head back to your AWS Admin console and head to the Buckets section of the Admin area. Click on the name of your Bucket - in our case, we've called it aeeiee-test. Then Click permissions. Scroll down to the CORS section. Add the following JSON code.
In our case, we're giving permissions to our local development server - https://aeeieetests.local to read from our bucket (GET) and upload objects to our bucket (POST). You will need to change the AllowedOrigins
to use your local host environment. Our dev server can also make PUT or PATCH requests. In production, you may want to alter the permissions given to users accessing your buckets from different environments.
You can read more on CORS in the AWS documentation here.
Security
AWS is able to protect your bucket from unauthorised access by enforcing that buckets that do not have public access, must have explicit permissions given to hosts/domains to access them.
On the other hand, with our current setup, our AWS credentials will be exposed to the Client and can be accessed from the console. AWS frowns on this. The way we solve this problem internally is by creating a temporary bucket using the process described in this article and then moving the files from there to a permanent bucket using PHP on the server. This way we are able to keep our credentials from being exposed on the client side. We'll explain the exact process for this in a future article.
Versioning
When the same object is uploaded to your bucket more than once, AWS will automatically overwrite the previous object in the bucket. To avoid this, you can ensure that your files have unique file names before starting the upload to your S3 bucket.
The complete version of the code used in this example is available here.
Top comments (0)