DEV Community

Cover image for The Cloud Resume Challenge
Ozan Guner
Ozan Guner

Posted on • Edited on

The Cloud Resume Challenge

About The Challenge

This project was completed in accordance with the Cloud Resume Challenge, which was created by @forrestbrazeal.

GitHub Link

My Solution Repository Link

Contents:

1 - About My Journey
2 - Website Architecture Overview
3 - AWS Certification
4 - HTML & CSS
5 - Frontend
6 - Backend
7 - Continuous Integration / Continuous Deployment (CI/CD)
8 - Final Words

1 - About My Journey

Before Cloud Resume Challenge, I had already built my personal website within the scope of 100 Days of Code - Python Challenge by Angela Yu. While doing the Python challenge I realized my passion for automation when I was building automation bots, Instagram Follower Bot, LinkedIn Automatic Job Application Bot to name a few. I then started thinking, "Where can I use my prior job experiences, my skills in Python and Networking to automate things?" That is when I started to look into DevOps, and consequently found out about this challenge.

2 - Website Architecture Overview

Image description

3 - AWS Certification

When I was looking into what skills, tools, certifications I need to work on to switch to Cloud and DevOps space from IT, people in r/DevOps subreddit suggested me to get AWS Solutions Architect Associate certification, among many other things including the Cloud Resume Challenge. I started to study for the Solutions Architect Associate certification, without knowing this is the first requirement of the Cloud Resume Challenge. I completed the Solutions Architect Associate certification in March. When I decided to start the Cloud Resume Challenge, you can imagine how glad I was to cross off the first requirement right off the bat. ๐Ÿ˜„

4 - HTML & CSS

The next 2 requirements was to build a basic resume website using HTML & CSS. Luckily, as I previously mentioned, I have already built a fabulous resume website for myself, so I used what I already had for these steps. With that, I had already knocked out the first 3 requirements within the first 5 minutes!

5 - Frontend

S3

For the next requirement, I created an S3 bucket to host my static resume website in AWS Cloud, using S3 Bucket's static website hosting feature. During the creation I wrote the Bucket Policy accordingly so that any visitor viewing the website have public read access to it.

CloudFront

After the creation of the bucket was completed, now it was time to secure it. I created a new CloudFront Distribution and I added my S3 Bucket as the CloudFront Origin. I provisioned an HTTPS certificate and configured necessary settings to get the CloudFront Distribution operational.
And voila! However ugly my CloudFront Distribution Domain Name was (sorry Amazon, it's the truth!), I was able to reach my website through it.

Domain Name

The last item required for front end infrastructure is to get a domain name for the website, so we can use Route53 to route incoming traffic to a prettier Domain Name.

Even though I had ozanguner.me as my website, for this project I wanted to get a new domain name to use, because I already paid the cost of 2 year hosting of my original website so I wanted to use the service I bought for my original website. Therefore I got a free domain name from Freenom, ozangunercloud.ga.

Route 53

I created a new hosted zone in Route 53 to host the domain name I got from Freenom. Then, I copied the nameservers from that hosted zone to Freenom, so that any requests made to ozangunercloud.ga will be forwarded to Amazon Nameservers by Freenom.

However, I still had to define in the hosted zone settings where I want Amazon Nameservers to route incoming requests coming to ozangunercloud.ga. For this purpose, I added a new A record to redirect any requests made to ozangunercloud.ga to the CloudFront distribution that I created earlier. So this way, Freenom redirects to Route 53, Route 53 redirects to CloudFront, CloudFront redirects to S3 to complete the link chain.

Testing

I finished the Frontend part of the challenge pretty quickly. Therefore when I tested for the first time the nameserver settings had not taken effect yet. I started waiting and about an hour later, I could reach my static website through the ozangunercloud.ga, and with that the Frontend part of the challenge was completed!

6 - Backend

Javascript and API Gateway

Normally in the challenge it is suggested that you use Infrastructure as Code (IaC) to create your backend resources. However, as I was going through the requirements step by step, I created the backend resources through the AWS Console. It helped a lot visualize what I was doing and why I was doing it. I started off by creating an API through the API Gateway Console.

I connected API gateway to a template Hello World Lambda function with Lambda Proxy feature. Then I moved on to the seventh requirement, which is Javascript code to display the number of visitors on my website. This part was tricky as I had minimal exposure to Javascript, so I stopped here and went back to learn Javascript fundamentals. After couple days of studying JS basics, I was able to come up with a Javascript function to make calls to my Lambda function through API Gateway. At least that is what I thought at the time.

Getting API Gateway, Lambda function and Javascript function to work harmoniously was easily the most challenging part of this challenge. I spent a long time trying to figure out why the Javascript code was showing "undefined" as the output instead of "Hello World". I tested from different ends to narrow down the problem to its source. If I was able to get "Hello World" output when I called the function's API endpoint by directly accessing it through the browser, then something had to be wrong with the Javascript code that I wrote. If it was the other way around, API Gateway or Lambda Function had to be the source. After a full day of struggling, and by full day I mean 15 hours, I was able to get my Javascript function to display "Hello World"!

DynamoDB and Lambda Function

Alright, now I had a working JS function to make calls to my Lambda Function, it was time to build a DynamoDB Table. The point of creating a DynamoDB Table is to store the number of visitors of my website. With the new Lambda function, the number of visitors will be updated and the data will be sent to JS function that I wrote earlier.

I created a basic DynamoDB table through the console that only had ID as its primary key and I created a new item, "count". Now I needed to write a new Lambda function to read(GET) this item and update under certain circumstances, namely when someone visits my website.

First, I gave my Hello World function full access permission to my DynamoDB table. Then I started reading boto3 documentation(really helpful!) to figure out how to type out my Lambda function. With the help of Hello World function template and the documentation, I quickly figured out how to write the required Lambda function.

However, for reasons unknown to me it took me quite some time to figure out why I was not able to write the number of visitors to "count" item in the table when I did not create a new key manually from the console to store the number of visitors. The incrementation of the value through the function worked perfectly if I created a visitor_count key manually in my DynamoDB Table and set it to 0, but if there was no visitor_count key, the function was erroring out.

I struggled about 2 hours before I figured out I needed to use ADD and conditional expression together instead of SET.
After changing the expression, If there were no visitor_count key, it was being created by the Lambda Function.

7 - CI/CD

The CI/CD part is where the actual automation takes place. What I have done so far is only to get my website to be deployed on AWS resources. However, for any future modification or recreation of the stack, CI/CD is necessary.

Creating a Test Function

Before I build the CI/CD pipeline for the stack, I had to write a test function to make sure the API Gateway and the Lambda function is working properly. This is necessary because even though you may have a working stack at the beginning, as you work on it and make some changes, something may break and you do not want your CI/CD pipeline to push something that is not working to production.

For this reason I wrote a python test function. The test function makes a GET request to the API endpoint of my Lambda function and once it receives a response, it checks if the required headers are in the response, the status code of the response, as well as response type. I also wanted to test if the increment functionality was working, so within the test function I made 2 separate calls to the API, and compared the visitor_count values in the body. If the latter call was 1 higher than the first call, then the test was successful.

Building the CI/CD Pipeline

Before I created the CI/CD pipeline, I transposed my backend resources (DynamoDB table, API Gateway, Lambda function) onto a SAM template, otherwise known as infrastructure as code. Then I uploaded everything to GitHub and started building the pipeline with GitHub Actions. This was my first experience with GitHub Actions but it was pretty straightforward. I think YAML is such a powerful language that even though it may seem like a coding language at first glance, it is so descriptive it is like reading and writing normal sentences.

I found myself a template that I can use and made small tweaks to it.

I defined 2 Jobs in GitHub Actions workflow:

  1. Run the test Function.
  2. If the first job is successful, deploy the stack and update S3 bucket with the most recent website files.

After those small tweaks CI/CD Pipeline was complete. However, I wanted to make sure everything was working as expected and I intentionally made some changes that would cause test function to fail the test to see if my conditional deployment in GitHub Actions was taking effect.

After the test I confirmed that it was all working as expected, and with that the Cloud Resume Challenge was complete!

8 - Final Words

This challenge is definitely a worthwhile time investment if you are looking to get more Cloud experience. However, this challenge is for audacious, perseverant, exhaustive and resourceful individuals who are willing to go the extra mile to complete it. There were many times that I was trying to troubleshoot something that is not working and I spent hours without any results. You have to persevere through those challenges until you complete it.

At the end, the struggle taught me a lot of things about CORS, boto3, reading error messages of JS through the browser and many more. I got to use many services that I learned about when I was studying for the certification.

I want to express my deepest gratitude towards @forrestbrazeal for creating this challenge and @loujaybee for creating a video series to help people taking on this challenge, which allowed me to keep my sanity ๐Ÿ˜„.

So what is next for me ? Well, if you are in the Cloud business, you know there is always more to learn. I am planning to get more hands on experience with Linux and Terraform. I am definitely planning to do more Cloud projects.

Thank you for reading. Feel free to check out my resume website :
ozangunercloud.ga

Top comments (0)