Cloud Resume Challenge on AWS
Hosting my full-stack resume on AWS

Introduction
I am working as a Technical Manager ( Hosting Services Group), I came across this aws resume challenge recently when I was discussing with one of my friend.
Forrest Brazeal built this passionate community of cloud engineers looking to improve their skill and more hands-on workloads. I have joined the Cloud Resume discord and read nearly every blog and went over every resume for inspiration and some direction.
1. Certifications
I found this challenge after I have already passed two certifications.

The AWS Certifications are intense and no easy feat. It's a great learning experience for anyone trying to break into the Cloud world. You will learn a ton about AWS, but it's great to get some hands-on experience building projects. So this is where the AWS resume Challenge comes in.
Lets Start the Challenge:
My Website (resume.vinothkannan-ramamurthi.pro) Architecture

2. Build Frontend (HTML / CSS )
I created my resume as HTML webpage and used CSS for styling. I have download the template from internet and modified.
3. Host the Website and configure the Domain
Now I have my resume ready and need to configure Domain and DNS for my both blog and resume website. I have purchased my domain vinothkannan-ramamurthi.pro and routed everything accordingly through Route 53.

If you intend to use your domain with various services offered by AWS then you need to use Route 53 and point your domain to AWS*.
Create Hosted Zone:
- From AWS console, go to "Route 53" and select "Create Hosted Zone"
- Enter the domain name you want to point to AWS (it can also be a sub-domain). Select "Public hosted Zone" and click "Create hosted Zone"
- Once the hosted zone is created, copy the nameserver values.

Update NS Records in your domain registered. I purchased my domain from Porkbun.
Login to Porkbun and select the domain and click edit NS(Nameservers) copy the nameserver values you have created on Route53 Hosted Zone.

4. Build Backend
Used API Gateway, Lambda and Dynamo DB to record the visitor count. JavaScript function in static webpage makes a GET call to REST endpoint in API Gateway. The endpoint triggers a lambda function that increments a counter stored in DynamoDB table and returns the incremented value of counter as response to API call. (You can use SAM for API but I created manually to test)

Step 1: Create Dynamo DB table and item. Table name: crc_vtable Partition key: vapp Item: vapcount (String) lcount (number)
Step 2: Create Lambda Function. I used below python code. (Create and assign role which has full access to Dynamo DB on lambda function)
import json
import boto3
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('crc_vtable')
def lambda_handler(event, context):
response = table.get_item(Key= {'vapp' : 'vapcount'} )
count = response["Item"]["lcount"]
# increment string version of visit count
new_count = str(int(count)+1)
response = table.update_item(
Key={'vapp': 'vapcount'},
UpdateExpression='set lcount = :c',
ExpressionAttributeValues={':c': v_count},
ReturnValues='UPDATED_NEW'
)
return {
'count': v_count
}
Step 3: API Gateway configuration I used REST API and crated GET action. Pointed to lambda function. Enabled CORS and deployed API. Now I can access the API for the update and pull the visitor counts using JS.
Step 4: Java Script
/ GET API REQUEST ( java script )
async function get_visitors() {
// call post api request function
//await post_visitor();
try {
let response = await fetch('https://apigateway-url.com/api_updatecount', {
method: 'GET',
});
let data = await response.json()
document.getElementById("visitorscount").innerHTML = data['count'] + " visits.";
console.log(data);
return data;
} catch (err) {
console.error(err);
}
}
get_visitors();
Added below tag to my html.
<p id="visitorscount"> </p>
5. Infrastructure as Code (IAC) Now that I had everything working, It was time to work with SAM to deploy the resources I needed for the project as infrastructure as code. This was my favorite part because SAM makes it extremely easy to deploy your services using infrastructure as code (AWS CloudFormation behind the scenes).
6. CI/CD ( Github Repo / Git Actions ) Finally, my website is fully functioning I used GitHub Actions to set up CI/CD. I spent some time to read GitHub's documentation to see how everything should be set up. Next, I had followed their docs and made my backend redeploy (using a cool SAM GitHub Action) once the tests were passed. Next, and this is the most useful part, I setup my frontend GitHub repo to automatically push to s3! This was really cool if you are anything like me and like to change things often. I just make the change locally and the GitHub Actions pushed the changes to S3.
Steps Followed: ( You should create service account only with Programmatic access on AWS IAM and attach S3access)
I have installed GIT on windows. Downloaded Git from Git Windows.
Created repo on GitHub. GitHub Signup
Once you done with above two simple steps now you are ready to create your first public report for your frontend codes.

I have launched Git bash from my system to clone the empty repo and push the files to GitHub repo --> GitHub Action --> S3.
Used below command to clone my empty repo crc_frontend_vino.
Example
#git clone https://github.com/vinothkannan-ramamurthi/crc_frontend_vino.git
- Copied my website static contents to the repo directory and executed below git commands.
#mkdir .github
#mkdir .github/workflows
#touch .github/workflows/main.yml --> GitHub Action file to push the changes to S3
- Edit the main.yml file and add below content. Make sure you login to the GitHub to add secret parameters you have created for service account from AWS IAM.
GitHub --> Repo --> Settings --> Secret ( env parameters AWS_S3_BUCKET: YOUR BUCKET NAME, AWS_ACCESS_KEY_ID: User Access ID, Your Key ID and AWS_SECRET_ACCESS_KEY: User Key)
name: Upload Website
on:
push:
branches:
- master
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@master
- uses: jakejarvis/s3-sync-action@master
with:
args: --acl public-read --follow-symlinks --delete
env:
AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
- Add / Commit and Push your file to repo using below commands. ( Note: You should generate Personal access token Settings --> Developer settings --> Personal access token).
#git add -A
#git commit -a ( make sure you are the new files to added are uncommented )
#git push orginal master
Finally you can see the changes pushed to GitHub repo --> GitHub Action ( Workflow) --> S3 --> WWW.