Deploy AWS Lambda Function using S3 Bucket and GitHub actions

Deploy AWS Lambda Function using S3 Bucket and GitHub actions

Lambda functions, nodejs and github actions series

AWS Lambda offers zip file as a deployment package. The file includes your application code and its dependencies with a size limitation. For example: If your deployment package is larger than 50 MB, you need to upload the code to Amazon S3 bucket.

In this article we are going to learn how create pipelines in GitHub, upload files to S3 and deploy the application code in Lambda function. If you are new to AWS Features or Github itself, refer first to my previous posts:

  1. Lambda functions challenge

  2. NodeJS Express and AWS Lambda Functions

For this article, you will need:

  • 1 AWS credential for deploy lambda function and read S3 objects.

  • 1 AWS credential for S3 with create objects permissions.

  • GitHub Repository (we need GitHub Actions)

AWS Environment

Permissions

We need two users with specific permissions for different purpose and policies. One is for interact with S3 services and the other for Lambda function. Both are necessary for using the AWS Cli in our GitHubAction Workflow.

First user with permissions for create objects in S3 services with the following policies:

And second user with permissions for update our Lambda Function source code with the following policies:

Lambda Function and ApiGateway

We need a lambda function with an API Gateway attached and running:

\ Copy the API endpoint. We need it late:*

Change the lambda function handler

We should change the Lambda Function runtime settings with our custom handler:

GitHub Repository Secrets

After setup the AWS environment, create the action secrets for AWS security credentials on GitHub repository (Repository Settings -> Secrets and Variables -> Actions -> New repository secret):

For lambda function:

  • AWS_LAMBDA_FUNCTION_NAME: Lambda Function name

  • AWS_LAMBDA_REGION: Lambda Function region

  • AWS_LAMBDA_USER_ACCESS_KEY_ID: Access key id for Lambda Function user credential

  • AWS_LAMBDA_USER_SECRET_ACCESS_KEY: Secret access key for Lambda Function user credential

For S3 Bucket:

  • AWS_S3_BUCKET_NAME: S3 bucket name

  • AWS_S3_BUCKET_REGION: S3 bucket región

  • AWS_S3_BUCKET_USER_AWS_ACCESS_KEY_ID: Access key id for S3 user credential

  • AWS_S3_BUCKET_USER_AWS_SECRET_ACCESS_KEY: Secret access key for S3 user credential

Workflows

A workflow is a collection of job definition that will be executed concurrently and/or sequentially. A job has several steps which will be executed sequentially.

The project must have a folder named ".github/workflows". This folder should contains all the workflow files that you need.

In our case, we will have two workflows:

Workflow 1: Build and Upload

The workflow "Build and Upload" is configured to run on the push event is trigger. It has one job with a set of five steps executed sequently. The steps are describe bellow:

  1. Generate Tag Version: Generates a unique string with the format "yyyymmdd-hhmmss" e.g. 20230216-130920. The step output is an input for the "Create Zip File" and "Create Git Tag" steps.

  2. Create Zip File: Creates a package zip file named "yyyymmdd-hhmmss.zip" with the project source files e.g. 20230216-130920.zip

  3. Configure AWS Credentials: This step requires the user bucket secrets for configure the credentials for use AWS Cli. Ref: Github action documentation

  4. Upload to S3 Bucket: Takes the zip file and upload to S3 bucket using the command "aws s3 cp". Ref: Command documentation

  5. Create Git Tag: Creates a git tag named "yyyymmdd-hhmmss" e.g. 20230216-130920

Workflow 2: Deploy to Lambda Function

The workflow "Deploy to Lamda Function" is configured to run on the workflow_dispatch event using custom-defined inputs.

It has one job with a set of four steps executed sequently. The steps are describe bellow:

  1. Check Git Tag: The custom input named "version" expect a value equal to some tag in the repo. This step make a request to GitHub rest api and check if the tag exist. Ref.: GitHub action documentation

  2. Configure AWS Credentials: This step requires the user lambda secrets for configure the credentials for use AWS Cli. Ref.: GitHub action documentation

  3. Check S3 Bucket: This step executed "aws s3api" command to check if the object named "version.zip" exist in the S3 bucket. Ref: Command documentation

  4. Update Source Code: The step execute "aws lambda update-function-code" command to update the lambda source code with the object from S3 Bucket. Ref: Command documentation

Project files

In your project create the following files and folders:

.github/worflows/build.yml: Workflow for build and upload our code to S3 bucket.

.github/worflows/deploy.yml: Workflow for update the lambda function with our code from S3 Bucket.

app/index.js: Lambda Function Handler.

README.md: Readme :)

Let's put some the code into the files:

Index.js:

exports.handler = async (event) => {
  const response = {
    statusCode: 200,
    body: JSON.stringify("AWS Lambda and S3 Bucket")
  };
  return response;
}

build.yml:

name: 'Build and upload to S3 Bucket'

on:
  push:
    branches:
      - master

jobs:
  Build_and_Upload:
    permissions:
      actions: write
      contents: write
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - name: 'Generate Tag Version'
        id: generate_tag_version
        run: |
          TAG=$(date +%Y%m%d)-$(date +%H%M%S)
          echo "tag=$TAG" >> $GITHUB_OUTPUT
          echo "Tag Version: $TAG" >> $GITHUB_STEP_SUMMARY

      - name: 'Create Zip File'
        id: 'create_package'
        run: |
          PACKAGE_NAME="${{ steps.generate_tag_version.outputs.tag }}.zip"
          echo "package_name=$PACKAGE_NAME" >> $GITHUB_OUTPUT
          zip -r $PACKAGE_NAME .


      - name: 'Configure AWS Credentials'
        uses: aws-actions/configure-aws-credentials@v1-node16
        with:
          aws-access-key-id: ${{ secrets.AWS_S3_BUCKET_USER_AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_S3_BUCKET_USER_AWS_SECRET_ACCESS_KEY }}
          aws-region: ${{ secrets.AWS_S3_BUCKET_REGION }}


      - name: 'Upload to S3 Bucket'
        run: aws s3 cp ${{ steps.create_package.outputs.package_name }} s3://${{ secrets.AWS_S3_BUCKET_NAME }}


      - uses: actions/github-script@v6
        name: 'Create Git Tag'
        with:
          script: |
            github.rest.git.createRef({
              owner: context.repo.owner,
              repo: context.repo.repo,
              ref: 'refs/tags/${{ steps.generate_tag_version.outputs.tag }}',
              sha: context.sha
            })

deploy.yml

name: 'Deploy to Lambda Function'

on:
  workflow_dispatch:
    inputs:
      version:
        description: 'Repo Tag Name (Ex.: yyyymmdd-hhmmss)'
        type: string
        required: true

jobs:
  Deploy:
    runs-on: ubuntu-latest
    timeout-minutes: 10
    steps:
      - uses: actions/github-script@v6
        name: 'Check Git Tag'
        with:
          script: |
            const refResponse = await github.rest.git.getRef({
              owner: context.repo.owner,
              repo: context.repo.repo,
              ref: 'tags/${{ inputs.version }}'
            });

      - name: 'Configure AWS Credentials'
        uses: aws-actions/configure-aws-credentials@v1-node16
        with:
          aws-access-key-id: ${{ secrets.AWS_LAMBDA_USER_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_LAMBDA_USER_SECRET_ACCESS_KEY }}
          aws-region: ${{ secrets.AWS_LAMBDA_REGION }}

      - name: 'Check S3 Bucket'
        run: |
          aws s3api wait object-exists \
            --bucket ${{ secrets.AWS_S3_BUCKET_NAME }} \
            --key ${{ inputs.version }}.zip \
            --cli-read-timeout 60 \
            --cli-connect-timeout 60

      - name: 'Update Source Code'
        run: |
          echo "Deploy Version: ${{ inputs.version }}" >> $GITHUB_STEP_SUMMARY
          echo "Update Lambda Function with ${{ inputs.version }}.zip file" >> $GITHUB_STEP_SUMMARY
          aws lambda update-function-code \
            --function-name=${{ secrets.AWS_LAMBDA_FUNCTION_NAME }} \
            --s3-bucket=${{ secrets.AWS_S3_BUCKET_NAME }} \
            --s3-key=${{ inputs.version }}.zip \
            --cli-read-timeout 60 \
            --cli-connect-timeout 60

Commit and push the changes into the repo. After that, go to the actions tab in the repo and check if the workflow is running:

Now, visit the tag list in your repo. It should has a new tag like this:

And do the same with S3 bucket, go to the AWS console -> S3 Bucket and check if it has a new object:

Notice that the repo tag name and the object in the bucket has the same name, in this case is "20230214-051405" (copy your tag name, we need it soon)

Next, we are going to deploy our source code on the lambda function. So, go to the action tabs, select the workflow "Deploy to Lambda Function", run the workflow manually and fill the form with the tag name value:

In your browser, visit the Api Gateway endpoint. You should get this message:

Now, is time to create a new project version. Edit app/index.js file and change the response message:

exports.handler = async (event) => {
  const response = {
    statusCode: 200,
    body: JSON.stringify("AWS Lambda and S3 Bucket - New version")
  };
  return response;
}

Commit and push the changes into the repo. Wait until the workflow finish and visit the Tag list page. The repository should has a new tag and S3 bucket a new object.

Finally, run the "Deploy to Lambda Function" workflow with your new version and check the lambda function. It will response a new message:

Congratulations we have workflows for create package and deploy lambda function integrate with AWS S3 bucket services 👏 👏.

Conclusion

You can get the source files from my GitHub repository

With those workflows you can create versions and storage the package file in S3 Bucket for every version. Also, you can deploy any version of your project at anytime specially for test versions or make some rollbacks.... :)

Advice: Be careful when you Upload files to S3 Bucket. It can be expensive because is easy to keep a lot of versions in the bucket. For example; Every new version involves upload a new zip file to the Bucket. Don't forget remove old versions and keep your storage up to date.

References

https://docs.aws.amazon.com/cli/latest/reference/s3api/wait/object-exists.html

https://docs.aws.amazon.com/cli/latest/reference/s3/

https://docs.github.com/en/actions/managing-workflow-runs/manually-running-a-workflow

https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-package.html

https://docs.github.com/en/actions/learn-github-actions/understanding-github-actions

Cover Image Background by wirestock on Freepik

Did you find this article valuable?

Support Señor Developer by becoming a sponsor. Any amount is appreciated!