Help with AWS cli

I been trying to use the AWS cli with github actions. I’m using this workflow https://github.com/actions/aws/tree/master/cli to run an aws s3 sync command. Does anyone have any working examples? Here’s an example of what I been trying:

jobs:
  library-update:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/aws/cli@master
      - with:
        - args: s3 sync location1 location2 --acl public-read
        - secrets:
          - AWS_ACCESS_KEY_ID
          - AWS_SECRET_ACCESS_KEY

Personally I would recommend using a python script instead of the AWS CLI action, since the latter is only supported on Linux, and it’s documentation apparently hasn’t been fully updated to reflect the YAML syntax.  In fact the last commit was removing the example YAML syntax.  

I use the boto3 module for Python, which you can get like so

    - nameInstall python modules
      runpip install boto3

I then folow up with running a script something like this:

    - nameUpload artifact
      working-directory${{runner.workspace}}/build  

      shellbash      env:  
        AWS_ACCESS_KEY_ID${{ secrets.aws_access_key_id }}
        AWS_SECRET_ACCESS_KEY${{ secrets.aws_secret_access_key }} 
        ARTIFACT_PATTERNHighFidelity-Beta-*.exe
      runpython “$GITHUB_WORKSPACE\tools\ci-scripts\upload.py” 

Specifying bash for the shell means you can use the $ syntax for environment variables regardless of platform.

Where the script contents look like this:

import os
import boto3
import glob

bucket_name = os.environ[‘BUCKET_NAME’]
upload_prefix = os.environ[‘UPLOAD_PREFIX’]
S3 = boto3.client(‘s3’)

path = os.path.join(os.getcwd(), os.environ[‘ARTIFACT_PATTERN’])
files = glob.glob(path, recursive=False)
for archiveFile in files:
    filePath, fileName = os.path.split(archiveFile)
    S3.upload_file(os.path.join(filePath, fileName), bucket_name, upload_prefix + ‘/’ + fileName)
This ends up being more portable IMO, although on OSX builds you have to use pip3 and python3 instead of pip and python

This does pre-suppose that you’ve done a checkout of some repository containing your script, otherwise you’d have to find a way to express the python logic directly in the workflow.  Also it looks like boto3 doesn’t have a sync command, so you might have to reproduce the logic.

2 Likes

AWS officially provide github actions 

https://github.com/aws-actions/configure-aws-credentials

  1. get your AWS_ACCESS_KEY_ID & AWS_SECRET_ACCESS_KEY (on AWS, IAM user)

  2. set your access & secret on github

  • repository > setting > Secrets 
  • add a new secret 
  • example name is AWS_ACCESS_KEY_ID_MAY and AWS_SECRET_ACCESS_KEY_MAY
  1. using on workflow

${{ secrets.YOUR_SECRETS_NAME }}

example

-name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v1.0.1
with:
aws-access-key-id: ${{ secrets. **AWS\_ACCESS\_KEY\_ID\_MAY** }}
aws-secret-access-key: ${{ secrets. **AWS\_SECRET\_ACCESS\_KEY\_MAY** }}
aws-region: ap-northeast-2# aws seoul region 

@mohammed-salam, if it’s still relevant …

I’ve created a GitHub Action that installs the AWS CLI on a Linux runner, according to a given version, so you might find it useful - unfor19/install-aws-cli-action

This is how you use it -

- id: install-aws-cli
  uses: unfor19/install-aws-cli-action@v1
  with:
    version: 1