Downloading Images using getBlob/getContent and transferring to S3

I’ve tried both endpoints getContent and the getBlob to access an image stored in the repository. Both return a binary buffer that is not base64 encoded AND when I try to save it to both a local file and S3, the image is corrupt/un-useable. I’ve used hexdump to inspect the binary buffer and it doesn’t match what the image is supposed to be (local-file).

On my getBlob call to github I have the following:

const { data } = await request('GET /repos/:owner/:repo/git/blobs/:sha', {
      headers: {
        authorization: `token ${token}`,
        accept: 'application/vnd.github.v3.raw'

      },
      owner: owner,
      repo: repository,
      sha: fileSha
    })

Less ideal also tried getContent

getFetchTestImage: async function (config) {
    const opts = {
      owner: config.ownerName,
      repo: config.repository,
      ref: 'master',
      headers: {
        authorization: `token ${config.token}`,
        accept: 'application/vnd.github.v3.raw'
      }
    }

    const { data } = await request('/repos/:owner/:repo/contents/media/images/S3-Upload.png', opts)
    console.log(data)
    return data
  },

Is there something I missing? Are image/buffer from the github API different and need additional params/encoding/decoding work?

I’m using the following:

"@octokit/request": "^5.1.0",

UPDATES: using application/vnd.github.v3+json in my accept works

This snippet-helped

I now have a json shape with content. For my own edification if I send accept: 'application/vnd.github.v3.raw' what is that binary and is it operable?

Or should I always default to the json shape and use the content blob?

Cheers!