Localstack permissions prohibits s3 client to write file to bucket

  amazon-web-services, boto3, docker, localstack, python

I run localstack with docker container (docker-compose.yaml)

localstack:
    image: localstack/localstack:latest
    ports:
      - '4563-4599:4563-4599'
      - '8055:8080'
    environment:
      - SERVICES=s3
      - DEBUG=1
      - DATA_DIR=/tmp/localstack/data
      - DEFAULT_REGION=us-east-1
    volumes:
      - './.localstack:/tmp/localstack'
      - '/var/run/docker.sock:/var/run/docker.sock'

and try to upload file with upload function

import logging
import boto3
from botocore.exceptions import ClientError

class ClientS3(object):
    def __init__(self):
        self.session = boto3.session.Session()
        self.client = self.session.client('s3')

    def upload_file(self, filename, bucket, key=None):
        if not key:
            key = filename
        try:
            self.client.upload_file(
                Filename=filename,
                Bucket=bucket,
                Key=key
            )
        except ClientError as e:
            logger.error(f'Failed to upload file: {str(e)}')

in API

def test_upload_file_to_s3(filename: str):
    s3cl = ClientS3()
    s3cl.upload_file(filename, "mybucket")

after running code in API localstack returns error message

boto3.exceptions.S3UploadFailedError: Failed to upload testfile.txt to mybucket/testfile.txt: An error occurred (InvalidAccessKeyId) when calling the PutObject operation: The AWS Access Key Id you provided does not exist in our records.

What I did wrong and where proper credentials should be in docker-compose?

Source: Docker Questions

LEAVE A COMMENT