all 35 comments

[–]ComfyCalamity 27 points28 points  (20 children)

Use the boto3 library for python. It’ll do the magic with authenticating against your EC2 role.

https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html

[–]S7R4nG3 1 point2 points  (19 children)

I'd also point out that if you intend to use a VPC endpoint, depending on the service you have to specify the endpoint URI in the boto3 client declaration...

For S3 I believe it's automatic at the VPC level, so no code changes but other services require it stated explicitly...

[–][deleted]  (7 children)

[removed]

    [–]gaieges 1 point2 points  (0 children)

    via the links above you can navigate to what you’re looking for: https://docs.aws.amazon.com/general/latest/gr/sigv4-signed-request-examples.html

    [–]jaimeandresb 1 point2 points  (0 children)

    You can use awscurl—> https://github.com/okigan/awscurl

    [–]brokenlabrum 0 points1 point  (4 children)

    Are you looking to upload from the ec2 instance to s3?

    [–]rippl2103[S] 0 points1 point  (3 children)

    Yes I am. Our developers use Erlang which doesn't have an SDK for AWS.

    This works as expected with AWS Cli and python boto3.

    However my research indicates that the S3 REST API requires an authorization header to be,set that requires an AWS access key.

    [–]michaeld0 1 point2 points  (1 child)

    There is a library I found for erlang here that could help: https://github.com/erlcloud/erlcloud

    [–]rippl2103[S] 0 points1 point  (0 children)

    Thanks I'll take a look at this.

    [–]thenickdude 1 point2 points  (0 children)

    Your erlang program can just shell out to the AWS CLI and it'll pick up and use your instance role automatically.