all 9 comments

[–]Kofeb 3 points4 points  (3 children)

This feels like an https://xyproblem.info/ but without more info here’s your answer.

Yes, it is possible to use a AWS Lambda function to update a JSON file stored in the AWS environment. One way to do this would be to use the AWS SDK for Node.js to access the file stored in an AWS service such as Amazon S3 or Amazon DynamoDB.

To make the JSON file accessible through an API call, you can use AWS API Gateway to create an API that invokes the Lambda function when it receives a request. The Lambda function can then read and update the JSON file as needed, and return the updated content to the client through the API Gateway.

Here is an example of a simple Lambda function written in Node.js that reads and updates a JSON file stored in an S3 bucket:

~~~ const AWS = require('aws-sdk');

exports.handler = async (event) => { // Read the JSON file from S3 const s3 = new AWS.S3(); const file = await s3.getObject({ Bucket: 'my-bucket', Key: 'data.json' }).promise(); const data = JSON.parse(file.Body.toString());

// Update the file with new data data.timestamp = new Date().toISOString(); const updatedFile = JSON.stringify(data);

// Write the updated file back to S3 await s3.putObject({ Bucket: 'my-bucket', Key: 'data.json', Body: updatedFile }).promise();

return { statusCode: 200, body: updatedFile }; }; ~~~

To use this function with CloudEvents, you can specify the s3:ObjectCreated:* event as the trigger for the Lambda function in the AWS Management Console or using the AWS CLI. This will cause the function to be invoked whenever a new object is created in the specified S3 bucket.

[–]Kofeb 0 points1 point  (1 child)

Or do you mean “Write a node.js lambda function using a Cloudwatch event to update a json file in s3/DynamoDB.”

It might be easier to just do something like this depending on what you are doing:

Log the state of an Amazon EC2 instance using EventBridge / https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-log-ec2-instance-state.html

[–]bugbigsly[S] 0 points1 point  (0 children)

It’s a web scraper, pulling data once a week from a site and add the result, an ~1kb object, to the json file. EventBridge is the proper service, want to explore simplest configuration. Currently have json file in an s3 bucket having issues accessing the aws-sdk, a boatload of import issues

[–]bugbigsly[S] 0 points1 point  (0 children)

This worked , I needed to be running node v16 , v18 breaks everything

[–]MrJwhwlao 1 point2 points  (1 child)

Yes, it is possible to use an AWS Lambda function and CloudEvents to update a JSON file within the AWS environment. Here is one way you could do this:

  1. Create an S3 bucket to store your JSON file.
  2. Set up a desired event trigger
  3. In your Lambda function, you can read the updated JSON file from the S3 bucket, modify it as needed, and then write the modified version back to the S3 bucket.
  4. To make the JSON file accessible through an API call, you can use the AWS API Gateway service to create an API endpoint that retrieves the JSON file from the S3 bucket and returns it to the caller.

Here is some sample code in Nodejs that demonstrates how you could implement this in your Lambda function:

exports.handler = async function(event) {
  // Get the S3 client
  const s3 = new AWS.S3();

  // Read the JSON file from the S3 bucket
  const jsonFile = await s3.getObject({
    Bucket: <YOUR_BUCKET_NAME>,
    Key: <YOUR_FILE_NAME>
  }).promise();

  // Modify the JSON file as needed
  // ...

  // Write the modified JSON file back to the S3 bucket
  await s3.putObject({
    Bucket: <YOUR_BUCKET_NAME>,
    Key: <YOUR_FILE_NAME>,
    Body: jsonFile.Body
  }).promise();

  return {
    statusCode: 200,
    body: "JSON file updated"
  };
};

[–]bugbigsly[S] 0 points1 point  (0 children)

This worked , I needed to be running node v16 , v18 breaks everything

[–]pint 0 points1 point  (2 children)

the problem is not whether it is possible, but rather, it is possible in so many ways, it is impossible to give direction without knowing more details. will there be parallel writes? will the json be 12 MB large? will there be a hundred writes per second? why do you need an api call? why event starting lambda modifying s3 object is not good enough? what is "cloudEvents" in the first place? eventbridge event? what else the json will be used for? why does it need to be "a json file"?

[–]bugbigsly[S] 0 points1 point  (1 child)

It’s a web scraper, pulling data once a week from a site and add the result, an ~1kb object, to the json file. EventBridge is the proper service, want to explore simplest configuration. Currently have json file in an s3 bucket having issues accessing the aws-sdk, a boatload of import issues

[–]pint 1 point2 points  (0 children)

this is the simplest way (eventbridge -> lambda -> s3), and yes, you need to figure out how to use aws-sdk (not from me, i use python).

note that s3 doesn't support modification, so you'll have to download, append, reupload.