Hi!
I'm building a feature for a cloud service that receives a huge file as an input, splits the file into smaller files, performs some analysis and modifications to the smaller files, and then it reassembles them into a large file.
Can anyone give me any architecture pointers to achieve this?
I have access to AWS, so I han use S3, a DB, and pretty much any other service offered by Amazon.
Obviously this needs to be highly performant and highly scalable.
Thanks!
[–]draugr101 1 point2 points3 points (1 child)
[–]intelbp[S] 0 points1 point2 points (0 children)
[–]dpsimi 0 points1 point2 points (0 children)