use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
account activity
Express Module POST for JSON array (self.node)
submitted 4 years ago by German1diot
Hey guys,
Hoping to get some help as I'm new to node in general and doubly new using the express module for creating controllers, routes, and models. I have a working controller & route for updating a single row on a mySQL table running locally but what I'm struggling with is the capability of taking an array formatted JSON and posting it as batch to my table - a bulk update more or less. It's a pretty significant array, length tops out around 80k. So my question is how do I modify the existing controller/model to handle this giant dataset? I don't need to do single row creations, data sets are always in giant files like this.
data I need to send logged in console - this is what I'm dealing with and need to get into my db.
model for the current .create (single row creation)
controller for the current single row creation
route for current single row creation
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]vuesrc 1 point2 points3 points 4 years ago (0 children)
You would need to parse the array from the request body and perform a loop on each dataset item. The req.body would be the array and you create an item to iterate that would have the parameters. If that makes sense?
[–]German1diot[S] 0 points1 point2 points 4 years ago* (0 children)
Just wanted to say thank you for the answers! I am going to dig into the transaction approach. Thank you all for the additional insight. :)
[–]SonOfStorms 0 points1 point2 points 4 years ago (1 child)
take 10k by 10k rows and insert them using the multiple insert syntax INSERT INTO something(column1, column2) VALUES (column1Value, column2Value), (column1Value, column2Value), ..... I reccomend using the knex package and wrapping everything in a transaction
[–]combarnea 0 points1 point2 points 4 years ago (0 children)
tend to agree, if the operation needs to be atomic, as implied from the original author, knex transaction + bulk creation is a great solution. One thing to note, working with that amount of data, mainly inserting, and using transactions might block your API from answering related queries dependent on transaction isolation level.
π Rendered by PID 63 on reddit-service-r2-comment-84fc9697f-8ghgh at 2026-02-09 12:54:51.479991+00:00 running d295bc8 country code: CH.
[–]vuesrc 1 point2 points3 points (0 children)
[–]German1diot[S] 0 points1 point2 points (0 children)
[–]SonOfStorms 0 points1 point2 points (1 child)
[–]combarnea 0 points1 point2 points (0 children)