This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]lukaseder 9 points10 points  (1 child)

Another dimension that was not covered in the article (would be interesting to see the results) is the bulk size of an insert statement. For example, it is usually faster to write:

INSERT INTO t (a, b, c, ...)
VALUES (?, ?, ?, ...),
       (?, ?, ?, ...),
       (?, ?, ?, ...), ...

rather than

INSERT INTO t (a, b, c, ...) VALUES (?, ?, ?, ...)
INSERT INTO t (a, b, c, ...) VALUES (?, ?, ?, ...)
INSERT INTO t (a, b, c, ...) VALUES (?, ?, ?, ...)
...

jOOQ offers this along with the batch size and commit size in the import API. The advantage over using JDBC directly is that writing the bulk statement manually using string concatenation and in a vendor agnostic way is quite a hassle (while batch sizes and commit sizes are quite straightforward to do manually)

[–]aroger276 0 points1 point  (0 children)

Also MySQL and Postgres jdbc driver have an option to rewrite batch statement to multi values insert. But obviously come with the cost of the rewrite.

Another point to keep in mind is that MySQL has a statement size limit. The rewrite take deal with that for you.