java - Fastest 'update' on jdbc with PreparedStatement and executeBatch -
I have a Java program that, in some circumstances, a large number of records must be updated in a database (for example 100,000 ).
The way it does this, it does the snippet by creating a created state
and using the addBatch
technique here:
connection.setAutoCommit (wrong); Created place PS = connection.prepareStatement ("myTable SET colName =? Id = where?"); For (...) {// This loop can be 100000 long colValue = ... id = ... ps.setString (1, colValue); Ps.setString (2, id); Ps.addBatch (); } Ps.executeBatch (); Connection.commit ();
Is the best (fastest) way to update 100,000 records in JDBC?
Can someone recommend a better way?
Try this as a benchmark:
-
Built-in SQL tool to create a bulk extracts in the entire table. All rows all columns.
-
Drop the table (or rename).
-
To use / write a simple flat file, a new file has been applied with the update.
-
Use the bulk load utility that comes with your database to rebuild the entire table from the extracted file.
-
After reload, add the index.
You think it's faster than any SQL solution. We stopped using updates for the data warehouse because the extract -> flat file process -> was too fast with the loaded SQL.
Comments
Post a Comment