java - Fastest 'update' on jdbc with PreparedStatement and executeBatch -


I have a Java program that, in some circumstances, a large number of records must be updated in a database (for example 100,000 ).

The way it does this, it does the snippet by creating a created state and using the addBatch technique here:

  connection.setAutoCommit (wrong); Created place PS = connection.prepareStatement ("myTable SET colName =? Id = where?"); For (...) {// This loop can be 100000 long colValue = ... id = ... ps.setString (1, colValue); Ps.setString (2, id); Ps.addBatch (); } Ps.executeBatch (); Connection.commit ();  

Is the best (fastest) way to update 100,000 records in JDBC?

Can someone recommend a better way?

Try this as a benchmark:

  1. Built-in SQL tool to create a bulk extracts in the entire table. All rows all columns.

  2. Drop the table (or rename).

  3. To use / write a simple flat file, a new file has been applied with the update.

  4. Use the bulk load utility that comes with your database to rebuild the entire table from the extracted file.

  5. After reload, add the index.

You think it's faster than any SQL solution. We stopped using updates for the data warehouse because the extract -> flat file process -> was too fast with the loaded SQL.


Comments

Popular posts from this blog

sql - dynamically varied number of conditions in the 'where' statement using LINQ -

asp.net mvc - Dynamically Generated Ajax.BeginForm -

Debug on symbian -