When using JDBC to connect to MySQL database for data insertion, especially large batches of data are continuously inserted (100,000). How to improve efficiency?
There are two methods of Statement in the JDBC programming interface that are particularly noteworthy:
The batch processing of data can be achieved by using addBatch() and executeBatch() methods.
However, it is worth noting that first you need to set up a manual commit in the database link, connection.setAutoCommit(false) , and then execute connection.commit() after the Statement is executed.
import java.io.BufferedReader;import java.io.IOException;import java.sql.DriverManager;import java.sql.PreparedStatement;import java.sql.SQLException;import java.util.Date;import com.mysql.jdbc.Connection;public class MysqlBatchUtil { private String sql="INSERT INTO db_test (param1,param2,param3,param4,param5) VALUES (?,?,?,?,?)"; private String charset="utf-8"; private String connectStr="jdbc:mysql://localhost:3306/test"; private String username="root"; private String password="123456"; private void doStore() throws ClassNotFoundException, SQLException, IOException { Class.forName("com.mysql.jdbc.Driver"); connectStr += "?useServerPrepStmts=false&rewriteBatchedStatements=true";//Here is a test for efficient batch insertion, and normal batch insertion is removed after execution Connection conn = (Connection) DriverManager.getConnection(connectStr, username,password); conn.setAutoCommit(false); // Set manual submission int count = 0; PreparedStatement psts = conn.prepareStatement(sql); String line = null; Date begin=new Date(); for(int i=0;i<=100000;i++){ psts.setString(1, i+"param1"); psts.setString(2, i+"param2"); psts.setString(3, i+"param3"); psts.setString(4, i+"param4"); psts.setString(5, i+"param5"); psts.addBatch(); // Add batch processing count++; } psts.executeBatch(); // Execute batch processing conn.commit(); // Submit Date end=new Date(); System.out.println("Quantity="+count); System.out.println("runtime="+(end.getTime()-begin.getTime())); conn.close(); } public static void main(String[] args) { try { new MysqlBatchUtil().doStore(); } catch (ClassNotFoundException e) { e.printStackTrace(); } catch (SQLException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } }}Test results:
Quantity = 100001
Running time = 4725
A total of 10W, and the execution time takes 47 seconds.
This efficiency is still not high and does not seem to achieve the desired effect and needs further improvement.
Parameters can also be added to the MySQL JDBC connection string.
rewriteBatchedStatements=true
MySQL has turned off batch processing by default, and it is turned on through this parameter. This parameter can be rewrite the SQL statement submitted to the database.
useServerPrepStmts=false
If it is not enabled (useServerPrepStmts=false), use com.mysql.jdbc.PreparedStatement for local SQL assembly, and finally send it to db. It is the final SQL that has been replaced.
Here is a little improvement, add the following statement to the connection string (remove the comments in the code constructor):
connectStr += "?useServerPrepStmts=false&rewriteBatchedStatements=true";
The test results are as follows:
Quantity = 100001
Running time = 1213
With the same amount of data, this time it took only 12 seconds to execute, which shows that the processing efficiency has been greatly improved. Haha
The above is the test efficiency of using JDBC to insert 10W pieces of data into MySQL database batches. I hope it will be helpful to everyone. If you have any questions, please leave me a message and the editor will reply to everyone in time. Thank you very much for your support to Wulin.com website!