Preface
The previous articles focus on the optimization of back-end service database and multi-threaded parallel processing, and examples of pseudocode logic before and after transformation. Of course, optimization is endless. Previous generations plant trees and later generations enjoy the shade. As we developers, since we stand on the shoulders of giants, we must write more optimized programs.
SpringBoot Development Case JdbcTemplate Batch Operation
SpringBoot Development Case: CountDownLatch Multitasking Parallel Processing
Renovation
Theoretically, the more threads, the faster the program may be, but in actual use, we need to consider the resource consumption of the thread itself creation and destruction, as well as the purpose of protecting the operating system itself. We usually need to limit threads to a certain range, and thread pools play this role.
Program logic
Multitasking parallel + thread pool processing.png
The problems that can be solved by a picture should be as few as possible. Of course, the underlying principle still needs to be remembered and understood by everyone.
Java thread pool
Java provides four types of thread pools through Executors, namely:
advantage
Code implementation
Method 1 (CountDownLatch)
/** * Multitasking parallel + thread pool statistics* Creation time April 17, 2018*/public class StatsDemo { final static SimpleDateFormat sdf = new SimpleDateFormat( "yyyy-MM-dd HH:mm:ss"); final static String startTime = sdf.format(new Date()); /** * IO-intensive tasks = generally 2*number of CPU cores (often in threads: database data interaction, file upload and download, network data transmission, etc.) * CPU-intensive tasks = generally 1 CPU cores + 1 (often in threads: complex algorithms) * Hybrid tasks = depending on machine configuration and complexity self-test */ private static int corePoolSize = Runtime.getRuntime().availableProcessors(); /** * public ThreadPoolExecutor(int corePoolSize,int maximumPoolSize, long keepAliveTime, * TimeUnit unit,BlockingQueue<Runnable> workQueue) * corePoolSize is used to specify the number of core threads* maximumPoolSize specifies the maximum number of threads* keepAliveTime and TimeUnit specifies the maximum survival time after the thread is idle* workQueue is the buffer queue of the thread pool, and threads that have not yet been executed will wait in the queue* Monitor queue length to ensure that queue bounded* Improper thread pool size can slow down processing speed, reduce stability, and lead to memory leakage. If there are too few threads configured, the queue will continue to grow larger and consume too much memory. * And too many threads will slow down the speed of the entire system due to frequent context switching - and the same end result will be achieved. The length of the queue is crucial, it must be bounded so that if the thread pool is overwhelmed, it can temporarily reject new requests. * The default implementation of ExecutorService is an unbounded LinkedBlockingQueue. */ private static ThreadPoolExecutor executor = new ThreadPoolExecutor(corePoolSize, corePoolSize+1, 10l, TimeUnit.SECONDS, new LinkedBlockingQueue<Runnable>(1000)); public static void main(String[] args) throws InterruptedException { CountDownLatch latch = new CountDownLatch(5); //Use execute method executor.execute(new Stats("Task A", 1000, latch)); executor.execute(new Stats("Task B", 1000, latch)); executor.execute(new Stats("Task C", 1000, latch)); executor.execute(new Stats("Task D", 1000, latch)); executor.execute(new Stats("Task E", 1000, latch)); latch.await();// Wait for everyone's task to end System.out.println("All statistical tasks execution is completed:" + sdf.format(new Date())); } static class Stats implements Runnable { String statsName; int runTime; CountDownLatch latch; public Stats(String statsName, int runTime, CountDownLatch latch) { this.statsName = statsName; this.runTime = runTime; this.latch = latch; } public void run() { try { System.out.println(statsName+ " do stats begin at "+ startTime); //Simulate task execution time Thread.sleep(runTime); System.out.println(statsName + " do stats complete at "+ sdf.format(new Date())); latch.countDown();//A single task ends, the counter is reduced by one} catch (InterruptedException e) { e.printStackTrace(); } } }}Method 2 (Future)
/** * Multitasking parallel + thread pool statistics* Creation time April 17, 2018*/public class StatsDemo { final static SimpleDateFormat sdf = new SimpleDateFormat( "yyyy-MM-dd HH:mm:ss"); final static String startTime = sdf.format(new Date()); /** * IO-intensive tasks = generally 2*number of CPU cores (often in threads: database data interaction, file upload and download, network data transmission, etc.) * CPU-intensive tasks = generally 1 CPU cores + 1 (often in threads: complex algorithms) * Hybrid tasks = depending on machine configuration and complexity self-test */ private static int corePoolSize = Runtime.getRuntime().availableProcessors(); /** * public ThreadPoolExecutor(int corePoolSize,int maximumPoolSize, long keepAliveTime, * TimeUnit unit,BlockingQueue<Runnable> workQueue) * corePoolSize is used to specify the number of core threads* maximumPoolSize specifies the maximum number of threads* keepAliveTime and TimeUnit specifies the maximum survival time after the thread is idle* workQueue is the buffer queue of the thread pool, and threads that have not yet been executed will wait in the queue* Monitor queue length to ensure that queue bounded* Improper thread pool size can slow down processing speed, reduce stability, and lead to memory leakage. If there are too few threads configured, the queue will continue to grow larger and consume too much memory. * And too many threads will slow down the speed of the entire system due to frequent context switching - and the same end result will be achieved. The length of the queue is crucial, it must be bounded so that if the thread pool is overwhelmed, it can temporarily reject new requests. * The default implementation of ExecutorService is an unbounded LinkedBlockingQueue. */ private static ThreadPoolExecutor executor = new ThreadPoolExecutor(corePoolSize, corePoolSize+1, 10l, TimeUnit.SECONDS, new LinkedBlockingQueue<Runnable>(1000)); public static void main(String[] args) throws InterruptedException { List<Future<String>> resultList = new ArrayList<Future<String>>(); //Use submit asynchronous task and get the return value as future resultList.add(executor.submit(new Stats("Task A", 1000))); resultList.add(executor.submit(new Stats("Task B", 1000))); resultList.add(executor.submit(new Stats("Task C", 1000))); resultList.add(executor.submit(new Stats("Task D", 1000))); resultList.add(executor.submit(new Stats("Task E", 1000))); //Result of the traversal task for (Future<String> fs: resultList) { try { System.out.println(fs.get());//Print the results of each line task execution, call future.get() to block the main thread, and get the return result of the asynchronous task} catch (InterruptedException e) { e.printStackTrace(); } catch (ExecutionException e) { e.printStackTrace(); } finally { //Start once and execute the previously submitted tasks, but do not accept new tasks. If it has been closed, the call has no other effect. executor.shutdown(); } } System.out.println("All statistical tasks are executed:" + sdf.format(new Date())); } static class Stats implements Callable<String> { String statsName; int runTime; public Stats(String statsName, int runTime) { this.statsName = statsName; this.runTime = runTime; } public String call() { try { System.out.println(statsName+ " do stats begin at "+ startTime); //Simulate task execution time Thread.sleep(runTime); System.out.println(statsName + " do stats complete at "+ sdf.format(new Date())); } catch (InterruptedException e) { e.printStackTrace(); } return call(); } }}Execution time
The above codes are all pseudo-codes, and the following are real test records of 2,000+ students.
2018-04-17 17:42:29.284 INFO test record 81e51ab031eb4ada92743ddf66528d82-Single-threaded sequential execution, time spent: 3797
2018-04-17 17:42:31.452 INFO Test record 81e51ab031eb4ada92743ddf66528d82-Multi-threaded parallel task, time spent: 2167
2018-04-17 17:42:33.170 INFO test record 81e51ab031eb4ada92743ddf66528d82-Multi-threaded parallel task + thread pool, time spent: 1717
The above is all the content of this article. I hope it will be helpful to everyone's learning and I hope everyone will support Wulin.com more.