When processing large files, if you use ordinary FileInputStream, FileOutputStream, or RandomAccessFile to perform frequent read and write operations, the process will slow down due to frequent read and write external memory. The following is a comparison experiment.
package test; import java.io.BufferedInputStream; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.IOException; import java.io.RandomAccessFile; import java.nio.MappedByteBuffer; import java.nio.channels.FileChannel; public class Test { public static void main(String[] args) { try { FileInputStream fis=new FileInputStream("/home/tobacco/test/res.txt"); int sum=0; int n; long t1=System.currentTimeMillis(); try { while((n=fis.read())>=0){ sum+=n; } } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } long t=System.currentTimeMillis()-t1; System.out.println("sum:"+sum+" time:"+t); } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } try { FileInputStream fis=new FileInputStream("/home/tobacco/test/res.txt"); BufferedInputStream bis=new BufferedInputStream(fis); int sum=0; int n; long t1=System.currentTimeMillis(); try { while((n=bis.read())>=0){ sum+=n; } } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } long t=System.currentTimeMillis()-t1; System.out.println("sum:"+sum+" time:"+t); } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } MappedByteBuffer buffer=null; try { buffer=new RandomAccessFile("/home/tobacco/test/res.txt","rw").getChannel().map(FileChannel.MapMode.READ_WRITE, 0, 1253244); int sum=0; int n; long t1=System.currentTimeMillis(); for(int i=0;i<1253244;i++){ n=0x000000ff&buffer.get(i); sum+=n; } long t=System.currentTimeMillis()-t1; System.out.println("sum:"+sum+" time:"+t); } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } } }The test file is a file of size 1253244 bytes. Test results:
sum:220152087 time:1464
sum:220152087 time:72
sum:220152087 time:25
It means that the data is read correctly. Delete the data processing part.
package test; import java.io.BufferedInputStream; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.IOException; import java.io.RandomAccessFile; import java.nio.MappedByteBuffer; import java.nio.channels.FileChannel; public class Test { public static void main(String[] args) { try { FileInputStream fis=new FileInputStream("/home/tobacco/test/res.txt"); int sum=0; int n; long t1=System.currentTimeMillis(); try { while((n=fis.read())>=0){ //sum+=n; } } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } long t=System.currentTimeMillis()-t1; System.out.println("sum:"+sum+" time:"+t); } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } try { FileInputStream fis=new FileInputStream("/home/tobacco/test/res.txt"); BufferedInputStream bis=new BufferedInputStream(fis); int sum=0; int n; long t1=System.currentTimeMillis(); try { while((n=bis.read())>=0){ //sum+=n; } } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } long t=System.currentTimeMillis()-t1; System.out.println("sum:"+sum+" time:"+t); } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } MappedByteBuffer buffer=null; try { buffer=new RandomAccessFile("/home/tobacco/test/res.txt","rw").getChannel().map(FileChannel.MapMode.READ_WRITE, 0, 1253244); int sum=0; int n; long t1=System.currentTimeMillis(); for(int i=0;i<1253244;i++){ //n=0x000000ff&buffer.get(i); //sum+=n; } long t=System.currentTimeMillis()-t1; System.out.println("sum:"+sum+" time:"+t); } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } } }Test results:
sum:0 time:1458
sum:0 time:67
sum:0 time:8
It can be seen that after mapping some or all of the files to memory for reading and writing, the speed will be much higher.
This is because the memory mapped file first maps the files on external memory to a continuous area in memory, and is processed as a byte array. The read and write operations directly operate on the memory, and then remaps the memory area to the external memory file, which saves the time for frequent reading and writing of external memory in the middle, greatly reducing the read and write time.
The above implementation code for processing large files using memory mapping in Java is all the content I have shared with you. I hope you can give you a reference and I hope you can support Wulin.com more.