简体   繁体   中英

What is best way to read logger file with huge data?

I have created a logger tool which monitor changes in the file and get all the data from the changed file. I have followed following steps :

  • 1. Register file listener to find out file which has been changed. It will give me name of the file which has been changed.
  • 2. My regular expression will read log4j data from the file based on pattern specified.
  • 3. Read data using BufferedReader, which reads sequentially from start of file to end of file. Put all the data into list which will maintain insertion order. So basically i will have latest logs to be end of the list.
  • 4. Get first 100 records and show it on UI.
  • I wanted to know whether my approach is correct or is there any way i can read file data faster, since logs can be in GB and reading entire is not efficient and storing it into List. It will have hug impact on memory as maintaining list of such data will be not good idea. Any alternative, suggestions?

    Since you only need 100 records, I suggest only reading enough lines to find the data you need. You shouldn't be afraid of GB of data. You have GBs of memory and disk, and you only need to display as much data as the user can see. I wouldn't assume you have a performance problem until you have measured it. You can realistically manage TBs of data in a reasonable time frames (with the right hardware)

    The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

     
    粤ICP备18138465号  © 2020-2024 STACKOOM.COM