简体   繁体   中英

Storing vast amounts of "uptime" data for a website monitoring service

this is more of a general discussion rather than a code question.

I have a website monitoring platform whereby users of the system can input their website URL and we'll check it every X minutes based on the customer's interval, at each interval, an entry is stored as a UptimeCheck model in the Laravel 8 project with the status being down or up.

If a customer has 20 monitors, and each checks every minute, then over a 30 day period for the one customer they'd accumulate over 1 million rows.

My query, is really do I need to keep this number of rows?

The reason this number of rows is kept is so that we can present a graph showing the average website uptime.

My thinking is that if I created some kind of SVG programatically for each day and store this in the table then I wouldn't need to store as many entries, but my concern here is how would I merge SVG models into one to present a daily graph?

What kind of libraries could I use and how else might I approach this?

Unlike performance, the trick for storing uptime data is simple. You don't store it. ;) You need to store DOWNTIME data instead. Register only unavailability events and extrapolate uptime when displaying reports.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM