简体   繁体   中英

robots.txt from database for asp.net website

I have a custom made CMS for a website, and would like to output the robots.txt file based on contents in a database, rather than having to actually modify the file system.

I know that usually IIS would cache such a file as static content, and as long as it has not changed it would not have to load it each time from disk.

I was thinking about storing the contents of what should be in the robots.txt in memory (static string or in a singleton), and then an HttpHandler would output the contents.

Is this a good approach? Is there a better way? I would like to ideally mimic as much the IIS static file caching.

I think you point the right way. Check this .

You have all the control over the response code so just handle that based on an internal date from your database. You can set a new ETag each time the file "changes".

Response.AddHeader("ETag","\"" + Guid.NewGuid().ToString() + "\"");

This will give all the control you'll need over the cache.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM