简体   繁体   中英

whats the best way to generate sitemap

I need to generate sitemap for website that potentially could have very large number of user contributed content. I've read this tutorial: https://laravel-news.com/2016/09/laravel-sitemap/ It gives example like this:

public function podcasts()
{
    $podcast = Podcast::active()->orderBy('updated_at', 'desc')->get();
    return response()->view('sitemap.podcasts', [
        'podcasts' => $podcast,
    ])->header('Content-Type', 'text/xml');
}

What I don't like is that it's getting all the podcasts from the database at once, if you have 1 million records, that will slow down everyting and this function will be called each time a webspider requests a sitemap.

If your site grows really large, should sitemap include all the database records for lets say blog posts if there are 500,000+ of them or just last 50000 which is the max limit for one sitemap file. Why should I include all million blog posts and split them into multiple sitemaps if google has already crawled them since I update my sitemap regullary and thus there is no need to everytime sitemap is accessed to get all the database records, old posts won't get crawled again, so I may as well just get the latest 50,000?

试试这个包,它使您能够按日期对它们进行排序Carbon 在这里!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM