简体   繁体   English

什么是生成站点地图的最佳方法

[英]whats the best way to generate sitemap

I need to generate sitemap for website that potentially could have very large number of user contributed content. 我需要为可能会包含大量用户贡献内容的网站生成站点地图。 I've read this tutorial: https://laravel-news.com/2016/09/laravel-sitemap/ It gives example like this: 我已经阅读了本教程: https : //laravel-news.com/2016/09/laravel-sitemap/它给出了这样的示例:

public function podcasts()
{
    $podcast = Podcast::active()->orderBy('updated_at', 'desc')->get();
    return response()->view('sitemap.podcasts', [
        'podcasts' => $podcast,
    ])->header('Content-Type', 'text/xml');
}

What I don't like is that it's getting all the podcasts from the database at once, if you have 1 million records, that will slow down everyting and this function will be called each time a webspider requests a sitemap. 我不喜欢的是它会立即从数据库中获取所有播客,如果您有100万条记录,这将减慢每次操作的速度,并且每次Webspider请求站点地图时都会调用此功能。

If your site grows really large, should sitemap include all the database records for lets say blog posts if there are 500,000+ of them or just last 50000 which is the max limit for one sitemap file. 如果您的站点确实很大,那么站点地图应包括博客文章的所有数据库记录,例如博客文章是否有500,000+或刚好是50000,这是一个站点地图文件的最大限制。 Why should I include all million blog posts and split them into multiple sitemaps if google has already crawled them since I update my sitemap regullary and thus there is no need to everytime sitemap is accessed to get all the database records, old posts won't get crawled again, so I may as well just get the latest 50,000? 如果自从我更新站点地图规则以来Google已经抓取了它们,为什么我应该包含所有数百万个博客帖子并将它们拆分成多个站点地图,因此,不必每次访问站点地图来获取所有数据库记录时,旧帖子就不会得到我又爬了一次,所以我最好还是得到最新的50,000?

试试这个包,它使您能够按日期对它们进行排序Carbon 在这里!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM