简体   繁体   中英

NextJS Dynamic Pages cannot be crawled

I'm using NextJS and ExpressJS as Server.

I already implemented the custom routes like the example in the documentation of nextjs ( https://nextjs.org/docs#custom-routes-using-props-from-url ). I am also using getInitialProps for server-side rendering.

I also used Screaming Frog SEO Spider as crawler to test if it will be able to crawl my dynamic pages (it can't crawl my dynamic pages, it will just crawl the static pages). I don't know if I'm doing something wrong but I just followed the documentation for custom routes.

I really want the crawlers to crawl my dynamic pages because it will affect the SEO of our website.

Thanks

There is a common SEO recommendation not to build dynamic websites. I am not an expert in NextJS and ExpressJS. But in general I can say that most of crawlers don't like dynamic websites. To crawl dynamic website they need to execute JavaScript, it takes time and resources. As far as I know Google can crawl dynamic website, please follow the link . So, it is possible that Googlebot crawl your website successfully. Please, do not build SPA for SEO. About Screaming Frog SEO Spider. As far as I know it also can use Chromium like Googlebot. Please, read the documentation.

For my project, I added a sitemap.xml.tsx as a page that allows the Google crawler to see all the available pages. In order for this to work, you have to be able to retrieve all the possible dynamic pages that you want to be crawled then create the sitemap.

I would follow the along with the example given here: https://dev.to/timrichter/dynamic-sitemap-with-next-js-41pe on how to correctly implement the sitemap.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM