简体   繁体   中英

ASP.NET Core not serving robots.txt

In my Startup class' Configure method I have the following setup for a single-page application that's built separately:

app.UseStaticFiles();

app.Run(async (context) =>
{
    context.Response.ContentType = "text/html";
    await context.Response.SendFileAsync(Path.Combine(env.WebRootPath, "index.html"));
});

Then, in the wwwroot of the app I have a robots.txt file (same level as my JS/CSS files). The JS/CSS files are served fine by the static files middleware, but requests to robots.txt always falls through to the catch-all middleware.

How can I make it serve robots.txt as a static file too?

You need to add these lines to your Configure method:

using AspNetCore.SEOHelper;

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
   // add these two lines to those you already have
   app.UseRobotsTxt(env.ContentRootPath);
   app.UseStaticFiles();
}

You will need to install the AspNetCore.SEOHelper package

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM