简体   繁体   English

如何在pantheon.io上更新robots.txt-Wordpress

[英]How to update robots.txt on pantheon.io - wordpress

How can I update robots.txt on the pantheon environment Live site? 如何在万神殿环境实时网站上更新robots.txt?

I have tried the following option 1) Via FTP 2) via word press SEO >> tool 我尝试了以下选项1)通过FTP 2)通过单词按SEO >>工具

Do I need to follow any steps, as it's a word press instance 我是否需要按照任何步骤进行操作,因为这是一个Word Press实例

Nothing special. 没什么特别的。 Two options here, 这里有两个选择

  1. Create a robots.txt file locally. 在本地创建robots.txt文件。 Add desired statements. 添加所需的语句。 Upload to Pantheon via SFTP or Git. 通过SFTP或Git上传到Pantheon。

  2. Pull down the existing robots.txt file from Pantheon, modify as necessary, and push back up via SFTP or Git. 从Pantheon下拉现有的robots.txt文件,根据需要进行修改,然后通过SFTP或Git向上推送。

In both cases, you need to keep in mind that Pantheon forces a Workflow. 在这两种情况下,都需要记住,万神殿会强制执行工作流程。 You have the Dev, Testing, and Live Servers. 您拥有开发,测试和实时服务器。 When you push, whether by Git or SFTP, you are essentially pushing to the Dev environment. 当您通过Git或SFTP进行推送时,实际上就是在推送到Dev环境。 Note that if you choose to use SFTP, you must have the Pantheon site in SFTP mode (not Git), and you should log into the Dev environment SFTP. 请注意,如果选择使用SFTP,则必须使Pantheon站点处于SFTP模式(而非Git),并且应登录到Dev环境SFTP。 From there, you must deploy up to the Live environment. 从那里,您必须部署到实时环境。 You do this via the Pantheon Dashboard. 您可以通过万神殿仪表板执行此操作。

EDIT: Since you are going the SFTP route, you will need to login via SFTP to the dev environment. 编辑:由于您要使用SFTP路由,因此需要通过SFTP登录到开发环境。 Once logged in via SFTP, you will want to upload to the /code directory. 通过SFTP登录后,您将要上传到/code目录。 This is the root directory for the WordPress installation. 这是WordPress安装的根目录。 So you will have uploaded /code/robots.txt . 因此,您将上传/code/robots.txt Once you upload, you will need to return to the Pantheon Dashboard and commit your changes through Dev, Testing, and Production. 上传后,您将需要返回“万神殿”仪表板,并通过开发,测试和生产进行更改。

Hope this helps. 希望这可以帮助。

If you do not have any experience with PHP and or don't feel comfortable modifying your themes code for whatever reason the solution above should work perfectly. 如果您没有使用PHP经验,或者出于任何原因都不满意修改主题代码,则上述解决方案应该可以正常工作。

Alternative PHP approach 替代PHP方法

If this is a site you are developing / maintaining and feel comfortable modifying the theme there is another approach that will save you time in the long run. 如果这是您正在开发/维护的网站,并且可以轻松修改主题,那么从长远来看,还有另一种方法可以节省您的时间。

Filters to the Rescue! 筛选救援!

If you are unfamiliar with hooks and filters within WordPress I'll defer you to either this article from Treehouse blogs or a quick google search. 如果您不熟悉WordPress中的钩子和过滤器,我将把您引到Treehouse博客中的这篇文章或快速Google搜索中。 The hooks and filter system plays a fundamental part in how plugins like Yoast SEO function, allowing them to modify the output of the robots.txt file for example. 钩子和过滤器系统在Yoast SEO之类的插件的功能中起着至关重要的作用,例如,允许它们修改robots.txt文件的输出。

We can use this same robots_txt filter to modify the output of our sites robots.txt file without any external plugin or theme dependency. 我们可以使用相同的robots_txt过滤器来修改网站robots.txt文件的输出,而无需任何外部插件或主题依赖项。 If you use git or svn to manage your theme or /wp-content/ directories this approach allows you to keep any modifications under version control. 如果使用gitsvn来管理主题或/wp-content/目录,则此方法可让您将所有修改保留在版本控制下。

The code below should live in your themes functions.php file or another included PHP file of your chosing. 下面的代码应该位于主题的functions.php文件或您选择的另一个PHP文件中。

<?php 
function so_robots_txt_50725645( $output ) {
    // User-agent: *
    $output .= 'User-agent: *' . PHP_EOL;
    $output .= 'Disallow: /wp-includes/' . PHP_EOL;
    $output .= 'Disallow: /wp-content/uploads/' . PHP_EOL;

    return $output;
}

// Hook in our filter function.
add_filter( 'robots_txt', 'so_robots_txt_50725645', 10, 1 );

?>

What's listed above is just an example, you could populate the $output variable with whatever content you wanted to appear on the robots.txt page. 上面列出的只是一个示例,您可以使用要在robots.txt页面上显示的任何内容填充$output变量。 In this example we are appending new Disallow lines to the existing output via the .= operator. 在此示例中,我们通过.=运算符将新的Disallow行添加到现有输出。

After all operations have been completed we return the modified $output and go on our way, never to worry about migrating pesky robots.txt files ever again. 完成所有操作后,我们将返回修改后的$output并继续进行,不必担心再次迁移讨厌的robots.txt文件。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM