本文介绍了.htaccess中的X-Robots noindex特定页面的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我可以使用.htaccess中的x个机器人 noindex,跟随特定页面吗?

Can I 'noindex, follow' a specific page using x robots in .htaccess?

我找到了一些有关noindexing文件类型的说明,但是我可以

I've found some instructions for noindexing types of files, but I can't find instruction to noindex a single page, and what I have tried so far hasn't worked.

这是我想要的noindex页面:

This is the page I'm looking to noindex:

http://www.examplesite.com.au/index.php?route=news/headlines

这是我到目前为止尝试过的:

This is what I have tried so far:

<FilesMatch "/index.php?route=news/headlines$">
 Header set X-Robots-Tag "noindex, follow"
</FilesMatch>

感谢您的时间。

推荐答案

似乎无法从.htaccess文件中匹配请求参数。以下是可以与之匹配的列表:

It seems to be impossible to match the request parameters from within a .htaccess file. Here is a list of what you can match against: http://httpd.apache.org/docs/2.2/sections.html

在脚本中执行此操作会容易得多。如果您使用的是PHP,请尝试:

It will be much easier to do it in your script. If you are running on PHP try:

header('X-Robots-Tag: noindex, follow');

您可以轻松地在$ _GET,REQUEST_URI等上建立条件。

You can easily build conditions on $_GET, REQUEST_URI and so on.

这篇关于.htaccess中的X-Robots noindex特定页面的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

06-08 14:02