SO网友:Brooke.
搜索引擎应遵守行业标准robots.txt 可用于阻止对帖子类型的访问的文件。例如阻止访问下面的任何内容example.com/deals
.
您还可以超越并检查$_SERVER[\'HTTP_USER_AGENT\']
对于机器人程序。类似于:
$bot_list = array("Teoma", "alexa", "froogle", "Gigabot", "inktomi",
"looksmart", "URL_Spider_SQL", "Firefly", "NationalDirectory",
"Ask Jeeves", "TECNOSEEK", "InfoSeek", "WebFindBot", "girafabot",
"crawler", "www.galaxy.com", "Googlebot", "Scooter", "Slurp",
"msnbot", "appie", "FAST", "WebBug", "Spade", "ZyBorg", "rabaz",
"Baiduspider", "Feedfetcher-Google", "TechnoratiSnoop", "Rankivabot",
"Mediapartners-Google", "Sogou web spider", "WebAlta Crawler");
if (in_array($_SERVER[\'HTTP_USER_AGENT\'], $bot_list )) {
wp_die("You are a robot, I don\'t like you so go away!";
}
上面的僵尸列表来自
this good tutorial 关于bot检测
现在,如果您使用的是private post,如果用户访问它们,则会出现404错误。对于机器人来说也是如此,他们也会看到404。现在,如果机器人索引是依赖于机器人的404(但大多数不依赖)
然而,如果你只是在将交易链接到其他作者,如果无法访问帖子,为什么还要使用帖子呢?这可能是链接和链接类别的更好用法。