编程笔记

编程笔记

建站需要禁止的垃圾蜘蛛名单!
2025-01-14

建站需要禁止的垃圾蜘蛛名单!毫无用处浪费服务器宽带资源。

第一种办法,伪静态

在宝塔的伪静态中插入

if ( $http_user_agent ~ AhrefsBot ){

   return 403;

}

if ( $http_user_agent ~ YandexBot ){

   return 403;

}

if ( $http_user_agent ~ MJ12bot ){

   return 403;

}

if ( $http_user_agent ~ DotBot ){

   return 403;

}

if ( $http_user_agent ~ RU_Bot ){

   return 403;

}

if ( $http_user_agent ~ Ezooms ){

   return 403;

}

if ( $http_user_agent ~ Yeti ){

   return 403;

}

if ( $http_user_agent ~ BLEXBot ){

   return 403;

}

if ( $http_user_agent ~ Exabot ){

   return 403;

}

if ( $http_user_agent ~ YisouSpider ){

   return 403;

}

if ( $http_user_agent ~ sandcrawlerbot ){

   return 403;

}

if ( $http_user_agent ~ ShopWiki ){

   return 403;

}

if ( $http_user_agent ~ Genieo ){

   return 403;

}

if ( $http_user_agent ~ Aboundex ){

   return 403;

}

if ( $http_user_agent ~ coccoc ){

   return 403;

}

if ( $http_user_agent ~ MegaIndex ){

   return 403;

}

if ( $http_user_agent ~ spbot ){

   return 403;

}

if ( $http_user_agent ~ SemrushBot ){

   return 403;

}

if ( $http_user_agent ~ TwengaBot ){

   return 403;

}

if ( $http_user_agent ~ SEOkicks-Robot ){

   return 403;

}

if ( $http_user_agent ~ WordPress ){

   return 403;

}

if ( $http_user_agent ~ BUbiNG ){

   return 403;

}

if ( $http_user_agent ~ PetalBot ){

   return 403;

}

if ( $http_user_agent ~ Adsbot ){

   return 403;

}

if ( $http_user_agent ~ NetcraftSurveyAgent ){

   return 403;

}

if ( $http_user_agent ~ Barkrowler ){

   return 403;

}

if ( $http_user_agent ~ serpstatbot ){

   return 403;

}

if ( $http_user_agent ~ MegaIndex.ru ){

   return 403;

}

if ( $http_user_agent ~ DataForSeoBot ){

   return 403;

}

if ( $http_user_agent ~ Amazonbot ){

   return 403;

}

if ( $http_user_agent ~ ClaudeBot ){

   return 403;

}

if ( $http_user_agent ~ GPTBot ){

   return 403;

}

=========================

在所有的伪静态前面插入!


第二个办法:创建robots.txt,插入以下代码

User-agent: AhrefsBot

Disallow: /

User-agent: YandexBot

Disallow: /

User-agent: DotBot

Disallow: /

User-agent: RU_Bot

Disallow: /

User-agent: Yeti

Disallow: /

User-agent: BLEXBot

Disallow: /

User-agent: YisouSpider

Disallow: /

User-agent: sandcrawlerbot

Disallow: /

User-agent: Genieo

Disallow: /

User-agent: Aboundex

Disallow: /

User-agent: MegaIndex

Disallow: /

User-agent: spbot

Disallow: /

User-agent: TwengaBot

Disallow: /

User-agent: SEOkicks-Robot

Disallow: /

User-agent: BUbiNG

Disallow: /

User-agent: PetalBot

Disallow: /

User-agent: NetcraftSurveyAgent

Disallow: /

User-agent: Barkrowler

Disallow: /

User-agent: MegaIndex.ru

Disallow: /

User-agent: DataForSeoBot

Disallow: /

User-agent: ClaudeBot

Disallow: /

User-agent: GPTBot

=======================

第一个方法垃圾蜘蛛访问直接403禁止访问!

第二个方法是直接告诉他不欢迎他。