# 1. Block the specific aggressive bots found in your logs User-agent: Baiduspider User-agent: DataForSeoBot User-agent: PetalBot Disallow: / # 2. Protect dynamic and paginated URLs from ALL bots # This stops the "Deep Crawling" that caused the high CPU usage User-agent: * Disallow: /*?page=* Disallow: /*?year=* Disallow: /*?type=* Disallow: /search/ Crawl-delay: 5 # 3. Keep your existing massive block list below if you prefer, # but ensure there is a "Disallow: /" under each group.