# 屏蔽恶意蜘蛛 if ($http_user_agent ~ "hubspot|CCBot|VelenPublicWebCrawler|Konturbot|my-tiny-bot|eiki|webmeup|ExtLinksBot|Go-http-client|Python|ZoominfoBot|MegaIndex.ru|MauiBot|Amazonbot|ds-robot|intelx.io|coccocbot|FeedDemon|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Barkrowler|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|DuckDuckGo|ClaudeBot|coccocbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|MJ12bot|DotBot|heritrix|Bytespider|BLEXBot|serpstatbot|Ezooms|JikeSpider|Barkrowler|InfoTigerBot|SemrushBot|DuckDuckGo-Favicons-Bot|ImagesiftBot|GPTBot|^$"){ return 403; } #禁止访问的文件或目录 location ~ ^/(\.user.ini|\.htaccess|\.git|\.svn|\.project|LICENSE|README.md|package.json|package-lock.json|\.env|\.zip|\.tar\.gz) { return 404; } # 解决特殊字体不显示的问题 location ~ ^/(\.eot|\.ttf|\.ttc|\.otf|\.eot|\.woff|\.woff2|\.svg) { add_header Access-Control-Allow-Origin *; } # 1.强制把80端口转向443 , 该处和下面的2处是不可互换的,先跳转到https, 再转向www if ($server_port !~ 443){ rewrite ^(/.*)$ https://$host$1 permanent; } # 2.把不带www的转向到www上来 if ($host = "kd68.cn"){ rewrite ^(/.*)$ https://www.$host$1 permanent; } # 3.原来百度收录的文章 如: https://www.hao366.net/tags-etagid6584-0.html 设置成能正常访问 if ($uri ~* "/tags-etagid|/tougao/|/wenda/|/post/"){ rewrite ^(/.*)$ https://$host permanent; }
如果新增一个网站,一般情况下https是必须使用的, 但还要把 类似 http://www.hao366.net , https://hao366.net 这样的网址都统一转向到 https://www.hao366.net , 经本人测试,是有顺序的, 按照上面1、2顺序加入即可,如果换了,网址会跳转多次,浪费流量。