The blog post at https://rknight.me/blog/blocking-bots-with-nginx/ is about blocking AI bots and web crawlers from accessing a website using Nginx configuration. Here are the key points covered in the post: Motivation - The author wants to block AI bots and crawlers from accessing his website, as he doesn't trust them to respect the robots.txt file. - He was inspired by a previous post on blocking bots with Apache's .htaccess file. Approach - The author explores different ways to block bots in Nginx based on their user agent strings. - He settles on using a single if statement with a regular expression matching multiple bot user agents. - The matching bots are blocked with a 403 Forbidden response. Implementation - The author generates an Nginx configuration file (nginx.conf) using the Eleventy static site generator. - He fetches a list of AI bot user agents from a GitHub repo and formats it for the Nginx config. - The generated nginx.conf file is placed outside the website's public folder. - Instructions are provided to include this file in the main Nginx config and test if it works. Testing - The author tests the bot blocking by setting a bot user agent (ClaudeBot) in Chrome and verifying he gets a 403 Forbidden response. In summary, this post walks through the author's process of implementing bot blocking for his website using Nginx server configurations to identify and block requests from AI bots and crawlers.