Learn how to protect your site from bad bots while allowing visits from safe user agents with our all-in-one WordPress security plugin Defender.
Your website’s security is under threat 24/7, whether it’s from a serious DDoS attack, XSS attack, SQL injections, or just annoying spam. Defender’s User Agent Banning not only offers your WordPress site robust protection against requests from bad user agents at the server level, it also helps to free up server resources for all your good traffic.
And it’s all available at no cost (get it for free at wordpress.org).
In this article, we’ll cover:
Let’s dive in…
What Is a User Agent?
Let’s start with this definition from Wikipedia…
A user agent is any software, acting on behalf of a user, which retrieves, renders and facilitates end-user interaction with Web content.
Network servers, email clients, search engines, and web browsers are all examples of user agents.
Essentially, a user agent is a “string” (i.e. a line of text) that identifies a client to a server. In other words, it’s a way of saying “Hello! This is who I am” to a web server.
A web browser, for example, includes a User-Agent field in its HTTP header identifying the browser and operating system to the web server (e.g. Chrome Browser Version 94.0.4606.61 on Windows 10).
The user agent string format for web browsers reads as follows:
Mozilla/[version] ([system and browser information]) [platform] ([platform details]) [extensions]
This allows each web browser to have its own, distinctive user agent and the contents of the user agent field can vary from browser to browser.
When I looked up my web browser’s user agent, for example, I got the following:
This information is useful to a web server, because it allows the web server to serve different web pages to different web browsers and different operating systems (e.g. send mobile pages to mobile web browsers, show different pages to different platforms or operating systems, and even display “please upgrade your browser” messages to older web browsers).
Good Bots vs Bad Bots
Most website owners want their content to be found on the web, especially by search engines like Google.
Google automatically discovers and scans websites by following links from one webpage to another employing user agents called “crawlers”. Google’s main crawler,
This article was written by Martin Aranovitch and originally published on WPMU DEV Blog.