November 22, 2023

What is robots.txt?

If you’re new to SEO, some of the more technical aspects of it can seem a bit confusing and overwhelming, especially if you’re not from a technical or web development background to begin with. Robots.txt is one such aspect of Technical SEO that causes more than a few head-scratches, so we got one of our Senior SEO Account Managers to take you through it. Let’s start with the basics; what exactly is robots.txt?

Robots.txt is a text file on your website that instructs web robots and crawlers on which pages they should or should not access. When it comes to search engine agents, the robots.txt file allows us to stop Googlebot, Bingbot, and other crawlers from accessing certain areas of your site, and better manage the crawl budget.

The robots.txt file is part of a number of tools that website owners and developers can use to implement the Robots Exclusion Protocol, alongside X-robots-tags, robots meta tags, and rel attributes.

Read on to find out more about how and why we use robots.txt files. 

How does robots.txt work?

Robots.txt is a simple text file, without any HTML markup. It is hosted on the web server, located at the root of your domain, and it is publicly accessible. If a website has a robots.txt file, you will be able to find it by typing in the URL followed by /robots.txt

Robots.txt is the first file that search crawlers read after reaching a domain. This file provides bots with information on how to crawl the website and what pages, resources, or folders they should not crawl. If the bots do not find a robots.txt or if the file does not contain any disallow directives, it is implied that they can crawl all the links found on the domain. 

The file contains lines of text. Each line specifies a rule for one or more crawlers, allowing or disallowing their access to specific file paths on the domain. 

Wildcat’s robots.txt file indicates that all crawlers can access all URLs on the site. It also points to the specific location of the XML sitemap. 

What is robots.txt used for?

The main goal of the robots.txt file is to manage good* bot traffic and activity, so the crawl budget is used effectively and servers do not become overloaded. There most common uses of robots.txt include allowing and disallowing specific agents, directories, or files and specifying the location of your sitemap.

*more detail on this point under Limitations

How to create a robots.txt file

Many website builders will create a robots.txt file by default. Here is how you can create your robots.txt file if your website does not already have one, and how you can optimise your existing file. 

Syntax

The robot.txt file is structured as a series of lines, where each line contains a single field specifying a user-agent, allow directive, disallow directive, or sitemap location. The order of these fields matters to the proper understanding of the file. Below, we will outline the most important rules to follow when writing your robots.txt file. 

User-agent

Defines the web crawler or user agent that the rule applies to. It can be a specific agent or a wildcard (*) for all agents.

Disallow

The disallow command is the most commonly used directive in robots.txt. It tells crawlers to omit certain areas of the site. It can be used to:

User-agent: *

Disallow: /

User-agent: *

Disallow: /wp-login.php

User-agent: *

Disallow: /wp-admin/

User-agent: *

Disallow: /my-account/secret-info

User-agent: *

Disallow: /shop/?query=*

Allow

The allow command does just that – it  allows bots to access certain pages or directories. Because bots will always follow the most specific command on the file. The allow directive can be used to, for example, allow access to a specific page within a disallowed directory. Or,  to allow one crawler access while disallowing all others. 

User-agent: *

Disallow: /wp-admin/

Allow: /wp-admin/admin-ajax.php

Sitemaps

Adding a link to the XML sitemap in your robots.txt file helps readers find all your pages and understand what you deem to be the most important links on your site. 

User-agent: *

Disallow:

Sitemap: https://wildcatdigital.co.uk/sitemap_index.xml

Crawl delay*

The crawl delay directive can be used to tell the user agent to wait for a specified number of milliseconds between crawl requests. This helps avoid overtaxing the server. 

*While Bing and Yandex still recognise this directive, Google no longer does. However, the crawl frequency for Google bot can be set through Google Search Console.

Field order and grouping in robots.txt

Understanding the logic robots use to read your file can help you write effective rules.

Snippet from Semrush’s robots.txt file. All of the directives below the user agent field apply to that user agent. In this case, all user agents. 

In this configuration, all other bots will follow the first group of rules, where no disallows are in place. Both Yahoo and Yandex will follow the second group and will not crawl any of the pages in the domain. 

Because both rules apply to Googlebot (the disallow rule applies to all bots), Googlebot will follow the second, more specific directive.

This example, from Google’s documentation, specifies that in this case, the Google bots will follow the allow directive.

The example above disallows crawling for all dynamic shop search URLs.

Disallows all URLs ending in .php

Disallows the root URL without disallowing lower-level URLs like /root/file.

Robots.txt Limitations

For all its useful implementations, the robots.txt file has certain limitations that are important to know about before making any changes. 

Robots.txt does not enforce directives

It is important to note that the commands contained in the robots.txt file are directives, not rules. This means that malicious bots and crawlers can choose to ignore these directives. While you can rely on Google, Bing and most good bots to follow these directives, you must employ alternative methods to truly protect sensitive content on your website, like password-protecting files. 

Disallowed pages can be indexed

The disallow directives on the robots.txt file stop search engine crawlers from reading the content of the disallowed pages. However, when these pages are linked to from other crawlable pages, they may still be indexed and appear in search results.

No index directives in the robots.txt file are not supported by Google, and the robots.txt file directives should not be used to manipulate search results. 

To reliably prevent certain pages from appearing in search results, we can use noindex robot directives on the necessary pages. 

Need Help With Your robots.txt File?

Our team of technical SEO specialists at Wildcat Digital have a wealth of experience setting up websites for success. Checking that your robots.txt is set up correctly and following best practices is a key step in our technical audits and campaign planning. If you need help with your robots.txt or have any concerns about the indexing and crawling of your website, get in touch today. 

Post by

Miruna Hadu

Will Hitchmough

Founder

Our founder, Will Hitchmough, worked at a number of high profile Sheffield Digital Agencies before founding Wildcat Digital in 2018. He brings an extensive knowledge of all things related to SEO, PPC and Paid Social, as well as an expert knowledge of digital strategy.

Digital Marketing can be a minefield for many businesses, with many agencies ready to take your money without knowing how to deliver results. I founded Wildcat Digital to deliver digital success to businesses with smaller budgets in a transparent way.

Chloe Robinson

Content Strategist Team Lead

With a degree in Marketing and a background in more traditional, offline marketing, Chloe joined Wildcat in 2021 after deciding to move into the digital marketing industry. She joined us as a Content Specialist and quickly moved up the ranks, becoming a Content Strategist and later an SEO Team Leader.

Outside of work, Chloe is an avid creative. If she’s not knitting, you’ll likely find her behind a sewing machine or in the kitchen trying (and often failing!) to make sourdough.

Paul Pennington

SEO Account Director

Paul has a strong background in SEO, having previously founded and ran a successful eCommerce business, as well as running a personal blog that achieves an average of 17K users per month. Paul’s knowledge of SEO is extensive, with a strong emphasis on client handling and technical SEO.

Outside of work, Paul enjoys spending time with his family and staying active with weight lifting and combat sports.

Nadea van der Merwe

Head of Operations

Nadea joined Wildcat in 2021 and has since overhauled the way we work. She has a background in various administrative, operations and HR roles, which gave her the experience and skills needed to lead and organise a growing business. 

Outside of work, Nadea loves anything that keeps her active, but she especially enjoys hiking, camping and mountain biking.

Molly Sturgeon

SEO Account Manager

With a background in sales, Molly is a natural Account Manager, brilliantly handling any issues that come her way. Having joined us as a Digital Marketing Executive, and working part-time through her final year of University, Molly is a shining example of how hard work pays off. She is now an SEO Account Manager with a particular interest in Content and Client Management. 

In her spare time, Molly loves to get out in nature, hiking and exploring the Peak District. She also loves cooking and likes to unwind with a bit of yoga.

Libby Oldale

Senior PPC Account Manager

Libby joined Wildcat in 2021 as our first PPC hire. With a degree in Digital Media Production, a Master’s in Digital Media Management and previous experience in Social Media Management, Libby hit the ground running and has since climbed the ranks to Senior PPC Account Manager and has a particular interest in the eCommerce sector.

Outside of work, Libby likes gaming, and cooking and likes to keep active by lifting weights.

Jasmine Savery

SEO Team Leader

Jasmine joined Wildcat in 2022 with a strong background in SEO and Account Management. At the time, she was finishing up a Level 4 Apprenticeship in Digital Marketing from the Chartered Institute of Marketing, and has since worked her way up to SEO Team Leader. Jasmine excels at content writing and promotion, and particularly enjoys finding creative ways to join the dots on multi-channel campaigns.

In her spare time, Jasmine volunteers at a charity, helping combat loneliness & social isolation experienced by older neighbours. Outside of Wildcat, she owns a catering company, Savery Grazing, creating delicious grazing tables & platters for a range of events. She also loves skiing and exploring the Peak District.

Braden Godley

Senior PPC Executive

Prior to joining Wildcat, Braden founded and ran his own Social Media Management agency where he gained valuable skills in growing brands’ online presence, which has since proven invaluable in his role at Wildcat. 

Outside of work, Braden is a big football fan, regularly travelling to Pride Park to support Derby County. He also enjoys watching and playing snooker, hiking in the Peak District, and attending air and car shows.

Jon Herdman

Senior SEO Executive

After spending ten years managing businesses, restaurants, cafes and event spaces across Sheffield, Jon decided to change careers and joined Wildcat as an SEO Executive in 2022. He especially enjoys the client management side of the job, helping them to understand digital marketing and ways in which they can build their business’s presence online. 

Outside of work, Jon likes to keep fit with running, badminton and football, and also loves music. 

Rachel Davies

SEO Executive

Rachel joined us as a graduate, having recently graduated from Sheffield University with a degree in English Literature. Since joining, Rachel has proven herself to be a Content Queen, and particularly enjoys getting stuck into projects that allow her to explore her creative side. 

Outside of work, Rachel enjoys a wide range of hobbies, including rugby, singing, reading, and spending time with her family and three labradors.

Carl Atterbury

PPC Team Leader

Carl joined Wildcat in 2023 with a wealth of experience in PPC. He has experience working with a wide range of clients, industries and budgets, from small local businesses all the way to international e-commerce businesses. Carl has a particular interest in strategic planning and forecasting. He is passionate about data analysis and creating sustainable long-term cross-channel strategies.

In his spare time, Carl enjoys keeping fit by lifting weights, bouldering and running. He’s also a keen musician, having played the double bass and bass guitar for 20 years.

Dariusz Baczyk

Team Lead & Technical SEO Account Manager

With a degree in Computer Science and SEO experience dating back to 2017, Dariusz has a wide range of SEO skills and knowledge. His specialist knowledge of Technical SEO has firmly landed him the title of Wildcat’s Technical Wizard, and he has recently taken on the responsibility of Team Leader for the Panthers Team.

In his spare time, Dariusz loves hiking, experimenting and trying new coffees and loves learning new things. He is currently learning more about CRO and AI and how this could benefit our clients.

Jamie Stowe

SEO Account Manager

With a degree in Film and TV production, and a varied career history, Jamie made the move to marketing with a Masters degree in Digital Media Management. He has since worked in SEO at Agencies across Sheffield, before joining Wildcat and working his way up to SEO Account Manager. Jamie has a particular interest in backlinks and Digital PR and has recently gained a client a valuable backlink from Forbes!

In his spare time, Jamie is an avid foodie and loves trying new restaurants and cuisines. He also loves to travel and spent a year travelling to Australia after university.

Andy Blanchard

SEO Executive

Andy joined Wildcat in 2023 after starting his digital marketing career in-house for a local Sheffield company. Since joining, he has developed a strong interest in Technical SEO and has strong skills in Account Management. 

Outside of work, Andy loves music and plays in a couple of bands. He also enjoys rock climbing, cycling, photography and good food.

Siena Russell

Client Success Coordinator

Siena joined us in 2023 with a background in sales and digital marketing. She leads on client relationships across the company, ensuring that our customers are happy throughout their journey with us, from their initial consultation through to onboarding and beyond. 

Outside of work, Siena enjoys travelling and getting stuck into the local culture. She likes to make the most of her experiences and particularly enjoys watching sunrises and sunsets from beautiful locations around the world.

Thea Chapman

Senior SEO Executive

Thea has a wealth of experience in SEO, having previously worked for other Digital Marketing Agencies in Sheffield. She has a particular interest and skills in Technical SEO, but is more than willing to get stuck in and give anything a go. 

Outside of work, Thea spends most of her time with her children, but also loves reading, photography and gardening. 

Tom Brookes

PPC Executive

Before joining Wildcat, Tom worked across different industries, building skills in sales and customer service. He later developed a passion for digital marketing whilst working on personal marketing projects and freelance ventures, and gained numerous certifications in PPC and Social Media. 

Outside of work, Tom enjoys staying active by going to the gym and hiking. He also loves travelling and motorbiking.

Rich Ayre

Head of Growth

Rich joined us in May 2024 to head up our growth team. With years of experience helping other agencies to grow, Rich joins us at an exciting time as Wildcat is working on a five-year plan to become one of the biggest agencies in the UK.

Outside of work, Rich is a father to three children, which keeps him very busy! He’s also recently started running again to keep fit and loves a bit of DIY.

More blogs.

View all

Is Search Intent a Ranking Factor?

July 23, 2024

Search intent (also known as user intent) is the goal of the user when they search a query on Google.…

Chloe Robinson

Knowledge Hub
SEO

When Should You Use a 302 Redirect?

July 22, 2024

Redirects in general can be a confusing topic, knowing when to use one and why is key. There are a…

Thea Chapman

Knowledge Hub
SEO
Technical SEO

July Digital Marketing Roundup: Latest Updates

July 12, 2024

Welcome to our July Marketing Roundup! As we reach the height of summer, the marketing world is heating up with…

Jasmine Savery

News