Customize robots.txt
The robots.txt
file tells search engines which pages can, or can't, be crawled on a site. It contains groups of rules for doing so, and each group has three main components:
- The user agent, which notes which crawler the group of rules applies to. For example,
adsbot-google
. - The rules themselves, which note specific URLs that crawlers can, or can't, access.
- An optional sitemap URL.
To learn more about robots.txt
and rule-set components, refer to Google's documentation.
Shopify generates a default robots.txt
file that works for most stores. However, you can add the robots.txt.liquid
template to make customizations.
In this tutorial, you'll learn how you can customize the robots.txt.liquid
template.
Anchor to RequirementsRequirements
Add the robots.txt.liquid
template with the following steps:
- In the code editor for the theme you want to edit, open the Templates folder.
- Click Add a new template.
- Select
robots.txt
under the Create a new template for drop-down menu. - Click Create template.
Anchor to ResourcesResources
The robots.txt.liquid
template supports only the following Liquid objects:
Anchor to Customize ,[object Object]Customize robots.txt.liquid
robots.txt.liquid
You can make the following customizations:
The examples below make use of Liquid's whitespace control in order to maintain standard formatting.
While you can replace all of the template content with plain text rules, it's strongly recommended to use the provided Liquid objects whenever possible. The default rules are updated regularly to ensure that SEO best practices are always applied.
Anchor to Add a new rule to an existing groupAdd a new rule to an existing group
If you want to add a new rule to an existing group, then you can adjust the Liquid for outputting the default rules to check for the associated group and include your rule.
For example, you can use the following to block all crawlers from accessing pages with the URL parameter ?q=
:
Anchor to Add host-specific rulesAdd host-specific rules
If you're using multiple domains for different markets, then you can create host-specific rules using the request.host
object. You should only implement host-specific rules if you're using Shopify Markets and you have distinct domains or subdomains that require different crawling behaviors per market.
For example, you could block crawling of English content on a French domain while maintaining default rules:
Anchor to Remove a default rule from an existing groupRemove a default rule from an existing group
If you want to remove a default rule from an existing group, then you can adjust the Liquid for outputting the default rules to check for that rule and skip over it.
For example, you can use the following to remove the rule blocking crawlers from accessing the /policies/
page:
Anchor to Add custom rulesAdd custom rules
If you want to add a new rule that's not part of a default group, then you can manually enter the rule outside of the Liquid for outputting the default rules.
Common examples of these custom rules are:
Anchor to Block certain crawlersBlock certain crawlers
If a crawler isn't in the default rule set, then you can manually add a rule to block it.
For example, the following directive would allow you to block the discobot
crawler:
Anchor to Allow certain crawlersAllow certain crawlers
Similar to blocking certain crawlers, you can also manually add a rule to allow search engines to crawl a subdirectory or page.
For example, the following directive would allow the discobot
crawler:
Anchor to Add extra sitemap URLsAdd extra sitemap URLs
The following example, where [sitemap-url]
is the sitemap URL, would allow you to include an extra sitemap URL: