Skip to main content

rule

A rule for the robots.txt file, which tells crawlers which pages can, or can't, be accessed.

A rule consists of a directive, which can be either Allow or Disallow, and a value of the associated URL path.

For example:

Disallow: /policies/

You can output a rule directly, instead of referencing each of its properties.



Properties

The directive of the rule.

The value of the rule.

{
"directive": "Disallow",
"value": "/*preview_script_id*"
}
Was this section helpful?