github.com/rabbouni145/gg@v0.47.1/docs/content/en/templates/robots.md (about) 1 --- 2 title: Robots.txt File 3 linktitle: Robots.txt 4 description: Hugo can generate a customized robots.txt in the same way as any other template. 5 date: 2017-02-01 6 publishdate: 2017-02-01 7 lastmod: 2017-02-01 8 categories: [templates] 9 keywords: [robots,search engines] 10 menu: 11 docs: 12 parent: "templates" 13 weight: 165 14 weight: 165 15 sections_weight: 165 16 draft: false 17 aliases: [/extras/robots-txt/] 18 toc: false 19 --- 20 21 To create your robots.txt as a template, first set the `enableRobotsTXT` value to `true` in your [configuration file][config]. By default, this option generates a robots.txt with the following content, which tells search engines that they are allowed to crawl everything: 22 23 ``` 24 User-agent: * 25 ``` 26 27 ## Robots.txt Template Lookup Order 28 29 The [lookup order][lookup] for the `robots.txt` template is as follows: 30 31 * `/layouts/robots.txt` 32 * `/themes/<THEME>/layouts/robots.txt` 33 34 {{% note %}} 35 If you do not want Hugo to create a default `robots.txt` or leverage the `robots.txt` template, you can hand code your own and place the file in `static`. Remember that everything in the [static directory](/getting-started/directory-structure/) is copied over as-is when Hugo builds your site. 36 {{% /note %}} 37 38 ## Robots.txt Template Example 39 40 The following is an example `robots.txt` layout: 41 42 {{< code file="layouts/robots.txt" download="robots.txt" >}} 43 User-agent: * 44 45 {{range .Pages}} 46 Disallow: {{.RelPermalink}} 47 {{end}} 48 {{< /code >}} 49 50 This template disallows all the pages of the site by creating one `Disallow` entry for each page. 51 52 [config]: /getting-started/configuration/ 53 [lookup]: /templates/lookup-order/ 54 [robots]: http://www.robotstxt.org/