![]() It helps search engines to crawl your site more intelligently. Sitemap: Note: The XML sitemap is an index of page URLs on your store that is available for crawling by search engines. The following example shows a PUT request that adds your error page to the list of pages for robots to ignore. If you do not specify a site, the request updates the default site’s robots.txt file. If you run multiple sites within a single instance of Commerce Cloud, you must specify the site whose robots.txt file you are updating in the x-ccsite header in the PUT request. When you update the robots.txt file, it will not be overwritten until the next PUT request is sent to /ccadmin/v1/merchant/robots. The body of the request must include the entire contents of the file, in text/plain format. To update the robots.txt file, issue a PUT request to /ccadmin/v1/merchant/robots. See Extending Oracle Commerce Cloud for information about the REST APIs. You must edit it with the Commerce Cloud Admin REST API. You cannot edit the robots.txt file in the administration UI. ![]() If you are testing your store and do not want any robots to crawl any pages, you might want your robots.txt file to look like this: User-agent: * You should not remove any of the Disallow: entries from the default robots.txt file, though you might want to include additional pages that you want robots to ignore. You can replace the * (asterisk) with the name of a specific robot to exclude, for example, Google.Įach Disallow: / entry indicates a page that robots should not visit. Sitemap: User-agent: * means that the exclusion rules should apply to all robots. The default Commerce Cloud robots.txt file looks like this: User-agent: * where is the base URL for your store.Ĭommerce Cloud displays the contents of the current robots.txt file. ![]() Enter the following URL into your browser: ![]()
0 Comments
Leave a Reply. |