Last Updated on: 20th October 2023, 05:59 pm
“Failed: Robots.txt unreachable” means that Googlebot is experiencing issues trying to reach your website’s robots.txt file. Here are detailed steps to fix this issue.
Check the Existence and Accessibility of Your Robots.txt File
- Open a web browser and navigate to “www.yourwebsite.com/robots.txt” (Replace “yourwebsite.com” with your actual domain).
- You should see your robots.txt file with all its rules. If not, first, check your file upload to make sure it’s in the root directory.
Test your Robots.txt file
- Log into your Google Search Console account, select your property, and select “Robots.txt Tester” under the “Legacy tools and reports” section.
- If it shows any errors, you should correct them. This tool will provide instructions on the errors.
Use URL Inspection in Google Search Console
- In GSC, click on “URL Inspection” and enter your robots.txt URL (www.yourwebsite.com/robots.txt).
- Click on “Test Live URL”. This will fetch the current URL from the website and display the HTTP response.
Check the status code
- You should ideally see a “200 OK status” which means that Google can access the page correctly. A “404 Not Found” means the robots.txt file wasn’t found. A “500 Internal Server Error” often means there’s an issue on the server that’s causing an error.
Check the Fetching Section
- If Google can fetch your robots.txt file, it will show in this section.
Check the Availability Section
- This will tell you if Google can reach your site or not.
Check the robots.txt Tester for Errors
- Go back to the “robots.txt Tester” and click “Submit”. Then click “See live robots.txt” to check if Google can see your latest robots.txt content.
Resubmit your Sitemap
- Sitemap errors can sometimes cause issues with robots.txt.
- So, go back to your property in GSC, select “Sitemaps” from the sidebar. Enter your sitemap URL (typically www.yourwebsite.com/sitemap.xml), and then press “Submit”.
Check your server
- Contact your hosting provider and ask them to check if there’s an issue with your server.
Test with Other Tools
- Use tools like Screaming Frog to test if the robots.txt is accessible.
After implementing fixes, remember to clear your cache before re-testing URLs in GSC. Also, be patient – hosting and server changes may take some time to propagate. Always back up your site before making changes. If these steps don’t resolve your issues, consider asking a web developer for help.