As of Monday there are thousands of websites hosted on Siteground that are living through the hell of the SEO world. They are invisible to Google.
What seemed like an error from the occasional user complaining on Twitter, is turning out to be something much more global. Personally, I already know people who suffer from the problem, and there are already specialized SEO sites that are complaining about the matter.
We see the details in matttutt.me, where they indicate with screenshots the errors that appear when Google tries to crawl any of the websites hosted there.
The problem does not occur with all Siteground hosting, but it does with many of the sites hosted there. Pages that on November 6 were indexed perfectly, on Sunday they were already beginning to show problems.
The Googlebot cannot retrieve the current robots.txt file from the site, causing it to not crawl the web. Sometimes what it does is recover a previous version of the robots.txt file, even a previous version that is blocking pages that were in development and that are no longer.
Users complain since Monday when using the URL inspection tool in the search console, where it is seen how Google can not access the page. Other tools, such as Rich Results and Mobile Friendly Tester, also fail.
The funny thing is that it is something specific with the Google bot, because with the Inspect URL function of Bing Webmaster Tools there is no problem. Other Google tools, such as Pagespeed, do crawl the page, since the problem affects only the Google bot.
At the moment Siteground has not given details of the problem, they do not see any errors on their part, and they say that they are working to solve them, as indicated and Twitter.
Let’s hope that those who are suffering from the problem are not affected much from the SEO point of view, since since Monday more than 3 days have passed without Google indexing.