As we have seen the message in Google Search Console’s index coverage report for “Page indexed without content.” Gary Illyes states that whenever you see that showing up, most of the time it is about “pages that are blocked by robots.txt.”
“Page indexed without content: This page appears in the Google index, but for some reason Google could not read the content. Possible reasons are that the page might be cloaked to Google or the page might be in a format that Google can’t index. This is not a case of robots.txt blocking. Inspect the page, and look at the Coverage section for details.”
Google’s Gary Illyes was then questioned if the cause of the error is “heavy loading time or time-outs” to which Gary responded no. If it was a heavy loading time or time-out issue, you’d likely see a soft 404 notice instead. Gary states “this error is really just for pages that are blocked by robots.txt.”
Here Are The Tweets
no, we would likely just not used those pages of they time out. maybe we'd report them as soft404, depending on whether they time out for Googlebot or rendering.
this error is really just for pages that are blocked by robots.txt
— Gary 鯨理／경리 Illyes (@methode) March 20, 2021