Google’s John Mueller said that Google’s website crawlers cannot see the content behind captchas.
John Mueller said that hiding content behind captchas is a bad SEO practice as Google’s web crawlers cannot see what is behind them. The Googlebot does not interact with anything when it crawls webpages. In case the bot lands on a page with a captcha blocking the main content, it would assume that it is the only thing on the page.
Additionally, Googlebot does not fill out captchas, even Google-based captchas. As a result, though Google would index the page, none of the content behind the captcha would be used for ranking. However, Mueller suggests marketers can safely use captchas if the main content is readily accessible.
Mueller further suggests an option where website owners can completely block content with a captcha but keep it Google-friendly at the same time. They can serve Googlebot a different version of the page than what regular users are served. In this way, the content would get used for ranking while marketers could still accomplish their captcha goals.
[3 minute read]