While Restrict is awesome for content restriction for users, restricting access to the content on your website for SEO bots and web crawlers is something that may cost you quite a bit. Of course, the content of your website is what drives the search results so restricting SEO bots and web crawlers from accessing the content will inevitably position your website lower in the search result than you want.
However, Restrict offers a neat feature for bypassing the content restriction to non-human visitors.
So, essentially, if you navigate to the Bots & Web Crawlers area of Restrict and flick the switch to Yes, all the restricted content on your website will become visible to SEO bots and web crawlers. This applies to the page/post content, widgets, menus, content partially protected with Restrict’s shortcode and even if you’re using Site shield to restrict access to the whole website for non logged-in users. Of course, this won’t affect in any way content restriction set for users and the restriction criteria you have set for the users will still be enforced. In other words, if you enable this feature, the whole user experience on your website will remain unchanged and the only noticeable change will be the search results related to your website.