This question comes up a lot, and it isn’t that hard to execute a fetch and render from search console to a protected environment. What’s the worst that can happen? Well, Google could crawl the test server pages and index them, which isn’t that big of a deal and very easy to prevent with the Noindex tag.
When you request to fetch and render, you’ll get 1 from a render bot, the other is Googlebot so you can compare what Google sees versus a user. You could try and whitelist by User agent of the render bot (below), but that might change.
Generally, in many tests, I’ve always noticed Google crawls from 66. – so, whitelisting that block should be good enough. See the test I performed on http://test.opensourceseo.org/ and the results below:
UA and IPs for Fetch and Render
Render bot UA and IP (IP from my last test, June 26 2017)
- UA: Mozilla/5.0 (X11; Linux x86_64) (KHTML, like Gecko; Google Search Console Chrome/41.0.2272.118 Safari/537.36
- IP: 188.8.131.52
General Fetch UA and IP (IP from my last test, June 26 2017)
- UA: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
- IP: 184.108.40.206
How to do this, step by step
1) Add noindex to each page (either <head> or http header)
2) Whitelist the 66. block, deny all others (except the dev team’s IP of course)
Example .htaccess implementation (just serving an error page here):
ErrorDocument 403 /error.php
Allow from 66.
4) Double check you have the noindex tags 😉
5) Run your fetch and render tests, do not request crawl or indexing
6) When you’re done, remove the whitelist
Want to see it in action?
http://test.opensourceseo.org/ is already set up. You can use the smart iframe hack featured on Screamingfrog.co.uk to request from your own domain and test it or you can see it live in search results here (note: I purposely kept this subdomain indexable)
Whitelisting IPs should work for any test domain, but of course, there are plenty of other things to take into account. For example, if the test environment is already publicly available but protected by authentication, you would need to drop authentication for the pages you want to test and block all IPs while whitelisting Google for a short period of time.