Hi @johnmu.com One of our technicians asked if they could upload a robots.txt file in the morning to block Googlebot and another one in the afternoon to allow it to crawl, as the website is extensive and they thought it might overload the server. Do you think this would be a good practice?

Comments