For a temporary fix you can write protect the file (see https://wordpress.org/support/topic/blocked-cache-directory-from-google-bots-triggering-mobile-friendliness-issue/). That worked in our case. So chmod 444 robots.txt
after you changed it manually to your needs. The W3C Dashboard will now give you an error that it couldn’t complete actions but in this case this actually tells you that the wirte protection is working 😉
Problem is if you used the generated robots.txt and modified it with other plugins, the physical one completely replaces it. Sure one can copy over all the missing lines and write protect it. Either that or revert the version and delete the robots.txt so it works like before.
They need to remove this BS pronto.
Hello @endymion00 @galfom and @dziordzio
Thank you for reaching out and I am sorry about the issue you are experiencing.
This issue is reported numerous times and we already gave a Github issue reported for this.
The temporary solution is either to revert back to the previous version of W3TC or to change the permission for the robots.txt file.
Please refer to the existing topic for this issue where the Github issue is posted
https://wordpress.org/support/topic/w3tc-conflict-with-aioseo/
Thanks!
Hello @endymion00 @galfom @dziordzio
We have released a patch in version 2.1.8. We do apologize for any inconvenience. Please update and do let us know if there are any issues. We will be happy to assist you.
Thank you.
@vmarko it seems as though if you have an empty robots.txt file (which is valid) it is getting deleted when you clear the cache.
Hello @1stwebdesigns
Thank you for the information.
This will be fixed in the upcoming release and we will remove this as it’s been causing issues for some users.
Thanks!