• Resolved dziordzio

    (@dziordzio)


    Hey there,

    Latest update disallows crawling /cache/ directory which gives a lot of GSC errors. There are other plugins using this directory, not only W3 Total Cache. Crawlers need to have access to it in order to correctly crawl the site. I’ve tried deleting these lines from robots.txt file but they’re added right after. Fix that ASAP please.

Viewing 6 replies - 1 through 6 (of 6 total)
Viewing 6 replies - 1 through 6 (of 6 total)
  • The topic ‘Robots.txt cache directory’ is closed to new replies.