Google is exploring alternatives or supplemental ways of controlling crawling and indexing beyond the 30 year standard of the robots.txt protocol. “We believe it’s time for the web and AI communities to explore additional machine-readable means for web publisher choice and control for emerging AI and research use cases,” Google wrote. Google said it is inviting members from the web and AI communities to discuss in a new protocol. This is one of the many reasons why maybe Google is looking for alternatives to the robots.txt protocol. What those methods and protocols may look like is unknown right now but the discussion is happening.