Google has recently updated its open source robots.txt parser code on GitHub, as announced by Gary Illyes. This parser has been in use by Google for some time, but the update has now been made available on GitHub. The new release introduces enhanced capabilities in the parser class, allowing users to export parsing information about the robotstxt body. Additionally, a new library has been added to access this information, which has been utilized by Google Search Console for a significant period without issues.
When Google first released this parser, it was described as the open sourced C++ library used by their production systems for parsing and matching rules in robots.txt files. The library, which has been in existence for 20 years, contains code written in the 90's. Over time, the library has evolved, incorporating learnings from how webmasters write robots.txt files and addressing various corner cases. These learnings have also been added to the internet draft where appropriate.