This phrasing is very misleading. To bullet point directly from "possibility" to "standard" implies the standardization was a turning point where it could start being used. But it was massively used long before that. The standard is a side note that's barely relevant.
Here's a 2010 discussion about Google's explicit support, and I'm sure I could find earlier.
The thing google did in 2019 was submit it as a standard, nothing to do with adoption or starting to recommend. In that very post they said "For 25 years, the Robots Exclusion Protocol (REP) has been one of the most basic and critical components of the web" "The proposed REP draft reflects over 20 years of real world experience of relying on robots.txt rules, used both by Googlebot and other major crawlers, as well as about half a billion websites that rely on REP."
Thought of and discussed as a possibility in 1994.
Proposed as a standard in 2019.
Adopted as a standard in 2022.
Thanks, IETF.