GSC is indexing urls blocked with robots txt.
Google is indexing urls like
/detroitchicago/portland.js?gcb=3&cb=69
/beardeddragon/iguana.js?cb=154
Using robots.txt is not the way to avoid this files to be indexed.
Google warns:
Important: For the noindex rule to be effective, the page or resource must not be blocked by a robots.txt file, and it has to be otherwise accessible to the crawler. If the page is blocked by a robots.txt file or the crawler can't access the page, the crawler will never see the noindex rule, and the page can still appear in search results, for example if other pages link to it.
Please include the notindex file to these Ezoic files