Im getting errors in google search console for system pages blocked in robots.txt file. Some examples below.
How to a prevent this (obviously google is ignoring the robots txt file?)
https://www.mysite.com.au/_mycart?tkn=cart&ts=1563862031566649
https://www.mysite.com.au/_myacct
https://www.mysite.com.au/_myacct/warranty
https://www.mysite.com.au/_myacct/nr_track_order
https://www.mysite.com.au/_mycart?tkn=cart&ts=1563504027155146
That doesn't sound like an error, it's exactly the behaviour robots.txt creates. Google is seeing links to those pages somewhere on your site but when it attempts to crawl them it gets blocked by robots.txt. The search console likes to warn you about that just in case it's not what you intended. Those warnings can be safely ignored.