TL;DR: What is robots.txt?
It’s a plain text file that instructs Googlebot or any other search engine bot if they are allowed to request a page/resource.
Fortunately, the URL Inspection tool points out all the resources of a rendered page that are blocked by robots.txt.
But how can you tell if a blocked resource is important from the rendering point of view?
You have two options: Basic and Advanced.
In most cases, it may be a good idea to simply ask your developers about it. They created your website, so they should know it well.
Obviously, if the name of a script is called content.js or productListing.js, it’s probably relevant and shouldn’t be blocked.
Unfortunately, as for now, URL Inspection doesn't inform you about the severity of a blocked JS file. The previous Google Fetch and Render had such an option: