I just ran a crawl using Screaming Frog, and it says 21 URLs have no issues. But the website obviously has common problems like missing descriptions.
I’ve checked that the settings are at default, and I tested other websites where it’s working fine, so the problem seems to be with this specific site. Since it’s a small site, I’ll handle it manually.
However, I’d like to understand why this is happening and how to fix it. Any ideas?
EDIT: The crawl isn’t blocked; it’s showing all the details (URL name, description content, length, directives, etc.). But in the overview window on the right, it shows 0 for all issues, except it correctly displays the number of URLs.