How to Run a Technical SEO Audit With Screaming Frog (Step-by-Step)

Screaming Frog remains the gold standard for technical SEO audits because it crawls exactly like Google does — following links, respecting robots.txt, and surfacing the issues that actually tank your rankings. While web-based tools like SEMrush Site Audit are faster to set up, Screaming Frog gives you granular control over crawl settings and exports that let you dig deep into problems. Most SEOs overthink technical audits. The reality is straightforward: crawl your site, identify what’s broken, prioritize fixes by impact, then validate with Google Search Console. The entire process takes 2-4 hours from initial crawl to actionable fix list. This walkthrough covers the exact steps I use for client audits, including crawl configuration, issue identification, and building a prioritized remediation plan. You’ll also see how Sitebulb and SEMrush handle the same audit workflow differently.

Step 1: Configure Your Crawl Settings

Before hitting “Start,” configure Screaming Frog to crawl like Googlebot. Poor crawl settings produce garbage data that wastes hours of analysis time.

Essential Configuration Settings

Navigate to Configuration > Spider and enable these settings: – **Follow Internal Links**: Always enabled for full site audits – **Follow External Links**: Disable unless checking outbound link health – **Respect Robots.txt**: Enable to see what Google actually crawls – **Check Images**: Enable to audit alt text and image optimization Under Configuration > User-Agent, switch from Screaming Frog to Googlebot. This reveals pages that block the default crawler but allow Google. For JavaScript-heavy sites, enable JavaScript rendering in Configuration > Rendering. Set it to “JavaScript” mode rather than “Text Only” — this crawls the rendered DOM like Google does post-2015.

Mobile-First Crawling

Switch the user-agent to Googlebot Mobile under Configuration > User-Agent > Custom. Google’s mobile-first indexing means your mobile site structure determines rankings, not desktop. Set a reasonable crawl delay (1-2 seconds) if your server struggles under load. Better to crawl slowly than trigger rate limiting that skews your data.

Step 2: Execute the Crawl

Enter your domain and hit Start. For most sites under 10,000 pages, expect 15-30 minutes. Enterprise sites with 100k+ pages can take several hours. Watch the crawl progress for red flags: – **High 4xx/5xx response codes**: Server issues or widespread broken links – **Massive redirect chains**: Indicates structural problems – **Slow average response time**: Performance issues affecting the entire site Don’t stop mid-crawl unless you spot critical configuration errors. Incomplete crawls produce incomplete data that misses important issues.

Step 3: Identify Priority Technical Issues

Once crawling completes, tackle issues in order of SEO impact. Not every technical problem deserves immediate attention — focus on what actually hurts rankings and user experience.

Broken Internal Links (High Priority)

Check the Response Codes tab, filter for 4xx errors. Export this data and sort by “Inlinks” to prioritize pages with the most broken internal links pointing to them. These hurt user experience and waste crawl budget. Start with 404s that have 10+ internal links — these represent navigation or content structure problems that affect multiple pages.

Redirect Chains and Loops (High Priority)

Navigate to Response Codes > Redirection (3xx). Look for redirect chains longer than 2 hops — these slow page loading and dilute link equity. Common patterns to flag: – HTTP to HTTPS to www redirects (should be single redirect) – Category page redirects that chain through multiple URLs – Redirect loops that create infinite cycles

Duplicate Content Issues (Medium Priority)

Check Page Titles and Meta Description tabs for duplicates. Filter by “Duplicate” to see exact matches across multiple URLs. Focus on: – **Homepage variants**: domain.com, domain.com/index.html, www.domain.com – **Parameter-driven duplicates**: Filtering, sorting, or tracking parameters creating duplicate pages – **Category/tag duplicates**: Similar content across different taxonomy pages

Missing Optimization Elements (Medium Priority)

Review these tabs for optimization gaps: – **Page Titles**: Missing or over 60 characters – **Meta Descriptions**: Missing or over 160 characters – **Images**: Missing alt text, especially for content images – **H1**: Missing or multiple H1 tags per page

Orphaned Pages (Low-Medium Priority)

Use Integration > Google Analytics or Search Console to identify pages getting organic traffic but not found during the crawl. These orphaned pages have no internal links but still receive search visibility.

Step 4: Validate with Google Search Console

Cross-reference Screaming Frog findings with Google Search Console data to confirm what Google actually sees versus what your crawler found. In GSC’s Coverage report, compare: – **Valid pages count** vs. your crawled pages count – **Excluded pages** that Screaming Frog found but Google ignores – **Error pages** that match your 4xx/5xx findings This validation catches crawl discrepancies and confirms which issues Google considers problems versus technical noise. Run Core Web Vitals reports for pages flagged as slow in Screaming Frog. Real user experience data from GSC trumps synthetic testing for prioritization.

Step 5: Build Your Prioritized Fix List

Organize findings into an actionable spreadsheet with these columns: – **Issue Type**: Broken links, duplicates, missing elements, etc. – **Affected URLs**: Specific pages with problems – **Priority**: High/Medium/Low based on traffic and ranking impact – **Effort**: Hours required to fix – **Owner**: Who implements the fix
Issue Type Priority Typical Fix Time Impact on Rankings
Broken internal links High 1-2 hours Direct crawl budget waste
Redirect chains High 2-4 hours Page speed and link equity loss
Duplicate titles Medium 4-8 hours Keyword cannibalization
Missing alt text Low 2-6 hours Image search visibility
Focus on high-priority items that take under 4 hours to fix. These quick wins often produce immediate ranking improvements while building momentum for larger technical projects.

Alternative Tools: Sitebulb vs SEMrush Site Audit

Sitebulb excels at visual reporting and automated issue prioritization. Its hints system guides non-technical users through fixes, while interactive charts make client presentations easier. However, it costs more than Screaming Frog and offers less granular data export options. SEMrush Site Audit runs entirely in-browser and connects directly to Search Console for streamlined reporting. The automated scheduling and progress tracking work well for agencies managing multiple client audits. The downside is less control over crawl settings and no offline analysis capability.
Tool Best For Price Key Advantage
Screaming Frog Deep technical audits $259/year Granular control and exports
Sitebulb Client reporting $35/month Visual reports and hints
SEMrush Site Audit Ongoing monitoring $119/month Automated scheduling

Verdict

Use Case Best Tool Why
First-time technical audit Screaming Frog Complete data control and detailed exports for analysis
Client presentations Sitebulb Visual reports that non-technical stakeholders understand
Ongoing site monitoring SEMrush Site Audit Automated crawls and progress tracking over time
Budget-conscious freelancers Screaming Frog One-time annual fee vs. monthly subscriptions

FAQ

How often should I run technical SEO audits?

Quarterly for most sites, monthly for large sites with frequent content updates. Major site changes (redesigns, migrations, new CMS) require immediate post-launch audits regardless of schedule.

Can I audit sites with millions of pages in Screaming Frog?

The paid version handles up to 500,000 URLs efficiently. Beyond that, use sampling (crawl sections of the site) or switch to enterprise tools like Botify or DeepCrawl that handle unlimited URLs.

Should I fix every issue Screaming Frog identifies?

No. Focus on issues affecting your highest-traffic pages first. A broken link on a page with no traffic shouldn’t take priority over duplicate titles on your category pages.

How do I handle JavaScript-heavy sites in technical audits?

Enable JavaScript rendering in Screaming Frog’s configuration. For complex single-page applications, supplement with Google Search Console data to see what Google actually indexes versus what renders in the crawler.

Want help picking the right SEM tool stack?

If reading reviews and comparing tools is starting to feel like its own job, we can help you cut through the noise faster. A working SEO will look at your situation and tell you what stack actually fits.

Get in touch

Disclosure: Some links in this post are affiliate links. If you sign up through them, we earn a commission at no extra cost to you. Here’s how that works.