Most SEO conversations focus on content: keywords, backlinks, and publishing strategies. But content can only perform as well as the technical foundation underneath it. If search engines can't efficiently crawl, index, and understand your pages, no amount of great writing will help them rank.
A technical SEO audit is the systematic process of identifying and fixing the infrastructure issues that prevent your site from reaching its organic search potential. Done properly, an audit surfaces hidden problems — from misconfigured robots.txt files to broken redirect chains to missing structured data — that silently bleed rankings every day without any obvious symptom.
This guide walks you through a complete technical SEO audit: what it covers, why each element matters, a step-by-step process you can follow for any website, and an interactive checklist you can use to track your progress. Whether you're auditing your own site or working through a client's, you'll have a clear, actionable framework by the end.
What This Guide Covers
A technical SEO audit examines six core areas: crawlability (can search engines reach your pages?), indexability (will they include your pages in search results?), site speed (do pages load fast enough?), mobile friendliness (does your site work on phones?), structured data (do search engines understand your content type?), and security (is your site safe and trustworthy?). This guide covers each in depth.
What Is a Technical SEO Audit?
A technical SEO audit is a comprehensive review of the non-content elements of a website that affect its visibility in search engines. While content SEO focuses on what a page says, technical SEO focuses on whether search engines can access, process, and trust the page in the first place.
Think of it this way: if your website were a library, content SEO would be the quality of the books, and technical SEO would be whether the library doors are open, whether books are properly catalogued, whether the aisles are accessible, and whether the building passes safety inspections. Even the best books go unread if the library can't be found or entered.
Technical audits are not one-time events. Best practice is to run a full technical audit quarterly, with lightweight monitoring running continuously. As sites grow — adding pages, changing CMS platforms, running experiments — new technical issues emerge. Regular auditing catches these before they compound.
Why Technical SEO Is the Foundation of Rankings
Google processes hundreds of ranking signals, but all of them depend on one prerequisite: the page must be crawled and indexed. A technically broken page is invisible to search engines regardless of its content quality, backlink profile, or on-page optimization. Technical SEO is upstream of everything else.
Beyond the crawl-and-index prerequisite, Core Web Vitals — Google's framework of page experience metrics — are explicit ranking factors. Sites that load slowly, shift layouts unexpectedly, or are unresponsive to user input face measurable ranking penalties. These are technical issues, not content issues.
Similarly, Google's structured data requirements directly affect whether your pages can earn rich results — the enhanced search listings with star ratings, FAQs, how-to steps, and recipe details that dramatically improve click-through rates. Structured data is a technical implementation, and errors in that implementation silently prevent rich result eligibility.
For sites of any meaningful scale — typically 100+ pages — technical debt accumulates faster than most teams realize. Redirect chains lengthen. Duplicate content proliferates. Orphaned pages lose internal link equity. A systematic audit is how you find and fix these issues before they become ranking liabilities.
Run a Technical SEO Audit Now — Free
RankPath crawls your site and checks 50+ technical SEO signals automatically. Get a full report in under two minutes, no account required.
Start Free Audit →Core Elements of a Technical SEO Audit
A comprehensive technical SEO audit covers six interconnected areas. Each area has a set of specific checks; RankPath's crawler evaluates over 50 of them automatically. Here is what each area examines and why it matters.
1. Crawlability
Crawlability refers to a search engine's ability to discover and access your pages. Before any other optimization can have an effect, your site must be crawlable. The most common crawlability issues include:
- robots.txt misconfiguration: The
robots.txtfile instructs crawlers which paths to visit and which to avoid. A misconfigured file can accidentally block important pages — or entire sections of your site — from being crawled. This is one of the most impactful and commonly missed errors. Always verify that yourrobots.txtat/robots.txtis not blocking CSS, JavaScript files, or key content pages. - Missing or malformed sitemap.xml: An XML sitemap is a roadmap for search engine crawlers, listing the URLs you want indexed along with metadata like last modification date and update frequency. A missing sitemap doesn't prevent crawling, but a well-maintained sitemap accelerates it — especially for large sites or sites with low internal linking.
- Crawl errors: These include 404 (Not Found) responses that waste crawl budget and potentially break user experience, and server errors (5xx) that signal infrastructure problems. Monitor crawl error trends in Google Search Console regularly.
- Blocked resources: If your
robots.txtblocks CSS or JavaScript files that search engines need to render your pages, Google may not be able to understand your content correctly. Fetch and render tools in Google Search Console can verify this.
2. Indexability
Indexability is whether search engines will include your crawled pages in their index. A page can be crawlable but intentionally or accidentally excluded from the index. Key indexability checks include:
- Noindex directives: Pages with
<meta name="robots" content="noindex">orX-Robots-Tag: noindexHTTP headers are excluded from the search index. These are sometimes left in place from staging environments or incorrectly applied to production pages. - Canonical tags: The
rel="canonical"tag tells search engines which version of a page is the "master" copy when multiple URLs contain similar or identical content. Incorrect canonical tags can cause the wrong version of a page to be indexed, or consolidate ranking signals to a different URL than intended. - Duplicate content: When the same or highly similar content exists at multiple URLs — due to URL parameters, trailing slashes, www vs. non-www, or HTTP vs. HTTPS variations — search engines must decide which version to index. Without canonical tags or 301 redirects, this decision is unpredictable and often wrong.
- Thin content: Pages with very little meaningful content (under 200-300 words for most page types) may be treated as low-quality and excluded from the index or demoted in rankings. Identify and either enrich or consolidate thin pages.
3. Site Speed and Core Web Vitals
Page speed has been a ranking factor since 2010, and Google's Core Web Vitals — introduced in 2021 and now stable as ranking signals — have made performance a first-class SEO concern. The three Core Web Vitals are:
- Largest Contentful Paint (LCP): Measures how quickly the largest visible element on the page loads. Target: under 2.5 seconds. LCP is most commonly affected by large images, slow server response times, and render-blocking resources.
- Interaction to Next Paint (INP): Measures the page's overall responsiveness to user interactions. Target: under 200 milliseconds. High INP is typically caused by heavy JavaScript execution that blocks the main thread.
- Cumulative Layout Shift (CLS): Measures how much page elements shift unexpectedly during loading. Target: under 0.1. Common causes include images without explicit dimensions, web fonts loading, and dynamically injected content above the fold.
Beyond Core Web Vitals, general performance improvements — enabling compression (gzip or Brotli), leveraging browser caching, minifying CSS and JavaScript, and optimizing images (WebP format, lazy loading, proper sizing) — improve both user experience and search engine crawl efficiency.
4. Mobile Friendliness
Google switched to mobile-first indexing for all sites in 2023, meaning the mobile version of your site is what Google primarily crawls and indexes. A site that works well on desktop but breaks or degrades on mobile faces significant ranking disadvantages. Key checks:
- Viewport meta tag: Every page must include
<meta name="viewport" content="width=device-width, initial-scale=1">. Without it, browsers render pages at desktop width and users must zoom to read content. - Tap target sizes: Buttons, links, and interactive elements should be at least 48×48 pixels to be reliably tappable on mobile devices. Cramped tap targets generate poor mobile usability signals.
- Text readability: Body text should be at least 16px on mobile to be readable without pinch-zooming. Larger text sizes are increasingly common best practice.
- Mobile-only content: Ensure that the content visible on mobile (what Google indexes) is the same as on desktop. Hiding content on mobile with CSS display:none may reduce what gets indexed.
5. Structured Data
Structured data is machine-readable markup — almost always implemented as JSON-LD in the page's <head> or <body> — that explicitly tells search engines what type of content a page contains and provides key details about it. Correct structured data implementation enables rich results in Google Search, which can dramatically improve click-through rates.
- Schema types by page type: Blog posts and articles should use
BlogPostingorArticle. How-to guides should useHowTo. FAQ pages should useFAQPage. Product pages should useProduct. Each schema type has required and recommended properties that must be correctly populated. - Validation: Use Google's Rich Results Test to validate your structured data and confirm it is eligible for rich result display. Common errors include missing required properties, incorrect data types, and syntax errors in the JSON.
- Organization and BreadcrumbList: Sitewide schemas for your organization's name, URL, and logo (via
Organization) and breadcrumb navigation (viaBreadcrumbList) improve how Google understands your site's identity and structure.
6. Security
Security is a direct ranking signal and a trust indicator for users. HTTPS has been a ranking factor since 2014. Security issues also trigger browser warnings that dramatically increase bounce rates and destroy user trust. Key checks:
- HTTPS enforcement: All pages must be served over HTTPS. HTTP pages should 301-redirect to their HTTPS equivalents. Verify that the redirect is in place and that HTTPS is working across all subdomains used in production.
- SSL/TLS certificate validity: An expired or misconfigured SSL certificate causes browsers to display a security warning page before your content, effectively removing your site from organic reach for affected users.
- Mixed content: Pages served over HTTPS that load resources (images, scripts, stylesheets) over HTTP generate mixed content warnings. Modern browsers block many mixed content resources by default, which can break page functionality.
- Security headers: Headers like
X-Content-Type-Options,X-Frame-Options, andContent-Security-Policyprotect against clickjacking and injection attacks. While not direct ranking signals, they're technical best practices and contribute to trust signals.
The 7-Step Technical SEO Audit Process
With the six core audit areas in mind, here is a concrete, sequential process for conducting a technical SEO audit on any website.
-
1
Set Up Your Audit Tools
Before crawling anything, ensure you have access to the right data sources. A thorough technical audit draws from multiple inputs:
- Google Search Console (GSC): GSC is free and provides authoritative data from Google about how it crawls and indexes your site. Set it up at search.google.com/search-console if you haven't already. You'll use it for crawl error reports, Core Web Vitals data, mobile usability issues, and manual action notifications.
- RankPath free audit tool: RankPath's crawler (/check) performs over 50 technical SEO checks on any URL — title tags, meta descriptions, heading structure, canonical tags, robots.txt, structured data, Open Graph tags, and more — in under two minutes without requiring account setup. Use it as your primary automated check.
- Google PageSpeed Insights: Run your key pages through PageSpeed Insights to get Core Web Vitals data with diagnostic breakdowns for both mobile and desktop performance.
With these three tools in place, you have access to the data needed for a thorough initial audit. For larger sites (10,000+ pages), a dedicated crawler like Screaming Frog or Semrush Site Audit adds value for bulk analysis.
-
2
Crawl the Site
A complete site crawl is the foundation of any technical SEO audit. The crawl discovers all accessible pages, identifies broken links and redirect chains, and surfaces on-page technical issues at scale.
Start by running RankPath's free audit on your homepage and key landing pages. This gives you an immediate view of the most critical issues on your highest-value pages. For a site-wide crawl, use Google Search Console's URL Inspection tool or a dedicated crawler.
During the crawl, prioritize discovering:
- All 4xx and 5xx status code responses (broken pages and server errors)
- Redirect chains (URLs that redirect through 2+ intermediate URLs before reaching the final destination)
- Pages blocked by robots.txt or noindex directives — confirm these exclusions are intentional
- Pages absent from the sitemap.xml but discovered via internal links (orphaned or unregistered pages)
- Duplicate title tags and meta descriptions (common on large sites with templated content)
Document your findings systematically. A spreadsheet with columns for URL, status code, redirect target, title tag, and issue flags makes triage much easier in later steps.
-
3
Analyze Crawl Results and Prioritize Issues
Raw crawl data must be translated into prioritized action items. Not all technical SEO issues have equal impact — a 5xx server error that affects your homepage is more urgent than a missing meta description on a low-traffic page.
Use this priority framework to triage your findings:
- Critical (fix immediately): Crawl blocks on important pages, 5xx server errors, HTTPS failures, duplicate content without canonicalization, noindex on pages you intend to rank.
- High (fix within 2 weeks): Missing or misconfigured canonical tags, broken internal links on key pages, 404 errors on linked pages, redirect chains longer than 2 hops, Core Web Vitals failures on key landing pages.
- Medium (fix within 1 month): Missing or thin structured data, non-optimized images, missing meta descriptions, mobile usability warnings on secondary pages, sitemap.xml missing some indexed pages.
- Low (fix when practical): Missing security headers, minor CLS issues on low-traffic pages, URL parameter handling, hreflang for international sites.
Create a prioritized backlog of fixes, assign owners, and set deadlines. A perfectly prioritized fix list with clear ownership is more valuable than a complete list with no accountability.
-
4
Check Core Web Vitals and Page Performance
Core Web Vitals data is available in Google Search Console under the "Core Web Vitals" report, which aggregates field data (real user measurements) by URL group. This is more meaningful than lab data from PageSpeed Insights because it reflects how actual visitors experience your pages across all their devices and network conditions.
Start with the pages flagged as "Poor" in GSC's Core Web Vitals report — these are confirmed ranking liabilities. Then audit "Needs Improvement" pages before they deteriorate to "Poor."
For each underperforming page, diagnose the root cause:
- Slow LCP: Usually caused by a large hero image without proper optimization, a slow server response (TTFB > 800ms), or render-blocking scripts in the
<head>. Fix by optimizing images to WebP, enabling a CDN, or deferring non-critical JavaScript. - High INP: Caused by JavaScript that blocks the main thread during user interactions. Audit your JavaScript bundles, remove unused code, and consider code-splitting for large applications.
- High CLS: Almost always caused by images without explicit
widthandheightattributes, or fonts triggering layout shifts when they load. Add explicit dimensions to all images and usefont-display: optionalorfont-display: swapfor web fonts.
RankPath's audit report flags many performance-related signals including image optimization, resource compression, and viewport configuration. These serve as a quick starting point before deeper PageSpeed diagnostics.
- Slow LCP: Usually caused by a large hero image without proper optimization, a slow server response (TTFB > 800ms), or render-blocking scripts in the
-
5
Validate Structured Data
Structured data is one of the most overlooked areas of technical SEO, yet it has disproportionate impact on click-through rates. Rich results — enhanced SERP listings with star ratings, FAQ dropdowns, how-to steps, or article thumbnails — consistently outperform standard blue links in click-through rate.
For each key page type, verify:
- Is there a relevant JSON-LD schema implemented? (Article/BlogPosting for articles, Product for product pages, FAQPage for FAQ pages, HowTo for how-to guides, LocalBusiness for location pages)
- Does the schema pass Google's Rich Results Test without errors or warnings?
- Are all required properties populated with valid data? (e.g.,
Articlerequiresheadline,author,datePublished, andpublisher) - Is your sitewide
Organizationschema correctly implemented on the homepage?
RankPath checks for the presence and validity of multiple schema types — including Article, BlogPosting, FAQPage, and HowTo — as part of its 50+ SEO checks. It surfaces missing schema opportunities and flags known implementation errors, giving you a starting list for structured data improvements without manual page-by-page review.
-
6
Review Security and Compliance
Security issues are both a direct ranking signal and an immediate user experience problem. A failed HTTPS check or expired SSL certificate can cause browsers to block your site entirely for users, producing a dramatic and immediate traffic drop.
Verify the following for your site:
- All HTTP URLs 301-redirect to HTTPS equivalents — check the homepage, internal links, and resources loaded by the page
- The SSL/TLS certificate is valid and covers all subdomains in use (wildcard certificate if necessary)
- No mixed content warnings — all resources loaded by HTTPS pages (images, scripts, stylesheets) are also served over HTTPS
- Security headers are present: at minimum,
X-Content-Type-Options: nosniffandX-Frame-Options: DENY
You can verify HTTPS configuration and redirect behavior using browser developer tools (Network tab) or curl commands. For a quick automated check, RankPath's audit validates HTTPS status, detects common security header gaps, and flags mixed content on the audited page.
-
7
Prioritize Fixes, Track Progress, and Schedule Re-Audits
A technical SEO audit is valuable only if its findings result in implemented fixes. This final step is where many audits fail — recommendations are written but never actioned, or are only partially addressed before attention moves elsewhere.
To ensure your audit results in real improvements:
- Create a fix-by-fix tracking document with columns for issue, affected URLs, priority level, assigned owner, and target resolution date. Share it with the full team — developers, content managers, and stakeholders — so everyone understands the scope and urgency.
- Verify fixes once implemented. Re-run RankPath's audit or use Google Search Console's URL Inspection tool to confirm that resolved issues are actually resolved, not just assumed to be.
- Schedule a re-audit: Run a full audit every quarter. Sites change constantly — new pages are published, CMS updates are deployed, redirects are added — and each change can introduce new technical issues. Continuous monitoring with an automated tool like RankPath's crawler gives you early warning before issues compound.
- Establish a technical SEO baseline. After your first audit and fix cycle, document your baseline scores. Use these as benchmarks for future audits to measure improvement and catch regressions.
For teams with limited developer bandwidth, focus ruthlessly on the critical and high-priority issues first. A perfect fix of five critical issues is more valuable than partial fixes across thirty low-priority items.
Interactive Technical SEO Audit Checklist
Use this checklist to track your progress through each audit area. Your progress is saved in your browser — you can return to this page and continue where you left off. Use the Print button to produce a physical copy for team walkthroughs.
Automate 50+ of These Checks
RankPath runs over 50 technical SEO checks on any URL automatically — including all the items above and more. Get a detailed report with specific fixes in under two minutes.
Run Free Audit Now →Common Technical SEO Issues Found in Audits
Across the thousands of sites audited using RankPath's crawler, certain technical issues appear with striking regularity. Here are the most common findings, their ranking impact, and how to resolve them.
| Issue | Ranking Impact | Fix |
|---|---|---|
| Missing H1 tag | Moderate — weakens topical relevance signals on the page | Add a single, descriptive H1 containing the page's primary target keyword to every page |
| Duplicate title tags | High — prevents pages from differentiating for distinct queries | Write unique, keyword-optimized title tags for every page; template-based CMS sites often need a programmatic fix |
| Missing meta descriptions | Indirect — Google auto-generates snippets, often poorly | Write unique, compelling 150–160 character meta descriptions for all key pages |
| Broken internal links (404) | High — wastes crawl budget and breaks user experience | Identify with a site crawl; redirect broken URLs to the most relevant live page (301) |
| Missing image alt text | Moderate — misses image search traffic; hurts accessibility | Add descriptive, keyword-relevant alt text to all content images; decorative images should use alt="" |
| Redirect chains | Moderate — dilutes link equity; slows crawling and page load | Update the origin URLs to point directly to the final destination; eliminate intermediate redirects |
| No XML sitemap or outdated sitemap | Moderate for large sites — slower indexing of new content | Generate and maintain an XML sitemap; submit it in Google Search Console and update it dynamically when content changes |
| Missing Open Graph tags | Indirect — poor social sharing and AI citation signals | Add og:title, og:description, og:image, og:type, and og:url to all pages; use RankPath's OG Checker to validate |
| Slow server response time (TTFB >800ms) | High — directly impairs LCP Core Web Vitals score | Upgrade hosting, implement a CDN, enable server-side caching, or optimize database queries on dynamic sites |
| Missing structured data | Indirect — foregoes rich result eligibility, lower CTR | Add JSON-LD schemas appropriate to each page type; validate with Google's Rich Results Test |
Don't Chase Every Issue Equally
A technical SEO audit on a real website will surface dozens or hundreds of issues. Fixing them all simultaneously is rarely feasible. Use the priority framework from Step 3 to focus developer time on the changes with the highest ranking impact. A single critical fix — like removing an accidental noindex directive — can unlock more ranking improvement than resolving fifty low-priority warnings.
Tools for Technical SEO Auditing
The right toolset depends on your site size, team resources, and budget. Here's an honest comparison of the main options.
Google Search Console
Cost: Free
Best for: Real-world crawl and index data, Core Web Vitals field data, mobile usability reports
Limitation: Only shows data for your own verified sites; no competitor analysis; limited depth for large site crawls
RankPath
Cost: Free audit available at /check; paid plans for continuous monitoring
Best for: 50+ automated checks on any URL, fast first-pass audit, Open Graph and structured data validation, GEO citation signals
Limitation: Audits individual URLs; for full site crawls use with GSC or a dedicated crawler
Screaming Frog
Cost: Free up to 500 URLs; £259/year for unlimited
Best for: Deep site crawls, bulk URL analysis, redirect chain mapping, custom extraction
Limitation: Desktop application; requires technical familiarity; doesn't audit GEO or AI citation signals
Semrush / Ahrefs Site Audit
Cost: $100–$500+/month depending on tier
Best for: Enterprise sites, competitor analysis alongside technical auditing, integrated backlink and keyword data
Limitation: Significant monthly cost; complexity can be overkill for smaller sites or teams
For most sites — especially those under 10,000 pages — combining Google Search Console with RankPath's free audit covers the majority of critical technical checks without requiring paid tools or significant technical setup. Larger sites or agency workflows benefit from adding Screaming Frog for bulk crawl analysis alongside these core tools.
The Practical Starting Point
Run RankPath's free audit on your top 5–10 pages first. This takes five minutes and immediately surfaces the most common critical issues. Then use Google Search Console to investigate at scale. Add paid crawlers only when you need bulk analysis across thousands of URLs.
Frequently Asked Questions
How long does a technical SEO audit take?
The time depends heavily on site size and the depth of analysis. For a small site (under 100 pages), an automated audit with RankPath plus a manual review of Google Search Console data typically takes 2–4 hours. For a medium site (100–1,000 pages), expect a full audit to take one to two days, including documentation and prioritization. Enterprise-scale audits on sites with 10,000+ pages can take a week or more with a dedicated team. Automated tools like RankPath reduce the data collection phase to minutes, but analysis and fix planning still require human judgment.
How often should I run a technical SEO audit?
Run a comprehensive audit quarterly. Between full audits, use automated monitoring — either through RankPath's continuous monitoring or Google Search Console's alerts — to catch critical issues like crawl errors, manual actions, or Core Web Vitals regressions as they emerge. After major site changes (CMS migrations, redesigns, significant content additions), run an immediate audit regardless of your regular schedule. Technical issues introduced by site changes can compound rapidly if left undetected.
What's the difference between a technical SEO audit and a content SEO audit?
A technical SEO audit examines the infrastructure that determines whether search engines can access, index, and understand your site. It covers crawlability, indexability, page speed, mobile friendliness, structured data, and security. A content SEO audit examines whether your content targets the right keywords, covers topics with appropriate depth, aligns with search intent, and uses on-page elements (title tags, headings, internal links) effectively. Both are necessary for a complete SEO program, but technical issues must be resolved first — content optimization provides no benefit if pages can't be crawled or indexed.
Can I do a technical SEO audit without technical knowledge?
Yes, to a significant degree. Automated tools like RankPath identify technical issues in plain language without requiring you to understand the underlying mechanics. You'll know that "Your page has a missing H1 tag" or "Structured data errors found — 2 required fields missing" without needing to dig into raw HTTP headers or robots.txt syntax. However, resolving some issues — particularly server configuration, redirect setup, or CMS-level template changes — does require developer involvement. A useful approach: run the automated audit yourself, identify what needs fixing, then bring specific, prioritized issues to a developer with clear descriptions of the problem and its ranking impact.
What's the most impactful technical SEO fix for most websites?
Based on audit data across a wide range of sites, the three changes with the highest consistent impact are: (1) fixing broken internal links and redirect chains, which recovers wasted crawl budget and link equity; (2) resolving Core Web Vitals failures — particularly LCP — on key landing pages, which directly affects Google's page experience ranking signals; and (3) implementing or fixing structured data errors, which enables rich results that significantly improve click-through rates from search. If you're auditing your site for the first time and need to prioritize ruthlessly, address these three areas first.
What's the difference between a technical SEO audit and an SEO site audit?
The terms are often used interchangeably, but technically they differ in scope. A technical SEO audit focuses specifically on crawlability, indexability, performance, mobile usability, structured data, and security — the infrastructure layer. A full "SEO site audit" usually includes technical SEO plus content analysis, keyword gap analysis, backlink profile review, and competitive positioning. RankPath's audit is primarily a technical SEO audit with additional on-page signal analysis (title tags, meta descriptions, heading structure, Open Graph, and GEO citation readiness). For complete coverage, pair it with a keyword and content analysis in a platform like Ahrefs or Semrush.
How do I know if a technical SEO issue is actually affecting my rankings?
Correlation between a specific technical fix and a ranking improvement is hard to prove definitively, since many factors affect rankings simultaneously. The most reliable approach is to fix the issue, document the change with a date, and monitor your rankings and organic traffic in Google Search Console over the following 4–6 weeks. For high-impact issues — like removing an accidental noindex directive or fixing a crawl block — ranking recovery is often significant and relatively fast (2–4 weeks). For lower-impact issues like image optimization or structured data additions, the effect is more gradual and harder to attribute directly.
References
- Google Search Central: SEO Starter Guide
- web.dev: Core Web Vitals — LCP, INP, and CLS explained
- Google: Introduction to Structured Data
- Google Rich Results Test — validate your structured data
- Google: Introduction to robots.txt
- Google: XML Sitemaps Overview
- RankPath: SEO Ranking Factors — The Complete Checklist
- RankPath: GEO vs SEO — Understanding AI Search Optimization
- RankPath: The Complete Guide to Open Graph Tags