301 Redirect
Technical SEO - IntermediateA permanent redirect from one URL to another, passing most link equity.
Used when moving pages to new URLs. Consolidates signals and is preferred over 302 for permanent moves.
Showing 103 of 103 terms
A permanent redirect from one URL to another, passing most link equity.
Used when moving pages to new URLs. Consolidates signals and is preferred over 302 for permanent moves.
Temporary redirect that doesn't pass link equity.
Used for temporary page moves or A/B testing. Should not be used for permanent URL changes.
Temporary redirect that preserves the request method (POST, GET).
Modern alternative to 302 redirects for temporary moves. Maintains HTTP method unlike 302.
Page not found error.
Too many 404s harm user experience and waste crawl budget. Skid identifies all 404 pages to help you fix or redirect them.
A strategic plan generated by AI showing month-by-month actions with expected KPIs.
Shows semantic and technical improvements organized by month, with expected KPIs to improve your website's SEO performance.
Content visible without scrolling.
Critical for First Contentful Paint and user experience. Should load quickly and contain key information.
LLM-powered analysis that translates technical SEO issues into business-focused action plans.
Provides clear priorities, effort estimates, and 6-month roadmaps to improve your website's SEO performance.
Descriptive text for images that helps search engines understand image content.
Improves accessibility for screen readers and helps images rank in image search.
Clickable text in a link.
Helps search engines understand link destination content. Should be natural and varied, avoiding over-optimization.
A comprehensive SEO analysis of a domain.
Includes crawling up to 2,000 pages, checking 200+ SEO checkpoints, running PageSpeed Insights, and generating AI-powered recommendations.
Mean ranking position for a keyword across all searches.
Lower numbers (closer to 1) are better. Tracked in Google Search Console. Monitor for ranking improvements.
Links from other websites pointing to your site.
Primary ranking factor in Google's algorithm. Quality and relevance matter more than quantity.
Percentage of visitors leaving after viewing only one page.
High bounce rates may indicate poor content match or bad user experience. Context matters - some pages naturally have high bounce rates.
Navigation trail showing page hierarchy (Home > Category > Product).
Helps users and search engines understand site structure. Can appear in search results with schema markup.
Links pointing to non-existent pages (404s).
Harm user experience and waste crawl budget. Should be fixed or removed regularly. Skid identifies broken internal links during audits.
An HTML link element that tells search engines which version of a page is the 'master' copy.
Prevents duplicate content issues. Critical for SEO health. Use <link rel='canonical' href='...'>; self-referential on unique pages.
Distributed network of servers that deliver content from locations closest to users.
Reduces latency and improves page load times globally. Critical for international sites and high-traffic websites.
Percentage of impressions that result in clicks.
Average CTR varies by position - #1 typically gets 25-40% CTR. Optimizing titles and descriptions improves CTR.
Measures visual stability - how much page content shifts unexpectedly during loading.
Good CLS is under 0.1. High CLS frustrates users and hurts rankings. Reserve space with width/height or aspect-ratio, preload fonts, and avoid inserting content above existing elements.
SEO strategy organizing content around pillar pages and related cluster pages.
Demonstrates topical authority and improves internal linking. Helps search engines understand your expertise in a subject area.
How recently content was updated.
Fresh, updated content can rank better for time-sensitive queries. Important for news, trends, and competitive topics.
Percentage of visitors completing desired action (purchase, signup, download).
Ultimate measure of SEO ROI beyond just traffic. Tracks business impact of SEO efforts.
Major changes to Google's ranking algorithm released several times per year.
Can significantly impact rankings. Require monitoring and potential site adjustments. Examples: Panda, Penguin, BERT, Core Updates.
Google's key metrics for user experience: Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), and Total Blocking Time (TBT).
These directly impact search rankings and user satisfaction. Good scores: LCP ≤ 2.5s, CLS ≤ 0.1, TBT ≤ 200ms.
The number of pages search engines will crawl on your site in a given timeframe.
Larger sites must optimize to ensure important pages get crawled regularly. Proper robots.txt and sitemap configuration helps maximize crawl budget efficiency.
The number of pages search engines will crawl on your site in a given timeframe.
Larger sites must optimize to ensure important pages get crawled. Proper robots.txt and sitemap configuration helps.
How many clicks it takes to reach a page from the homepage.
Pages deeper than 3-4 clicks are harder for search engines to discover and may receive less crawl priority. Important content should be within 3 clicks of the homepage.
Minimum CSS needed to render above-the-fold content.
Inlined in HTML to speed up initial page render. Rest of CSS loaded asynchronously. Improves FCP and LCP.
Performance score (0-100) for desktop devices.
While mobile is prioritized, desktop performance still matters for user experience and conversions.
Normal links that pass PageRank and link equity.
Most valuable for SEO as they contribute to ranking authority. Default link type unless nofollow is specified.
Third-party metric (Moz) predicting ranking potential on a 1-100 scale.
Based on backlink profile. Not used by Google but useful for competitor analysis and link building prioritization.
Identical or very similar content appearing on multiple URLs.
Confuses search engines and dilutes ranking potential. Canonical tags help resolve this.
How long users stay on your page before returning to search results.
Longer dwell time suggests content satisfies user intent. Google may use this as a ranking signal.
Serving pre-rendered HTML to search engines while serving JavaScript-heavy version to users.
Helps with JavaScript SEO but adds complexity. Consider SSR or SSG instead for better maintainability.
Experience, Expertise, Authoritativeness, Trustworthiness.
Google's quality evaluation framework. Critical for YMYL (Your Money Your Life) topics. Demonstrating E-E-A-T improves rankings.
Processing data closer to users rather than in centralized servers.
Reduces latency for dynamic content and personalization. Used by modern CDNs and serverless platforms.
Filter system that creates multiple URL variations (common in e-commerce).
Can cause duplicate content and crawl budget issues if not managed properly. Use canonical tags or noindex on filter pages.
Selected search result displayed above organic results.
Appears as paragraph, list, table, or video. Provides direct answers to queries. Can significantly increase visibility.
Field data comes from real users (Chrome User Experience Report), while lab data comes from controlled tests (Lighthouse).
Both are shown in PSI. Field data reflects actual user experience but requires sufficient traffic. Lab data is consistent and reproducible but may not reflect all user conditions.
Measures when the first text or image is painted on the screen.
Good FCP is under 1.8 seconds. Faster FCP improves perceived loading performance and user experience.
Serving different content based on user location.
Important for international sites and local businesses. Configured through hreflang and Search Console.
Free tool showing how Google sees your site.
Provides search performance data, indexing status, mobile usability, and security issues. Essential for SEO monitoring.
Track PSI scores and SEO metrics over time to measure improvement.
Identify trends in your website's technical health and monitor progress across multiple audits.
The main heading tag on a page, indicating the primary topic.
Each page should have exactly one H1 that clearly describes the content.
Proper structure of H1-H6 tags on a page.
Should flow logically (H1 → H2 → H3) to help search engines and users understand content organization. Don't skip heading levels.
HTML attributes that tell search engines which language and regional version of a page to serve to different users.
Essential for international sites. Should be reciprocal and consistent.
Three-digit codes indicating how a server responded to a request.
2xx = success, 3xx = redirect, 4xx = client error, 5xx = server error. Skid tracks all status codes found during crawls.
Network protocol that allows multiple requests over single connection.
Improves page load speed through multiplexing, server push, and header compression. Most modern sites use HTTP/2.
Latest HTTP protocol using QUIC transport.
Provides faster connection establishment and better performance on unreliable networks. Gradually being adopted.
Compressing and formatting images for web delivery.
Includes choosing right formats (WebP, AVIF), sizing appropriately, and adding alt text. Critical for page speed and user experience.
Number of times your site appeared in search results.
High impressions with low clicks indicate poor titles/descriptions or wrong targeting. Track in Search Console.
Web pages that search engines can crawl, understand, and include in their search results.
Pages blocked by robots.txt or meta robots tags are non-indexable. Skid identifies indexability issues during audits.
Links from one page to another within the same website.
Proper internal linking distributes page authority and helps users navigate.
Designing websites to support multiple languages and regions.
Involves hreflang tags, proper URL structure, and localized content. Essential for global businesses.
Optimizing JavaScript-heavy sites to ensure search engines can crawl and render content properly.
Critical for modern web applications. Ensure content is server-rendered or properly hydrated for crawlers.
When multiple pages compete for the same keyword.
Confuses search engines and dilutes ranking potential. Should be consolidated or differentiated to target different aspects.
Process of finding terms people search for related to your business.
Involves analyzing search volume, competition, and user intent. Foundation of all SEO strategy.
Technique that defers loading of non-critical resources until they're needed.
Improves initial page load time by loading images and other resources below the fold only when user scrolls near them.
Measures loading performance - the time it takes for the largest visible content element to appear.
Good LCP is under 2.5 seconds. Affects both SEO and user experience. Improve by lowering TTFB, compressing/prioritizing images, inlining critical CSS, and lazy-loading non-critical media.
Google's automated tool that audits performance, accessibility, SEO, and best practices.
PSI uses Lighthouse under the hood. Scores range from 0-100. Provides actionable recommendations for improvement.
Value passed through links from one page to another.
Internal linking strategy distributes link equity to important pages. Influenced by anchor text and page authority.
Finding and fixing broken links to your site or unlinked brand mentions.
Recovers lost link equity and brand visibility. Effective link building strategy with high success rate.
Rate at which site acquires backlinks.
Sudden spikes can appear unnatural. Steady, consistent growth is healthier and more sustainable.
Examining server logs to understand how search engines crawl your site.
Reveals crawl budget usage, bot behavior, and technical issues. Advanced technique for large sites.
Longer, more specific search phrases with lower volume but higher conversion rates.
Easier to rank for than broad, competitive keywords. Often have clearer user intent and better conversion potential.
Terms conceptually related to your main keyword.
Help search engines understand content context and topical relevance. Natural use improves content quality signals.
A brief summary (150-160 characters) that appears in search results.
While not a direct ranking factor, good descriptions improve click-through rates. Aim for clear value and CTA.
HTML meta tag that provides page-level instructions to search engines.
Can control indexing, link following, snippet display, and caching. Common values: noindex, nofollow, noarchive, nosnippet.
Performance score (0-100) for mobile devices.
Mobile-first indexing means this score is critical for SEO. Skid tracks this for every audit.
Google's practice of primarily using the mobile version of a site for indexing and ranking.
Mobile optimization is now mandatory, not optional. Mobile PSI scores directly impact rankings.
Links with rel='nofollow' attribute that don't pass PageRank.
Used for untrusted content, paid links, or user-generated content. Google may still use them for discovery.
Directive that tells search engines not to include a page in their index.
Used for duplicate content, thank you pages, or pages not meant for search results. Applied via robots meta tag or X-Robots-Tag HTTP header.
Meta tags that control how content appears when shared on social media platforms.
Includes title, description, and image. Used by Facebook, LinkedIn, Twitter, and other platforms for rich social previews.
Visitors arriving from unpaid search results.
Primary SEO success metric. Should grow steadily with good SEO practices. Track in Google Analytics.
Pages without any internal links pointing to them.
These are difficult for search engines to discover and may not be indexed. Should be linked from sitemap or other pages to ensure discovery.
Google's tool for measuring website performance.
Skid integrates PSI to provide mobile and desktop performance scores (0-100) for every audited domain. PSI combines field data from the Chrome UX Report with Lighthouse lab results.
Splitting content across multiple pages.
Requires proper rel=next/prev tags or load-more implementations to ensure all content is crawlable.
Google feature showing related questions users search for.
Provides content opportunities and insight into user intent. Answering PAA questions can improve rankings.
High-impact, low-effort SEO improvements that can be implemented quickly.
Boost your site's performance and rankings with actionable recommendations that deliver fast results.
Multiple redirects in sequence (A→B→C).
Slows page load and wastes crawl budget. Should be fixed to redirect directly to final destination.
How JavaScript content is delivered: Client-Side Rendering (CSR), Server-Side Rendering (SSR), or Static Site Generation (SSG).
Affects how search engines see content. SSR and SSG are better for SEO than pure CSR.
Browser directives (preload, prefetch, preconnect, dns-prefetch) that optimize resource loading.
Helps browser anticipate and fetch resources earlier. Use carefully to avoid wasting bandwidth.
HTML tag that gives instructions to search engine crawlers about indexing and following links on specific pages.
Common values: noindex, nofollow, noarchive. Placed in the <head> section of HTML pages to control how search engines handle individual pages.
A file that tells search engine crawlers which pages they can or cannot access on your website.
Proper configuration ensures important pages are crawled while protecting sensitive areas. Skid respects robots.txt during crawling.
Collaborative vocabulary for structured data markup.
Allows search engines to understand content context and display rich results like ratings, prices, events, and more in search results.
An open-source Python framework for web scraping and crawling.
Skid uses Scrapy to efficiently crawl websites, respecting robots.txt, and collecting technical SEO data like meta tags, titles, canonical links, and page structure.
The reason behind a user's search query.
Four types: informational, navigational, commercial, and transactional. Matching intent is crucial for rankings.
Time To First Byte - how quickly a server responds to requests.
Fast TTFB (under 200ms) is crucial for good page speed and SEO. Affects LCP and overall user experience.
An XML file listing all important URLs on your website, helping search engines discover and index your content more efficiently.
Skid automatically checks for and analyzes your sitemap to understand your site structure and identify important pages.
Page that returns 200 status but shows 'not found' content.
Search engines may treat as actual 404s. Should return proper 404 status code to avoid confusion.
Measures how quickly content is visually displayed during page load.
Lower is better. Shows how quickly the page appears to load from a user's perspective. Combines FCP and visual completeness.
Code that helps search engines understand page content and display rich results.
Improves visibility and click-through rates. Can show stars, prices, events, and more in search results.
Measures interactivity - the total time the page is blocked from responding to user input.
Good TBT is under 200ms. High TBT means poor user experience on mobile. Split long tasks (>50ms), code-split, and defer non-critical scripts to reduce main-thread blocking.
Pages with little value to users - insufficient information, duplicate content, or auto-generated text.
Can result in ranking penalties. Google prefers comprehensive, original, valuable content.
Measures when a page becomes fully interactive and can reliably respond to user input.
Good TTI is under 3.8 seconds on mobile. Indicates when users can actually interact with the page, not just see it.
The HTML <title> element that appears in search results and browser tabs.
Optimal length is 50-60 characters. Skid checks for missing, duplicate, or overly long titles.
Low-quality or spammy backlinks that can harm your site's ranking.
Should be disavowed through Google Search Console to prevent negative SEO impact.
Query strings in URLs (e.g., ?id=123).
Can create duplicate content if not handled properly. Use canonical tags or parameter handling in Google Search Console.
The organization and format of your website's URLs.
Clean, descriptive URLs improve SEO and user experience. Avoid deep nesting and use keywords naturally.
Identifier that crawlers use to introduce themselves when accessing websites.
Different search engines use different user-agents (Googlebot, Bingbot, etc.). Websites can serve different content based on user-agent, though this should be done carefully.
The automated process of systematically browsing websites to discover and analyze web pages.
Skid uses Scrapy, a Python-based web scraping framework, to crawl up to 2,000 pages per domain and extract SEO-relevant data like meta tags, titles, canonical links, and page structure.
A file listing all URLs on your site with metadata about each page's importance and update frequency.
Submitted to Google Search Console to aid crawling. Keep it clean, current, and accessible at /sitemap.xml.
A sitemap file that lists other sitemap files.
Used for large sites with multiple sitemaps to stay within the 50,000 URL limit per sitemap. Helps organize and manage sitemaps for sites with thousands of pages.