Free professional SEO tools — SERP preview, meta analyser, schema generator, robots.txt tester, and more. No login required.
An XML sitemap is a file that lists all the important URLs on your website, helping search engines discover and crawl your content efficiently. Without a sitemap, crawlers may miss new pages or deep content. Sitemaps are especially critical for large sites (1000+ pages), new domains with few inbound links, and sites with rich media content like videos or images.
A valid XML sitemap must: follow the sitemap protocol schema (xmlns='http://www.sitemaps.org/schemas/sitemap/0.9'), list only canonical URLs (not redirects or noindex pages), include a lastmod date in ISO 8601 format (YYYY-MM-DD), keep file size under 50MB uncompressed, and contain no more than 50,000 URLs per file. Use a sitemap index for larger sites.
Submit your sitemap in Google Search Console: go to Sitemaps → enter your sitemap URL (e.g. yourdomain.com/sitemap.xml) → Submit. You can also reference your sitemap in robots.txt with 'Sitemap: https://yourdomain.com/sitemap.xml' — this allows any crawler (not just Google) to discover it automatically without manual submission.
Your sitemap should update automatically whenever you publish or update content. Most CMS platforms (WordPress, Shopify, Webflow) generate dynamic sitemaps. If your sitemap is static, update it manually whenever you add or remove significant pages. Stale sitemaps with deleted URLs or missing new pages slow down Google's crawl efficiency.