auxmeta / docs / getting-started
AuxMeta is an SEO monitoring and automation platform that crawls your websites, detects technical issues, generates sitemaps and robots.txt files, and monitors your pages continuously so you catch ranking-killing problems before they cost you traffic.
This documentation covers every feature available in AuxMeta — from adding your first site to configuring the white-label API. Use the sidebar to jump to any section.
10K+
Websites monitored
50M+
Pages crawled
<24h
Average issue detection
Go to auxmeta.com/sign-up and create a free account. No credit card is required. AuxMeta uses secure OAuth — you can sign up with Google, GitHub, or email.
After logging in, click Add Site in the dashboard. Enter your domain including the protocol:
https://yourdomain.comAuxMeta validates the URL, confirms it is reachable, and immediately queues a discovery crawl. Within 2 minutes your site card will show the number of pages discovered, any critical errors found, and your Core Web Vitals snapshot.
The main dashboard surfaces five key panels:
auxmeta / docs / core-features
AuxMeta's crawler discovers every URL on your site by following internal links from your homepage and sitemap. It supports both standard (HTML) and JavaScript-rendered pages (headless crawl on Pro / Premium).
Two crawl modes are available from the Sites panel → Start Crawl:
After each crawl, AuxMeta generates a prioritized Issues list. Issues are classified into four severity levels:
URL Monitoring watches your highest-value pages between crawls. Configure it from Sites → Monitor URLs. AuxMeta fires an alert when:
Alerts are delivered via email or SMSPro. Configure your notification preferences under Settings → Notifications.
Scheduled Crawling removes the need to remember to run audits manually. From the Sites panel → Crawl Settings, configure:
The Reports section gives you a longitudinal view of your site's SEO health. Widgets include:
auxmeta / docs / tools
AuxMeta generates a clean, submission-ready sitemap.xml from your crawl data — only canonical 200-status URLs without noindex or robots.txt blocks.
Sitemap: https://yourdomain.com/sitemap.xmlThe Robots.txt Generator provides a visual rule builder. Select the user-agent, define allow and disallow paths, and AuxMeta outputs a validated file you can deploy directly.
User-agent: *
Disallow: /admin/
Disallow: /private/
Allow: /
Sitemap: https://yourdomain.com/sitemap.xmlAuxMeta's AI analyzes your crawled pages against search intent signals and identifies quick wins and content gaps. For each suggestion you get a target keyword, current best-ranking page, content brief, and internal linking recommendations.
Headless Crawl runs your pages in a real Chromium browser and captures the fully JavaScript-rendered DOM — essential for React, Next.js, Vue, Angular, lazy-loaded content, and accurate Core Web Vitals measurement.
auxmeta / docs / integrations
Connecting GSC enriches your AuxMeta reports with real traffic data — impressions, clicks, average position, and CTR for every page and keyword. To connect: Settings → Integrations → Google Search Console → Authorize.
AuxMeta integrates with Google Analytics 4 and other platforms via webhook. Once connected, AuxMeta can overlay traffic data on issue timelines — so you can see exactly when a crawl issue started affecting your traffic.
AuxMeta provides native plugins for WordPress and Webflow, and a generic webhook trigger for headless CMS platforms (Contentful, Sanity, Strapi) to automatically trigger a crawl on content publish.
auxmeta / docs / api
https://api.auxmeta.com/v1curl https://api.auxmeta.com/v1/sites \
-H "Authorization: Bearer YOUR_API_KEY"GET /v1/sites # List all sites
POST /v1/sites # Add a site
GET /v1/sites/:id # Get a site
DELETE /v1/sites/:id # Delete a sitePOST /v1/sites/:id/crawls # Trigger a crawl
GET /v1/sites/:id/crawls # List crawl history
GET /v1/crawls/:crawl_id # Get crawl resultsGET /v1/sites/:id/issues # All issues
GET /v1/sites/:id/issues?severity=critical # Filter
GET /v1/sites/:id/issues?format=json # Exportauxmeta / docs / account
| Feature | Free | Pro | Premium |
|---|---|---|---|
| Sites | 3 | 15 | 50 |
| Sitemap Generator | ✓ | ✓ | ✓ |
| Robots.txt Generator | ✓ | ✓ | ✓ |
| Manual Crawls | ✓ | ✓ | ✓ |
| Scheduled Crawls | — | ✓ | ✓ |
| AI SEO Suggestions | — | ✓ | ✓ |
| Alerts (Email/SMS) | — | ✓ | ✓ |
| Team Access | — | 5 users | Unlimited |
| Headless Crawl | — | — | ✓ |
| White-label API | — | — | ✓ |
| SLA Support | — | — | ✓ |
Pro accounts support up to 5 team members. Invite members from Settings → Team. Premium accounts add role-based permissions (Admin, Editor, Viewer), SSO via SAML/OIDC, and audit logs.
Generate API keys from the dashboard under API Keys. Each key is displayed once — save it immediately.
Start free — no credit card required. Connect your first site and run your first audit in under 2 minutes.