HTTP Status Codes for SEO
Complete reference of HTTP status codes. What each code means, how search engines handle it, and what to do about it.
What Are HTTP Status Codes?
HTTP status codes are three-digit numbers returned by a server in response to a browser's or crawler's request. They tell the client whether the request succeeded, was redirected, or failed — and why.
For SEO, status codes directly affect how search engines crawl, index, and rank your pages. Incorrect status codes can lead to wasted crawl budget, lost link equity, or pages disappearing from search results.
Quick Reference
2xx Success
The request was successfully received, understood, and accepted.
3xx Redirection
The client must take additional action to complete the request. Critical for SEO migration and URL management.
4xx Client Errors
The request contains bad syntax or cannot be fulfilled. These indicate problems on the client side.
5xx Server Errors
The server failed to fulfill a valid request. These impact crawl budget and can cause deindexing if persistent.
2xx Success
The request was successfully received, understood, and accepted.
The request succeeded and the server returned the requested resource.
The ideal status code. Pages returning 200 are crawled and indexed normally.
The request succeeded and a new resource was created (typically after POST requests).
Not relevant for SEO crawling. Typically returned by APIs, not web pages.
The request succeeded but there is no content to return.
Not used for web pages. If a page returns 204, it won't be indexed.
3xx Redirection
The client must take additional action to complete the request. Critical for SEO migration and URL management.
The resource has been permanently moved to a new URL. Browsers and crawlers should update their references.
Passes ~95-100% of link equity. The SEO-recommended redirect for permanent URL changes, domain migrations, and HTTP-to-HTTPS switches.
The resource is temporarily available at a different URL. The original URL should still be used for future requests.
Does not consolidate link equity by default. Google may index either URL. Use only for genuinely temporary situations (A/B tests, geo-redirects).
The resource has not changed since the last request. The browser can use its cached version.
Positive for crawl efficiency. Tells Googlebot the page hasn't changed, saving crawl budget.
Like 302, but the request method must not change (POST stays POST).
Same SEO behavior as 302. Google treats it as a temporary redirect.
Like 301, but the request method must not change.
Same SEO behavior as 301. Google treats it as a permanent redirect and passes link equity.
4xx Client Errors
The request contains bad syntax or cannot be fulfilled. These indicate problems on the client side.
The server cannot process the request due to malformed syntax.
If a crawlable URL returns 400, Google will eventually drop it from the index. Fix the underlying issue.
Authentication is required and has not been provided or has failed.
Google cannot crawl or index content behind authentication. Pages returning 401 are excluded from the index.
The server understood the request but refuses to authorize it.
Google treats 403 like 404 over time. If a page consistently returns 403, it will be dropped from the index.
The requested resource could not be found on the server.
Google removes 404 pages from the index. Excessive 404s don't hurt your site's overall ranking, but they waste crawl budget. Fix or redirect valuable URLs.
The resource has been permanently removed and will not return.
Stronger signal than 404. Google deindexes 410 pages faster. Use when you deliberately remove a page with no replacement.
The user has sent too many requests in a given timeframe (rate limiting).
If Googlebot gets 429, it slows down crawling. Persistent 429s can prevent your pages from being crawled and indexed.
5xx Server Errors
The server failed to fulfill a valid request. These impact crawl budget and can cause deindexing if persistent.
A generic server error. The server encountered an unexpected condition.
Temporary 500s are tolerated. Persistent 500s cause Google to reduce crawl rate and eventually drop pages from the index.
The server received an invalid response from an upstream server (reverse proxy, load balancer).
Same as 500. Indicates infrastructure issues. Frequent 502s degrade crawl efficiency and risk deindexing.
The server is temporarily unable to handle requests (overloaded or under maintenance).
Google retries later. Short 503 outages are fine. Extended 503s (days/weeks) lead to deindexing. Use a Retry-After header to signal expected recovery time.
The server did not receive a timely response from an upstream server.
Same as 502. Usually indicates slow backend or misconfigured timeouts. Persistent 504s hurt crawl budget.
Redirect Decision Guide
| Scenario | Recommended Code |
|---|---|
| Permanent URL change | 301 |
| Domain migration | 301 |
| HTTP to HTTPS | 301 |
| A/B test with different URL | 302 |
| Geo-based redirect | 302 |
| Page permanently removed, no replacement | 410 |
| Scheduled maintenance | 503 + Retry-After |
Key Takeaways for SEO
- 1.Always use 301 redirects for permanent URL changes. They pass nearly all link equity to the new URL.
- 2.Monitor 404 errors in Google Search Console. Redirect valuable pages, leave intentional 404s alone.
- 3.Use 410 (Gone) instead of 404 when you deliberately remove a page. Google deindexes 410s faster.
- 4.5xx errors waste crawl budget. Fix them immediately. Persistent server errors can cause deindexing.
- 5.During maintenance, use 503 with Retry-After, never 301 or 302 to a temporary page.
Check Your Pages
Validate your HTML, check your meta tags, and audit your robots.txt for errors.
Semrush — All-in-One SEO Toolkit
Audit status codes, redirects, and crawl issues across your entire site.