Blog

Ahrefs Site Audit Issue Descriptions

Published on May 2, 2026

404 page

Issue details

404 – Not Found is one of the most common 4xx errors and indicates that the requested URL does not exist.

Links pointing to the 404 URLs are widely known as "broken links".

404 URLs on your website damage the user experience, as people cannot access the page or file via a link they click. Besides, internal links to 404 URLs create unnecessary "dead ends" for the search engine crawlers and can waste your crawl budget.

How to fix

Review the list of 404 URLs on your website. Click on the number of inlinks to a given 404 URL to access the list of pages that link to it.

You should review the internal outgoing links to all the 404 pages reported and either remove these links or replace them with relevant links to live pages.

Alternatively, you can set the appropriate 301 redirects. It is especially important for the 404 pages with a decent number of external backlinks.

Learn more

4XX page

Issue details

4xx HTTP status codes indicate that the requested page or resource cannot be accessed. 401 - Unauthorized, 403 - Forbidden, 408 - Request Timeout, and 404 - Not Found are the most common "Client Errors".

4xx URLs damage the user experience on your website as people cannot access the page or file via a link they click. Besides, internal links to 4xx URLs create unnecessary "dead ends" for the search engine crawlers and can waste your crawl budget.

Pages of your website that changed their response code to 4xx will be removed from Google's index.

How to fix

Review the list of 4xx URLs. Click on the number of inlinks to a given 4xx URL to access the list of pages that link to it.

You should review the internal outgoing links to all the 4xx pages reported and either remove these links or replace them with relevant links to live pages.

Alternatively, you can set the appropriate 301 redirects. It is especially important when for moved or deleted pages on your website.

This will provide smooth crawlability for your website and guarantee good user experience.

The HTTP 429 (Too Many Requests) response code may indicate that the crawling speed set in the crawl settings for your project is too high for a web server. Reduce it in the crawl settings and run a project re-crawl.

Learn more

Issue details

Pages on your website that link to internal or external URLs returning 404 or 410 HTTP response codes. These links are widely known as "broken links".

Broken links on your website damage your visitors' browsing experience as people cannot access the page or file via a link they click. Besides, broken links create unnecessary "dead ends" for the search engine crawlers and can waste your crawl budget.

How to fix

Remove the broken links from the affected pages or replace them with links to other relevant live pages.

Additionally, you can set redirects for the deleted or moved pages, which is especially relevant for the pages with external backlinks.

Learn more

Issue details

Orphan pages of a website have no incoming internal links.

Search engine crawlers can only discover such pages from the sitemap file or from external backlinks. Website visitors won't be able to get to this page from any other page on your website.

How to fix

Check your website navigation and link architecture to make sure all relevant pages are easily accessible.

Learn more

Issue details

Pages that only have one "dofollow" internal link.

The number of internal links pointing to a page is a signal to search engines about the relative importance of that page.

Besides, their anchor text helps search engines to understand the context better.

How to fix

Make sure the most important pages on your website have at least a few internal "dofollow" links.

Ahrefs' guide to internal links for SEO.

Issue details

For redirecting URLs on your website, this is not a problem, although we recommend linking to the destination page directly.

However, a redirect on an external page you link to requires your attention.

How to fix

It is generally recommended to replace links to redirecting URLs on your website with direct links.

This is especially important when linking to external pages. You should manually review the external redirecting URLs linked from your site to make sure that the destination URL has relevant content.

Learn more

3XX redirect

Issue details

Even though Google announced that any redirection method is good and will pass PageRank, Googlebot is not the only visitor of your website.

Redirects always require caution. They may hurt your website performance, especially for mobile users, or confuse website visitors.

How to fix

It is recommended to replace the links to the internal redirected URLs on your website with the direct links to the destination pages where possible.

Learn more

HTTP to HTTPS redirect

Issue details

URLs using HTTP protocol that redirect to HTTPS.

How to fix

It is recommended to use direct links to HTTPS versions of the pages on your website to avoid unnecessary redirects.

Pages to submit to IndexNow

Issue details

IndexNow is a free protocol that enables website owners to inform search engines about the latest content updates, additions, or removals on their sites. By notifying search engines about these changes, you can ensure that they are aware of the updates instantly, rather than waiting for their bots to crawl and discover the changes themselves.

This issue automatically selects pages that we recommend to submit to IndexNow. It includes:

  • Indexable pages with content changes
  • New indexable pages that were previously non-indexable or missing from the site
  • Pages that have been removed or redirected

How to fix

To submit pages to IndexNow using Site Audit, follow these steps:

  • Set up IndexNow in Site Audit by navigating to Project settings > Site Audit > Crawl settings and follow further the instructions in the IndexNow section there
  • Open this issue report and click the "IndexNow" button in its header to submit all listed pages

Learn more about IndexNow and how to set it up in Site Audit by reading our help article.

Robots.txt is not accessible

Issue details

Site Audit bot could not access the robots.txt file on your website. We either got an HTTP code that denies us access to the file, or the server failed to satisfy the request. When this happens, our bot behaves conservatively, and considers the entire domain disallowed from crawling.

The most likely reason is that your website is blocking our crawler from accessing it on the server side. This issue should not affect visitors, but it does mean that some crawlers may not be able to crawl your website.

How to fix

Check that you do not have a firewall or plugin blocking bots; and if you do, to whitelist our AhrefsSiteAudit user-agent on that system.

Please also add our IPs to the server's whitelist.

External 3XX redirect

Issue details

Some external URLs linked from your site redirect to another URL.

A redirect on an external page could be set up after you had added a link to it from your website. Thus your link might point to a different page now.

How to fix

You should manually review the external redirecting URLs linked from your website to make sure the end page has relevant content.

Redirects always require caution. It is recommended to avoid redirects and use direct links to the destination pages where possible.

External 4XX

Issue details

Some external URLs your website links to result in a 4xx HTTP response code. These links are also known as "broken links".

They may harm the user experience for the visitors of your website.

How to fix

Review all the pages reported and remove or replace links to them on your website.

Make sure your website has links to live pages only.

Indexable page not in sitemap

Issue details

Sitemaps help search engines to crawl and index your site. If a page is important in your website, and you want it to be easily discoverable by search engines, we recommend including it in your sitemap.

How to fix

Review the list of pages found. If there's relevant pages with unique and valuable content, include them in a sitemap.

X (Twitter) card missing

Issue details

Pages with no X (Twitter) card tags.

X card instructs X what information (title, description, image, etc.) to display whenever a URL to your page is shared.

If X cards are missing, X will pull data from relevant Open Graph tags.

How to fix

If you want your pages to look good in X feed when shared, you should implement the X card tags available.

Please note that URLs inside X cards must be absolute and utilize the http:// or https:// protocols.

You can find more information about X cards here.

Open Graph tags missing

Issue details

Pages with no Open Graph tags.

Open Graph tags instruct social networks like Facebook, Pinterest, and LinkedIn what information (title, description, image, etc.) to display whenever a URL to your page is shared.

How to fix

Make sure your pages have Open Graph tags if you want them to look good in social feeds when shared.

Please note that URLs inside OG tags must be absolute and utilize the http:// or https:// protocols.

You can find more information on the Open Graph protocol here.

Open Graph tags incomplete

Issue details

Pages with one or more of the required Open Graph tags missing.

The four required Open Graph tags for every page are og:titleog:typeog:image, and og:url.

How to fix

Make sure your pages have all required OG tags if you want them to look good in social feeds when shared.

Please note that the URLs inside OG tags must be absolute and utilize the http:// or https:// protocols.

You can find more information on the Open Graph protocol here.

Multiple H1 tags

Issue details

Pages that have more than one <h1> tag.

It is possible to have multiple <h1> tags on your pages.

John Mueller of Google mentioned that you could use as many <h1> tags on a page as you need, hinting that Google is smart enough to puzzle out your headers.

How to fix

To avoid any possible confusion for search engines, you should consider keeping the recommended header hierarchy on all of your pages and use only one <h1> tag on a page.

Title too short

Issue details

A short title may not describe the content of your page in the best possible way.

Google may even generate an improved title from anchors, on-page text, or other sources for its SERP.

See Google's recommendations on good titles.

How to fix

Generally recommended title length is between 50 and 70 characters (max 600 pixels). Longer titles will be truncated when they show up in the search results.

Review all the pages reported and consider writing longer titles.

Low word count

Issue details

Pages where the word count is less than 50.

Pages with low word count are not likely to give good coverage of the topic for the search engines.

How to fix

Although you don't always need to make your content very long, pages with little to no text might be hard for search engines to understand.

Make sure your word count is enough to cover a specific topic or to describe other content types on your page.

Title too long

Issue details

Longer titles will be truncated when they show up in the search results.

See Google's recommendations on good titles.

How to fix

Generally recommended title length is between 50 and 60 characters (max 600 pixels).

Review all the pages reported and consider shortening their titles.

Meta description too long

Issue details

Google sometimes uses <meta> tag content to generate snippets, if they think they give users a more accurate description than can be taken directly from the page content.

Besides, Facebook, for example, will use <meta> tag content for link preview if the page has no 'og:description' tag.

If Google decides to use the page meta description as a snippet, a long one can be truncated.

How to fix

A general recommendation today is to keep your page description between 110 and 160 characters, although Google can sometimes show longer snippets.

Google's recommendations on good descriptions

H1 tag missing or empty

Issue details

<h1> tag is the top level heading of the page. Although it is not as crucial as your page title, an <h1> heading is a strong component of your on-page SEO. It helps search engines better understand the content on your page and its overall topic.

How to fix

Each page should have its unique <h1> heading.

It is recommended to use only one <h1> tag per page.

Learn more

Meta description too short

Issue details

Google sometimes uses <meta> tag content to generate snippets, if they think they give users a more accurate description than can be taken directly from the page content.

Besides, Facebook, for example, will use <meta> tag content for link preview if the page has no 'og:description' tag.

A short meta description may not summarize the content of your page in the best possible way.

How to fix

A general recommendation today is to keep your page description between 110 and 160 characters, although Google can sometimes show longer snippets.

Google's recommendations on good descriptions