<meta name="robots" content="noindex, nofollow"> which prevents search engines from indexing this page AND from following its links. This is the most restrictive robots directive combination.
sitemap.xml to trigger the E02 error — a noindex page should never appear in the sitemap, as it sends conflicting signals to search engines.
The meta name="robots" tag controls how search engine crawlers interact with a page. The noindex directive tells search engines not to include this page in their search results, while nofollow tells them not to follow any links on this page for ranking purposes. When both are combined, the page is effectively invisible to search engines and provides no link equity to pages it links to.
While there are legitimate uses for noindex (staging pages, internal search results, duplicate content management), having noindex on content pages that should be found by users through search is a critical SEO issue. Many websites accidentally deploy noindex directives site-wide during development and forget to remove them before launch, effectively making the entire site invisible to search engines.
Login pages, admin panels, thank-you pages after form submissions, paginated archive pages beyond the first page, tag and category pages with thin content, and print-friendly page versions are all reasonable candidates for noindex. However, product pages, service descriptions, blog articles, and landing pages should almost never carry noindex directives unless they are exact duplicates of canonical versions.
Including a noindex page in the XML sitemap creates a direct contradiction: the sitemap tells search engines "please crawl and index this page" while the meta robots tag says "do not index this page." Search engines like Google have noted that this conflicting signal can lead to unpredictable behavior, with some engines respecting the noindex while others may attempt to index the page anyway. The best practice is to ensure that noindex pages are excluded from the sitemap entirely.