Google indexing

Started by skb, January 05, 2025, 11:03:50 PM

Previous topic - Next topic

skb

Google emailed me that the Search Console has identified some pages are not being indexed as they are being blocked by robots.txt

Should this be ignored or changes should be made to the file.

SMF 2.1.4 / TP 2.2.2

Arantor

Maybe? Hard to tell without some details, like *which* URLs? Cut off the domain part if necessary.

There are definitely URLs exposed to Google that shouldn't be indexed by default and various "best practice" guidance.
Holder of controversial views, all of which my own.


Maxheather

If the blocked pages are critical for SEO or user access, you should update the robots.txt file to allow crawlers to index them. Use Google Search Console's Robots.txt Tester to identify and fix disallowed rules, then submit the updated file and request reindexing. If the pages are intentionally blocked (e.g., private or admin pages), no action is needed.







Advertisement: