News:

Want to get involved in developing SMF, then why not lend a hand on our github!

Main Menu

Googlebot errors since upgrade to 2.0

Started by Pudders, April 24, 2014, 04:34:00 PM

Previous topic - Next topic

Pudders

Hi,

I recently paid someone to upgrade my forum to 2.0 version and since then I have been getting emails from google every few days saying there are crawl errors - these are all similar but vary between 20-90% errors:

Quotehttp://www.hairdyeforum.com/: Googlebot can't access your site
Over the last 24 hours, Googlebot encountered 589 errors while attempting to connect to your site. Your site's overall connection failure rate is 22.2%.
Quotehttp://www.hairdyeforum.com/: Googlebot can't access your site
Over the last 24 hours, Googlebot encountered 389 errors while attempting to connect to your site. Your site's overall connection failure rate is 32.7%.
You can see more details about these errors in Webmaster Tools.

I have checked with my webhosts and they say:
QuoteI have verified your issue that you are facing with http://www.hairdyeforum.com/ and found that domain is properly resolving from server .

http://www.dnswatch.info/dns/dnslookup?la=en&host=hairdyeforum.com&type=A&submit=Resolve


Current robots text:
User-agent: *
Sitemap: http://www.phphelp.com/sitemap.xml
Sitemap: http://www.phphelp.com/topicsitemap.xml

User-agent: *
Disallow: /Sources/*
Disallow: /Smileys/*
Disallow: /Packages/*
Disallow: /header/*
Disallow: /avatars/*
Disallow: /attachments/*
Disallow: /Themes/*
Disallow: /index.php?action=printpage*
Disallow: /index.php?action=post*
Disallow: /index.php?action=permissions*
Disallow: /index.php?action=pm*
Disallow: /index.php?action=help*
Disallow: /index.php?action=register*
Disallow: /index.php?action=login*
Disallow: /index.php?action=mlist*
Disallow: /index.php?action=search*
Disallow: /index.php?action=who*
Disallow: /index.php?*rss*
Disallow: /index.php?action=stats
Disallow: /index.php?PHPSESSID=
Disallow: /index.php?*wap*
Disallow: /index.php?*imode*
Disallow: /index.php?action=post*
Disallow: /index.php?action=activate*
Disallow: /index.php?action=reminder*
Disallow: /index.php?wap2*
Disallow: /index.php?action=profile*
Disallow: /*action=
Disallow: /*.new.html
Disallow: /*.msg
Disallow: /*.prev_next

I have a couple of other websites hosted on the same server and these are running as normal with no errors being reported, it's only this SMF one having issues, and again only since the upgrade.  The robots text changed during the upgrade and although I have copied recommended robots text from this forum it still gives these errors, so cannot work out what I am doing wrong as did not get all these errors before the 2.0 upgrade.  Any suggestions?

kat

Have you checked your .htaccess files?

If you have any "deny" stuff, in those, that'll keep 'em out.

Pudders

Hi,

Thanks for the quick reply.  I have had a look at the htaccess file and it shows several pages of denied IP addresses :(
Am not sure if these are the forum problem IP addresses (noted as spammers from Stop Forum Spam) I added in CPanel or if they are something that was added when the forum was upgraded.  Should I just delete them all or is there an IP range that googlebot uses so I can make sure none of those are on the list?

kat

I would assume that Google use a myriad of IP addresses, myself.

I don't know that mod, I'm afraid. I wouldn't think that it edits .htaccess. But, I really couldn't be certain, on that. :(

Maybe you could ask in the mod's support topic?

Pudders

Thanks - think I will delete the IP addresses and see what happens.  have kept a back up of the original list of denied IP's so can always re-add them if needed and I will keep an eye on it each time the stop forum spam mod blocks a new IP.

One last thing, the htaccess file contains the following:
Quotephp_value session.save_path "/home/hairdyef/public_html/generator/data/"

<Files 403.shtml>
order allow,deny
allow from all
</Files>

deny from (then lists about 50 IP addresses)

Do I just delete everything and leave it as a blank/empty htacess file or should there be something in there?

Thanks
Nickki

Kindred

BTW: All those disallow statements are really not needed...



However, if you look in google webmaster tools and figure out WHAT URLs are failing, we might have a better idea.

For example, You might have had a sitemap under smf 1.1.x which was not reinstalled when you were upgraded...
Слaва
Украинi

Please do not PM, IM or Email me with support questions.  You will get better and faster responses in the support boards.  Thank you.

"Loki is not evil, although he is certainly not a force for good. Loki is... complicated."

Advertisement: