Google question

StephanieCordray

New Member
I worked on a robots.txt for a customer to disallow certain areas of the site from google crawls and indexing... well, not just google but all bots... However, google is ignoring the robots.txt and doing its own thing anyway. Anybody else have this problem?

Google's been using up about 10GB/month on this particular site. Slurp was the worst culprit though topping out about 13GB. The site is heavily trafficked and the bandwidth is consistently going through the roof so any little bit of reduction would help them.
 

ian

Administrator
Staff member
I have never had that problem, but in all honesty, I cant see why you would want to prevent google from crawling certain parts of your site which are viewable by the public.
I have the reverse problem of google not crawling enough pages.
Bandwith is fairly cheap these days, I would be looking at ways to better monetize the site so that they could afford extra bandwith.
Alternatively cut down on the amount and size of the graphics.
 

StephanieCordray

New Member
It's not the public places that I disallowed, :). I wouldn't do that. The best solution would be to prune the forum but the customer doesn't want to do that. Images are another problem but that's being worked on. As far as monetizing the site, I'm not sure what she could do to better monetize it because she has a lot going on with it now.
 

ian

Administrator
Staff member
If it is a forum, then the non public places are not viewable by the bots either and are not generally indexed, well that is the case with vbulletin.
What forum software do they use?
 
Top