magnify
Home Google Index Pages Should I disallow Googlebot from crawling slower pages?
formats

Should I disallow Googlebot from crawling slower pages?

You mentioned that site speed is a factor in ranking. On some pages, our site uses complex queries to return the users request, giving a slow pagetime. Should we not allow Googlebot to index these pages to improve our overall site speed. @NeilTompkins Tommo, London, UK Learn more about making the web faster: code.google.com Have a question? Ask it in our Webmaster Help Forum: groups.google.com Want your question to be answered on a video like this? Follow us on Twitter and look for an announcement when we take new questions: twitter.com More videos : www.youtube.com Webmaster Central Blog: googlewebmastercentral.blogspot.com Webmaster Central: www.google.com
Video Rating: 5 / 5

 
 Share on Facebook Share on Twitter Share on Reddit Share on LinkedIn
4 Comments  comments 

4 Responses

  1. Cheta Manuel

    Matt mentioned “not timing out”. Anyways, you might want to keep it under 3-4s. More than that and I won’t come back to? your site.

  2. StramarkWouter

    @hireahitCA Did you watch the video? Matt does answer your question. From a user perspective you can test it in google analytics and it is about 3.5 seconds for US and europe.
    If you are 400 milliseconds slower than your competitor, the? user is already prone to leave your website.

  3. hireahitCA

    Hey Matt,

    What exactly is “slow” from Google’s perspective? 1s? 5s? At what point might one worry?

    From a user-experience perspective we’re reasonably happy as? only very specific queries are slow and they contain real-time data that can’t be precomputed, but we’re considering keeping Googlebot away from the output entirely if one extremely slow script might hurt the ranking of the entire site.

  4. jadr

    🙂 In short? NO. If your pages are slow, you should fix that, not keep Google away 🙂