Home Videos How can I get Google to index more of my Sitemap URLs?

How can I get Google to index more of my Sitemap URLs?

Bill from Stuart, FL asks: “The Sitemap.xml file states there are 10000 URLs but only 1500 have been indexed. After numerous crawls it does not appear Google…
Video Rating: 4 / 5

 Share on Facebook Share on Twitter Share on Reddit Share on LinkedIn
20 Comments  comments 

20 Responses

  1. Josué Rodríguez M

    GOOD VIDEO!!!!?

  2. PortalooSunset

    This is great info to have. Anyone else think this chap looks like Saif
    Al-Islam, Gaddafi’s son?

  3. infiltrator7777

    Links. What a great answer Matt. “Well Allow Me To Retort” – If a 10,000+
    site isn’t being indexed and Google relies on “links” to point too the site
    content – What makes NEW and FRESH content part of Google? Answer –
    Nothing. It’s your way or the highway Google.



  5. Abhik Biswas

    So, it’s the backlinks. But, backlinks to what? Homepage? Inner pages? or
    the inner inner pages which are not yet indexed? We all know it’s almost
    impossible getting backlinks for all the pages of a website which have
    10000+ URLs in sitemap.

  6. infiltrator7777

    And BILL in Stuart, FL – To answer your questions and a little SEO secret –
    YOU NEED TO TRAP THE BOT. Create ONE page that has ONE link to your dynamic
    content and keep the bot from exiting to other static site pages. Once the
    bot is on this dynamic content it will continue to grab as many pages as
    your server request can handle.

  7. aseohosting

    Very interesting post dude! Ill be back for more of your posts! 😉

  8. The Book Stores Deal

    Thank you for the video. Are there any specific links we should be looking
    for ie from directories, forums, blogs, social bookmarks? Which are the
    most important?

  9. Leonard Iordache


  10. agapitoflores001

    Informative! It helps a lot. Very well.

  11. Kat Bader

    Video site map helps

  12. Rajesh Chaurasia

    How often we can submit sitemap for google? Is there any restriction?

  13. wintogreen1

    @infiltrator7777 -Im a beginner, pardon me if its a silly qtn, Im trying to
    understand infiltrator7777. So I have a blog which is dynamic (ie new
    content & new posts) and have hyperlinks in the blog to static pages like
    the home page-that helps trap the google bot to crawl better? Does that
    sound correct? Am I understanding it correctly?

  14. zTub3

    oh you are compelling us to spam 😐

  15. personalchefsuresh

    Simple yet Good Answer Matt

  16. Leonard Iordache

    it hase page rank 4 and damn it takes to much for indexing it :(( …1 week
    already ..i pay adwords every day a lot but still i need to be in the first
    line in the google….HELP AGAIN! 🙂

  17. infiltrator7777

    @wintogreen1 Well it probably won’t work well for a blog since “dynamic”
    isn’t really the name ID parameter as product site. (I.E. –
    product_id=xxxx). For a blog I would make all the pages static and use
    keywords in the file name. If your blog is 1,000+ pages and all of them are
    dynamic Google most likely isn’t going to run parameter queries to grab all
    the pages.

  18. Peter Jaap Blaakmeer

    Awesome, thanks for the info.

  19. Leonard Iordache

  20. Grahame Davies

    In addition I would check wemaster tools carefully – any redirect errors?
    any duplicate titles or meta information? Do you have a good structure
    within your site to get to all those pages. I have 300,000 pages in
    sitemaps and have improved my position dramatically by paying attention to
    this. Grahame