Announcement

Collapse
No announcement yet.

Search Engine Spiders - Crawl Limits..?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Search Engine Spiders - Crawl Limits..?

    Firstly Happy New Year to everybody =]

    If you are still concerned that your important pages may not get indexed, then you can consider adding a sitemap to your website. A sitemap can be best described as an index page - it is a list of links to all of the pages within a site contained on one page. If you link to a sitemap from your homepage then it gives a robot easy access to all of the pages within your site. Just remember - robots typically can't follow more than 100 links from one page, so if your site is larger than this you may want to consider spreading your sitemap across several pages.
    Ok, above is a snippet from some SEO newsletter I subscribe too. Usually I take them as a guide, but I have read similar articles and infomation numerous times over the past few months.

    Does anyone here think that splitting the sitemap up so that no more than 100 links are displayer per page of the is worth doing to improve SERPs? Will it have any benefit?

    If so has anyone managed to do this yet? Or do you have any tips/ideas on how to get this done with out having to do too much everytime a page is added?

    TIA

    Ben
    www.bathroomexpress.co.uk

    #2
    Hi Ben

    I created a sitemap.txt file, uploaded it and registered it with Google 3 weeks ago. Google have now spidered every single page of my site and i am moving up through the ranks. This file has something like 250 URLs all in one file.

    Everytime something changes on my site, i add the new URL and "resubmit" the sitemap. This has worked fine for me so i will continue to do it.

    One thing i have noticed is that Google is reporting errors on pages that do not exist on my new site. I believe it is reading them from cached pages it has stored. The list of pages is reducing and in time i expect there to be none.

    Comment


      #3
      Have a look at this Google sitemaps page (which does tend to overcomplicate things).

      http://www.google.com/webmasters/sit..._GB/about.html

      In a nutshell what you need to do is:

      Use a 3rd party freeware product to crawl your site, generate the XML site map, FTP it to your site and ping the Google search engine . Download it and install it from here.

      http://johannesmueller.com/gs/

      You will also need a google email account. If you have one login here or create one on the same page.

      https://www.google.com/webmasters/sitemaps/login

      Then follow the instructions.
      Regards
      David

      Comment


        #4
        Cheers guys, dont mean to sound ungrateful, but I have already been doing this and I havent seen hardly any progress in the last 3/4 months. I update the sitemap pretty much weekly and then resubmit the sitemap also.

        The reason I asked about the actinic sitemap was to use it as an addition to the Google sitemap, but I have read in a few places that the oogle Spiders/robots do not like pages with more than 100 links... so was wondering if it was easily possible to split the Actinic generated sitemap so there is say 90 links per page to assist the spider.

        Will this still be relevant if I submit Google sitemaps... as in do the spiders now ignore the links and just go by the submitted sitemap...

        Cheers for your help thus far,

        Ben
        www.bathroomexpress.co.uk

        Comment


          #5
          Hi - I'm afraid that I do not know of any way to split the sitemap info - sounds like one for the Actinic guys or the Perl experts.
          Regards
          David

          Comment


            #6
            ah, I see.

            It was just an idea anyway, just wondered if anybody else had managed to pull it off.

            If anyone alse here has any ideas, they would be greatly appreciated.

            TAL

            Ben
            www.bathroomexpress.co.uk

            Comment


              #7
              I read the same article and was curious about the 'limit'. I thought at the time that they were perhaps referring to links that were not from the same domian - in other words off site-links.

              I too have well over 100 links in my sitemap and, since starting to update my content at least weekly, I have found that new the content appears in Google within just a couple of days now.

              Duncan R

              Comment


                #8
                sorry for the dumb question

                but why do we need to subscribe to google sitemaps at all, if there robots will crawl the actinic sitemap where is the +,
                i have just spent hours generating, uploading, and working every thing out to get a google site map working, hope they will take the site seriously now and list properly..
                Gary Simpson
                www.tba
                Replacement blades, drills and cutters for your power tools.....

                Comment


                  #9
                  I think the reason it may help is that you can specify to Google how often you expect the site content to change and of course you are effectively doing some of the robot's work.

                  I submitted a sitemap some time ago and said that content would change weekly - which it does - and whether coincidence or not I see changes picked up faster nowadays.

                  Comment


                    #10
                    I think the 100 limit refers more to external links rather than a site map. Google looks on a page with 100+ external links as a possible link farm and hence ignores or penalises the site - you should ensure that if you have external links pages that they do not have more than about 50 links to avoid any penalties.

                    Comment


                      #11
                      the limit is with the links on the actinic generated site map, not the google site map.

                      I want to cover every angle to get the best possible rankings, although if the workload excedes the increase it is not worth it.
                      www.bathroomexpress.co.uk

                      Comment


                        #12
                        Search Engine Sitemaps

                        Hi Ben,
                        We are a web search engine optimisation business (http://www.jamcomm.com) and are working to up the rank of http://www.colourwash.co.uk, and we are finding the best way to drive rank is inward links. The sitemap is really only for Google and even then its low priority.

                        Matthew
                        http://www.jamcomm.com
                        Last edited by Matthew Brown; 10-Jul-2006, 11:31 AM. Reason: spelling mistake
                        -----------------------------------------------
                        Matthew Brown
                        Jam Communications Ltd
                        http://www.jamcomm.com

                        Comment


                          #13
                          The sitemap is really only for Google and even then its low priority
                          I would disagree with this statement. A site map works for all search engines that use spiders to gather information about a site not just google, a site map is vital for any site but particularly sites where the site has deep pages ie more than 4 as the spiders only follow links for so far into the site. A site map allows them access to every page on the site one link from the home page.

                          Comment

                          Working...
                          X