My client who has been running an Actinic site for 6 months is disappointed at the lack of sales and asked me to investigate. In doing so I have discovered that there could be an indexing problem despite submitting the site to a number of engines, using a Google sitemap etc. One reason could be the presence of different versions of the same files following an upload, the newer ones being included in the navigation system but not the older ones. For instance there is Online_Catalogue_301_119.html and Online_Catalogue_301_35.html, the latter from an earlier upload and the former more recent.
We have noticed that during the upload, some files appear to be reordered, which begs the question, how can there be a stable set to index and what happens to the old orphaned files?
My client is uploading rather than refreshing so I would be grateful to learn what we are doing wrong.
In analysing the log files, I discovered that the view cart perl file had been hit on average 14 times per month and yet this has not related to any sales, so my next question is whether this could have been a robot rather than a user and should this file and others be excluded in the robot.txt file?
We have noticed that during the upload, some files appear to be reordered, which begs the question, how can there be a stable set to index and what happens to the old orphaned files?
My client is uploading rather than refreshing so I would be grateful to learn what we are doing wrong.
In analysing the log files, I discovered that the view cart perl file had been hit on average 14 times per month and yet this has not related to any sales, so my next question is whether this could have been a robot rather than a user and should this file and others be excluded in the robot.txt file?
Comment