Dec 8, 2008
It is very important in search engine optimization to constantly have the search engines crawling your website to have new information. There are several ways to have the search engines to constantly come back to your website.
Primarily the best is to continue to add valuable content to readers and information that can be used by many different people. As a website adds content the search engines love to come back and eat it up. Blogs are by far the best way to go as you can have it ping the pinging directories and the search engines are constantly looking in these directories.
WordPress blogs can have a new post and ping the pinging directories and within minutes the search engines will crawl and index that page. In WordPress you can set up what ping directories you wish to ping and as soon as you publish your content it will automatically notify the directories that you have new content and let the internet world know to come visit your blog for the new information.
Getting high quality links on a consistent basis is extremely valuable in getting the search engines to come often to your website. Using social bookmarking companies that have great authority in the eyes of the search engines is a excellent way to have the search engines find your website and come back to crawl it.
In Google webmaster tools you can set the way you want Googlebot to crawl your site. You can set the crawl rate to slow, normal, and faster. Webmaster tools also has a feature that will show you how often Googlebot is on your site and how many pages he is crawling on average.
The ultimate goal is that you want the search engines constantly on your website. Google is the number one search engine and the one that I would focus on having Googlebot constantly on your website. There is also a place in webmaster tools where you can see information about your website like how many pages are in the index, how many links point to it, related pages, and information that Google has about your website.
One of the items that is also listed is the cache of the website. If you click this it will show you what the website looked like when Googlebot came to crawl the site. In the search engine results pages there is a link by the domain of the listing that says CACHED. If you click this link it will show you the cached version of the site and it will tell you the date.
After I write and publish this post it will ping the directories and will ask Googlebot to come crawl and index the page.
You want to have the cache dates of a website within 15 days. So you can see the date and count back 15 days and if your website has not been crawled within 15 days then you will want to add more content and get more links for Googlebot to see and follow back to your website.
The big authority websites constantly have Googlebot on their website as their is information that is changing on a constant basis. If you want to look like an authority website and increase your SEO efforts then you will want to make it so Googlebot and the other search engine crawlers are constantly on your website.
So make sure you have cache dates that are newer than 15 days.