Are you wondering how to creat effective site map for inclusion in search engines like google and yahoo? What about a simple software that work for you with minimal knowledge and efforts? Hope everyone looking for the best tool to generate site maps. This is the best tool I found and its totally free. Tool is really simple to use and works even for massive sites.
This is couple of comments I found on the site.
Greetings from Google’s European development hub here in Zurich. We came across your site, and your Windows client for Sitemaps is quite cool and very Googley.
– google.com (8/14/2006)
I’ve tried lots of online, and downloaded, generators for google sitemap files and hadn’t found anything very good until ……. I found GSiteCrawler! GSiteCrawler beats all other generators, hands down! There isn’t a paid for generator anything like as good as GSiteCrawler. It’s the best! I’m so pleased with it after 2 days that I’ve made my donation. Thank you for a superb program.
– Malcolm (8/5/2007)
a Google Sitemap file in XML format (of course :-)) – with or without the optional attributes like “change date”, “priority” or “change frequency”
a text URL listing for other programs (or for use as a UrlList for Yahoo!)
a simple RSS feed Excel / CSV files with URLs, settings and attributes like title, description, keywords a Google Base Bulk-Import file a ROR (Resources of Resources) XML file a static HTML sitemap file (with relative or absolute paths) a new robots.txt file based on your chosen filters … or almost any type of file you want – the export function uses a user-adjustable text-based template-system For more information, it also generates
a general site overview with the number of URLs (total, crawlable, still in queue), oldest URLs, etc a listing of all broken URLs linked in your site (or otherwise not-accessable URLs from the crawl) an overview of your sites speed with the largest pages, slowest pages by total download time or download speed (unusually server-intensive pages), and those with the most processing time (many links) an overview of URLs leading to “duplicate content” – with the option of automatically disabling those pages for the Google Sitemap file Additionally …
It can run on just about any Windows version from Windows 95 and up (tested on Windows Vista beta 1 and all server versions).
It can use local MS-Access databases for re-use with other tools
It can also use SQL-Server or MSDE databases for larger sites (requires a seperate installation file).
It can be run in a network environment, splitting crawlers over multiple computers – sharing the same database (for both Access and SQL-Server).
It can be run automated, either locally on the server or on a remote workstation with automatic FTP upload of the sitemap file.
It tests for and recognizes non-standard file-not-found pages (without HTTP result code 404).
Why not download and check it yourself