With the help of this tool users can find the location of their sitemap.xml and check if their URLs are working properly.
The tool will give the user information on the URLs by retrieving the HTTP response code and on the structure and location of the sitemap.xml file!
After inserting your webpage URL, preferably the domain link, the tool will search for "robots.txt" from where the sitemap path will be extracted.
After analyzing the file and visiting all the found URLs the results will be processed and displayed in simple tables. A table for basic information and a table with all the URLs and the HTTP response code.
An XML file that contains a set of URLs that require indexing from search engines, files or videos. It's location is usually specified in Robots.txt similar to: "Sitemap: https://mydomain.com/sitemap.xml". Basically it's location should be in the "root" folder of the site.
This file provides important information about the pages like the last time the page was updated, alternate languages compatible with the specific page.