Make your website easier to find by simplifying search engine indexing.
Make a list of all URLs on your website that you want people to find via search engines and record the last-modified date of each.
URLs can include pages, posts, and categories and exclude pages that have duplicate content such as tags. Do not include URLs that are blocked by robots.txt file, require a login to access, or are password-protected.
Decide if you will create one sitemap file or many sitemap files based on the number of URLs you have.
If you have more than 50,000 URLs, break your sitemap into multiple sitemap.xml files. A common practice is to create separate sitemap files for pages, posts, and categories.
Code all your URLs in XML tags to create a sitemap.
Your XML code should look like this:
https://example.com/ 2020-09-17 daily 0.9
```Including the URL in tags is mandatory. [You can also include](https://www.sitemaps.org/protocol.html): - The last modified date - How often the page is modified - How important the page is to your website as a whole. 1. Save the file as ***page_sitemap.xml.*** 2. If your sitemap file exceeds 50 MB or has more than 50,000 URLs, break it into multiple sitemaps. Save them as ***page_sitemap1.xml*** and so on. ## Create similar sitemaps for your posts and categories. ## Create a sitemap index file, include the links to the location you'll upload the sitemaps to, and save the file as sitemap_index.xml. Your sitemap index file should be in this format:
example.com/sitemap2.xml with the correct URL to your sitemaps.
Alternatively, download Screaming Frog SEO Spider, install it on your computer, and run a crawl of your website.
- Click on Sitemaps to generate an XML sitemap.
- Tick PDFs but leave other checkboxes unchecked.
- On the Last Modified tab, select Include tag. Leave other options as automatically selected by Screaming Frog.
- Click Next and save the sitemap as sitemap.xml on your computer.
Last edited by @hesh_fekry 2023-11-14T12:35:17Z