Actions
ClientProject #1654
openCR: Implement Multiple Sitemaps and Optimize Robots File
Start date:
01/13/2025
Due date:
01/14/2025 (about 15 months late)
% Done:
100%
Estimated time:
8:00 h
Spent time:
Description
Objective¶
Create separate sitemaps for static and dynamic pages, remove unused URLs from the robots file, and ensure proper indexing for optimal SEO performance.
Task Checklist¶
Sitemap Creation¶
-
Static Sitemap:
- Generate a static sitemap for all static pages.
-
Ensure the sitemap is accessible at
https://www.pqsjapan.jp/sitemap_.xml.
-
Dynamic Sitemap:
- Generate a dynamic sitemap for products and stock.
-
Ensure the sitemap follows a segmented structure (e.g.,
https://www.pqsjapan.jp/all-stock/sitemap/0.xml).
-
Main Sitemap Index:
- Create a sitemap index file to list all individual sitemaps (static and dynamic).
-
Ensure accessibility at
https://www.pqsjapan.jp/sitemap_index.xml.
Robots File Update¶
-
Unused URLs:
-
Identify and remove any unused or redundant URLs from
robots.tsbased on the provided Google Sheet.
-
Identify and remove any unused or redundant URLs from
Expected Deliverables¶
-
Sitemaps:
- A static sitemap available at
https://www.pqsjapan.jp/sitemap_.xml. - A segmented dynamic sitemap available at
https://www.pqsjapan.jp/all-stock/sitemap/0.xml. - A main sitemap index file linking all sitemaps at
https://www.pqsjapan.jp/sitemap_index.xml.
- A static sitemap available at
-
Robots File:
- A clean and optimized
robots.tsfile free of unused URLs for better crawling efficiency.
- A clean and optimized
Actions