How to Add a Sitemap to Anchor CMS

So you have an Anchor CMS site and you want to improve your search engine indexing and rankings? Look no further. This quick tutorial will have you adding a sitemap in no time at all!

Why a Sitemap in the First Place?

Good question. Google and other search engines crawl our sites regularly. They do this by following links on pages. When they crawl a page, they make note of all the links and add them to the list of pages to crawl. But sometimes there are no links to our site.

Learn about Sitemaps from Google Webmaster Tools…

Especially when we first launch a new website, nobody is linking to it. Even when links begin to pop up, it’s up to search engines to find those links and follow them. Instead of waiting for search engines to figure it out, we can give them a sitemap which includes all of the pages we want them to index.

Which Type of Sitemap Do We Want?

Traditionally, sitemaps were pages on a site, styled like any other page. Think of a standard content page and replace everything between the header and footer with a nested list of pages and posts on the site. The list items were usually page titles and linked to the page they represented.

The traditional version works, but it’s next to useless to our visitors and it’s not the most efficient way for search engines to crawl and index our site. If your site isn’t easy to navigate for you visitors, you have IA/UX work to do; a sitemap won’t help. That leads us to the XML sitemap.

The following tutorial will show you how to add an XML sitemap to your Anchor-powered site. Once set up, the sitemap will be generated on the fly like any other page, so you never have to update it manually, but it will output in a simple, machine-readable format.

Why Do We Have to Do This?

Anchor doesn’t come with a sitemap out of the box. I’m sure as soon as version 1.0 gets released, and plugins are finally a reality, someone will make a sweet sitemap plugin for all of us. Until then, we need to add the functionality ourselves.

Fortunately, it’s a piece of cake.


Open up your Anchor site in your favorite editor. We’ll need to edit one file. That’s right, just one file needs to be edited to get a sitemap working on our site. How sweet is that?!

The file is located here: [root directory]\anchor\routes\site.php. Be careful. We don’t want to mess with anything in this file or our entire site might break. I’d recommend adding the following code at the very bottom of the file.

 * Sitemap
Route::get('sitemap.xml', function() {
    $sitemap  = '<?xml version="1.0" encoding="UTF-8"?>';
    $sitemap .= '<urlset xmlns="">';

    // Pages
    $query = Page::where('status', '=', 'published');
    foreach($query->get() as $page) {
        $sitemap .= '<url>';
        $sitemap .= '<loc>' . full_url() . $page->slug . '</loc>';
        $sitemap .= '<changefreq>weekly</changefreq>';
        $sitemap .= '</url>';

    // Posts
    $query = Post::where('status', '=', 'published')->sort(Base::table('posts.created'), 'desc');
    foreach($query->get() as $article) {
        $sitemap .= '<url>';
        $sitemap .= '<loc>' . Uri::full(Registry::get('posts_page')->slug . '/' . $article->slug) . '</loc>';
        $sitemap .= '<lastmod>' . date("Y-m-d", strtotime($article->created)) . '</lastmod>';
        $sitemap .= '</url>';

    $sitemap .= '</urlset>';

    return Response::create($sitemap, 200, array('content-type' => 'application/xml'));

That’s it. Save that file and push it live. You now have a sitemap!

You can see, based on the comments, that we’re adding a new route for sitemap.xml, and adding pages and posts as entries to an XML structure. There are a few things we can change if we want to. We’ll see how in the next section, but you could be done here and your sitemap would be just fine.

The Next Steps

Improve Our Sitemap

Let’s improve our sitemap by modifying just a few things. First let’s add specific rules to the entries for our homepage and blog page. Since we previously marked our <changefreq> as weekly for pages, search engines may not look at our home page or blog as frequently as we might like.

Since these pages are generally updated more frequently, we want to make note of that. We also want to increase their priority. The default priority is 0.5. So we only need to specify it when it’s different than that. For more info on sitemap protocol and the available options, view this page from

Note: If you’re using the homepage as your posts page, don’t include the // Blog page section of the following code.

// Home page
$sitemap .= '<url>';
$sitemap .= '<loc>' . full_url() . '</loc>';
$sitemap .= '<changefreq>daily</changefreq>';
$sitemap .= '<priority>0.9</priority>';
$sitemap .= '</url>';

// Blog page
$sitemap .= '<url>';
$sitemap .= '<loc>' . Uri::full(Registry::get('posts_page')->slug) . '</loc>';
$sitemap .= '<changefreq>daily</changefreq>';
$sitemap .= '<priority>0.8</priority>';
$sitemap .= '</url>';

Place that code just above // Pages. We want it to show up at the top of our list of pages as the homepage and blog pages are most important. You can also place the blog page below the other pages, but it doesn’t make much difference.

Now that we’ve included the home and blog pages manually, they will show up in our sitemap twice since the // Pages section will list them as regular pages too. So let’s make sure that doesn’t happen.

Replace the // Pages section with the following code:

    // Pages (except home & blog)
    $query = Page::where('status', '=', 'published')->where('slug', '!=', 'home')->where('slug', '!=', 'blog');
    foreach($query->get() as $page) {
        $sitemap .= '<url>';
        $sitemap .= '<loc>' . full_url() . $page->slug . '</loc>';
        $sitemap .= '<changefreq>weekly</changefreq>';
        $sitemap .= '</url>';

All that’s really changed is the comment and the $query string. We’ve asked it to include all published pages that don’t have a slug of home or blog. This is where you’ll need to make changes if your slugs are different. Even if your homepage doesn’t show its slug, check to see what it is, because every page has a slug associated with it.

Submit Our Sitemap to Google

We have a sitemap! Yay. Now how will any search engines know about it?

Well we could add a link on our site, but that would be useless to everyone but search engines and even then, we’d have to wait for them to find it. A better option is to manually submit the sitemap URL to Google Webmaster Tools and Bing Webmaster Tools.

Here’s a quick step-by-step on how to do that:
Submit your sitemap to search engines

What about other search engines and what if we don’t want to set-up accounts for webmaster tools?…

Robots.txt to the Automatic Rescue!

So you think there must be a better way in 2015 to have your sitemap be visible to any and all search engines that may benefit from it? You are a progressive thinker and a fantastic person.

This option has actually been around forever, but we can add a robots.txt file to our website’s root folder and specify where our sitemap (or sitemaps) lives. Create a text file with the following:

User-agent: *

User-agent: * means the following lines are directed toward every robot that crawls this site. Disallow: means we don’t want to block crawling / indexing of any parts of our site. You can specify directories or individual pages with lines like this. Finally, Sitemap: is pretty obviously sharing our super, awesome sitemap with the world.

Also obviously, you need to change to the correct URL to your site…

Uploading this file to our root folder means that the next time a search engine starts crawling our site, it will check for a robots.txt file, and follow our instructions. Now our sitemap is visible to search engines and we are done!

See my robots.txt file for an example.


The original code example I found came from @lsmoura. You can find it on GitHub, but I made a lot of improvements to it, so I highly recommend my version.

How Did I Do?

Was this tutorial helpful to you? Do you have any questions or problems with the code? Is there anything you’d do differently?

Let me know in the comments below. Thanks!