Can google find dynamically generated single pages?

Hello,

I’ve got a blog single page. It’s has 2 views: an overview page and detail page that is rendered if an ID is given in the URL. Pretty standard stuff.

Recently I noticed that these detail pages can’t be found in google. Probably because technically it’s all 1 page.

Is it possible to index these dynamically generated detail pages so that google can find them?

Do links with ID are actually generated somewhere on site (for example on overview page)? If they exists, then googlebot should reach and crawl those subpages like normal pages.

Do those subpages have unique title/description and unique content so they are not considered as duplicated content?

Check source of pages if they don’t have noindex metatag by accident.

Check robots.txt if subpages of that specific single page are not blocked.

I believe only main Single page will appear in sitemap.xml, so you would need to create/extend custom sitemap.xml if you want to include those “virtual” subpages.

We have added these to the sitemap.xml which seems to help with getting them indexed. There is an “on_sitemap_xml_ready” event that you can use and then loop over your Posts and add each one like this

use \Concrete\Core\Support\Facade\Events;

Events::addListener('on_sitemap_xml_ready', function ($event) {
    $app = \Concrete\Core\Support\Facade\Application::getFacadeApplication();
    $config = $app->make('config');
    $xmlDoc = $event->getSubject()['xmlDoc'];

    //get your blog list here, however you do that

    $lastmod = new \DateTime();

    foreach($results as $result) {
        $xmlNode = $xmlDoc->addChild('url');
        $xmlNode->addChild('loc', $result->getPageLink());
        $xmlNode->addChild('lastmod', $lastmod->format("Y-m-d\TH:i:sP"));
        $xmlNode->addChild('changefreq', $config->get('concrete.sitemap_xml.frequency'));
        $xmlNode->addChild('priority', $config->get('concrete.sitemap_xml.priority'));
    }
});
1 Like

Hello,

I’ve modified the code like so:

$blogparent = Page::getByID(211);

    $blogparentData = $db->fetchAll('SELECT Blog.*, Blog_nl.*, Blog.id AS urlID
        FROM
            Blog
        INNER JOIN
            Blog_nl
        ON
            Blog.id_nl = Blog_nl.id
        WHERE
            active = 1
        ORDER BY date DESC');


    Events::addListener('on_sitemap_xml_ready', function ($event) {
        $app = \Concrete\Core\Support\Facade\Application::getFacadeApplication();
        $config = $app->make('config');
        $xmlDoc = $event->getSubject()['xmlDoc'];

        $lastmod = new \DateTime();

        foreach($blogparentData as $page) {
            $xmlNode = $xmlDoc->addChild('url');
            $xmlNode->addChild('loc', $blogparent->getPageLink().'/'.$page['urlID'].'/'.$page['url']);
            $xmlNode->addChild('lastmod', $lastmod->format("Y-m-d\TH:i:sP"));
            $xmlNode->addChild('changefreq', $config->get('concrete.sitemap_xml.frequency'));
            $xmlNode->addChild('priority', $config->get('concrete.sitemap_xml.priority'));
        }
    });

So basically I get a list of all the posts and then (I think) I add parent url + child url to the sitemap. However when I run it I get the error “Invalid argument supplied for foreach()”.
$blogparentData is not empty.

My collegue can’t figure it out either because he hasn’t done this before either. Could you point me in the right direction?

Kind regards

You have to do the query inside of the function otherwise $blogparentData doesn’t exist.

Thank you. We got it to work. Sitemap was immidiately uploaded to google and now we’re waiting for it to index all the pages.