Concrete CMS Versions 10, 11... AI Evolution

We’ve been building concrete5/Concrete CMS websites for 15 years and have seen the platform mature into the stable and comprehensive application it is today.

The rise of AI search engines (ChatGPT, Perplexity, Google’s AI Overviews, etc.) is set to fundamentally change how people discover and consume web content. We’ve been asking ourselves: how will the websites we build need to evolve to respond to this shift?

The New Requirements for Web Visibility

Visibility in AI search requires comprehensive, authoritative content with rich context and semantic meaning. The next generation of websites will need to “speak” to AI in new ways:

  • Structured data is becoming critical - not just basic schema, but comprehensive entity relationships
  • Content authority and originality matter more than traditional SEO metrics
  • Semantic context is replacing keyword optimization as the primary ranking factor

Delivering robust structured data and supporting intelligent content optimization are likely to become key differentiators for any content management system.

Beyond Search: AI-Enhanced User Engagement

Beyond search visibility, AI provides exciting new opportunities to help users find and engage with website content through personalized experiences, intelligent navigation, and contextual recommendations.

Questions for the Concrete CMS Community

  1. What features should Concrete CMS prioritize to help site owners succeed in an AI-dominated search landscape?
  2. How can we make structured data more accessible to content creators who aren’t technical experts?
  3. Should Concrete CMS include AI-powered content analysis tools that help evaluate content for AI search readiness?
  4. What AI-driven user engagement features would add the most value for your clients?

A Strategic Opportunity

This isn’t just about keeping up with trends - it’s about ensuring the sites we build continue to be discoverable and valuable as search evolves. The Concrete CMS Roadmap is rather brief just now – is this an opportunity to build out some exciting new concepts for future versions?

Concrete CMS is exceptionally well-positioned for this evolution. The comprehensive API, robust architecture, and minimal dependency on external plugins create ideal conditions for integrated AI tools that can work seamlessly with website content to enhance both search visibility and user engagement.

What Are Your Thoughts?

  • How do you see the changing landscape affecting the websites you build?
  • What features would be most valuable to you and your clients?
  • Are you already making changes to prepare for AI search?
  • What challenges are you anticipating?

Looking forward to a great discussion!

3 Likes

The main AI requirements I am seeing are:

  1. RAG (Retrieval-Augmented Generation) - where an AI uses site content to answer questions.

  2. AI generated content - such as telling an AI to write a blog post, or via an agent where an AI automatically generates a new blog post every week.

  3. Chatbots, which if not built with RAG can lack site-specific insight and lead to frustrated visitors.

I generally advise clients against un-restrained (2). Even when accurate it tends to produce vanilla content which is either boring or advertises their laziness. It also promotes model collapse.

The existing Concrete site search is completely inadequate. For Concrete to skip way ahead to offer a fully integrated RAG (1), just add the AI API keys and the rest is automatic (so similar complexity to adding Google Maps) would be top of my list. I have been kicking around some ideas to extend my Search++ into a RAG engine, but haven’t got into detailed implementation.

The main obstacle is that MySQL doesn’t support a vector store, so RAG requires an external database and hence considerable expertise to configure beyond pasting an API key into a dashboard page. If Concrete could be ported to Postgress, that would open up many possibilities.

On my own site, I have been using AI to generate thumbnails to go with articles. Just eye candy rather than accurate components. It works particularly well when told to use Lego. With that in mind I have also been kicking around the idea of integrating an AI image generator into my Snapshot addon. But that would be more of a fun project than a widely appealing application.

Of course, these all miss the premise that the nature of web sites will change, so adding AI enhancements to a conventional web site is only an interim measure.

1 Like

I’m very interested in the ideas that may come out of this discussion, so please don’t take my post as some type of final organizational position - these are thoughts and thoughts often change. I really think we’re at a time where there’s more questions than clear answers for everyone.

I agree with Katalysis that AI changes everything. Not just how we create content, but how we discover it, structure it, and turn it into something useful. Like JohntheFish, I’m not impressed with “auto-generate my blog posts” gimmicks. But using AI as a collaborator? It’s been transformative. Brainstorming, outlining, reworking tone, cleaning up clarity - it’s like having a brutally efficient junior editor who never sleeps. Case in point; this post started as twice as long and half as clear. Thanks ChatGPT. :wink:

The metaphor I keep using is: take a set of power tools back to a furniture shop in the 1600s. A skilled craftsman would panic as they saw how fast I could cut wood. The reality is, we still use hand tools in shops today. Sure, fewer people are employed in “sanding” - but sanding is a shit job. Power tools didn’t kill furniture design. You still need taste, judgment, and vision to make anything worth keeping. AI is the same.

So no, I’m not racing to add a “generate content” button to Concrete CMS. That’s table stakes, CKEditor already offers one. Everyone’s doing it. I’d welcome marketplace submissions for it, but I’m not spending my time making it easier to publish more vanilla sludge. Great content is still a human-led process. AI just helps you get there faster.

Structured data and semantic context are very interesting. If there’s something more we can be doing in the core to make the content people are creating (sometimes with AI) easier for AI to consume, we should absolutely be moving on that. Katalysis is right that we have the ability to make deep changes pretty quickly for an open source project because the core delivers so much consistently. I can’t help but to admit we JUST added open graph tags to the core in a recent version update, we should have done that ages ago. If there are new standards we can embrace in a similar vein of accurately describing and organizing content in ways that AI crawlers will like, we should be discussing that and doing it faster than we’ve moved in the past. I’d love to get some consensus from the community on how we can create better meta data for AI gracefully spider Concrete sites.

Changing the fundamental database structure of Concrete because we’re interested in vector search seems like a big shift. It’s also not clear to me that the types of sites being built with Concrete really warrant the core database being rethought. Populating another RAG database with information about a Concrete site seems far more believable to me. We’ve certainly used 3rd party search indexes in the past because as John points out, the core search in Concrete is very very bare bones. It’s hard for me to see us delivering a search experience that rivals what people expect from hosted closed source solutions with billions of dollars of investment in them. Perhaps a DB shift to Postgres is easier than I’m imagining and delivers more immediate rewards than I understand today?

I see some low hanging fruit for the marketplace that I’m kinda shocked people haven’t built yet:

  • The file chooser already has a sidebar and architecture that lets you connect it to 3rd party sources, we use it to connect to BrandCentral (our stand alone DAM on Concrete) for army sites. Connecting to popular AI image generators there seems like a no brainer.
  • The description & meta-description attributes have event handlers and there is a bulk SEO tool built into the dashboard. Someone building an add-on that pipes the content of a page to a LLM and returns a short description for the page in those places seems like something I’d buy.
  • Translation is a cumbersome process, it feels like a LLM could do a semi-acceptable job of block level content in a different language for someone tasked with turning a recently built out page tree from english to spanish or what-have-you.
  • Better integration with a 3rd party search service would be delightful. We spend a lot of time tinkering with elastic search and we still struggle to deliver what clients have learned to expect from google.
  • I’m sure there’s a good deal more creative ideas for AI powered extensions that we’d happily review and approve if they were submitted to the marketplace.

That all said, there’s an entirely different angle we’ve thinking about here at PortlandLabs.

AI is helping us write faster. But it hasn’t yet solved my biggest bottleneck in digital work today: collecting and organizing the actual content.

A lifetime ago, a client would hand you a Zip drive with everything you needed for a project. Now? You’re scraping info from email threads, Slack DMs, text messages, random PDFs, and screenshots of Facebook posts. I spend half the time on a project scrolling through my downloads directory or searching email for the correct file. This problem is larger than a web project manager’s needs. If you’ve ever led volunteers, a parent group, or a team activity that took some organizing you know the pain of dozens of people trying to collaborate across different communication channels.

A lot of my thinking is about the mess before the writing even starts: what is the process for actual content collection, and why is it still so manual?

So we’ve been building Voxera. It’s a platform that joins your group’s discussions (email, text, Slack, wherever) and quietly pulls together all the assets, links, images, videos, and decisions. Then AI helps organize and summarize everything into something you can actually use.

This extracted content eventually may get built into a stand alone websites (and here’s where this matters to Concrete) we’ll be building:

  • Sites that summarize group activity across platforms

  • Resource hubs that reflect community conversations

  • Planning sites powered by what’s actually been said and shared

  • Informational that feel alive, not archived

Go take a look at the mockups under the various solutions pages we’ve thrown together.

Concrete is uniquely suited for this. Its structure, security, and extensibility give it a major edge over flashier-but-flakier tools. We’re building a deep Voxera integration with Concrete so that once you’ve gathered and refined your content, it’s one click to publish a site that delivers what you need.

None of those themes are built, all of these outputted sites are things we’d expect to run on the Concrete SaaS hosting we offer today.

Enterprise CRM platforms and some high-end marketing tools have been aggregating content from multiple communication channels for years. It’s proven to be important to big sales and marketing teams, but no one has stopped to take the same type of tooling and offer it to an individual who has to manage communications for other reasons. My hope is that Voxera can do for that space what Trello did for Kanban boards in 2011, and that the surge of organizers looking to build a website with the knowledge they’ve extracted from conversations will help give Concrete CMS the clearer market and target use-cases it’s deserved for so long.

4 Likes
  • The description & meta-description attributes have event handlers and there is a bulk SEO tool built into the dashboard. Someone building an add-on that pipes the content of a page to a LLM and returns a short description for the page in those places seems like something I’d buy.

IMHO, a simple extension that suggests itself and is easy to implement is to add automatic generation to the site. llms.txt for better indexing by AI. The details are described here https://llms.txt.org

Link is broken. I found https://llmstxt.org/ which I presume is the intended site. Then the first few links in that site to the FastHTML docs are broken…

On a positive note, the site does have an associated

which links to the markdown docs:

A few more items

  1. Auto generation of tags and descriptions for assets uploaded
  2. Not really “Gen AI” but some sort of layout design algorithm -think PPT’s layout design, where you select elements on your page and the system provides suggested layout designs you can try out and revert back from.
  3. Configured AI assistant for the site that can answer questions and be configured to point to just the site info or specific information or configured to go beyond (maybe people would want more than they provide on the site)
  4. Not really AI but could use it for this - site wide check on grammar, spelling and accessibility compliance
  5. Auto gen of Alt text for images (accessibility)

Besides using more precise HTML semantics, we also started to implement schema.org wherever possible.
As @frz mentioned, using LLMs for generating descriptions and meta tags is a no-brainer, and I think would be easy to integrate with Hugging Face.
By the way, we have been using OG tags for a long time, so it’s nice that this is now implemented in the core.