Earlier this month, I had the privilege of attending Sitecore Symposium 2025, where I attended many different talks about what is in the pipeline for Sitecore AI.
If you haven’t heard, Sitecore announced Sitecore AI as it’s next-generation digital experience platform. Sitecore AI, as the name suggests puts AI at the forefront, giving us the tools to make each interaction personalized and connected for the user.
As part of the AI overhaul, I attended a talk presented by Liz Nelson, VP of Product and Technology at Sitecore and Andrew Liu, Product Manager at Microsoft. This talk focused on the new Sitecore content service and what’s coming down the pipeline.
What is the Content Service?
The content service, is the new age replacement to Sitecore’s legacy content paradigm. It will become the common content and media engine for all of Sitecore AI leading to easy and seamless integration across the suite.
This is a major change to the very foundation of Sitecore’s content management, and in my opinion it’s era defining.
There was a lot of great information in this session. To keep things concise, I’ll focus on just a few topics that really piqued my interest.
Speeding things up
I’ll start off with something strong: faster publishing. As I’m sure anyone who works with Sitecore can agree that publishing can be slow and painful. Today, large publishes can take a long time to run, especially when publishing to multiple languages or including related items. It’s not uncommon that an author sees the dreaded “Queued.” message in the publish dialog, not knowing how long they’ll have to wait until it’s their turn.
The content service looks to solve that, with some pretty impressive metrics to boot. Some of the claims were:
10,000 items published in 2 seconds.
Instant & scheduled content availability
I was pretty happy to see content availability addressed, and impressed at the amount of work that has gone into improving this aspect of the platform.
The publishing speed is way, way faster than what we tend to see now. In some cases, we’re talking going from hours to seconds.
One of the main reasons why these speeds are achievable is the adoption of a Graph-based content structure and Azure Cosmos DB. This replaces the MSSQL databases and the legacy item database model.
I’ll be talking more in-depth on Azure Cosmos DB in a future blog post, but all you need to know for now is that Azure Cosmos DB is a multi-model database service. It supports user-defined JSON records called documents, which are the successors to the traditional Sitecore SQL items.
More control over content
Because of data-structure shift, there was also an emphasis on how content becomes available. Traditionally, when a publish occurs, the data (content) is moved from the master database to the web database or Experience Edge. Now, content can instead be setup state changes.
You can think of a state as an authority that determines who can see your content and when. Content can flow through states like workflow, going through approval before going live, but could also be restricted to specific user groups like how we use the Security Editor today. The beauty of states is that you can fine-tune the availability of any content in any way you want.
You’ll also be able to schedule content availability. This is one of those age-old dilemmas with content in Sitecore. Setting up a scheduled content publishing system has always been a bit of a pain point for me personally. Seeing an emphasis on true scheduled content functionality was great news.

The screenshot we were shown details that not only can you make content instantly available, you can also schedule it to appear at a specific time. You can also set a date to make it unavailable again.
Before, your only option was to mark an item as unpublishable and then republish it, and of course, to remember to actually do it!
A new content structure
I was really happy to see that the content structure is moving to JSON. Using a graph-based approach allows for flexible schemas. You can think of the schema like a template, a strongly-defined object/item with a series of properties/fields that dictate the structure of an object.
Moving to a JSON structure gives us as developers a lot of flexibility in how we structure content types and makes it easy to make updates to it in the future.
Because it’s JSON, it also will make it much, much easier to search and filter content that meets specific criteria. Add in agentic capabilities and semantic search to the mix and suddenly item lookup and retrieval becomes simpler, way faster and with better performance.
…and a lot more
They discussed a lot more on what’s in the pipeline like git-style branching for content and full-text, semantic, visual & deep searching. Content branching could include things like previewing big content releases or to safely test changes to content structures. Enhanced search should make it much easier to find specific pieces of content whether by keywords, prompts and the like!
So when’s it coming?
We were told that the roll out will happen in 9 phases, 7 of which should require no intervention from customers. It will be a phased hybrid approach, where the base image of the CMS will be replaced by the content service.
The roll out phases are:
- Side by side
- Publishing & Delivery
- Media Library
- New Layout Engine
- Security Graph & Permissions
- Extensible Workflow Engine
- Updated Content Model
- Authoring Engine Replacement
- Base Image Deprecated
The bolded phases are those that may require customer intervention, which will surely be communicated when the time comes.
We don’t know exactly when this roll out will start. However, they did say we’d start to hear things in the first half of the new year. Personally, after hearing about what’s to come, I’m very optimistic about the future of Sitecore. I can’t wait to start digging into all of it!
If you’re interested in a part two talking about the rest of this talk, leave a comment below!
Until next time, happy Sitecoring!


Leave a comment