Is it possible to sync data between staging environment and production environment databases for all website content (so that staff don’t have to double their workload and build pages twice) without losing submitted information, such as form submissions and unregistered Conversations block posts, from site visitors?
The reason I ask is that the current project I am working on is a Concrete site where all content and pages are built on a staging environment, and then rebuilt on the live environment once they’ve been approved. I know that non-development work, like building pages, can be done on the live site and only published when it’s ready, reducing the duplication in work, but our management team is adamant that everything is done on a separate staging environment first before migrating it to production.
Is this at all possible, or is our current workflow the only option if pages must not be built and published directly from the live environment? The ultimate goal is that no one directly touches the live environment, aside from site visitors. If user-submitted data didn’t need to persist, this wouldn’t be a problem and we’d just clone our staging to live whenever changes are made.
Any insight on this would be appreciated. We’re open to using containers to streamline the process as well if that would help. We aren’t intimately familiar with the Concrete database schema and what we need to worry about when it comes to not overwriting user-submitted data on the live site (this absolutely cannot happen) when migrating changes from staging.
I don’t know if your staging lives inside of the same server or separate server.
If it’s the same server, I’ve made the following shell script and job
The copy shell copies all database and files from production to staging area, and find and replace the config of the database.php file to staging DB info.
Then, you can install either job to modify (anonymize) or delete existing users.
You can modify the job to filter which users to modify or delete.
You NEVER want to copy production user data onto staging especially if you enable public registrations.
Anyway, if it’s different server, we have custom deployer set-up. I didn’t make that one but my colleague did.
It does more than copying the data, but deploying new code, backup, upgrade, and copying the data from production to staging. It’s very complicated to explain. But i just say we have the solution.
I suspect you are looking for something like Mainio Sync - Concrete CMS
Unfortunately it is not compatible with recent core versions and is discontinued. However, if you are prepared to work on the code you may be able to get it updated.
For new pages you might look at the migration tool, if you don’t have a lot of custom blocks it might save you some time.
This looks to offer some of the functionality I am looking for, definitely. I mostly want to bring page data forward from one environment into another via the database, with everything else other than package installations and version updates being handled with source control, wherever possible. Do you see any issues there?
Also, this add-on is currently disabled. Is it still possible to purchase a license? We are currently still using the v8.x core, so it doesn’t need to be v9 compatible for our use case.
AFAIK, the Mainio Sync tool was discontinued at v7 because changes to the core at v8 prevented it from functioning and the work involved in updating it was disproportionate to the value of the project. I believe it was originally developed from the migration tool @pvernaglia refers to.
You can’t purchase it from the marketplace. You would need to contact Antti Hukkanen Contact / Mainio Tech to get a copy, then dig into the code to get it working for v8 or v9 sites.
Back at v7 it was a good tool and worked reliably within its scope of copying between sites with identical applications/packages.