It’s time for experimentation tools to integrate directly with the CMS instead of trying to imitate them
Marketing teams still need to be able to experiment at the speed of ideas, and not be slowed down by reliance on scarce engineering resources. That’s why we need a new solution to help marketers self-sufficiently build experiments.
It’s time for experimentation tools to integrate directly with the CMS that manages the marketing website in the first place. The need for experimentation-specific visual editors disappears when teams can leverage more robust general-purpose tools like Webflow or Contentful, where marketers already have strong familiarity and integrations.
Modern web architecture is too dynamic for the presumptions of code written by visual editors
In short: visual editors don’t work with some of the most-used development libraries on the web today, like React, which generates dynamic (and often quite generic) class names upon every re-deploy, breaking experiment code generated by visual editors.
The root of this problem has always been present, in the likelihood of conflicts between the production codebase for a website and the code “written” by marketing teams using a WYSIWYG tool. The latter is code written on top of the former - code meant to modify what a user would otherwise see on the page. This is all good and well, until the underlying code changes.
In the example of React’s dynamic selectors, experiments break because visual editors rely on targeting relevant selectors to specify where changes should be made. Those selectors would need to stay the same from the time the experiment is coded, to when it’s concluded, for things to work correctly. Whenever the site is updated, however, React is renaming all the selectors on the site, and the result is that the code a visual editor generates is extremely prone to breaking… on every re-deploy.
The general change over the last few years here is that modern web architectures - like Angular, Vue.js, or Next.js - have underlying code that changes a lot more, and a lot more often. Sometimes the resulting breakage is as innocuous as experiment changes no longer applying (causing data quality issues that may invalidate your results); other times it can wreak havoc, rendering your entire website inaccessible.
We heard repeatedly in our research that visual editors have been relegated to simple headline changes or like-for-like swapping of creative assets. One notable CRO agency even suggested that some background knowledge of coding was a pre-requisite to productively using the visual editor itself, undoing the central premise of removing engineering bottlenecks.
Visual editor approaches slow down websites, damaging SEO efforts
Even beyond SEO, this slowdown in site performance may have tangible negative impact on revenue.
Several firms have proven such a causal link - in “Trustworthy Online Controlled Experiments” by Kohavi et al, an entire chapter is dedicated to proving the impact of site slowdowns. Amongst the examples cited, a 100msec slowdown experiment at Amazon in 2006 decreased sales by 1%, and a 2012 study at Bing showed that every 100msec improvement increased revenue by 0.6%. Enough companies have replicated some version of this experiment to make it industry standard knowledge: slow site speed negatively impacts metrics.
Marketing teams loved WYSIWYG visual editors because they enabled easily making changes to the website, without having to go through the ENG team or jump through other technical hurdles. In fact, during my time as a consultant in the experimentation space, I saw plenty of teams who used their experiment visual editor for general website editing, as if it was Squarespace or Webflow. Sometimes these activities outpaced actual experiments conducted.
There is one obvious problem with this picture: no A/B testing tool is a best-in-class solution for implementing global, long-lasting website changes.
There’s a different tool that’s already in the marketer’s arsenal for these sorts of website changes: the CMS - literally purpose-built to make website changes accessible and performant. And unlike the slate of challenges facing visual editors in A/B testing tools, changes made inside a CMS like Contentful, Builder.io, or Webflow are not brittle “code on top of code”, nor are they third-party resources that slow down site performance.
We need to get experimentation tools out of the pretend CMS business, and start using the actual tool designed for website changes.
Instead of using a visual editor to build changes meant for an A/B test, the goal is for teams to build those changes directly in the CMS, and then flip a switch to turn the new changes into an experiment variation.
As an example, the Eppo <> Contentful integration requires a small one-time engineering setup, after which teams can use the entry ID of any piece of content in Contentful to specify experimental variations. The result is a scaleable approach to no-code experimentation leveraging two best-in-class tools to do exactly what they do best.
The workflow boils down to 6 simple steps:
We built Eppo from the beginning as the only 100% data warehouse-native experimentation platform because we knew your A/B testing tool shouldn’t be a secondary form of data collection and storage, divorced from your existing source of truth. You should use best-in-class tools to warehouse your data, and layer on a best-in-class experimentation tool.
In the same way, your A/B testing tool shouldn’t be a second-class CMS either. You already have purpose-built tooling to manage your website, why add a brittle and outdated tool on top to do a poor imitation of it?
Deep integrations with CMS tools are the path forward for marketing teams to run A/B tests - enabling experimentation at scale, while being resilient to the modern tech landscape. Because the visual editors wrote code on top of code, the ability to scale the number of simultaneous experiments was always tightly restricted - how many layers of code adapting code can you pile on top before things start to break? Starting at the source of truth (the production codebase generated by the CMS) avoids this problem entirely.
At Eppo, we’re excited to be pioneering this approach and continue to develop our roadmap around enabling marketing experimentation and shipping further integrations and features. If you’re a marketing team ready for the new way of running no-code A/B tests, reach out today.
Building the Modern Experimentation Stack
The Warehouse-Native Experimentation Workflow
How to Set Up an Experiment in Eppo