A/B Testing

A/B testing on bedrock is done with the help of Traffic Cop. A small javascript library which will direct site traffic to different variants in a/b experiments and make sure a visitor always sees the same variation.

It’s possible to test more than 2 variants.

Traffic Cop sends users to experiments and then we use Google Analytics (GA) to analyze which variation is more successful. (If the user has DNT enabled they do not participate in experiments.)

All a/b tests should have a mana page detailing the experiment and recording the results.

Coding the variants

Traffic cop supports two methods of a/b testing. Executing different on page javascript or redirecting to the same URL with a query string appended. We mostly use the redirect method in bedrock. This makes testing easier.

Create a variation view for the a/b test.

The view can handle the url redirect in one of two ways:

  1. the same page, with some different content based on the variation variable
  2. a totally different page

Content variation

Useful for small focused tests.

This is explained on the variation view page.

New page

Useful for large page changes where content and assets are dramatically different.

Create the variant page like you would a new page. Make sure it is noindex and does not have a canonical url.

{% block canonical_urls %}<meta name="robots" content="noindex,follow">{% endblock %}

Configure as explained on the variation view page.

Traffic Cop

Create a .js file where you initialize Traffic Cop and include that in the experiments block in the template that will be doing the redirection. Wrap the extra js include in a switch.

{% block experiments %}
  {% if switch('experiment-berlin-video', ['de']) %}
    {{ js_bundle('firefox_new_berlin_experiment') }}
  {% endif %}
{% endblock %}


See the traffic cop section of the switch docs for instructions.

Recording the data


If you are measuring installs as part of your experiment be sure to configure custom stub attribution as well.

Including the data-ex-variant and data-ex-name in the analytics reporting will add the test to an auto generated report in GA. The variable values may be provided by the analytics team.

if(href.indexOf('v=a') !== -1) {
        'data-ex-variant': 'de-page',
        'data-ex-name': 'Berlin-Campaign-Landing-Page'
} else if (href.indexOf('v=b') !== -1) {
        'data-ex-variant': 'campaign-page',
        'data-ex-name': 'Berlin-Campaign-Landing-Page'

Make sure any buttons and interaction which are being compared as part of the test and will report into GA.


Write some tests for your a/b test. This could be simple or complex depending on the experiment.

Some things to consider checking:

  • Requests for the default (non variant) page call the correct template.
  • Requests for a variant page call the correct template.
  • Locales excluded from the test call the correct (default) template.