Mutual Exclusion
Many testing programmes wish for users who see a test to not see others. By doing so, they minimise extraneous variables which affect their test, and know that the observed change in conversion rate is due to the experiment itself.
In Webtrends Optimize, there are two ways in which we facilitate Mutual Exclusion logic:
Option 1: Use Projects
Whilst not yet surfaced clearly in our new interface, Projects are a key part of how tests are built, and come with a very useful feature - inherent Mutual Exclusions.
Within a given project, users can only fall into one experience - a single test, or a single target, or a single baseline, etc. So, imagine that in your project, you have the following tests:
- Test 1 - Mobile
- Test 2 - PPC users
- Test 3 - All users
Example 1 - Mobile + PPC user
If you had a Mobile user coming through a PPC campaign, they should be eligible for all three tests. But given these are in the same project, they will only fall into one. The first one they are eligible for. You can control the order in which tests are evaluated using Priority Weighting as described below.
Example 2 - Desktop + PPC user
If you had a similar user to example 1, but they were on a Desktop browser, they would not be eligibile for the first test. So, in our evaluation order, we would attempt to evaluate for all tests, but they wouldn't meet the audience rules for the first one. They would therefore fall into the second test.
Example 3 - With Throttling
Imagine if you want a test setup which looks like this:
- Test 1 - Mobile - to 50% of eligible users
- Test 2 - PPC users - to 50% of eligible users
- Test 3 - All users - to whoever is left
Projects naturally have this ability, with no additional setup required. Best of all, users who do not meet throttling for one test automatically get evaluated for the next, meaning you don't lose out on any traffic.
How to create Mutual Exclusions in Projects
As mentioned, the folder-like structure of projects is not yet obvious in our new UI. To get there for now, simply clone an existing test. All clones are created within the same Project, and so they'll naturally fall into this way of working. From there, just adjust the Name and the Segment, and swap out the code with whatever you'd like.
Things to note
While it could be tempting to put all of your tests in a single project, please note that Projects retain Location information. So, any tests within a given project are all intended to run on the same set of pages.
Option 2: Using the Test History library
The Test History library is a lightweght JavaScript library, typically kept in Pre-Init (integration coming for easier enablement). When enabled, it allows users to write simple logic to handle when to allow tests to run. Unlike projects, because this handling is purely with JavaScript, you're able to facilitate the most intricate of logic, deciding at the very last minute (after polling/waiting, API calls, etc.) whether to allow someone into your test or not. It's accurate usage is dependant on the Optimize Build Framework, which allows you to suspend and manually track Views.
The library is found below, to be placed in Pre-Init in your tag. It's usage is described as follows:
To check against a specific test you want to exclude - here's a snippet from the OBF post-render:
/*/////////////////////////////////////////////////////////////
| ENTRY POINT TO THE TEST
-------------------------------------------------------------*/
Run: function()
{
// Optional: If you have anything immediate to check that would exclude users, do it here.
if(someConditionThatWouldExcludeUsers){
Test.abort("User matched some condition that would exclude users");
return;
}
// To abort if user has seen a specific other test.
if (WT.TestsHistory.seenTest('ta_30_HomepageLayoutAB')) {
Test.abort('Visitor has seen test 30 - excluding from this one.');
// Stop function execution
return;
}
// Add this test to the History, so 30 can pick up similar logic
WT.TestsHistory.addTest(Config.testAlias);
Test.poll({
// ...
To check against any/all tests:
/*/////////////////////////////////////////////////////////////
| ENTRY POINT TO THE TEST
-------------------------------------------------------------*/
Run: function()
{
// If you have anything immediate to check that would exclude users, do it here.
if(someConditionThatWouldExcludeUsers){
Test.abort("User matched some condition that would exclude users");
return;
}
// To abort if user has seen any tests other than this one.
if (WT.TestsHistory.getTests().replace(Config.testAlias, '').length) {
Test.abort('Visitor has seen another test. Abort.');
// Stop function execution
return;
}
// Add this test to the History, so 30 can pick up similar logic
WT.TestsHistory.addTest(Config.testAlias);
Test.poll({
// ...
If you want to perform this check after polling/waiting for some conditions:
/*/////////////////////////////////////////////////////////////
| ENTRY POINT TO THE TEST
-------------------------------------------------------------*/
Run: function()
{
Test.pageview.suspend("Waiting for cart");
Test.poll({
msg: 'Polling for cart',
// Polling function
when: function()
{
return document.querySelector('.header--cart-link .number-wrapper');
},
// Polling callback
then: function()
{
// Run the checks you need to run.
if(someConditionThatWouldExcludeUsers){
Test.abort("User matched some condition that would exclude users");
return;
}
// Now check Tests History
// To abort if user has seen any tests other than this one.
if (WT.TestsHistory.getTests().replace(Config.testAlias, '').length) {
Test.abort('Visitor has seen another test. Abort.');
// Stop function execution
return;
}
// Add this test to the History, so 30 can pick up similar logic
WT.TestsHistory.addTest(Config.testAlias);
// ...
Additional Info
Priority Weighting
This is found in the dashboard carousel, a few slides along. Simple drag tests/targets from the left to the right, in the order you wish for them to be evaluated.
WTO Tests History
The library, as described above. This requires tag 5.2 or newer.
try {
// Function to save project numbers user has seen
WT.TestsHistory = (function(){
return {
cookieName: '_wt.testsHistory',
domain:'',
addTest: function(pTest)
{
var tests = this.getTests();
var testsArr = [];
if(tests)
{
testsArr = tests.split(',');
}
var rxTest = new RegExp('(^|,)'+pTest+'(,|$)', 'i');
if(tests.match(rxTest)) return;
testsArr.push(pTest);
// Write the test to a cookie (expires in 90 days)
WT.helpers.cookie.set(this.cookieName, testsArr.join(','), 90);
},
seenTest: function(pTest)
{
var rxTest = new RegExp('(^|,)'+pTest+'(,|$)', 'i');
var tests = this.getTests() || '';
return Boolean(tests.match(rxTest));
},
getTests: function()
{
return WT.helpers.cookie.get(this.cookieName) || "";
}
};
}());
} catch(err) { if(document.cookie.match(/_wt.bdebug=true/i)) console.log(err); }