QA your projects
To ensure that each project performs as required, a detailed project plan should be created that outlines what elements are being changed and how ultimately the updated content will look and perform.
Our recommendation is to follow a QA process in 3 stages:
- Correct Look and Feel
- Segments are selected correctly
- Check Conversions are triggering
To check individual projects please use our “Force Experiment Tool” here
A QA checklist is available that should help match your requirements with the project plan to the outcomes of the experiments.
1 Correct look and feel
Ensuring that the project is in line with your corporate guidelines and looks how it should according to your project plan is a matter of testing the content
To activate the Force Experiment tool simply go to the page URL that you wish to check and select force experiment from your bookmark bar.
For mobile devices you do not need to use the Force experiment tool just add “?_wt.mode=staging” to the end of the URL – You will then see an overview of the current projects running on that URL.
From the top right-hand section of the navigation select “Current Mode = Staging”
You can then see if you are in the test or in the control by looking at the sections on the left-hand side in the body of the widget.
You can select a different test (or control) by clicking on another numbered button.
Using this tool across different devices and browsers will enable you to see how your content is not only displayed but if it functions as per the project plan.
Things to check
The following checks should be carried across browsers & devices – Your project plan should outline which devices and browsers are included.
This should only be done using actual devices. Emulators running under browser extensions do not always perform correctly. A series of checks should be carried out across all the devices specified. Just because it works on one browser on one device, don’t assume the browser will perform the same on another device / operating system
The visuals (transformations) for each project should be checked on each device and browser combination to ensure they’re in line with the mock-ups (comps) that were included in the project plan.
Functionality should also be checked to ensure everything is working correctly. If QA scenarios have been included within the plan then perform each of these scenarios to verify that the test passes.
2 Segments are selected correctly
If you have included a segment or segments in your project plan then you should try to emulate accessing the page both from within and outside the chosen segment. To do this you will need to go to the Optimize dashboard and select the particular test you are interested in and open the overview detail.
3 Conversion Tracking & Custom Data Capture
Conversion tracking (and custom data where applicable) should also be checked to ensure any click or page conversions that are in the test plan are triggering and tracking correctly. If the test is capturing additional custom data (for example text from a field on the page) then a data extract will be made and the csv file checked to ensure it corresponds with the plan.
Once your project goes live then you should perform some basic post-launch checks ( a subset of the above) to ensure experiments are being served correctly.