In this article, you’ll learn about making informed decisions based on campaign results: |
VWO's SmartStats engine helps you interpret your test campaign results and make data-driven decisions to optimize your website. This article explains what each recommendation means and how to act accordingly.
When No Recommendation Has Arrived Yet
If VWO hasn't provided a recommendation, hold off on taking action. The campaign might still be collecting data to reach a statistically significant suggestion.
Performance-based Recommendations
Winning Recommendation
VWO tests focus on finding the winner as earliest as possible. As soon as the probability of improvement crosses the defined winner threshold (95% by default), the variation is considered to be statistically significant winner.
Winning recommendations can often arrive earlier than the expected duration of the test when the actual uplift provided by a variation is larger than the defined Minimum Detectable Effect (MDE).
Note that in the case of multiple variations, the earliest variation to reach significance is the one with the highest uplift. Hence, VWO declares a conclusion as soon as a winner is found and does not wait for other potential winners.
Disable Recommendation
A disable recommendation is given if the probability of improvement goes below the disable threshold (5% by default). In this case, it is unlikely that the variation will perform better than the baseline, so the variation can be disabled, and visitors can be saved.
Inconclusive
This will come if the maximum number of visitors is reached, yet no winner is found.
If none of the variations achieved statistically significant improvement, consider these options:
- Reduce MDE and Extend Test: If the expected improvement is likely smaller than the configured MDE, extend the test duration and reduce the MDE to allow VWO to detect a smaller effect.
- Accept Baseline: If the baseline performs adequately, you can conclude the test and stick with the existing experience.
Experiment Vitals-based Recommendations
- Data Tracking: Double-check your test setup and ensure data is being tracked correctly.
- Conversion Tracking: Verify your conversion-tracking setup is functioning correctly. Check the metric setup and the events associated with it.
-
Minimum Runtime: VWO recommends waiting at least 7 days before a recommendation to account for potential weekly visitor behavior patterns.
- Confident in Unaffected Metrics: If you're certain that weekly variations don’t influence the tested metric, you can disregard this and potentially receive a recommendation sooner.
- Longer-Term Variations: If you anticipate effects lasting longer than a week, wait until that period has passed before making a decision and ignore interim recommendations.
- Guardrails: One or more guardrails that you set have been breached, which means that a variation underperformed for a metric you deemed crucial. If the variation is not disabled automatically, you should disable it and direct more traffic to the rest for a faster conclusion.
- Experimentation Conduct: Issues like traffic allocation changes during the test campaign or re-enabling disabled variations can introduce bias. In such cases, it is recommended that you either clone the campaign (if you want to retain the existing data) or flush the data of the existing campaign and restart it.
By understanding these recommendations and addressing any accompanying vitals, you can leverage VWO's SmartStats to make informed decisions that optimize your website or app.