Measuring Results
Last updated
Last updated
Measuring experiment results accurately is critical for making informed decisions. Mtrix provides comprehensive tools to analyze your experiments.
On the reporting section of each experiment, you'll see both chart view and written reporting section.
Experiment specific charts designed to easily compare variants for any events fired. You can compare the PageView, BeginCheckout, Purchase, or any custom event you're sending. You can enrich the results and the chart by using Filters, Segmentation or Traits. From the top you can change the time period for time-based filtration or include a comparison time period. As well as switching to funnel view will let you show the dropoffs of your visitors.
The written reporting section provides all the necessary key e-commerce metrics:
Confidence Scores: Provides confidence scores between 0 to 1 for the likelyhood of the variants winning.
Total Visitors: Unique visitors that variant received
Conversion Rate: Total users purchased, the actual conversion rate percentage
Next Page Visit: Number of the people makes a second page view after they bucketed for the experiment
Checkout Start: Number of people who started viewed Checkout page
Checkout Complete: Number of people who completed the checkout process
Average Revenue: Total Revenue / Purchasers
Total Revenue: Revenue generated from variant
Revenue Per User: Total Revenue / Total Visitors
Allocation: Using this input field you can alter each variants allocation!
Altering allocation after a confidence score is generated will result in creating a historic tab and start collecting the data in a new bucket.
Understanding what your results mean is critical for making decisions:
Mtrix calculates statistical significance to help determine if results are reliable:
Green (95%+ confidence): Strong evidence the variant is different from control
Yellow (80-95% confidence): Some evidence, but not conclusive
Red (<80% confidence): Not enough evidence to be confident in the result
Required Sample: The calculated number of users needed for reliable results
Current Sample: How many users have been included so far
Progress Indicator: Visual representation of sample size progress
Mtrix helps you make informed decisions based on experiment results:
Follow this process to make experiment decisions:
Check Validity: Ensure the experiment ran properly without technical issues
Assess Significance: Determine if results have reached statistical significance
Analyze Impact: Calculate the potential business impact of implementing changes
Consider Segments: Look for important differences across user segments
Evaluate Secondary Metrics: Check for unintended consequences
Document Learning: Record insights regardless of outcome
For each experiment outcome, Mtrix offers appropriate next steps:
Clear Winner: Implement the winning variant permanently
Inconclusive Results: Modify and retest, or increase sample size
Multiple Good Options: Consider segmenting the experience for different users
All Negative Results: Revert to control and document learnings; Mtrix will ask you how the experiment went while concluding it, so you'll always have the historical data in-hand.
Share experiment results with stakeholders:
Automated Reports: Schedule PDF reports to be sent to stakeholders
Shareable Links: Create links to results dashboards for easy sharing
Presentation Mode: Simplified view optimized for presentations
Export Options: Download raw data in CSV/Excel format