LogoLogo
  • Overview
  • Key Features
  • Getting Started
    • Installation
    • Quick Start Guide
    • Configuration Options
    • Authentication
  • NPM Package
    • Installation
    • Core API Reference
    • Event Tracking
    • Experimentation
    • Performance Monitoring
    • Session Recording
    • Error Tracking
    • Advanced Configuration
  • A/B Testing & Experimentation
    • Creating Experiments
    • Targeting Users
    • Measuring Results
    • Statistics & Significance
    • Best Practices
  • Analytics Dashboard
    • Overview
    • User Journey Analysis
    • Conversion Funnels
    • Custom Reports
  • Session Recording
    • Privacy Considerations
    • Viewing Recordings
    • Filtering & Searching
    • Heatmaps
  • Performance Monitoring
    • Core Web Vitals
  • User Management
    • Roles & Permissions
    • Team Collaboration
    • Access Controls
    • Audit Logs
  • Troubleshooting
    • Common Issues
    • Debugging Tools
    • Error Reference
  • Appendix
    • Glossary
    • API Status Codes
    • Migration Guides
    • Release Notes
    • Roadmap
Powered by GitBook
On this page
  • Reading the Reports
  • Interpreting Results
  • Making Decisions
  • Results Sharing
  1. A/B Testing & Experimentation

Measuring Results

PreviousTargeting UsersNextStatistics & Significance

Last updated 1 month ago

Measuring experiment results accurately is critical for making informed decisions. Mtrix provides comprehensive tools to analyze your experiments.

Add alt text and captions to your images

Reading the Reports

On the reporting section of each experiment, you'll see both chart view and written reporting section.

Experiment specific charts designed to easily compare variants for any events fired. You can compare the PageView, BeginCheckout, Purchase, or any custom event you're sending. You can enrich the results and the chart by using Filters, Segmentation or Traits. From the top you can change the time period for time-based filtration or include a comparison time period. As well as switching to funnel view will let you show the dropoffs of your visitors.

The written reporting section provides all the necessary key e-commerce metrics:

  1. Confidence Scores: Provides confidence scores between 0 to 1 for the likelyhood of the variants winning.

  2. Total Visitors: Unique visitors that variant received

  3. Conversion Rate: Total users purchased, the actual conversion rate percentage

  4. Next Page Visit: Number of the people makes a second page view after they bucketed for the experiment

  5. Checkout Start: Number of people who started viewed Checkout page

  6. Checkout Complete: Number of people who completed the checkout process

  7. Average Revenue: Total Revenue / Purchasers

  8. Total Revenue: Revenue generated from variant

  9. Revenue Per User: Total Revenue / Total Visitors

  10. Allocation: Using this input field you can alter each variants allocation!

    1. Altering allocation after a confidence score is generated will result in creating a historic tab and start collecting the data in a new bucket.

If you're not using Mtrix as your primary infrastructure a page source map must be uploaded to Mtrix in order to page-based metrics.

Interpreting Results

Understanding what your results mean is critical for making decisions:

Statistical Significance

Mtrix calculates statistical significance to help determine if results are reliable:

  • Green (95%+ confidence): Strong evidence the variant is different from control

  • Yellow (80-95% confidence): Some evidence, but not conclusive

  • Red (<80% confidence): Not enough evidence to be confident in the result

Expected vs. Actual Sample Size

  • Required Sample: The calculated number of users needed for reliable results

  • Current Sample: How many users have been included so far

  • Progress Indicator: Visual representation of sample size progress

Making Decisions

Mtrix helps you make informed decisions based on experiment results:

Decision Framework

Follow this process to make experiment decisions:

  1. Check Validity: Ensure the experiment ran properly without technical issues

  2. Assess Significance: Determine if results have reached statistical significance

  3. Analyze Impact: Calculate the potential business impact of implementing changes

  4. Consider Segments: Look for important differences across user segments

  5. Evaluate Secondary Metrics: Check for unintended consequences

  6. Document Learning: Record insights regardless of outcome

Outcome Actions

For each experiment outcome, Mtrix offers appropriate next steps:

  • Clear Winner: Implement the winning variant permanently

  • Inconclusive Results: Modify and retest, or increase sample size

  • Multiple Good Options: Consider segmenting the experience for different users

  • All Negative Results: Revert to control and document learnings; Mtrix will ask you how the experiment went while concluding it, so you'll always have the historical data in-hand.

Results Sharing

Share experiment results with stakeholders:

  • Automated Reports: Schedule PDF reports to be sent to stakeholders

  • Shareable Links: Create links to results dashboards for easy sharing

  • Presentation Mode: Simplified view optimized for presentations

  • Export Options: Download raw data in CSV/Excel format