AB Testing – Is the Difference Statistically Significant?

This calculator is based on the two-sided Chi-Square statistic with Yates Correction, and uses a confidence level of 95% to check to see if the difference between two sets of results are statistically significant. See below for more information.

 

Split Test A/B Calculator

Does your Landing Page, Email, PPC, Direct Mail, or other A/B split test
have a statistically significant winner?
Enter Your Test Data Below in Steps #1 and #2
     
 
Sample 1
  Sample 2 
Step 1.
Enter Number of Visitors
  
 
   
Step 2.
Enter number of goals achieved
  
 
   
        
 
   
Conversion Rate is:
  
Confidence Level is:

YOUR SPLIT A/B TEST RESULT:

     

 

Explanation of Statistical Significance

When conducting tests, results can vary due to randomness. Statistical significance helps you understand how likely randomness could have caused the difference in the results between your A and B samples.

This calculator uses a confidence level of 95%, meaning that if it says you have a winner, then there is only a 5% chance that the difference was due to randomness and a 95% chance that the sample is different: i.e. the advert, subject line or email you are testing really is better.

For more information on statistical significance, check out the Wikipedia article on the topic.