Skip to main content

FV Decipher Support

All the topics, resources needed for FV Decipher.

FocusVision Knowledge Base

Report Testing

Overview

When conducting report testing, you analyze the way that simulated data flows through your survey to ensure that things like logic, terminates, and quotas are working properly. Before getting started, make sure that you’ve successfully completed a simulated data run.   

When testing, any simulated data that is run through your project mimics the path that actual respondents might take when the survey is live and display in the reporting tools. The simulated data must obey the setup of the survey, respecting question logic, quotas, sample sources, languages, etc., which allows you to use the reporting tools for testing purposes.  

Learn more: Simulated Testing 

For general testing purposes, it's good practice to let the data fall out naturally, without restrictions or custom test simulation configuration, in order to identify any issues or errors.

The Decipher Platform offers a number of reporting tools for helping you analyze your data. In this training we’re going to utilize those tools to perform quality assurance checks on our project before going live.  

1: Crosstabs

The crosstabs reporting system by decipher provides you with a quick and easy way to view all of the data in your survey. You can create various Crosstabs and splits to dissect your data and conduct more specific side-by-side comparison of the information.  

You can utilize the crosstabs reporting tool to verify that various components of your survey are functioning as expected (quotas are assigning properly, logic is working, etc.) after running simulated data. The goal of testing with crosstabs is to verify that the base of the table matches the base of the crosstabs segment that you’re comparing it to. Take a look at the following documents to learn more:

Save your crosstab reports for later! This helps to save time when reviewing the soft launch data.  

Learn More About Testing with Crosstabs: 

Learn More About Crosstabs:

1.1: The Crosstabs Checklist

Use the Crosstabs checklist below as a guideline on what types of errors to look for when you conduct report testing in the Crosstabs report.  

Click here to download the checklist.

Checking Crosstabs Data:

Compare the Questionnaire (QRE) to the Crosstabs Reporting Tool, paying special attention to quotas, logic terminates, and survey paths. You can use simulated data and custom banners in crosstabs to verify that the logic is working properly. 

n/a

Run the Crosstabs Total Report

Split by: Total Frequencies

Respondents: Qualified Only

Filter: Total

Verify the base for all tables with no condition logic.

   

Verify that all terminated answer choices have 0 Total Respondents.

   

Verify that Rating questions include the following: Top 2 / Bottom 2 / Avg. (top 3 for scales over 7 points).

   

Verify 2D checkbox questions grouped appropriately by columns or by rows.

   

Appropriate virtual questions are in place & correct.

   

Create Custom Crosstabs to check logic from QRE:

Building a Crosstab

 

Check Logic from QRE.

   

Quota breakouts, quota tables match your segments.

   

Verify that autofill elements are collecting data as expected.

   

2: Field Report

The field report shows the health of your survey. This tool gives an overall view of what’s happening with your project. All the data in the field report are real-time and keep track of who is in the survey.

After running simulated data, you can utilize the Field Report to verify that quotas and terminates are working as desired. The field report indicates areas where respondents were not able to qualify for a quota or identify any areas of high terminates.

Learn more: Checking for NQ’s/Unspecified Terminates 

Learn More About The Field Report:

2.1: The Field Report Checklist

Use the Field Report checklist below as a guideline on what types of errors to look for when you conduct report testing on the field report.  

Click here to download the checklist.

Checking Field Report Data:

This report allows you to monitor the progress of your survey before, during and after it has launched.

n/a

General

Split field report by list, segment, variables if needed.

   
Quotas Tab

Quota Limits and totals are set (if provided).

   

Run simulated data to fill all buckets completely. Check all quota segments for any non-qualifying data, for more information click here.

   

Verify that all quota segments match the quota setup designated in the QRE.

   

Check for Non-Qualified Respondents on the Quota Sheet.

   
Terminate Tab

Terminate tab shows all terms possible. If you see “Unspecified Terminates” then you should check your survey immediately for programming errors.

   

3: Soft Launch

Fielding a study is completed in two steps. First is the soft launch, which is reached when 10% or less of the overall quota have completed the survey. At this point, a data check is done to ensure there are no problems with the invite, link, study, etc. After data has been reviewed and it is confirmed that everything looks correct, then the full launch can begin, which is the second part of fielding a study.

The following section explores what to check when conducting a review of the soft launch data collected for your project.  

Do a review of the soft launch data for your project when you have between 30-50 respondents who have qualified and completed your survey. Collecting a larger base size prior to conducting the data check may represent a more accurate fallout of the data.

3.1: The Soft Launch Checklist

Use the Soft Launch checklist below as a guideline to review important parts of your survey and troubleshoot potential issues before proceeding with a full launch.  

Click here to download the checklist.

Checking Soft Launch Data:

Review important parts of your survey and troubleshoot potential issues before proceeding with a full launch. 

n/a

 

Re-check critical logic and segments and verify totals are matching in the report and quotas.

   

Verify virtuals are recording data properly.

   

Review the terminates for the survey and take note of particularly high terminates. Depending on the term, you may want to check with the sample provider to ensure that you’re accurately targeting your sample. 

   

Review the Dropouts for the survey and take any note of particularly low completion rates as this may indicate an area of frustration or programming error for respondents taking your survey. Depending on the dropout page, you may want to use manual testing to make sure that no issue is present. 

   

Verify that there is data in all fields of the report and that the counts in the report match counts in the Field report.

   

Verify all required link variables are being passed and recorded properly (check virtual questions, segments, etc.).

   

Re-check the Quotas.  Verify that setup and limits are correct according to the questionnaire.

   
  • Was this article helpful?