Table of Contents


Analyzing Survey Results

Administrators can access comprehensive survey performance metrics and question-level response data through the Analyze tab on published surveys. The Survey module provides analytics capabilities that support data-driven decision making through aggregate response visualization, conversion rate tracking, and flexible date range filtering that enable trend analysis and survey effectiveness assessment.

Survey analytics transform raw response data into actionable insights through automated calculations, visual charts, and response aggregation that facilitate quick understanding of user sentiment, satisfaction levels, and feedback patterns across organizational communities.

Requirements

To analyze survey results, users must be assigned a security role with one of the following permissions:

  • Administrator System Role, or
  • Additional Role Settings: Enable Survey Management Access

Accessing Survey Analytics

Navigation to Survey Analytics

To view survey performance metrics and response data:

  1. Navigate to Setup > Create > Surveys
  2. Select a published survey from the Surveys Home list
  3. Click the Analyze tab
Note: The Analyze tab is only available for published surveys. Draft surveys do not display analytics because they are not accessible to respondents and cannot collect response data.

Analyze Tab Availability

Published surveys: Analyze tab displays with full metrics and response data

Draft surveys: Analyze tab is not present—no analytics are collected until publication

Archived surveys: Analyze tab remains accessible with all historical response data preserved for continued reporting and analysis

Overview Metrics

The Analyze tab begins with an analytics summary section displaying three key performance indicators that measure survey effectiveness and respondent engagement.

Date Range Filter

Location: Top of the Analyze tab, above the overview metrics

Purpose: Filter all analytics data (overview metrics and question-level results) to a specific timeframe

Date Range Options:

Preset ranges:

  • Today: Current calendar day
  • This Month: Current calendar month from the 1st to current date
  • This Quarter: Current calendar quarter
  • This Year: Current calendar year from January 1 to current date
  • Custom Range: User-defined start and end dates

Custom range selection:

  1. Click the Date Range dropdown
  2. Select Custom Range
  3. Use the date picker to select start date and end date
  4. Click Apply to filter analytics to the selected timeframe

Default view: Analytics typically default to a recent preset range (such as This Month) or show all-time data depending on system configuration.

Filter scope: The selected date range applies to:

  • Overview metrics (Views, Conversion Rate, Submissions)
  • All question-level response aggregations and charts
  • Response lists for text-based questions

Views

Definition: Total count of survey link clicks during the selected date range

Counting behavior:

  • Every click on the survey link increments the Views counter
  • The same user opening the survey multiple times generates multiple Views
  • Opening the survey without completing it (closing or navigating away) still counts as a View
  • Each page load of the survey URL is tracked as a View

Interpretation:

  • High Views indicate strong link distribution and respondent interest
  • Views exceeding Submissions suggest completion barriers or intentional browsing
  • Views per Submission ratio informs conversion optimization opportunities

Display: Large numeric value with "VIEWS" label

Conversion Rate

Definition: Percentage of survey Views that resulted in completed Submissions

Calculation: (Submissions ÷ Views) × 100

Example calculations:

  • 50 Submissions ÷ 100 Views = 50% Conversion Rate
  • 10 Submissions ÷ 100 Views = 10% Conversion Rate
  • 100 Submissions ÷ 150 Views = 66.67% Conversion Rate

Interpretation:

  • High conversion rates (typically 40%+) indicate effective survey design, clear questions, and appropriate length
  • Low conversion rates (below 20%) may indicate survey length issues, confusing questions, or technical problems
  • Conversion rate trends over time reveal survey fatigue or optimization success

Factors affecting Conversion Rate:

  • Survey length—shorter surveys typically achieve higher conversion
  • Question clarity—confusing questions increase abandonment
  • Technical issues—display problems or slow loading reduce completion
  • Required questions—excessive required questions increase abandonment risk
  • Respondent motivation—high engagement topics achieve better conversion

Display: Percentage value with "CONVERSION RATE" label

Submissions

Definition: Total count of completed survey responses during the selected date range

Counting behavior:

  • A Submission is counted when a respondent clicks the final submit button
  • Partial completions (users who start but don't submit) are not counted
  • Each authenticated user can submit once (excluding course-attached surveys with multiple submission rules)

Interpretation:

  • High Submissions indicate effective distribution and respondent engagement
  • Submissions trends over time reveal response collection patterns
  • Comparing Submissions across surveys identifies high-performing feedback instruments

Display: Large numeric value with "SUBMISSIONS" label

Using Overview Metrics Together

Scenario analysis examples:

High Views, Low Submissions, Low Conversion Rate:

  • Indication: Survey is being accessed but not completed
  • Possible causes: Survey too long, confusing questions, technical issues
  • Action: Review question clarity, reduce survey length, test on multiple devices

Low Views, High Conversion Rate:

  • Indication: Distribution is limited but engaged respondents complete the survey
  • Possible causes: Narrow distribution, highly motivated respondents
  • Action: Expand distribution channels to increase Views while maintaining conversion

High Views, High Submissions, High Conversion Rate:

  • Indication: Effective survey design with strong distribution
  • Status: Optimal performance—consider using as template for future surveys
  • Action: Continue monitoring and maintain current strategy

Increasing Views without Submission increase:

  • Indication: More exposure but conversion effectiveness declining
  • Possible causes: Survey fatigue, wrong audience, declining relevance
  • Action: Refresh survey content, verify target audience alignment, consider archiving

Question-Level Results

Below the overview metrics, the Analyze tab displays individual results for each question in the survey, presented in the order questions appear in the survey structure.

Common Question Display Elements

All question results include:

Question text: The full question as configured in the Survey Builder

Date range indicator: Text line showing the filtered timeframe (e.g., "Data Range: From 11/1/2025 To 11/30/2025")

Result visualization: Charts, lists, or statistical summaries appropriate to the question type

Choice-Based Question Analytics

Choice-based question types (Rating, Sentiment, Radio, Single Checkbox, Checkboxes, Dropdown, NPS) display aggregated response data with visual charts and statistical summaries.

Rating Questions

Display format:

  • Bar chart showing response distribution across the 0-10 scale
  • X-axis: Rating values from 0 to 10
  • Y-axis: Response count or percentage
  • Statistical summary may include average rating

Interpretation:

  • Distribution patterns reveal satisfaction levels and consensus
  • Clustering at high values (8-10) indicates strong satisfaction
  • Clustering at low values (0-3) indicates dissatisfaction requiring attention
  • Spread across scale suggests mixed experiences or unclear question

Sentiment Questions

Display format:

  • Bar chart or pie chart showing percentage of each sentiment option
  • Total response count per sentiment
  • Color-coded sentiment indicators

Interpretation:

  • Dominant sentiment reveals overall emotional response
  • Sentiment distribution patterns identify consensus or polarization
  • Negative sentiment concentrations highlight concern areas

Radio Questions

Display format:

  • Bar chart showing response count or percentage for each option
  • Options listed with corresponding values
  • Percentage distribution across all options

Interpretation:

  • Highest percentage options indicate most common responses
  • Distribution patterns reveal preference concentrations
  • Unexpected distributions may indicate question confusion or need for option refinement

Single Checkbox Questions

Display format:

  • Simple percentage breakdown:
    • Percentage who checked the box
    • Percentage who left unchecked
  • Total response count for each state

Interpretation:

  • High checked percentage for acknowledgments indicates compliance
  • Low checked percentage for opt-ins reveals limited interest
  • Percentage trends over time show changing attitudes or awareness

Checkboxes Questions (Multi-Select)

Display format:

  • Bar chart showing response count or percentage for each option
  • Note: Percentages may exceed 100% total because respondents can select multiple options
  • Total selection count per option

Interpretation:

  • High-percentage options indicate common challenges or widely used features
  • Low-percentage options may indicate niche concerns or underutilized capabilities
  • Option combinations analysis reveals common patterns (requires manual review of individual responses)

Dropdown Questions

Display format: Identical to Radio questions

  • Bar chart showing response count or percentage for each option
  • Percentage distribution across all options

Interpretation: Same as Radio questions—focus on highest percentage responses and distribution patterns

NPS Questions

Display format:

  • Distribution chart showing percentage in each category:
    • Detractors (0-6)
    • Passives (7-8)
    • Promoters (9-10)
  • NPS score calculation: (% Promoters - % Detractors)
  • Detailed response count for each score value (0-10)
  • Follow-up question responses grouped by category

NPS Score Interpretation:

  • +50 to +100: Excellent—strong loyalty and recommendation likelihood
  • +10 to +49: Good—positive sentiment with room for improvement
  • 0 to +9: Needs attention—neutral sentiment, risk of churn
  • -100 to -1: Critical—negative sentiment requiring immediate action

Category distribution insights:

  • High Promoter percentage indicates strong satisfaction and loyalty
  • High Detractor percentage reveals serious dissatisfaction requiring intervention
  • High Passive percentage suggests satisfied but unenthusiastic users vulnerable to competitive alternatives

Follow-up responses: Text responses are grouped by category (Detractors, Passives, Promoters) enabling targeted analysis of sentiment drivers for each group

Text-Based Question Analytics

Text-based question types (Single-line Text, Multi-line Text) display individual responses in list format without aggregation or statistical analysis.

Single-Line Text Questions

Display format:

  • List of individual text responses
  • Each response displayed on a separate line
  • Response count displayed
  • "Show more" link if response volume exceeds initial display limit

Review approach:

  • Scan responses for common themes or patterns
  • Identify recurring words or phrases indicating trends
  • Note outlier responses that may reveal unique insights
  • Manual categorization may be needed for quantitative analysis

Multi-Line Text Questions

Display format:

  • List of individual text responses
  • Each response displayed in a text block
  • Response count displayed
  • "Show more" link for extensive response lists

Review approach:

  • Read responses for detailed qualitative insights
  • Identify common themes, suggestions, or concerns
  • Extract actionable recommendations from detailed feedback
  • Consider sentiment analysis or thematic coding for large response volumes

Data Limitations

No raw data export: The Magentrix Survey module does not provide raw data export or CSV download capabilities. All analytics are viewed exclusively through the Analyze tab interface.

No cross-tabulation: The system does not provide built-in cross-tabulation or correlation analysis between questions. Administrators must manually review response patterns to identify relationships.

Aggregate views only for choice questions: Choice-based questions show only aggregated summaries—individual respondent answers across multiple questions cannot be viewed together.

No respondent identification: Individual respondent identities are not visible in analytics results to maintain response confidentiality, even for authenticated users.

Date Range Filtering Strategies

Analyzing Specific Timeframes

Campaign-specific analysis:

  1. Note the start and end dates of a distribution campaign
  2. Set Date Range to Custom Range matching campaign dates
  3. Review metrics to assess campaign effectiveness

Month-over-month comparison:

  1. Set Date Range to specific month (e.g., October 1-31)
  2. Note Views, Conversion Rate, and Submissions
  3. Change Date Range to next month (e.g., November 1-30)
  4. Compare metrics to identify trends

Event feedback analysis:

  1. Set Date Range to event dates (e.g., conference week)
  2. Review responses collected during event timeframe
  3. Isolate event-specific feedback from ongoing survey responses

Trend Analysis Approaches

Quarter-over-quarter trends:

  • Analyze Q1, Q2, Q3, Q4 separately
  • Compare Submissions volume and Conversion Rates
  • Identify seasonal patterns or declining engagement

Before and after comparisons:

  • Set Date Range to period before a change (e.g., portal update)
  • Note sentiment and satisfaction levels
  • Set Date Range to period after the change
  • Compare metrics to assess change impact

Rolling window analysis:

  • Analyze "Last 30 days" repeatedly over time
  • Track rolling metrics to identify emerging trends
  • Monitor Conversion Rate changes indicating survey fatigue or increased effectiveness

Best Practices for Analytics Review

Regular Monitoring Schedule

Weekly monitoring: For active surveys with high response volumes, review analytics weekly to:

  • Identify sudden Conversion Rate drops indicating issues
  • Monitor Submissions growth versus distribution efforts
  • Catch technical problems quickly

Monthly monitoring: For ongoing feedback surveys, conduct monthly reviews to:

  • Assess overall performance trends
  • Compare month-over-month metrics
  • Plan survey refresh or archiving timing

Campaign-specific monitoring: For time-limited surveys, monitor continuously during collection period to:

  • Optimize distribution timing based on response patterns
  • Identify low-performing distribution channels
  • Maximize response collection during campaign window

Conversion Rate Optimization

Establish baselines: Track initial Conversion Rate for new surveys to establish performance expectations

Identify drop-off patterns: If Conversion Rate declines over time:

  • Survey fatigue may be affecting completion
  • Consider refreshing survey content or distribution messaging
  • Evaluate if survey length or question complexity needs reduction

A/B testing approach: Create cloned surveys with variations:

  • Test different question orders
  • Compare one-question-per-page versus all-questions-on-one-page
  • Evaluate different Thank You page messaging
  • Deploy both versions and compare Conversion Rates

Question-Level Analysis

Identify consensus questions: Questions with strong clustering (80%+ selecting one option) indicate:

  • Clear respondent consensus
  • Possible question redundancy if results consistently similar
  • Opportunity to remove from future surveys if not providing new insights

Identify divisive questions: Questions with even distribution across options may indicate:

  • Question confusion requiring clarification
  • Genuinely mixed experiences requiring deeper investigation
  • Opportunity for follow-up surveys exploring the division

Text response review priorities:

  • Prioritize Detractor follow-up responses in NPS questions for immediate action opportunities
  • Review Multi-line text responses for detailed improvement suggestions
  • Scan Single-line text for unexpected themes not covered in choice-based questions

Action Planning from Analytics

High-impact insights:

  • Prioritize issues mentioned in multiple text responses
  • Act on clear Detractor feedback from NPS follow-ups
  • Address low satisfaction ratings (0-3 on Rating questions)

Quick wins identification:

  • Look for specific feature requests mentioned repeatedly
  • Identify easy-to-fix issues with high response frequency
  • Prioritize actionable suggestions from text responses

Trend validation:

  • Compare survey results with other data sources (support tickets, usage analytics)
  • Validate concerning trends with follow-up surveys or interviews
  • Confirm positive trends before broad communication

Reporting Survey Insights

Executive summaries: Create concise summaries highlighting:

  • NPS score and trend direction
  • Top 3 themes from text responses
  • Critical action items from Detractor feedback
  • Key performance metrics (Submissions, Conversion Rate)

Stakeholder reports: Tailor analytics presentation to audience:

  • Product teams: Feature-specific feedback and improvement suggestions
  • Training teams: Course-specific satisfaction and clarity ratings
  • Support teams: Support quality ratings and common issues
  • Executive leadership: High-level trends and strategic implications

Action plans: Always include next steps with survey insights:

  • Specific actions to address identified issues
  • Timeline for implementation
  • Ownership assignments for action items
  • Follow-up survey plans to measure improvement

Archived Survey Analytics

Accessing Historical Data

Archived surveys retain full analytics access:

  1. Navigate to Setup > Create > Surveys
  2. Locate the archived survey (Status = Archived)
  3. Select the survey
  4. Click the Analyze tab
  5. All historical response data remains accessible
Note: Archived surveys cannot collect new responses—the link is inactive. Analytics show only responses collected before archiving.

Historical Comparison Uses

Year-over-year comparisons:

  • Compare current survey results with archived previous year's survey
  • Identify satisfaction trends over time
  • Measure program improvement effectiveness

Survey iteration assessment:

  • Compare archived survey version with current survey
  • Evaluate if changes improved Conversion Rate or response quality
  • Determine if survey evolution achieved intended goals

Baseline establishment:

  • Use archived surveys as performance baselines
  • Set targets for new surveys based on historical performance
  • Identify best-performing historical surveys for template development

Troubleshooting Analytics Issues

No Data Displaying

Possible causes:

  • Survey is still in Draft status (Analyze tab not available)
  • Date Range filter excludes all responses (adjust to broader range)
  • Survey was just published and no responses collected yet

Resolution: Verify survey is Published, adjust Date Range to "This Year" or broader range, confirm survey link has been distributed

Unexpected Conversion Rate

Very low Conversion Rate (under 10%):

  • Review survey length—may be too long
  • Preview survey to check for technical display issues
  • Verify questions are clear and not confusing
  • Check that required questions are not excessive

Very high Views without corresponding Submissions:

  • May indicate survey preview testing counting as Views
  • Could indicate technical submission errors
  • Review Thank You page configuration to ensure submission completes properly

Missing Expected Responses

Responses expected but Submissions show 0:

  • Verify survey is Published (not Draft)
  • Confirm survey link is correct and active
  • Test survey completion yourself to verify functionality
  • Check that users are not encountering technical errors

Analyzing survey results provides essential feedback interpretation capabilities that support data-driven decision making, trend identification, and continuous improvement through comprehensive metrics tracking, visual response aggregation, and flexible date range filtering that transform raw response data into actionable organizational insights.


Jump to Survey Module Checklist

<< Publishing and Sharing SurveysSurvey and Training Module Integration >>