About the Survey Module
The Survey module enables administrators to create comprehensive feedback collection and sentiment measurement tools for portal users. This powerful data gathering system allows organizations to design custom surveys with multiple question types, collect user responses through shared links or automated prompts, and analyze aggregated results to inform business decisions and improve user experiences across partner, customer, and employee communities.
Survey transforms traditional feedback processes into dynamic, measurable experiences that provide real-time insights into user sentiment, satisfaction levels, and actionable feedback while maintaining response integrity through submission controls and automated tracking capabilities.
Core Functionality
Survey Design and Question Types
Flexible Question Architecture: Surveys support nine distinct question types that accommodate different feedback collection needs and measurement approaches. Each question type serves specific data collection purposes and provides appropriate response mechanisms for users.
Available Question Types:
- Rating: Numeric scale from 0-10 with configurable labels (e.g., "Strongly disagree" to "Strongly agree") for quantitative satisfaction measurement
- Sentiment: Preconfigured sentiment scale for quick emotional response capture
- Radio: Single-select options from a custom list for mutually exclusive choices
- Single Checkbox: Individual checkbox for acknowledgments, opt-ins, or binary responses
- Checkboxes: Multi-select options allowing users to choose multiple responses from a list
- Dropdown: Single-select dropdown menu for space-efficient choice presentation
- Single-line Text: Short free-form text input for brief responses like job titles or names
- Multi-line Text: Extended text area for detailed feedback, comments, or explanations
- Net Promoter Score (NPS): Standard 0-10 NPS scale with automated follow-up questions based on score category (Detractors 0-6, Passives 7-8, Promoters 9-10)
Question Configuration: Each question supports help text, placeholder text where applicable, and required field designation to ensure appropriate data collection and user guidance.
Survey Structure and Presentation
Page Skip Logic: Administrators configure how survey questions are displayed to respondents through two presentation modes that affect user experience and completion patterns.
Presentation Options:
- One question per page: Each question appears on its own screen with Next/Previous navigation, supporting focused attention on individual questions and reduced cognitive load for longer surveys
- All questions on one page: All questions display simultaneously with single-submission workflow, optimizing completion speed for shorter feedback forms
Page Skip Logic settings can be modified after survey publication and affect only new users who access the survey after the change, without impacting responses already in progress.
Survey Content Elements: Beyond questions, surveys include configurable title and description fields that provide context to respondents, and a customizable Thank You page with optional redirect functionality to guide users after submission.
Survey Lifecycle Management
Draft Status: New surveys begin as drafts, allowing administrators to build, test, and refine survey content without exposing incomplete work to respondents. Draft surveys support full editing capabilities including adding, removing, reordering, and modifying questions.
Publication Process: Publishing surveys activates them for respondent access via direct links and automated prompts. Once published, the survey structure becomes locked to protect response data integrity—administrators cannot add new questions, remove existing questions, or change question order. Published surveys can still be previewed, cloned, and archived as needed.
Archive Functionality: Archiving published surveys permanently deactivates the survey link, preventing new responses while preserving all collected data for historical analysis. Archived surveys remain viewable in the Analyze tab for continued reporting access but cannot be reactivated or reverted to active status.
Response Tracking and Analytics
Submission Controls: The system enforces submission integrity through user-based tracking for authenticated portal users. Each authenticated user can submit a given survey only once, protecting data quality and preventing duplicate responses that could skew results.
Analytics Dashboard: The Analyze tab provides comprehensive survey performance metrics including:
- Views: Total count of survey link clicks, including multiple opens by the same user and incomplete sessions
- Conversion Rate: Percentage of views that resulted in completed submissions, calculated as (Submissions ÷ Views) × 100
- Submissions: Total number of completed survey responses
Date Range Filtering: Analytics can be filtered by custom date ranges or preset periods (This Month, This Year, etc.) to analyze survey performance over specific timeframes.
Question-Level Results: Survey results display aggregated response data by question. Choice-based questions (Rating, Sentiment, Radio, Checkboxes, Dropdown, NPS) show charts and statistical summaries. Text-based questions (Single-line Text, Multi-line Text) display individual responses in list format.
Distribution and Access
Survey Link Sharing
Link Generation: Each survey has a unique URL that administrators can copy through the Get Link function. The system provides two link formats:
- Internal link: Relative URL for use within the portal environment
- Full path: Absolute URL including domain for external distribution
Open Access Model: Survey links are publicly accessible—anyone with the link can access and complete the survey without authentication or login requirements. This enables broad feedback collection from diverse audiences including non-portal users.
Link Usage in Portal Navigation: Survey links can be embedded directly in portal navigation through:
- Tabs: Create custom tabs that load surveys as destination pages, making feedback collection accessible from primary navigation
- Engagement Pages: Use the Tiles widget to create navigational shortcuts that link to surveys, integrating feedback prompts into custom landing pages and user experiences
Training Module Integration
Automated Survey Prompts: Surveys can be attached to courses and learning paths in the Training module, creating automated feedback collection immediately following course completion. When users complete training content with an attached survey, the system automatically displays a "Share Your Feedback" confirmation dialog.
User Response Options: The feedback prompt presents users with immediate choices:
- Yes: Opens the survey in a modal for immediate completion
- No Thanks: Closes the prompt and allows users to continue without submitting feedback
Course-Specific Tracking: When surveys are attached to multiple courses or learning paths, the system tracks submissions separately for each training item. Users can submit the same survey once per course or learning path they complete, with each submission tagged with the corresponding course/learning path identifier for analysis.
Integration Features
Engagement Pages Integration
Tiles Widget Integration: Surveys can be embedded in Engagement Pages using the Tiles widget, which enables administrators to create visual navigation elements that link directly to surveys. This integration supports strategic placement of feedback collection within custom portal experiences.
Tile Configuration for Surveys: When configuring tiles for survey access, administrators specify the survey URL in the URL field and configure the link target (Same Window or New Window) to control how users access the survey experience.
Module Integration Architecture
Training Module Connection: Survey integration with the Training module creates seamless feedback collection workflows where course completion automatically triggers survey availability, connecting educational experiences with immediate feedback capture.
Data Consistency: Survey submission data maintains referential integrity with Training module course completion records through stored course/learning path identifiers, enabling cross-module reporting and analysis of training effectiveness.
Common Use Cases
Training and Course Feedback
Post-Training Assessment: Surveys attached to courses collect immediate feedback on training effectiveness, content clarity, instructor performance, and learning outcomes while experiences remain fresh in learners' minds.
Learning Path Evaluation: Multi-course learning path surveys gather comprehensive feedback on curriculum structure, progression logic, and overall program effectiveness to inform continuous training improvement.
User Experience and Satisfaction Measurement
Portal Experience Feedback: General satisfaction surveys distributed through Engagement Page tiles or standalone links collect broad user sentiment about portal functionality, content accessibility, and overall user experience quality.
Feature-Specific Feedback: Targeted surveys linked from specific portal sections or features gather focused feedback on discrete functionality, supporting iterative product improvement and feature prioritization.
Partner and Customer Engagement
Onboarding Experience Assessment: Surveys integrated into onboarding workflows collect feedback on activation processes, documentation clarity, and support effectiveness during critical early partnership or customer relationship phases.
Net Promoter Score Tracking: NPS surveys measure loyalty and recommendation likelihood across partner and customer populations, providing standardized metrics for relationship health monitoring and trend analysis.
Event and Program Evaluation
Event Feedback Collection: Post-event surveys shared via email or embedded in follow-up communications gather attendee feedback on content, logistics, networking opportunities, and overall event value.
Program Effectiveness Measurement: Surveys distributed to program participants assess satisfaction, outcome achievement, and areas for program enhancement to support continuous improvement cycles.
Best Practices for Implementation
Survey Design Considerations
Question Clarity: Use clear, unambiguous question text that respondents can understand without additional context. Include help text for complex questions requiring clarification or guidance.
Survey Length Balance: Balance comprehensive data collection needs with user attention span and completion willingness. Consider breaking extensive feedback needs into multiple focused surveys rather than single lengthy instruments.
Required vs. Optional Questions: Use required field designation strategically—mark only essential questions as required to reduce abandonment while ensuring critical data collection.
Question Type Selection: Choose question types appropriate to data collection needs: use Rating or NPS for quantitative measurement, Radio or Checkboxes for categorical responses, and text fields for detailed qualitative feedback.
Distribution Strategy
Audience Targeting: Consider survey audience when selecting distribution methods—use Training module integration for learner feedback, Engagement Page tiles for general portal users, and direct links for external audiences.
Timing Optimization: Distribute surveys when feedback will be most accurate and actionable—immediately after experiences for event feedback, during reflection periods for program assessment, or at regular intervals for ongoing satisfaction tracking.
Response Rate Optimization: Communicate survey purpose and value to encourage participation. Use Thank You page redirect functionality to acknowledge participation and guide users to next actions.
Implementation and Management
Pilot Testing: Test surveys with small user groups before broad deployment to identify confusing questions, technical issues, or unexpected response patterns that require adjustment.
Regular Review: Monitor survey analytics including conversion rates and question-level response patterns to identify opportunities for survey optimization and improved data collection effectiveness.
Archive Strategy: Establish clear criteria for survey archiving to prevent outdated or irrelevant surveys from continuing to collect responses that may not align with current organizational needs or measurement frameworks.
The Survey module provides essential feedback collection capabilities that support data-driven decision making, continuous improvement processes, and user experience optimization through flexible, accessible, and analytically rich survey creation and management systems.
Survey Module Checklist >>