Survey and Training Module Integration
The Survey module integrates with the Training Module to enable automated post-training feedback collection that captures learner satisfaction, course effectiveness, and improvement opportunities immediately after course or learning path completion. This integration transforms training from standalone educational activities into feedback-informed programs where survey data drives continuous improvement and validates training investment.
Survey and Training integration provides administrators with automated feedback collection workflows that prompt learners for input when training experiences are fresh, supports course-specific response tracking for comparative analysis, and maintains separate submission records for each training item to enable targeted program optimization.
Integration Overview
Survey and Training Module integration enables administrators to attach published surveys to individual courses and learning paths, creating automatic post-completion feedback prompts that collect structured learner input without requiring manual survey distribution or learner navigation to separate feedback forms.
Integration Benefits
For learners:
- Immediate feedback opportunity when course experiences are fresh and detailed
- Convenient modal interface requiring no navigation away from course completion flow
- Optional participation maintaining learner autonomy and respecting time constraints
- Clear feedback channels demonstrating organizational commitment to training quality
For training administrators:
- Automated feedback collection eliminating manual survey distribution
- Course-specific response tracking enabling comparative effectiveness analysis
- Consistent feedback timing ensuring all learners receive prompts at optimal moments
- Reduced administrative overhead through system-managed survey presentation
For organizations:
- Data-driven training improvement through structured learner feedback
- Training effectiveness validation supporting investment justification
- Continuous program enhancement based on learner satisfaction trends
- Quality assurance mechanisms ensuring training meets learner needs
Attaching Surveys to Courses
Prerequisites
Before attaching surveys to courses:
Survey must be published: Only published surveys appear in course survey dropdown menus. Draft and archived surveys are not available for Training attachment.
Course must exist: The course must be created and configured in the Training Module before survey attachment.
Appropriate permissions: User must have Training Module management permissions to configure course settings.
Attachment Process for Courses
To attach a survey to an individual course:
- Navigate to the Training Module course management interface
- Locate and select the course requiring feedback collection
- Access the Course Settings or General configuration section
- Locate the Survey dropdown field
- Click the Survey dropdown to display available published surveys
- Select the desired survey from the list
- Click Save or Update to apply the survey attachment
- The survey becomes automatically associated with the course
Field location: The Survey dropdown is typically located in the Course Settings section alongside other course configuration options such as Price, Request URL, Category, and Display Settings.
Field type: Lookup dropdown populated with all published surveys from the Survey module.
User Experience After Attachment
Once a survey is attached to a course, the following workflow activates:
Completion trigger:
- Learner completes all course requirements (lessons, quizzes, passing thresholds)
- Course status changes to "Completed" in the learner's training record
- System detects completion and course has attached survey
"Share Your Feedback" prompt:
- A "Share Your Feedback" confirmation dialog displays immediately after completion
- Dialog appears on the course information page or completion confirmation screen
- Learner sees two response options:
- Yes: Opens the survey in a modal overlay for immediate completion
- No Thanks: Closes the dialog and allows the learner to continue without feedback
Survey modal presentation (when learner clicks "Yes"):
- Survey opens in a modal window overlaying the current page
- Learner completes the survey within the modal interface
- Upon submission, the modal closes and learner returns to the course page
- Survey submission is recorded with course identifier metadata
Optional participation:
- Learners can select "No Thanks" to decline the feedback prompt
- Declining does not affect course completion status
- Learners who decline cannot access the survey later through automatic prompts
- No negative consequences for declining feedback participation
Note: Detailed user interface flows and modal behavior are subject to further documentation updates as additional integration specifics are clarified.
Attaching Surveys to Learning Paths
Attachment Process for Learning Paths
To attach a survey to a learning path:
- Navigate to the Training Module learning path management interface
- Locate and select the learning path requiring feedback collection
- Access the Learning Path Settings configuration section
- Locate the Survey dropdown field in the Learning Path Settings area
- Click the Survey dropdown to display available published surveys
- Select the desired survey from the list
- Click Save or Update to apply the survey attachment
- The survey becomes automatically associated with the learning path
Field location: The Survey dropdown is typically located in the Learning Path Settings section alongside other configuration options such as Price, Request URL, Self-Register toggle, and Language settings.
User Experience for Learning Path Surveys
The learning path survey workflow follows the same pattern as course surveys:
Completion trigger:
- Learner completes all courses within the learning path
- Learning path status changes to "Completed"
- System detects learning path completion and attached survey
Survey prompt and completion: Same "Share Your Feedback" dialog and modal interface as course surveys
Timing difference: Survey prompt appears after completing all courses in the learning path rather than after individual course completions, enabling holistic program feedback rather than course-specific input.
Course-Specific Response Tracking
Submission Rules by Survey Type
Survey submission tracking varies based on whether surveys are standalone or attached to Training content:
Standalone surveys (not attached to Training):
- Authenticated portal users can submit once per survey total
- Example: User completes "General Satisfaction Survey" and submits once; cannot resubmit
Course-attached surveys:
- Authenticated portal users can submit once per survey per course
- Example: "Training Feedback" survey is attached to both "Product Basics" course and "Advanced Features" course
- User completes "Product Basics" and submits survey
- User later completes "Advanced Features" and can submit the same survey again
- System tracks separate submissions for each course
Learning path-attached surveys:
- Authenticated portal users can submit once per survey per learning path
- Example: Same survey attached to two different learning paths allows two separate submissions
Metadata Tagging
Each survey submission from Training Module integration includes system-generated metadata identifying the source training item:
Course submissions: Include course identifier linking the response to the specific course Learning path submissions: Include learning path identifier linking the response to the specific program
Analytical value: Metadata tagging enables:
- Course-specific response filtering in survey analytics
- Comparative analysis of feedback across multiple courses using the same survey
- Training item effectiveness comparison based on learner satisfaction patterns
- Targeted improvement efforts based on course-specific feedback themes
Note: The specific mechanisms for accessing course-specific response data within survey analytics are subject to further documentation updates.
Survey Selection Strategies
Single Survey for Multiple Courses
Approach: Attach the same survey to multiple related courses to enable comparative analysis
Use cases:
- Standard post-course feedback survey attached to all courses in a program
- Common effectiveness measurement across different training topics
- Consistent feedback instrument enabling cross-course comparison
- Simplified survey management with single survey instrument
Benefits:
- Standardized feedback collection across training portfolio
- Comparable metrics facilitating identification of high and low-performing courses
- Reduced administrative overhead managing single survey rather than many
- Aggregated response volumes improving statistical reliability
Considerations:
- Questions must be generic enough to apply to all courses
- Course-specific questions require separate, targeted surveys
- Response volumes may accumulate rapidly requiring regular monitoring
Course-Specific Surveys
Approach: Create dedicated surveys for individual courses requiring targeted feedback
Use cases:
- New courses requiring detailed feedback on specific content elements
- Specialized training with unique assessment needs
- Courses undergoing iterative development requiring focused improvement data
- High-value programs justifying dedicated feedback collection
Benefits:
- Targeted questions addressing course-specific content and objectives
- Detailed feedback enabling precise improvement identification
- Clear association between feedback and specific training investment
- Specialized question types aligned with unique course characteristics
Considerations:
- Increased survey management overhead with multiple instruments
- Lower response volumes per survey potentially limiting statistical reliability
- Survey proliferation may complicate cross-course effectiveness comparison
Learning Path Program Surveys
Approach: Attach surveys to learning paths for holistic multi-course program feedback
Use cases:
- Certification programs requiring program-level effectiveness assessment
- Onboarding journeys with multiple courses requiring overall experience feedback
- Skill development tracks needing comprehensive program evaluation
- Multi-tier training progressions requiring sequencing and coherence feedback
Benefits:
- Holistic feedback on program structure, sequencing, and overall effectiveness
- Assessment of learning path coherence rather than isolated course quality
- Reduced survey fatigue with single survey per program rather than per course
- Strategic program-level insights supporting curriculum design decisions
Considerations:
- Questions should focus on program-level themes rather than individual course details
- Timing delay between starting and completing learning paths may affect feedback freshness
- Course-specific improvement insights require supplementary course-level surveys
Survey Design for Training Integration
Question Design Considerations
Focus on actionable feedback: Design questions that produce responses administrators can act upon to improve training quality
Balance quantitative and qualitative: Include rating questions for metrics tracking and text questions for detailed improvement suggestions
Keep surveys brief: Respect learner time with focused surveys (typically 5-10 questions) that don't discourage completion
Align questions with course objectives: Ask about learning outcome achievement, content clarity, and objective satisfaction rather than peripheral topics
Recommended Question Types
Overall satisfaction (Rating or NPS):
- "How satisfied are you with the overall [course/learning path] experience?"
- "How likely are you to recommend this [course/learning path] to colleagues?" (NPS)
Content effectiveness (Rating):
- "How would you rate the clarity of the training materials?"
- "How relevant was the course content to your role?"
Learning confidence (Rating):
- "How confident are you applying the skills learned in this training?"
- "Rate your ability to perform the key tasks covered in this course."
Specific aspects (Radio or Checkboxes):
- "Which aspects of the course were most valuable?" (Checkboxes)
- "What was your primary challenge completing this training?" (Radio)
Open-ended improvement (Multi-line Text):
- "What suggestions do you have for improving this [course/learning path]?"
- "What additional topics or resources would enhance this training?"
Survey Naming Conventions
Generic surveys: Use names indicating multi-course applicability
- Examples: "Standard Course Feedback", "Post-Training Survey", "Course Evaluation"
Course-specific surveys: Include course name or identifier
- Examples: "Product Fundamentals Feedback", "Advanced Sales Training Survey"
Learning path surveys: Include program name and scope indicator
- Examples: "Partner Onboarding Program Evaluation", "Certification Track Feedback"
Configuration Best Practices
Survey Publication Timing
Publish before attachment: Always publish surveys before attempting to attach them to courses or learning paths—draft surveys do not appear in Training Module dropdown menus
Test before broad deployment: Attach surveys to pilot courses and test the full learner workflow before rolling out to entire training catalogs
Coordinate with course launch: Ensure surveys are published and attached before course publication to activate feedback collection from first learners
Attachment Strategy
Start selectively: Attach surveys to high-priority courses or new content requiring feedback rather than surveying all training immediately
Monitor response rates: Track survey completion percentages to identify learner willingness to provide feedback and adjust survey length or questions accordingly
Review and refine: Regularly review survey response patterns and refine questions to improve feedback quality and actionable insight generation
Archive outdated surveys: When refreshing survey questions, archive old surveys and create new versions to maintain data integrity for historical comparison
Learner Communication
Set expectations: Communicate survey purpose and approximate completion time in course descriptions or welcome materials
Emphasize value: Explain how feedback drives training improvement to encourage learner participation
Acknowledge feedback: Share improvements made based on learner feedback to demonstrate survey value and encourage future participation
Respect optional nature: Clearly communicate that surveys are optional and declining has no negative consequences for learners
Integration with Other Modules
Journey Builder Connection
Surveys attached to Training courses can complement Journey Builder Course Completion steps:
Workflow example:
- Journey includes Course Completion step requiring specific course
- Learner completes course (Journey step marks complete)
- Course has attached survey prompting for feedback
- Journey may include subsequent Survey Submission step requiring feedback before phase completion (separate from automatic course survey prompt)
Note: Course-attached surveys differ from Journey Builder Survey Submission steps—course surveys are optional automatic prompts while Journey steps can require survey completion for journey progression.
Rewards Module Synergy
Organizations may combine Training and Survey integration with Rewards Module incentives:
Potential approach (subject to Rewards Module capabilities):
- Learners earn points for course completion
- Additional points awarded for survey completion
- Combined incentives encourage both learning and feedback provision
Note: Specific Rewards Module integration details are covered in separate Rewards Module documentation.
Troubleshooting
Survey Not Appearing in Dropdown
Possible causes:
- Survey is in Draft status (must be Published)
- Survey is Archived (not available for attachment)
- User lacks permissions to view surveys
Resolution: Verify survey status is Published, check user permissions for Survey module access
Learners Not Seeing Survey Prompt
Possible causes:
- Course completion requirements not fully satisfied
- Survey was not attached before learner completed course
- Browser or system technical issues preventing modal display
Resolution: Verify course completion status, confirm survey attachment timing, test survey prompt in different browsers
Survey Submitted But Not Counted
Possible causes:
- Learner previously submitted the same survey for the same course
- Technical submission error not recording response
Resolution: Verify submission rules (one per user per course), check survey analytics for submission recording, test submission workflow
Note: Detailed troubleshooting workflows and integration behavior specifics are subject to further documentation updates as additional technical details are confirmed.
Jump to Survey Module Checklist
<< Analyzing Survey Results