Table of Contents


Publishing and Sharing Surveys

Administrators can publish surveys to activate them for respondent access and distribute survey links through multiple channels including direct URL sharing, portal navigation integration, and Training module attachment. The Survey module provides flexible distribution mechanisms that support diverse feedback collection strategies while maintaining response integrity and enabling comprehensive analytics tracking.

Survey publication transforms draft surveys into active data collection instruments with locked structural configurations that protect response data quality while enabling continued customization of post-submission experiences and presentation settings.

Requirements

To publish and share surveys, users must be assigned a security role with one of the following permissions:

  • Administrator System Role, or
  • Additional Role Settings: Enable Survey Management Access

Survey Lifecycle Overview

Surveys progress through distinct lifecycle states that affect respondent access, administrative capabilities, and data collection status:

Draft Status

Characteristics:

  • Initial state for all newly created surveys
  • Full editing capabilities including adding, removing, and reordering questions
  • Survey link is not active—respondents cannot access the survey
  • No response tracking or analytics collection
  • Enables comprehensive testing and refinement before deployment

Purpose: Draft status provides a safe development environment where administrators can build, configure, and perfect survey content without exposing incomplete work to respondents or affecting data collection integrity.

Published Status

Characteristics:

  • Survey link becomes active—respondents can access and complete the survey
  • Question structure is locked—cannot add, remove, or reorder questions
  • Response tracking begins—Views, Conversion Rate, and Submissions are recorded
  • Analyze tab displays with performance metrics and question-level results
  • Thank You page and Page Skip Logic remain editable
  • Cannot be reverted to Draft status

Purpose: Published status activates surveys for active data collection while protecting response data integrity through structural lockdown that prevents mid-collection changes that could invalidate comparative analysis.

Archived Status

Characteristics:

  • Survey link becomes inactive—respondents receive an error when attempting to access
  • All collected response data is preserved
  • Analyze tab remains accessible for historical reporting
  • Survey cannot be reactivated or unarchived
  • Survey remains visible in Surveys Home list with Archived status indicator

Purpose: Archived status permanently deactivates surveys that are no longer relevant for active data collection while maintaining all historical response data for continued analysis and reporting.

Publishing Surveys

Publication Process

To publish a draft survey and activate it for respondent access:

  1. Navigate to Setup > Create > Surveys
  2. Select the draft survey you want to publish
  3. On the Survey Builder page, click the Publish button in the top-right corner
  4. A confirmation dialog appears:
    • Message: "Are you sure you want to publish the survey? When published, users will be able to access it with the link."
    • Actions: Yes (confirm publication) or Cancel (return without publishing)
  5. Click Yes to confirm publication
  6. The survey status changes from Draft to Published
  7. The survey link becomes immediately active for respondent access
  8. Analytics tracking begins recording Views, Conversion Rate, and Submissions

What Changes After Publishing

Activated capabilities:

  • Survey link becomes functional and publicly accessible
  • Analyze tab appears with performance metrics and date range filtering
  • Views counter begins tracking survey link clicks
  • Submissions counter records completed responses
  • Conversion rate calculation displays (Submissions ÷ Views × 100)

Locked capabilities:

  • Question structure becomes read-only
  • Cannot add new questions to the survey
  • Cannot delete existing questions from the survey
  • Cannot reorder questions within the survey
  • Cannot change question types for existing questions
  • Survey tab Question Types panel is hidden
  • Edit and Delete buttons are removed from question cards

Editable capabilities that remain:

  • Thank You page content (Featured Image, Thank You Header Text, Thank You Body Text, Thank You Action configuration)
  • Page Skip Logic setting (changes affect new users only)
  • Survey name (through More Actions > Edit Settings)

Publication Considerations

Finalize before publishing: Ensure all questions are complete, properly ordered, and thoroughly tested before publishing. Structural changes after publication require cloning the survey and creating a new version.

Preview validation: Always use More Actions > Preview to experience the survey as respondents will before publishing to validate question clarity, flow logic, and Thank You page presentation.

Communication planning: Prepare distribution channels (email campaigns, portal announcements, training attachments) before publishing to maximize immediate response collection.

Analytics preparation: Understand baseline expectations for Views and Submissions to establish performance benchmarks for conversion rate monitoring.

Generating and Copying Survey Links

Accessing the Survey Link

To obtain the survey URL for distribution:

  1. Open the survey (draft or published)
  2. Click More Actions (⋯ button) in the top-right corner
  3. Select Get Link
  4. The Get Survey Link panel appears

Survey Link Panel Components

URL text field:

  • Displays the complete survey URL
  • Format: /survey/take/[survey-identifier] (internal link) or full domain path
  • URL is automatically generated when the survey is created

Copy button:

  • Copies the current link format to clipboard
  • Click Copy to capture the URL for pasting into distribution channels

Link format toggle (arrow next to Copy button):

  • Internal link: Relative URL format for use within the portal environment
    • Format: /survey/take/[survey-identifier]
    • Use case: Embedding in portal tabs, Engagement Pages, or internal communications
  • Full path: Absolute URL format including the complete domain
    • Format: https://[your-portal-domain]/survey/take/[survey-identifier]
    • Use case: External distribution via email, external websites, or non-portal channels

Link Distribution Channels

Email campaigns: Include survey links in email communications to targeted audiences, support signature blocks, or automated notification workflows.

Portal navigation: Embed surveys in tabs or Engagement Pages for permanent placement in portal navigation structures.

External websites: Share full path links on external websites, social media platforms, or third-party applications.

Training communications: Include links in course welcome messages, supplementary materials, or completion notifications for manual feedback collection outside automatic Training integration.

Embedding Surveys in Portal Navigation

Creating Survey Tabs

Surveys can be integrated into portal primary navigation through custom tab creation:

To create a survey tab:

  1. Navigate to Setup > Tabs
  2. Click New to create a new tab
  3. Configure tab properties:
    • Tab Name: Descriptive name for navigation display (e.g., "Share Feedback", "Training Evaluation")
    • URL: Enter the survey URL (internal link format recommended)
    • Target: Select whether the survey opens in the same window or new window
  4. Configure tab sharing permissions to control which user groups see the tab
  5. Save the tab configuration
  6. The survey becomes accessible directly from portal navigation

Use cases for survey tabs:

  • Ongoing feedback collection requiring permanent navigation presence
  • General satisfaction surveys for continuous sentiment monitoring
  • Support quality surveys linked from help sections
  • Feature request surveys for product feedback collection

Integrating Surveys in Engagement Pages

Surveys can be embedded in custom Engagement Pages using the Tiles widget for strategic placement within tailored portal experiences:

To add a survey tile:

  1. Navigate to Setup > Create > Engagement Pages
  2. Select an existing Engagement Page or create a new page
  3. In the page configuration, add a Tiles widget
  4. Click the + icon to create a new tile
  5. Configure tile properties:
    • Title: Descriptive title for the tile (e.g., "Share Your Feedback", "Rate This Experience")
    • Icon: Select an appropriate icon from the library or upload a custom icon
    • URL: Enter the survey URL (internal link or full path)
    • Link Target: Choose Same Window or New Window
  6. Click Apply to save the tile widget configuration
  7. Save the Engagement Page
  8. The survey becomes accessible through the tile on the custom page

Use cases for Engagement Page survey integration:

  • Custom landing pages with contextual feedback prompts
  • Post-event pages with event-specific survey tiles
  • Training completion pages with course feedback surveys
  • Feature announcement pages with immediate feedback collection

Engagement Pages documentation reference: For comprehensive guidance on Engagement Pages configuration including Tiles widget setup, banner customization, and widget management, refer to the Engagement Pages documentation in the project knowledge base.

Training Module Integration

Attaching Surveys to Courses

Surveys can be attached to individual courses in the Training module to enable automated post-training feedback collection:

Configuration process:

  1. Navigate to the Training module course configuration
  2. In the course settings, locate the Survey dropdown field
  3. Select a published survey from the available options
  4. Save the course configuration
  5. The survey becomes automatically associated with the course

User experience: When learners complete a course with an attached survey, the system automatically displays a "Share Your Feedback" confirmation dialog immediately after course completion.

Response options:

  • Yes: Opens the survey in a modal overlay for immediate completion
  • No Thanks: Closes the dialog and allows the user to continue without providing feedback

Attaching Surveys to Learning Paths

Surveys can be attached to learning paths to collect comprehensive feedback on multi-course programs:

Configuration process:

  1. Navigate to the Training module learning path configuration
  2. In the learning path settings, locate the Survey dropdown field
  3. Select a published survey from the available options
  4. Save the learning path configuration
  5. The survey becomes automatically associated with the learning path

User experience: The "Share Your Feedback" prompt displays after learners complete all courses in the learning path, enabling holistic program feedback collection.

Course-Specific Response Tracking

When the same survey is attached to multiple courses or learning paths, the system maintains separate submission tracking for each training item:

Submission rules:

  • Standalone survey: Authenticated users can submit once per survey total
  • Course-attached survey: Authenticated users can submit once per survey per course
  • Learning path-attached survey: Authenticated users can submit once per survey per learning path

Example scenario:

  • Survey "Training Feedback" is attached to both "Product Basics" course and "Advanced Features" course
  • User completes "Product Basics" and submits the survey
  • User later completes "Advanced Features" and can submit the same survey again
  • System tags each submission with the corresponding course identifier
  • Analytics can differentiate responses by course association

Data tagging: Each survey submission from Training module integration includes metadata identifying the course or learning path, enabling course-specific response analysis and comparative training effectiveness assessment.

Training Integration Considerations

Survey timing: Surveys prompt immediately after course completion while training experiences are fresh in learners' minds, optimizing response quality and accuracy.

Optional vs. required: Survey completion is typically optional—learners can decline the feedback prompt without affecting course completion status or training progression.

Survey selection: Only published surveys appear in the course and learning path survey dropdown menus. Draft and archived surveys are not available for Training attachment.

Cross-module coordination: Training administrators and survey administrators should coordinate to ensure appropriate surveys are available for Training attachment and that survey questions align with course content and learning objectives.

Note: Detailed Training integration workflows and troubleshooting guidance are subject to further documentation updates. The information provided reflects current general integration behavior.

Survey Access Control

Public Link Access

Access model: Survey links are publicly accessible without authentication requirements.

Implications:

  • Anyone with the survey link can access and complete the survey
  • No portal login or user account is required
  • Enables broad feedback collection from external audiences including non-portal users
  • Supports anonymous feedback collection for sensitive topics

Use cases for public access:

  • Market research surveys distributed to external audiences
  • Event feedback collection from attendees without portal accounts
  • Customer satisfaction surveys shared via email to non-portal customers
  • General feedback collection from website visitors

Authenticated User Tracking

Submission controls for authenticated users:

  • Portal users who are logged in when accessing surveys are tracked by user account
  • Each authenticated user can submit a given survey once (excluding course-attached surveys)
  • Repeat submission attempts display an error or completion message
  • User identity is not visible in survey results to maintain response confidentiality

Submission controls for anonymous users:

  • Users who are not logged into the portal when accessing surveys are not tracked by identity
  • Technical submission limitations for anonymous users are subject to browser-based controls
  • Note: Detailed anonymous access behavior is subject to further documentation updates

No Role-Based Access Restrictions

Surveys do not integrate with Security Role or User Group permissions for access control. Distribution control is managed exclusively through link sharing—users who receive the link can access the survey regardless of their security role or group membership.

Access control strategies:

  • Selective distribution: Share survey links only with intended audiences through targeted email campaigns or specific portal pages
  • Portal navigation segmentation: Place survey tabs or Engagement Page tiles on pages with role-based access restrictions
  • Training attachment targeting: Leverage Training module course and learning path sharing permissions to control which users complete courses that trigger survey prompts

Preview and Testing

Survey Preview Function

To experience the survey as respondents will before publishing or distributing:

  1. Open the survey in the Survey Builder
  2. Click More Actions (⋯ button)
  3. Select Preview
  4. The survey opens in preview mode with URL format: /survey/take?id=[survey-id]&mode=preview
  5. Complete the survey to validate question flow, Thank You page presentation, and overall user experience

Preview mode characteristics:

  • Displays the survey exactly as respondents see it
  • Includes all questions with configured help text and placeholders
  • Shows the Thank You page with configured content and actions
  • Does not record responses—preview submissions are not included in analytics
  • Can be accessed repeatedly for iterative testing
  • Available for both draft and published surveys

Preview use cases:

  • Validating question clarity and survey flow before publication
  • Testing Page Skip Logic settings (one question per page vs. all questions on one page)
  • Confirming Thank You page messaging and redirect functionality
  • Experiencing the survey from a respondent perspective for quality assurance

Pilot Testing Strategy

Before broad survey deployment, conduct pilot testing with small user groups:

Pilot testing process:

  1. Publish the survey to activate the link
  2. Share the survey link with a small group of representative users (5-10 respondents)
  3. Request that pilot users complete the survey and provide feedback on:
    • Question clarity and interpretation
    • Survey length and completion time
    • Technical issues or display problems
    • Thank You page effectiveness
  4. Monitor analytics for initial conversion rate patterns
  5. Review question-level response patterns for unexpected results
  6. Make necessary adjustments:
    • Edit Thank You page content if needed
    • Change Page Skip Logic if completion patterns indicate issues
    • For structural changes, clone the survey and create a revised version
  7. Proceed with broad distribution after pilot validation

Pilot testing benefits:

  • Identifies confusing questions before they affect large response volumes
  • Validates technical functionality across different devices and browsers
  • Provides initial data for conversion rate benchmarking
  • Reduces risk of costly mid-collection structural changes requiring survey cloning

Best Practices

Publication Timing

Thorough preparation: Complete all question design, Thank You page configuration, and preview testing before publishing. Structural lockdown after publication limits correction options.

Stakeholder review: Have survey content reviewed by stakeholders or subject matter experts before publication to ensure alignment with organizational objectives and data collection goals.

Distribution coordination: Synchronize survey publication with planned distribution campaigns to maximize immediate response collection and minimize lag between publication and outreach.

Link Management

Descriptive naming: Use clear, descriptive survey names that help administrators identify survey purpose when copying links or managing multiple active surveys.

Link format selection: Use internal links for portal-embedded surveys and full path links for external distribution to ensure appropriate URL functionality in each context.

Link tracking: Maintain a record of where survey links are distributed (email campaigns, portal pages, external sites) to facilitate response source analysis and distribution effectiveness assessment.

Portal Integration Strategy

Strategic tab placement: Create survey tabs for ongoing feedback collection requiring permanent navigation access, rather than temporary or event-specific surveys better suited to direct links.

Engagement Page design: Integrate survey tiles into contextually relevant Engagement Pages where feedback collection aligns naturally with page purpose and user workflow.

Navigation hierarchy: Position survey tabs and tiles appropriately within navigation structures to balance visibility with primary portal functionality and avoid overwhelming users with feedback prompts.

Training Integration Optimization

Survey-course alignment: Ensure survey questions align specifically with course content and learning objectives to collect actionable training effectiveness data.

Timing consideration: Leverage automatic post-completion prompts to capture feedback while training experiences are fresh, maximizing response quality and accuracy.

Survey reuse: Use the same survey across similar courses to enable comparative analysis of training effectiveness across different content or instructor approaches.

Distribution and Communication

Clear communication: When sharing survey links, clearly communicate survey purpose, estimated completion time, and how feedback will be used to encourage participation.

Audience targeting: Distribute surveys to appropriate audiences through relevant channels rather than broad, unfocused distribution that may dilute response quality.

Follow-up strategy: Plan follow-up communications for low initial response rates, but avoid excessive reminders that may reduce response quality through survey fatigue.

Ongoing Management

Link validation: Periodically test survey links to ensure continued functionality, especially for surveys embedded in static content or external websites.

Archive timing: Archive surveys promptly when data collection periods end to prevent continued responses that may not align with current measurement timeframes or organizational needs.

Analytics monitoring: Regularly review Conversion Rate metrics to identify surveys with low completion percentages requiring optimization or redistribution strategy adjustment.

Publishing and sharing surveys provides essential feedback distribution capabilities that support targeted data collection, broad audience reach, and seamless portal integration through flexible link generation, navigation embedding, and Training module automation that activate surveys for comprehensive response collection.


Jump to Survey Module Checklist

<< Configuring Survey QuestionsAnalyzing Survey Results >>