Skip to main content

The race to deliver the best agentic AI development platform is heating up, and ElevenLabs has just taken a bold step forward with its new Conversational AI solution. With features like realistic voice synthesis, seamless integration capabilities, and advanced turn-taking, ElevenLabs has positioned itself as a leader in the field of interactive AI agents.

For eLearning developers, this represents a game-changing opportunity to elevate online training experiences. The ElevenLabs platform makes it simple to embed conversational AI widgets into environments like Articulate Storyline 360, enabling dynamic, personalised interactions within your eLearning projects.

This tutorial will guide you through the process of integrating ElevenLabs’ Conversational AI into your Storyline 360 projects.

ElevenLabs – Take The Real-Time Agentic AI Crown (For Now!)

 

While OpenAI’s Realtime API broke ground in October with its real-time audio capabilities, it still has notable limitations for production-ready deployment. Issues like incomplete audio delta data and subpar voice quality have left room for competitors to innovate.

Enter ElevenLabs, with its robust range of features that make it easy to get started even with limited technical know-how:

  • Low-latency, high-fidelity voices – the entire ElevenLabs library is here to choose from, with gargantuan choice compared to the very limited offer from OpenAI.
  • Multilingual support in 31 languages, enabling global deployment (with ability to switch languages during a conversation promised to come soon!)
  • External function calls, allowing agents to retrieve data or perform actions in real time, opening the door for integration solutions like our own eLearning Magic Toolkit!
  • Seamless integration options, including a quick and easy widget embed code for direct use in web or eLearning environments.

If you’re a Articulate Storyline 360 developer, you might be wondering if its possible to even use your agent embed code directly in your eLearning activities to deliver immersive, interactive learning assistants that can be called upon at any moment by the user.

Well, as you’ve likely guessed by the title of this blog post, that is indeed possible! And I’d like to show you how:

Step 1 – Create Your Agent on ElevenLabs.io

Begin by logging into your ElevenLabs account and creating your AI agent. This process is intuitive:

  • Define a First Message and System Prompt to guide your agent’s conversational focus.
  • Choose Gemini 1.5 Flash as the LLM for optimal low-latency responses.
  • Upload your organisation’s knowledge base or documentation to tailor the agent’s knowledge to your specific needs.

Step 2 – Retrieve Your Agent ID

Locate the Agent ID below your agent’s name or in the Widget Embed Code section under the Widget tab. Copy this ID—it’s needed for the integration in Storyline.

Step 3 – Designate a Placeholder in Storyline

  • Open your Storyline project and create a new rectangle on your slide or on a dedicated layer.
  • Set the rectangle’s transparency to 99% and remove any border.
  • Resize and position the rectangle where you want the AI widget to appear—by default, the widget will display in the bottom-right corner of this area.

Step 4 – Set Alternative Text (data-acc-text value) for the Placeholder

Right-click on the rectangle, select Accessibility, and set the Alt Text (or data-acc-text) value to agentarea. This will allow your JavaScript code to identify the placeholder element.

Step 5 – Add JavaScript to Embed the Widget

  • Create an Execute JavaScript trigger in Storyline, set to run when the timeline starts on the layer.
  • Copy and paste the following code into the JavaScript editor:
 var agentId = "YOUR_AGENT_ID_HERE"; // Replace this value with your desired agent ID

// Find the Storyline element with the data-acc-text attribute of "agentarea"
var divArea = document.querySelector('[data-acc-text="agentarea"]');

if (!divArea) {
    console.error("Div with data-acc-text='agentarea' not found.");
    return;
}

divArea.style.position = "relative";
var widgetContainer = document.createElement("div");
widgetContainer.id = "AI_Agent";

// Apply styles to make the widget container cover the entire parent div
widgetContainer.style.position = "absolute";
widgetContainer.style.top = "0";
widgetContainer.style.left = "0";
widgetContainer.style.width = "100%";
widgetContainer.style.height = "100%";
widgetContainer.style.zIndex = "10"; // Adjust as needed to ensure it appears above other elements


widgetContainer.innerHTML = `<elevenlabs-convai agent-id="${agentId}"></elevenlabs-convai>`;
divArea.appendChild(widgetContainer);

// Dynamically load the ElevenLabs script
var script = document.createElement("script");
script.src = "https://elevenlabs.io/convai-widget/index.js";
script.type = "text/javascript";
script.async = true;

// Append the script to the document body
document.body.appendChild(script);

Replace "YOUR_AGENT_ID_HERE" with your ElevenLabs Agent ID, keeping the quotation marks intact.

Step 6 – Publish and Upload Your Project

  • Publish your Storyline project for Web output.
  • Upload the published content to a web server, LMS, or localhost environment like XAMPP. Note: ElevenLabs scripts won’t run in Review360 or directly from local files.

Once uploaded, launch your project to see the ElevenLabs Conversational AI widget in action! Users will be able to engage with your AI agent directly within the eLearning activity:

Final Thoughts

The ElevenLabs Conversational AI platform represents a significant leap forward for integrating agentic AI into eLearning. By following the steps outlined above, you can create truly interactive and dynamic learning experiences, complete with personalised voice agents capable of answering questions and engaging learners in meaningful ways.

At Discover eLearning, we’re excited to see how these advancements transform the learning landscape. With tools like the ElevenLabs platform and our ongoing work on the eLearning Magic Toolkit plugin for WordPress, we aim to make cutting-edge AI technology accessible and impactful for eLearning developers everywhere.

Very soon we will be launching version 2.11 of our plugin, which will deliver audio input and output capabilities via the Create Chat Completions API using the brand new gpt4o-audio-preview model from OpenAI. The capabilities of using audio as part of Chat Completions is huge, opening the door to designing and building truly agentic-style learning assistants within eLearning that can both perform actions within learning activities, as well as helping to support data generation, processing, and storage (back into WordPress as part of the wider variable save/retrieval system that the eLM Toolkit provides).

Stay tuned for more tutorials and updates as we continue to explore the incredible potential of ElevenLabs and agentic AI!

Chris Hodgson

Chris Hodgson is an award-winning Digital learning Solutions Developer and Director of Discover eLearning Ltd. He supports organisations of all sizes to implement engaging, meaningful, and impactful eLearning solutions. Chris has over 15 years’ experience working in both private and public sector learning and development.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.