Skip to main content
  1. Home
  2. Knowledge Base
  3. eLearning Magic Toolkit - OpenAI Integration for Storyline
  4. How To Use Context Documents To Fine Tune Prompts To ChatGPT

How To Use Context Documents To Fine Tune Prompts To ChatGPT

How To Use Context Documents To Fine Tune Prompts To ChatGPT

Note: This feature requires eLearning Magic Toolkit v2.3 or higher. Make sure you have the latest version plugin installed.

With the eLearning Magic Toolkit version 2.3 release, we have introduced a Context Document upload feature which allows you to fine-tune any prompt being made to the ChatGPT ‘Create Chat Completions’ API by using the contents of your document as strict context for the prompt.

This means that the answers delivered by the ChatGPT API within your Storyline document will always be gathered and written/presented based on the contents of the document only.

If the information requested by the AI, e.g. by the user based on a question or prompt they have generated which is then submitted to ChatGPT, cannot be found anywhere within the document then the AI-model is trained to always return ‘I’m sorry but I don’t have the answer to that.’

(Note – ChatGPT is not perfect and so there is never a safe guarantee that the user might not be able to somehow jailbreak themselves out of the strict rules imposed by the context document system, but never-the-less our thorough testing of this solution indicates this feature update does work very well. Still, we would advise vigorous testing before deployment of your learning experience.)

Step 1 – Uploading your context document to the eLearning Magic Toolkit

Within the eLearning Magic Toolkit settings screen, which you will find the link for in your WP-Admin>Settings menu, there is now a section added to the bottom of the page to upload your context documents.

Currently the system supports both Word and PDF document uploads.

When the file has been uploaded you will see this displayed in the table, along with the ID code which has been dynamically generated to identify the document within the system:

Context Document Uploads Settings

Step 2 – Updating your Execute JavaScript code in Storyline in order to utilise the Context Document fine-tuning feature

Let’s now look at the prompt to use in order to utilise a context document for our prompt to the Create Chat Completions API:

var player = GetPlayer();
var prompt_text = player.GetVar("StorylineTextVariable");

var messages = []

var system_message = { role: "system", content: "You are a helpful and knowledgeable AI chatbot." };
var user_prompt00 = {role: "user", content: "Please answer the following question: " + prompt_text };

messages.push(system_message);
messages.push(user_prompt00);

var sendData = {
'nonce': parent.storylineMagicNonce,
'value': JSON.stringify( messages ),
'api': 'chatCompletion', 
'jsonresponse': 'false',
'contextfile': 'WQVX960',
};

sendData = JSON.stringify( sendData );

const myHeaders = new Headers();

myHeaders.append("Content-Type", "application/json");
myHeaders.append("X-WP-Nonce", parent.storylineMagicRestNonce);

async function openai_req() {

fetch(parent.storylineMagicEndpoint, {
method:'POST',
headers: myHeaders,
body: sendData
})

.then(res => res.json())
.then(data => {
if ( Object.prototype.hasOwnProperty.call(data.data, 'data') ) {
var gpt_content = data.data.data.choices[0].message.content;
gpt_content = gpt_content.trim();
player.SetVar("StorylineAnswerTextVariable",gpt_content);
}else{
console.error('Error fetching data:', error);
}

})
.catch(error => {
console.error('Error fetching data:', error);
})
};

openai_req();

Using the example code above, we are requesting the Create Chat Completions API to answer a question which the user has written, stored to a project variable in Storyline named StorylineTextVariable. We include the document ID in the var sendData section of the code, as you can see this links to our document containing data on the most recent winner of the Turner Prize (something ChatGPT couldn’t possibly know about as it is a very recent event).

The result of this is that any question sent to the Create Chat Completions API from your Articulate Storyline content will only return a response if the answer is known and found within the document.

This makes this solution ideal for creating e.g. organisation-specific AI-enhanced eLearning experiences, where answers to questions or content dynamically generated by the AI needs to

Note – All written content contained with the assigned document will be trasmitted along with your prompt to the OpenAI platform for processing in order for the Create Chat Completion API to return a fine-tuned response to the prompt based on the information contained within the document.

You must therefore only use written documentation which you would be comfortable being uploaded to OpenAI, and recognise that extremely large documents would account for a large token spend against your OpenAI platform account with each prompt request that is sent.

Read the rest of our Knowledge Base pages to make the most of eLearning Magic Toolkit.

Was this article helpful?

Related Articles

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.