How to use in Blueprints
Last updated
Last updated
Every installation comes with demo content you can learn from. I will be using those examples here to demonstrate how to use the plugin.
Decide which type of dialogue integration you want to use. Algorithmic is best for NPC and Characters who have traits and names, manual is best for one-off use cases, and quest integration is best for mission design.
Each Dialogue Context entry includes preset fields that need to be filled out; some may be left blank if they are not important. However, Dialogue Purpose is always required, as it defines the topic for the AI to discuss. Note: Leave the Past Dialogue History blank. It can be filled in at runtime through the "Store Dialogue History" node.
To use with a quest system, assign an integer to the map entry that aligns with your quest progress. It does not always need to be linear. For instance, if you have a main story and an actor that appears at quest progress 45, then you can start at 45.
To manage quest progress, first set up integers or strings, depending on how your system tracks progress. Fill in details as you would with manual dialogue context. If hidden, click the dropdown arrow next to the progress number or text to see
In your quest dialogue setup, use the "Get Integer/String Text/Voice Dialogue State" node from the component and input your quest progress. Ensure to pass it through a "Generate Dialogue Prompt" node to obtain the fallback option and return value (to be used in the generate dialogue)
Set up the manual dialogue context entries in the component details.
Use the "Get Manual Dialogue Context" node from the component with the correct entry index.
Combine this with the Generate Dialogue Prompt node to produce the string needed for the Generate Text
This uses Dialogue Tags applied to a component on an actor to determine which is the best dialogue to play. This is used in many big games where they need dialogue based on the character's traits, like ragdolled, angry, happy, attacking, wet, sad, etc.
To use this, first set up algorithmic dialogue tags in the details panel of your character or actor. Assign an array of tags for this dialogue option, then create a dialogue context
In this example, if the character has active traits of being angry and intrigued, it will likely choose this dialogue option, assuming no other dialogue entries are a better choice. You can apply the same tag to different entries as well. For example, you can have 10 different types of angry dialogue; some are intrigued, some are wet, some are dry, and some are ragdolled.
To manage character tags effectively, use algorithmic dialogue nodes to assign, remove, and locate current tags. When ready to generate text, utilize the "Get Best Algorithmic Dialogue Option" node, then connect it to the Generate Dialogue Text
This system allows you to have a single manager in your world that contains the names and traits of all characters. It can generate dialogue, track story history, and manage game dialogue from a centralized location
Compile a list or data table of all character tags that your NPCs and players will have using the Add Active Dialogue Tag node (on the characters, not the manager).
Cast the character to a manager or use an interface, then pass it as a string array.
In the manager, use a Clear Activate Dialogue Tags node with a for loop to determine the array length from the character. Add a dialogue tag in the loop and execute the Generate Text Dialogue Async node
To start, in the Blueprint context menu, search for the "Generate Text Dialogue Async" node. It should be available in almost all event graphs.
Once you have the node, use one of the methods outlined above to generate a prompt from the dialogue context. In this screenshot, I am using the manual
Plug the return value from the "Generate Dialogue Prompt" or "Find Best Dialogue Option" into the "prompt" input. Insert the Fallback String into the "Fallback Dialogue" input
Setup conditions for On Generation Success and On Generation Failure node returns
Note: In case of a generation failure, the fallback option will be returned. Therefore, if you are assigning this to a widget or character, ensure you also assign the generated text output on failure, not just the success
To use the voice acting feature, follow these steps:
When "On Generation Success" is executed, call the "Generate Voice Dialogue Async" node.
Connect the "Generated Text" output to the "prompt" input of the voice node.
Use either "Play Sound at Location" or "Play Sound 2D" for the returned voice line.
Optionally, you can use the "Store Dialogue History" node after generating the text so that the AI is aware of the conversational history, as seen in
An alternative method for generating dialogue is available if you need it in an event graph that does not support components.
Get the GPT Dialogue Subsystem(Not the editor one).
Drag off the subsystem and search for "Generate Dialogue" and "Generate Dialogue Prompt"
On the Generate Dialogue Prompt input, use the "Make Dialogue Context" node to make dialogue context on the fly.
Add a custom event off the callback to get the return value.
This is what it should look like. The same process can be used for Generating Voice Dialogue on the fly.
Add the AC_GPTDialogue Component to your character, actor, manager, or any blueprint that you want to generate text or voice from.