Q&A Node

⚠️ This feature is currently in public beta. We welcome feedback and may make changes based on usage and input. While it can be used in production, please do so with awareness of potential updates.

This node requires the Knowledge AI tab to be set up before use. Learn more here.

The Q&A node allows you to access the Indexes you created under the Knowledge AI tab, to create smooth and informative conversation flows with minimal build time!

Setting up the node requires you to first select one Index that the node will use as its Knowledge Base. All the Indexes that you have previously created under the API key of the selected virtual assistant (VA), that are not being used elsewhere, will be visible to you in the Index dropdown.

Although each Q&A node is restricted to one Index, you can use multiple Q&A nodes within a single VA to take advantage of different Sources of information at different points of your user journey.

Remember that each Index can be used only within one VA under an API key! Hence if you aren’t able to select the Index that you want for a specific Q&A node, it's worth it to check if another one of your VAs is using it within its flow.

If you require multiple virtual assistants (VA) under the same API key to use the same Index, it's best to create duplicate Indexes so that you can use the same content across many VAs!

Next, you will need to select the parameter that contains your user query under the User Query dropdown. This input will be processed using Knowledge AI (which combines RAG APIs with Knowledge Base Semantic Search and LLM response generation) in the backend.

The response generated from this process then needs to be saved within a parameter which you can select under the Response dropdown. You can then use this parameter within a Speak/Send Message, or Collect Input node to relay it to the end user.

Configurations

Waiting time

The waiting time will control the amount of time the virtual assistant will wait for the request to respond.

Answer Length

Depending on your use case, your Virtual Assistant can provide shorter or longer answers. This setting is optional.

For simple, direct questions, shorter responses help users get information quickly. For more complex queries, longer responses may be necessary to provide adequate context and clarity. Choose the response length that best fits your users' needs and the nature of the interaction. The minimum response length is 20 words, but there is no maximum; responses can be as detailed as needed.

Response Guidelines

Customize how your Virtual Assistant responds. Guide the tone, style, structure, and level of detail to match your brand voice or use case.

Some of the options include:

  • Tone: Specify whether the assistant should respond in a formal, friendly, casual, or concise tone.

  • Topic Restrictions: Set guidelines for topics that should be avoided during the conversation to ensure the assistant stays on track.

  • Custom Guardrails: Implement additional rules based on issues or gaps identified during testing to fine-tune the assistant's responses.

  • Company-Specific Guidelines: Ensure the assistant uses your company’s exact terminology, such as using "XYZ Corp" instead of "XYZ" or "we", to maintain brand consistency.

These options help ensure that the assistant's responses are consistent, high-quality, and aligned with your desired communication style.

Managing Outputs

There are three possible outputs from this node:-

  • Success: Indicating the successful creation of a response to the user query sourced from the requested Index. The response from the node has also been saved with the response parameter selected in this type of output.

  • Don’t know: Indicating that the model could not generate any appropriate responses from any of the sources within the Index.

  • Failed: Signifying an error or timeout of the Knowledge AI solution.

It is important to account for all of these outputs in order to maintain optimal user experience. Learn more on how to build meaningful flows here.

Using the Q&A node in a conversational flow

Like every other Node within AI Studio, the Q&A node needs to be supported by other nodes in order to create a seamless conversational flow for your end users.

Your setup can be as simple as collecting input and then using the Q&A node to answer the question immediately.

You can also use the node as a fallback for an existing setup.

Accounting for user input beyond the Q&A node is also quite easy, simply relay the response, add a collection point to gather the end users' next input, and reroute to the node.

Please keep in mind that the node does not hold memory of responses provided previously so it’s safe to either add context to the user query or classify before it comes to the Q&A node.

Want to switch context mid-conversation or handle instances where no appropriate response is found?

You can also take advantage of context digression using the Context Switch node and the Q&A node in conjunction.

What if the Index within one Q&A node is unable to answer a particular question? Simply connect the “Don’t Know” path to another Q&A node with the right index.

Last updated

Was this helpful?