Migrating from the GenAI node to Knowledge AI
Last updated
Was this helpful?
Last updated
Was this helpful?
We just released Knowledge AI!
A brand new integration featuring a seamless blend of RAG (Retrieval Augmented Generation) APIs with Semantic search and LLM (Large Language Model) technology that works in conjunction with a user-uploaded Knowledge Base to create nuanced conversational flows with minimal effort.
Given this feature release, the Generative AI (GenAI) node will slowly be deprecated over the course of the next few weeks, and if you are a current user of this node, here is what you need to know:-
Although the core use of both these features may seem similar, the functionality greatly differs in terms of the control you have over your Virtual Assistants (VA) knowledge base, and static and generative responses.
Functionality
Q&A node + Knowledge AI
VA Knowledge Base Autonomy
✅✅✅(this feature allows you to upload your own sources - text-based documents and URLs - allowing far more control on what serves as the VAs KB)
VA Response Control
✅✅✅(there is far more control with this feature given that the entire Knowledge Base is dictated by the user)
Conversation Looping
✅✅(Users can build seamless flows with little effort using a combination of traditional Conversation nodes and the Q&A node)
LLM Accessible
Uses Google’s proprietary LLMs to generate responses based on end-user input referring to an uploaded Knowledge base
Aside from the fact that the Generative AI node will soon be removed from the platform, there are quite a few benefits to moving to Knowledge AI:-
Increased flexibility & control over responses: The Knowledge AI functionality allows you to provide your VA with a reliable Knowledge Base that you can upload in the form of text-based documents or live accessible URLs, enhancing control over the model's performance whilst allowing you the flexibility to create customized flows.
Low effort builds: Enhance cost efficiency by reducing the time and effort you spend on creating intents, entities, testing and optimizing classification by allowing Knowledge AI to do the heavy lifting and simply make minor refinements to your Index to ensure optimal performance.
Easy maintenance: Maintenance and updates are as easy as re-uploading a source and adding it to your existing Index to deliver immediate live results.
Scalability: Knowledge AI can manage and utilize vast amounts of information, making knowledge base management scalable for businesses with extensive, and often decentralised knowledge bases and documentation.
Centralized Knowledge Management & Response Accuracy: Knowledge AI is able to draw from the latest information you upload, ensuring that the VA provides streamlined, up-to-date responses that address a broad spectrum of customer inquiries effectively.
Knowledge AI is based on internally developed RAG (Retrieval Augmented Generation) API’s, which combine Knowledge Base, Semantic Search and LLM response generation.
However, as a user of AI Studio, all you have to do is follow a few easy steps to get this feature up and running for your VA’s!
If you currently are a user of the Generative AI node, as long as you have Sources available, you should be able to set up the Knowledge AI feature in four quick steps:-
Upload or link your Sources
Create Indexes from the Sources
Test out the Indexes for optimal performance
Set up the Q&A node within your agent.
Everything you need to know about the details of this process can be found here.
Our team is happy to help answer any further questions you may have. You can raise a request with “Knowledge AI Query” as the Subject on our Support form and we will do our best to lead you along the right path. You can also reach out to your account manager to resolve any queries.
The Q&A node (along with the Knowledge AI tab) does not allow for third-party LLM plugins, however, AI Studio allows you to integrate into various systems using Webhooks. Stay tuned to learn how you can connect your virtual assistant with industry-standard LLMs.
Please keep in mind, however, like all API requests this is subject to latency.
Additionally, the onus of adding guardrails and preventing hallucinations lies on the VA designer and the third-party system.
Nope, all part of the same Virtual Assistant bill! Learn more about our pricing here or reach out to your account manager. Please note that pricing for this feature is dependent on the text size of the Sources you upload and the total requests made using the feature
AI Studio has got you covered! The Knowledge AI feature is GDPR compliant (keeping in terms with the rest of AI Studios features that fall under this compliance), and maintains data residency by ensuring that the VAs created under the US and EU regions have their respective platform and LLM servers located in these regions.
Additionally, given that the LLMs used for this feature are sourced from Google, data sent to these models is neither stored nor used for training purposes.