Assistants API (OpenAI): Real-world simple example - money transfer online FAQ
The recent OpenAI Dev Day showcased a remarkable advancement in artificial intelligence, debuting a range of innovative AI services, including GPT-4 Turbo, and the cutting-edge Assistants API.
This API provides developers with sophisticated AI tools, enabling crafting bespoke AI assistants proficient in various tasks.
This little tutorial tests the real-world scenario of how this solution would help on a larger scale. This is with Python client, but I plan to write a broader article on implementing it with the Next.js Full Stack product soon.
You can read this also on Medium here.
The Assistants API facilitates the creation of AI assistants within our applications. It currently supports three main functionalities:
- Code Interpreter
- Information Retrieval
- Function Execution
You have the option to discover the functionalities of the Assistant API either through the Assistants playground or by crafting a detailed integration, as illustrated in my subsequent example. The overarching integration process involves these steps:
- Initiate an Assistant in the API, specifying its tailored instructions and selecting a model. Optionally, activate features such as Code Interpreter, Retrieval, and Function Calling.
- Establish a Thread at the onset of a user conversation.
- Append Messages to the Thread in response to user inquiries.
- Execute the Assistant on the Thread to generate a response, which seamlessly invokes the necessary tools.
SETTING UP OPENAI CLIENT
I am using Jupyter Notebook for the purpose of this demo.
SETTING THE FILE UPLOAD FOR KNOWLEDGE RETRIEVAL
The Retrieval feature enhances the Assistant by integrating external knowledge sources, like unique product details or user-supplied documents. When a file is uploaded and delivered to the Assistant, OpenAI automatically segments your documents, indexes, and preserves the embeddings. It also employs vector search to find pertinent information that addresses user questions.
For our demonstration, we’ll supply the assistant with a basic FAQ from Western Union’s online service (https://www.westernunion.com/pl/en/digitalbanking/frequently-asked-questions.html).
CREATE ASSISTANT
Beginning with an Assistant involves simply selecting a model, but there’s room for further personalization. Utilize the instruction parameter to shape the Assistant’s character and establish its objectives.
In our scenario, we employ the tools parameter to equip the Assistant with retrieval capabilities, allowing it to access our unique data. Finally, the file_ids parameter grants retrieval access to specific files. These files should be uploaded through the File Upload endpoint and tagged with ‘assistants’ as their purpose for compatibility with this API.
CREATE A THREAD
It’s advisable to establish a separate Thread for each user at the start of their conversation. Threads are unrestricted in size, allowing the addition of numerous Messages. The Assistant is designed to manage requests to the model within the maximum context window, employing efficient optimization methods like truncation as necessary.
CREATE MESSAGE AND ADD MESSAGE TO A THREAD
A Message contains text and in our case, a file.
RUN THE ASSISTANT
To enable the Assistant to reply to a user message, we initiate a Run. This process prompts the Assistant to review the Thread and determine whether to utilize tools (if activated) or rely solely on the model for the most appropriate response.
Throughout the run, the assistant adds Messages to the thread marked with role=”assistant”. Additionally, the Assistant autonomously selects which previous Messages should be incorporated into the model’s context window.
VIEW THE RESPONSE
After a Run concludes, a duration influenced by the data input and prevailing API efficiency, we can examine the Messages the Assistant has appended to the Thread. These messages can then be presented to the user.
In our case, the Assistant, having assimilated the Western Union FAQ, has furnished a reasonably accurate answer.
Happy coding :)
Piotr