How to use a recurring Integration Endpoint for importing data
You are here
Using AI To Provide Help For Users
Using ChatGPT to improve the help experience for your end users
This article is about using ChatGPT to help provide contextual help to an end user of Finance and Operations. This is simply an example of how to do it with some extra features to illustrate how AI can be used and also mis-used. I don't expect that this specific use case would be provided in the manner this article shows and for a real solution, you would have a much larger set of training data available for an AI large language model to ingest and provide even better help. With that out the way, let's dig in.
New users of Finance and Operations can sometimes face issues getting the training or documentation they need in order to be able some common tasks in F&O. Ideally this wouldn't be an issue for the technical realm to solve but should that be the case, we can use AI in the form of ChatGPT to help.
What Is an AI?
An AI LLM, or Artificial Intelligence Large Language Model, is a type of advanced computer program designed to understand and generate human language. Think of it as a highly sophisticated robot that can read, comprehend, and respond to text in a way that's similar to how humans communicate. By analyzing vast amounts of text data, it learns patterns and nuances in language, enabling it to answer questions, write content, and even have conversations. It's like having a super-smart pen pal who knows a lot about a wide range of topics. In the past we as humans would use specialized languages to talk to computers, like code or something similar to convey an instruction set. Now we can use normal, natural language for the English speakers of the world.
Connecting To An AI Large Language Model (LLM)
Connecting to ChatGPT, or any LLM, is fairly easy and most of easy to follow guides on how to do that like this one. https://platform.openai.com/docs/api-reference/making-requests. Lots of different programming languages have official or community provided APIs if you'd like an easy button. In our instance, I coded the client and response from scratch but that was also very easy to do. All of the code for the features we're going to review today is available at https://github.com/NathanClouseAX/FOChatGPTExamples. You can clone it into your dev environment and review whenever you'd like. Like connecting to any API, you'll need to specify where to connect and how to connect plus some authentication info. For this example, we need to specify the API URL, LLM Model name, and the API key for authentication.
In System Parameters, we can specify all of those fields plus some additional values.
Include Context Information
One area that is critical in prompting an AI for assistance is to provide it as much context as you can for what it is you are looking for. When this is set to yes, this will send with the prompt for help the time zone of the user, the country of the legal entity, the time zone of the legal entity, the accounting currency of the legal entity, and the security roles of the user. This is done to help the AI answer the prompt from the user by narrowing down some geographical and monetary constraints for a given prompt. USA and EU have very different rules for various areas of F&O and that would manifest in process differences between the two - meaning how you do X in one geographical place could be entirely different from another geographical place. Giving the AI more context will typically get your a more helpful response.
AI PrePrompt to assist with more accurate results
This is essential for any system that accepts requests that are sent to an AI model of any kind. This gives the AI some guardrails for what it can and cannot do. If you don't include instructions for the user prompt, a user can use your AI assistant for however they would like, including for other things you didn't want it to be used for. This is just one example but i'm sure there are several hundred: https://gizmodo.com/ai-chevy-dealership-chatgpt-bot-customer-service-fai.... Below is the default prompt that ships with the code. Feel free to modify it to your needs. You can see I give specific instructions for what the AI can and cannot do when given a user prompt.
This is a request for a user of Microsoft Dynamics 365 for Finance and operations that is already signed in. They should be asking for how to perform a task only within that context. If they ask for anything that seems outside the scope of Finance and Operations, please state that you are unable to answer. Coding questions are strictly forbidden. If a question may not be related to Finance and Operations, please tell the user that you are unable to assist until their question is more specific to Finance and Operations. Do your best to provide the most up to date instructions based on the most recent version of Finance and Operations. Do not state the name of the software in the reply. Do not include login steps as part of the response.
Help
Now that we can connect to ChatGPT, let's give a try:
We're able to open up the Help system, go to AI Assistant and ask for help. We can see it is attempting to help. Let's try to get it to do something outside the realm of F&O:
As you can see, we have a bare minimum of guardrails in place to keep our AI assistant in check and on task for most common prompts and workloads.
Conclusion
As you can see, the instructions given by a very generically trained AI model can help guide a user through a process they may know nothing about. The overall instructions being given by the AI are impressive in how accurate they are given the model has no specific training on F&O specifically. That being said, if we had or created a more specialize model, like with https://zapier.com/blog/custom-chatgpt/, it would be possible to have an AI Assistant that knows about your organizations specific processes in your legal entities for your different operating locales.This scenario can only be improved upon with more data, process documentation, and training of the AI model.