Oracle Analytics Cloud and Server Idea Lab

Welcome to the Oracle Analytics Community: Please complete your User Profile and upload your Profile Picture

Integrated LLM Chatbot within OAC

Delivered
345
Views
8
Comments
Toby Culler
Toby Culler Rank 2 - Community Beginner

Is Oracle planning to introduce an embedded OAC chatbot for querying ADW and EBS data?

While Oracle's partnership with Cohere focuses on fine-tuned chatbots for proprietary data, our needs are better met by a simple Retrieval Augmented Generation (RAG) chatbot solution.


OAC's current data analysis feature via the homepage search bar (shown below) is inadequate and buggy. Implementing a robust LLM chatbot directly into OAC could fill this gap and add significant value for customers. It would also be an easier sell for businesses rather than going through an outside party like Cohere.


7
7 votes

Delivered · Last Updated

OA AI Assistant started rolling out to customers with the September 2024 update. The rollout will continue until it is enabled for all customers.

Comments

  • Branden Pavol
    Branden Pavol Rank 6 - Analytics Lead

    Love this idea!!

  • Please provide more information on bugs encountered using the home page search for NLQ.

    There are several ongoing LLM projects and explorations that we will discuss and present during Oracle CloudWorld 2023.

  • Toby Culler
    Toby Culler Rank 2 - Community Beginner

    Hi Gabby,

    The NLQ search currently falls short in generating meaningful insights for complex queries. For example, when requesting the top 10 customers for shipments in August 2023, the output visuals are inaccurate and often irrelevant. The auto-insights feature within DV workbooks also suffers from similar issues.

    The NLQ search seems to demand highly curated datasets and very specific keywords, which isn't scalable for our needs. We require a more advanced solution, akin to a GPT-4-enabled Chatbot with a code interpreter, to efficiently handle ad-hoc data inquiries.

    Solving this problem with an effective OAC LLM/Code Interpreter solution would allow us to forego creating many 'one-off' reports and dashboards for our users.


  • out of curiosity, do you have any specific types/examples of queries and responses that you think might be unique to your situation? As Gabby mentioned there are many LLM projects that will be shown/discussed at OCW but i'd like to hear of any specific usage patterns that you might have. As you know LLMs are only as good as the use cases they've been trained on ..

    What utterances do you anticipate your users might have that might be unique to your company

  • Toby Culler
    Toby Culler Rank 2 - Community Beginner
    edited September 2023

    Hi Jacques,

    Our business mainly sees user requests for creating KPI metrics or adding data columns to existing extracts, which are often straightforward tasks requiring 1-16 hours max, but these tasks can shift our focus from larger projects. We address these ad-hoc requests by training OAC Super Users, but scalability is a challenge for our large business.

    When you consider that most user requests involve simple KPI Metrics or Production Line datasets, this database schema/format should be applicable to many manufacturing facilities that utilize SQL Databases.

    Given the effectiveness of pre-trained open-source LLMs like Llama2 and CodeLlama, we only need to pair an LLM with a SQL Database connector for our ADW/EBS tables to enable this "chat with your data" environment.

    At the upcoming Oracle CloudWorld event, I hope to see integrated chat options within OAC that allow all users regardless of skill level to easily interact with our curated databases, as competitors like Salesforce already offer these chatbot options with their BI tools through offerings like 'Einstein AI'.


    Note: Below is just one example of how simple the SQL Queries might look for many of our user's seeking 'Instant Insights'.

    SELECT SupplierName, Ordered_Date, Material, Shipments

    FROM RawMaterialSuppliers 

    WHERE Material = 'Aluminum' AND Ordered_Date > current_date-30

    GROUP BY SupplierName, Material;

  • Toby Culler
    Toby Culler Rank 2 - Community Beginner
    edited September 2023

    Hi Jacques,

    Our business mainly sees user requests for creating KPI metrics or adding data columns to existing extracts, which are often straightforward tasks requiring 1-16 hours max, but these tasks can shift our focus from larger projects. We address these ad-hoc requests by training OAC Super Users, but scalability is a challenge for our large business.

    When you consider that most user requests involve simple KPI Metrics or Production Line datasets, this database schema/format should be applicable to many manufacturing facilities that utilize SQL Databases.

    Given the effectiveness of pre-trained open-source LLMs like Llama2 and CodeLlama, we only need to pair an LLM with a SQL Database connector to our ADW/EBS tables to enable this "chat with your data" environment.

    At the upcoming Oracle CloudWorld event, I hope to see integrated chat options within OAC that allow users regardless of skill level to easily interact with our curated databases, as competitors like Salesforce already offer these chatbot options with their BI tools through offerings like 'Einstein AI'.


    Note: Below is just one example of how simple the SQL Queries might look for many of our users seeking 'Instant Insights'.


    SELECT SupplierName, Ordered_Date, Material, Shipments 
    FROM RawMaterialSuppliers 
    WHERE Material = 'Aluminum' AND Ordered_Date > current_date-30 
    GROUP BY SupplierName, Material;
    


  • Thanks that is helpful.

    You definitely will want to attend Gabby's OAC roadmap section as i think you might get a glimpse of something you will like. For those not attending, the content will get posted after the presentation


    Cheers

    Jacques

  • This feature is planned and has been presented by Gabby Rubin and T.K. Anand as part of the OCW23 Analytics Keynote. It will be posted soon on LinkedIn. Please see the video below of the planned feature using LLM.