Our Blogs

AI Strategy Development

Customized AI strategies to meet your unique business goals.

How can LLMs and Generative AI help in real life business scenarios?

 

In the context of a manufacturing organization, envision the scenario where the Chief Financial Officer (CFO) is tasked with analyzing the factors influencing the profitability trend of the organization over the past five years and projecting an outlook for the next three years. This complex undertaking involves gathering and synthesizing various data points, which can be categorized into several key areas.

Firstly, the CFO would delve into cost trends, encompassing manufacturing, finance, sales and marketing, research and development, and other functional overhead costs. Within manufacturing, specific elements such as the Cost of Goods, Labor, Transportation, Inventory Carrying Costs, Warehousing costs, and more would be scrutinized. Financial considerations would include the Cost of Interest, while Sales and Marketing would involve metrics like Selling Cost per Unit, Advertisements, and Marketing expenses. Furthermore, the analysis would extend to pricing trends, including selling prices and the competitive landscape, and other revenue streams like Service and Maintenance, Leasing, and Recurring subscriptions.

However, the intricacies of this analysis become apparent when considering the integration and analysis of data from multiple enterprise systems and market data sources. Moreover, the historical data over the past five years, or possibly longer for model training, adds another layer of complexity. The dynamic nature of product lines, the competitive landscape, and external factors such as macro-economic trends and geopolitical influences further contribute to the challenge of predicting future outlooks.

To simplify, let’s focus on one aspect, the cost of goods. Even this narrow scope requires comprehensive data, including information on every product or Stock Keeping Unit (SKU) in the product lineup over the past five years, detailed Bill of Materials (BOM) for each product, and the complete procurement history of every raw material necessary for the manufacturing process.

Analyzing these data elements reveals distinct types of information, from relatively static data like BOMs to slowly changing internal proprietary data such as supply chain costs and distribution costs, to dynamic data points like the cost of materials with volatile pricing and other variable costs.

Now, in the context of conversational AI and Large Language Models (LLMs), the question arises: can LLMs assist in addressing these complexities? The answer lies in a structured approach:

  1. Gather User Question and Context:
    1. Understand the objective of the question.
    2. Determine if the CFO is evaluating profitability, product composition, or other aspects.
  2. Gather Data Elements:
    1. Identify the product or product attributes relevant to the question.
  3. Determine Data Sources and Fetch Relevant Data:
    1.  For the specific product, identify relevant data sources.
    2. Formulate queries for extraction, organization, and analysis across diverse data streams.
  4. Collate Information and Generate Response:
    1. Combine structured and unstructured data analysis.
    2. Utilize appropriate AI/ML models for each step, including classification, extraction, enterprise search, and generative models.

Achieving this involves careful orchestration and selection of models for each sub-step.

  • Classification models determine the question’s context
  • Extraction models identify key-value pairs
  • Enterprise search locates relevant data sources

The integration of structured and unstructured data requires a thoughtful combination of various AI and ML models, along with the use of generative models for relevant responses.

 

While the task may seem daunting, the evolving landscape of AI tools, models, and techniques provides a pathway. The continuous development in this field enables the creation of a fabric woven from diverse models, allowing for the generation of meaningful and reliable responses to complex enterprise questions. The pace of progress is rapid, necessitating active engagement rather than passive observation in the realm of AI evolution.