By: Naveen Kamat and Dennis Perpetua
A human wrote this article. A year ago, such a statement would have been unnecessary, even pedantic, but with the advent of AI, has it become necessary?
Expectations for using generative AI in business are bullish. Enterprise spending on generative AI solutions is forecasted to reach $143 billion by 2027.1 McKinsey research predicts generative AI will add the equivalent of $2.6 trillion to $4.4 trillion to the global economy annually.2 Companies eager to get started are left with a classic question: how is this going to generate real value for customers, employees and shareholders?
To unleash the full power of generative AI, we suggest businesses must address four pillars:
- Build a strong data foundation
- Leverage large language model operations (LLMOps)
- Manage for potential shadow AI
- Choose the right use cases
Build a strong data foundation
Whether you plan to fine tune an existing large language model (LLM), like OpenAI’s GPT-3.5 Turbo or Google’s LaMDA, or want to create your own foundation model, a good data strategy is essential. Good data architecture is critical for accelerated outcomes and for the age of AI to begin. Your eventual generative AI business solution will need to be able to access real-time data, curated document stores or vector databases, and work within established access controls and protocols to ensure regulations are met.
Laying a data fabric across your entire digital environment and establishing good data governance are the first steps. Generative AIs are only as good as their data; an erroneous label or instance of model drift can create hallucinations, mistakes or simply bad outputs. From there, you’ll have to make architectural adjustments specific to generative AI for business, such as designing prompt templates or prompt sequencing, retrieval augmented generation or model parameter tuning to get to the most relevant and optimal outcomes.
Data de-identification or anonymization to protect personally identifiable data or sensitive personal information data being consumed by the LLMs can be a key consideration from a data governance standpoint.
Data foundation hurdles aren’t insurmountable, but they’ll require a foundational data strategy to address them. Adding generative AI to your tech stack without data strategy changes is a recipe for cost overruns, compliance troubles and more.
Use Case: Data Foundation and SQL Lookups
In the case of a customer service department, ad-hoc requests for reports from structured query language (SQL) databases can be time-consuming for customer service teams. This is because hundreds of reports per month need to be completed, with many individual reports taking hours. Additionally, many customer service requests require agents to pore through large user guides and documentation to find a solution.
A generative AI customer service assistant trained on a business’s SQL database and user guides can translate natural language questions into SQL queries, run the queries, and then translate the results back into natural language for easy understandability. This process can also be applied to an entire library of user guides, enabling quick lookups and rapid responses to customer issues.