Technology

Kinetica adds ChatGPT integration for ‘conversational querying’ of data

[ad_1]

Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More


Arlington, Virginia-headquartered Kinetica, a company providing an analytic database to unlock value from spatial, time-series and temporal data, today announced an integration with OpenAI’s ChatGPT.

The move, the company explains, enables “conversational querying” where enterprise users can enter natural language prompts to query their data assets. Previously, the process involved writing complex structured query language (SQL) queries, which restricted analytics to a select group of users and took time.

This is the latest effort from a vendor to loop generative AI into its product and make it more intuitive and accessible for end customers.

>>Follow VentureBeat’s ongoing generative AI coverage<<

Event

Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.

 


Register Now

How exactly does ChatGPT querying work?

In order to perform a conversational query, an enterprise user has to go on Kinetica’s workbench UI and ask a question about their proprietary data, be it something plain and simple or a complex, previously-unknown query. Once the question is put in, the ChatGPT interface, powered by GPT-3.5 turbo, converts it into SQL and runs the query to provide insights for decision-making.

“Workbench is a SQL notebook system and at the top of every notebook is a little text prompt that lets you ask a question that will generate a SQL block in your notebook,” Nima Negahban, the CEO and cofounder of the company, told VentureBeat. 

For instance, in order to optimize inventory, a user could ask something like “What is the status of our inventory levels and should we reroute active delivery vehicles to reduce the chances of products being out of stock.” The prompt, Kinetica says, will provide action-oriented insights in seconds. It could also be backed up with follow-up questions, which could uncover unexpected correlations and relationships that may not have been immediately apparent through traditional querying methods.

While most analytic databases require data engineering, indexing and tuning to ensure rapid querying, Kinetica delivers similar performance through native vectorization. As part of this, it stores data in fixed-size blocks called vectors and performs query operations on these vectors in parallel, rather than on individual data elements.

“Enterprise users will soon expect the same lightning-fast response times to random text-based questions of their data as they currently do for questions against data in the public domain with ChatGPT,” Negahban noted.

Data remains safe 

Since the ChatGPT-driven querying capability is dealing with company data, which can be sensitive at times, the CEO also emphasized that the solution works in such a way that the model doesn’t capture any part of the data it queries.

“Kinetica automatically hydrates the GPT-3 context in an anonymized fashion with the necessary prompts and rules derived from the database metadata. This allows for the GPT model to generate the correct SQL query, given a user’s data model and question, without exposing the underlying detailed data to GPT,” he explained.

Moving ahead, Kinetica plans to enhance the querying feature with GPT-4 integration and make it more widely available to enterprises. For the latter, the company will launch a programmatic SQL API that will open the capability up to be used by other developers in their own analytic applications. It is also exploring other areas where its database might benefit from large language models (LLMs).

“We have a number of ideas about how we are going to be leveraging LLMs in our roadmap. We are having dialogues with our clients, who are leaders in their respective industries such as healthcare, telecommunications, defense, banking, automotive and others, and they have some incredible use cases that we are exploring together,” Negahban said.

Other enterprise technology vendors have also started leveraging LLMs into their products, including Salesforce, Microsoft, ThoughtSpot, Domo and SiSense.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

[ad_2]
File source

Tags
Show More

Related Articles

Back to top button
Close