AI & ML News

DataStax Adds Vector Search to Astra DB on Google Cloud for Building Real-time Generative AI Applications

DataStax

DataStax  popular database-as-a-service (DBaaS) Astra DB  built on the open source Apache Cassandra® database – now supports vector search, a key capability for letting databases provide long-term memory for AI applications using large language models (LLMs) and other AI use cases.  

Coming on the heels of the introduction of vector search into Cassandra, the availability of this new tool in the pay-as-you-go Astra DB service will enable developers to easily leverage the massively scalable Cassandra database for their LLM, AI assistant, and real-time generative AI projects. Goldman Sachs Research estimates that the generative AI software market could grow to $150 billion, compared to $685 billion for the global software industry. 

Datastax is working closely with the Google Cloud AI/ML Center of Excellence as part of the Built with Google AI program to enable the best of Google Cloud’s generative AI offerings to enhance the capabilities and experience of customers using Datastax. 

Vector search enables developers to search a database by context or meaning rather than keywords or literal values. This is done by using “embeddings”, for example Google Cloud’s API for text embedding, which can represent semantic concepts as vectors to search unstructured datasets such as text and images.     

Embeddings are a powerful tool that enable search in natural language across a large corpus of data, in different formats, and extract the most relevant pieces of data.  

Vector stores are required to enable extremely low latency search across databases. Altogether, embeddings, vector stores, and generative AI models like Google PaLM 2, can create powerful capabilities that dynamically combine the right information, for the right customer in their expected language. And now, because of Cassandra’s ability to search by meaning, it will play a key role in building generative AI applications.   

“Vector search is a key part of the new AI stack, every developer building for AI needs to make their data easily queryable by AI agents,” said, Ed Anuff, CPO, DataStax. “Unlike many other vector databases – Astra DB is not only built for global scale and availability, but supports the most stringent enterprise-level requirements for managing sensitive data including HIPAA, PCI, and PII regulations. It’s therefore an ideal option for both startups and enterprises that manage sensitive user information and want to build impactful generative AI applications.” 

“Our customers consistently ask us for ways to tightly integrate data and AI capabilities,“ added, Stephen Orban, VP of Migrations and GenAI Ecosystem at Google Cloud. “By integrating Google Cloud’s generative AI capabilities into Astra DB, DataStax is adding natural language capabilities into a suite of already powerful database capabilities, and giving customers a complete and unified data and AI solutions approach.” 

“Priceline has been at the forefront of using machine learning for many years,” said Martin Brodbeck, CTO, Priceline. “Vector search gives us the ability to semantically query the billions of real-time signals we receive as part of our checkout experience that flow back to Astra DB. We plan to use Google Cloud’s generative AI capabilities alongside Astra DB’s vector search to power our real time data infrastructure and generative AI experiences.”  

Related posts

TD SYNNEX to Amplify Global Footprint of eScan

enterpriseitworld

Publicis Sapient to Create a BU for Google Cloud AI

enterpriseitworld

Skylark Opens OT Cybersecurity COE with Fortinet

enterpriseitworld
x