Open source vector database vendor targets enterprise AI costs with cloud update


Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


As generative AI usage has grown dramatically in the last several years, vector databases have evolved from cutting-edge technology to essential enterprise infrastructure. 

With vector databases becoming more crucial, enterprises are taking an ever closer look at performance and cost. Zilliz, the company behind the open-source Milvus vector database, is announcing new features aimed at dramatically reducing costs and complexity for production deployments, addressing the growing demands of enterprise users who have moved beyond initial experiments to full-scale AI implementations.

The timing is particularly relevant given the explosive growth in vector database adoption since late 2022, when OpenAI’s ChatGPT catalyzed widespread interest in AI applications. The new features specifically target enterprises struggling with growing deployment sizes and the complexity of managing vector databases in production environments. In just two years, deployment scales have grown from millions to billions of vectors. Zilliz’s largest implementation now manages 100 billion vectors. The technology is now deployed across diverse use cases including multimodal applications, recommendation systems, autonomous driving, drug discovery, fraud detection and cybersecurity.

“In the past two years, we definitely saw that vector databases are moving from a cutting edge technology, to becoming a more mainstream technology,” Charles Xie, founder and CEO of Zilliz, told VentureBeat.

Enterprise AI vector database differentiation in a crowded market

In 2024, vector database technology has become table stakes for enterprise AI deployment. Nearly every database vendor has some form of vector implementation, including Oracle, Microsoft, Google, DataStax and MongoDB among others. 

Milvus however is a bit different in that it is a purpose-built vector database. In that category, competition includes vendors like Pinecone. While there are certainly other open source vector database technologies, Milvus holds the somewhat unique distinction of being the only one that is part of the Linux Foundation’s LF AI & DATA effort.

Milvus being hosted under the Linux Foundation’s AI & Data Foundation has enabled it to receive contributions from a wide ecosystem of participating institutions and organizations. Xie noted that among the organizations that have contributed code to the Milvus open source project are IBM, Nvidia, Apple, Salesforce and Intel.

According to Xie, the combination of having an open source foundation, native vector database focus and most importantly having specialized features, help to differentiate his company’s technology in the crowded market. Xie argued that being solely focused on vector database technology allows it to deliver more comprehensive and optimized solutions, than vendors that include vector as just yet another data type.

This specialization has enabled Zilliz to develop features specifically tailored to enterprise vector search needs, including compliance, security and high availability capabilities that production environments demand.

How Zilliz is improving its vector database for Enterprise AI production needs

The Zillliz Cloud offering is built on top of the open source Milvus database. The offering provides a manages service for the database that makes its easier for organizations to consume and use.

As part of the latest Zilliz Cloud update the company has added an automated indexing system that removes the need for manual parameter tuning. The new feature  automatically picks the optimal indexing algorithms to provide the best performance, without the user having to manually configure the indexes.

“Out of the box, you get the best performance,” Xie said.

The auto-indexing feature is part of Zilix Cloud’s effort to provide an “autonomous driving mode” for vector databases, using machine learning algorithms to optimize performance behind the scenes. This helps reduce the total cost of ownership for customers, as they don’t need to spend time and resources on manual index tuning.

Algorithm optimization helps to improve specific Enterprise AI use cases

Going a step further, Zilliz is now also integrating an algorithm optimizer. 

The optimization works with IVF (inverted file) as well as graph-based vector retrieval algorithms. Memory allocation as well as compute performance is also optimized for fast execution that the company claims provides up to 3X speedup over unoptimized implementations

The algorithm optimizer works across different use cases, whether the organization is running a document search system, a recommendation engine, fraud detection, or any other vector-based application.

Hybrid search and storage innovation helps lower enterprise AI cost

The new release also introduces hybrid search functionality, combining vector similarity search with traditional keyword-based searching in a single system. 

The integration allows companies to consolidate their search infrastructure and reduce operational complexity. Xie explained that the keyword-based search component makes use of the industry-standard BM25 algorithm as well as a sparse index. 

To address growing storage costs, Zilliz has implemented a hierarchical storage system that makes its service more cost-effective than traditional in-memory vector databases. The multi-layer storage hierarchy allows most data to be stored on local disks and object storage, making it cheaper than a pure in-memory solution, according to Xie.

Xie claims through the new set of innovations for performance and storage, Zilliz will be able to reduce vector database consumption costs for its users.

Looking ahead, Zilliz has ambitious plans for further cost optimization. 

“I’m going to make a very bold prediction here, that in the next five years, the cost, the total cost of vector database solution, should be reduced by another 100 times,” Xie stated.



Source link

About The Author

Scroll to Top