Docker, the software platform for building applications based on containers, has partnered with Neo4j, LangChain, and Ollama to release Gen AI Stack, designed to provide a one-stop platform that will allow developers to not only create applications but incorporate generative AI into them as well.
The new Gen AI Stack combines Neo4j’s graph and vector search capabilities along with LangChain orchestrations.
Docker’s partnership with Neo4j is meant to make AI models faster and ground large language models (LLMs) with the latter’s knowledge graph in order to generate more accurate predictions. LangChain orchestrations, meanwhile, are expected to help developers connect the database with the vector index, and the LLM with the application.
The connection of the database with the vector index is expected to help provide a framework for building context-aware reasoning applications powered by the LLMs, the company said.
The company’s partnership with Ollama will help developers run open-source LLMs locally, it added.
The stack, which will come with preconfigured open source LLMs such as Llama 2, Code Llama, and Mistral, includes a series of supporting tools, code templates, how-to’s and generative AI best practices.
The Gen AI Stack is currently available at the Learning Center in Docker Desktop and at GitHub.
In addition, the company has launched a new generative AI assistant, dubbed Docker AI, via an early access program.
In contrast to other code generating assistants such as Copilot or Amazon CodeWhisperer, Docker AI assistant helps developers define and troubleshoot all aspects of an application, according to Docker CEO Scott Johnston.
“Docker AI provides context-specific, automated guidance to developers when they are editing a Dockerfile or Docker Compose file, debugging their local ‘docker build,’ or running a test locally,” the company said in a statement, adding that the assistant is aimed at helping developers focus on application building, rather than on the tools and
Copyright © 2023 IDG Communications, Inc.