ChatDatabricks
Databricks Lakehouse Platform unifies data, analytics, and AI on one platform.
This notebook provides a quick overview for getting started with Databricks chat models. For detailed documentation of all ChatDatabricks features and configurations head to the API reference.
Overview
ChatDatabricks
class wraps a chat model endpoint hosted on Databricks Model Serving. This example notebook shows how to wrap your serving endpoint and use it as a chat model in your LangChain application.
Integration details
Class | Package | Local | Serializable | Package downloads | Package latest |
---|---|---|---|---|---|
ChatDatabricks | databricks-langchain | ❌ | beta |
Model features
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ |
Supported Methods
ChatDatabricks
supports all methods of ChatModel
including async APIs.
Endpoint Requirement
The serving endpoint ChatDatabricks
wraps must have OpenAI-compatible chat input/output format (reference). As long as the input format is compatible, ChatDatabricks
can be used for any endpoint type hosted on Databricks Model Serving:
- Foundation Models - Curated list of state-of-the-art foundation models such as DRBX, Llama3, Mixtral-8x7B, and etc. These endpoint are ready to use in your Databricks workspace without any set up.
- Custom Models - You can also deploy custom models to a serving endpoint via MLflow with your choice of framework such as LangChain, Pytorch, Transformers, etc.
- External Models - Databricks endpoints can serve models that are hosted outside Databricks as a proxy, such as proprietary model service like OpenAI GPT4.
Setup
To access Databricks models you'll need to create a Databricks account, set up credentials (only if you are outside Databricks workspace), and install required packages.
Credentials (only if you are outside Databricks)
If you are running LangChain app inside Databricks, you can skip this step.
Otherwise, you need manually set the Databricks workspace hostname and personal access token to DATABRICKS_HOST
and DATABRICKS_TOKEN
environment variables, respectively. See Authentication Documentation for how to get an access token.
import getpass
import os
os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
if "DATABRICKS_TOKEN" not in os.environ:
os.environ["DATABRICKS_TOKEN"] = getpass.getpass(
"Enter your Databricks access token: "
)
Enter your Databricks access token: ········
Installation
The LangChain Databricks integration lives in the databricks-langchain
package.
%pip install -qU databricks-langchain
We first demonstrates how to query DBRX-instruct model hosted as Foundation Models endpoint with ChatDatabricks
.
For other type of endpoints, there are some difference in how to set up the endpoint itself, however, once the endpoint is ready, there is no difference in how to query it with ChatDatabricks
. Please refer to the bottom of this notebook for the examples with other type of endpoints.