Skip to main content

CerebriumAI

Cerebrium is an AWS Sagemaker alternative. It also provides API access to several LLM models.

This notebook goes over how to use Langchain with CerebriumAI.

Install cerebriumโ€‹

The cerebrium package is required to use the CerebriumAI API. Install cerebrium using pip3 install cerebrium.

# Install the package
!pip3 install cerebrium

Importsโ€‹

import os

from langchain.chains import LLMChain
from langchain_community.llms import CerebriumAI
from langchain_core.prompts import PromptTemplate

Set the Environment API Keyโ€‹

Make sure to get your API key from CerebriumAI. See here. You are given a 1 hour free of serverless GPU compute to test different models.

os.environ["CEREBRIUMAI_API_KEY"] = "YOUR_KEY_HERE"

Create the CerebriumAI instanceโ€‹

You can specify different parameters such as the model endpoint url, max length, temperature, etc. You must provide an endpoint url.

llm = CerebriumAI(endpoint_url="YOUR ENDPOINT URL HERE")

Create a Prompt Templateโ€‹

We will create a prompt template for Question and Answer.

template = """Question: {question}

Answer: Let's think step by step."""

prompt = PromptTemplate.from_template(template)

Initiate the LLMChainโ€‹

llm_chain = LLMChain(prompt=prompt, llm=llm)

Run the LLMChainโ€‹

Provide a question and run the LLMChain.

question = "What NFL team won the Super Bowl in the year Justin Beiber was born?"

llm_chain.run(question)

Was this page helpful?


You can also leave detailed feedback on GitHub.