Lunary
This page covers how to use Lunary with LangChain.
What is Lunary?
Lunary is an open-source platform that provides observability (tracing, analytics, feedback tracking), prompt templates management and evaluation for AI apps.
Installation
Start by installing the Lunary package in your project:
- npm
- Yarn
- pnpm
npm install lunary
yarn add lunary
pnpm add lunary
Setup
Create an account on lunary.ai. Then, create an App and copy the associated tracking id
.
Once you have it, set it as an environment variable in your .env
:
LUNARY_APP_ID="..."
# Optional if you're self hosting:
# LUNARY_API_URL="..."
If you prefer not to use environment variables, you can set your app ID explictly like this:
import { LunaryHandler } from "@langchain/community/callbacks/handlers/lunary";
const handler = new LunaryHandler({
appId: "app ID",
// verbose: true,
// apiUrl: 'custom self hosting url'
});
API Reference:
You can now use the callback handler with LLM calls, chains and agents.
Quick Start
import { LunaryHandler } from "@langchain/community/callbacks/handlers/lunary";
import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({
callbacks: [new LunaryHandler()],
});
API Reference:
- ChatOpenAI from
@langchain/openai
LangChain Agent Tracing
When tracing chains or agents, make sure to include the callback at the run level so that all sub LLM calls & chain runs are reported as well.
import { LunaryHandler } from "@langchain/community/callbacks/handlers/lunary";
import { initializeAgentExecutorWithOptions } from "langchain/agents";
import { ChatOpenAI } from "@langchain/openai";
import { Calculator } from "@langchain/community/tools/calculator";
const tools = [new Calculator()];
const chat = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
callbacks: [new LunaryHandler()],
});
const executor = await initializeAgentExecutorWithOptions(tools, chat, {
agentType: "openai-functions",
});
const result = await executor.run(
"What is the approximate result of 78 to the power of 5?",
{
callbacks: [new LunaryHandler()],
metadata: { agentName: "SuperCalculator" },
}
);
API Reference:
- initializeAgentExecutorWithOptions from
langchain/agents
- ChatOpenAI from
@langchain/openai
- Calculator from
@langchain/community/tools/calculator
Tracking users
You can track users by adding userId
and userProps
to the metadata of your calls:
import { LunaryHandler } from "@langchain/community/callbacks/handlers/lunary";
import { initializeAgentExecutorWithOptions } from "langchain/agents";
import { ChatOpenAI } from "@langchain/openai";
import { Calculator } from "@langchain/community/tools/calculator";
const tools = [new Calculator()];
const chat = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
callbacks: [new LunaryHandler()],
});
const executor = await initializeAgentExecutorWithOptions(tools, chat, {
agentType: "openai-functions",
});
const result = await executor.run(
"What is the approximate result of 78 to the power of 5?",
{
callbacks: [new LunaryHandler()],
metadata: {
agentName: "SuperCalculator",
userId: "user123",
userProps: {
name: "John Doe",
email: "email@example.org",
},
},
}
);
API Reference:
- initializeAgentExecutorWithOptions from
langchain/agents
- ChatOpenAI from
@langchain/openai
- Calculator from
@langchain/community/tools/calculator
Tagging calls
You can tag calls with tags
:
import { LunaryHandler } from "@langchain/community/callbacks/handlers/lunary";
import { ChatOpenAI } from "@langchain/openai";
const chat = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
callbacks: [new LunaryHandler()],
});
await chat.invoke("Hello", {
tags: ["greeting"],
});
API Reference:
- ChatOpenAI from
@langchain/openai
Usage with custom agents
You can use the callback handler combined with the lunary
module to track custom agents that partially use LangChain:
import { LunaryHandler } from "@langchain/community/callbacks/handlers/lunary";
import { ChatOpenAI } from "@langchain/openai";
import { HumanMessage, SystemMessage } from "@langchain/core/messages";
import lunary from "lunary";
const chat = new ChatOpenAI({
model: "gpt-4",
callbacks: [new LunaryHandler()],
});
async function TranslatorAgent(query: string) {
const res = await chat.invoke([
new SystemMessage(
"You are a translator agent that hides jokes in each translation."
),
new HumanMessage(
`Translate this sentence from English to French: ${query}`
),
]);
return res.content;
}
// By wrapping the agent with wrapAgent, we automatically track all input, outputs and errors
// And tools and logs will be tied to the correct agent
const translate = lunary.wrapAgent(TranslatorAgent);
// You can use .identify() on wrapped methods to track users
const res = await translate("Good morning").identify("user123");
console.log(res);
API Reference:
- ChatOpenAI from
@langchain/openai
- HumanMessage from
@langchain/core/messages
- SystemMessage from
@langchain/core/messages
Full documentation
You can find the full documentation of the Lunary LangChain integration here.
Support
For any question or issue with integration you can reach out to the Lunary team via email or livechat on the website.