Langfuse 快速开始

本快速入门可帮助您将 LLM 申请与 Langfuse 集成。它将记录一个 LLM 调用以开始。

官方文档:https://langfuse.com/docs/get-started

在Langfuse中创建新项目

记录您对 Langfuse 的第一次 LLM 通话

Python

安装依赖

pip install langfuse

.env

LANGFUSE_SECRET_KEY="sk-lf-...";
LANGFUSE_PUBLIC_KEY="pk-lf-...";
LANGFUSE_HOST="https://cloud.langfuse.com"; # EU region
# LANGFUSE_HOST="https://us.cloud.langfuse.com"; # US region
- import openai
+ from langfuse.openai import openai

像平常一样使用 OpenAI SDK。例子:

completion = openai.chat.completions.create(
  model="gpt-3.5-turbo",
  messages=[
      {"role": "system", "content": "You are a very accurate calculator."},
      {"role": "user", "content": "1 + 1 = "}],
)
JS/TS

安装依赖

npm i langfuse
# or
yarn add langfuse

# Node.js < 18
npm i langfuse-node

# Deno
import { Langfuse } from "https://esm.sh/langfuse"

使用示例

import { Langfuse } from "langfuse";

const langfuse = new Langfuse();

// Example generation creation
const generation = trace.generation({
  name: "chat-completion",
  model: "gpt-3.5-turbo",
  modelParameters: {
    temperature: 0.9,
    maxTokens: 2000,
  },
  input: messages,
});

// Application code
const chatCompletion = await llm.respond(prompt);

// End generation - sets endTime
generation.end({
  output: chatCompletion,
});
OpenAI SDK (Python)

该集成是 OpenAI Python SDK 的直接替代品。通过更改导入,Langfuse 将捕获所有 LLM 调用并将它们异步发送到 Langfuse。

安装依赖

pip install langfuse
Langchain

该集成使用 Langchain 回调系统自动捕获 Langchain 执行的详细跟踪。

安装依赖

pip install langfuse

使用示例

# Initialize Langfuse handler
from langfuse.callback import CallbackHandler
langfuse_handler = CallbackHandler(
    secret_key="sk-lf-...",
    public_key="pk-lf-...",
    host="https://cloud.langfuse.com", # 🇪🇺 EU region
  # host="https://us.cloud.langfuse.com", # 🇺🇸 US region
)

# Your Langchain code

# Add Langfuse handler as callback (classic and LCEL)
chain.invoke({"input": "<user_input>"}, config={"callbacks": [langfuse_handler]})

也适用于runpredict方法。

chain.run(input="<user_input>", callbacks=[langfuse_handler])
conversation.predict(input="<user_input>", callbacks=[langfuse_handler])
Langchain (JS)

该集成使用 Langchain 回调系统自动捕获 Langchain 执行的详细跟踪。

安装依赖

npm i langfuse-langchain

使用示例

import { CallbackHandler } from "langfuse-langchain";
// Deno: import CallbackHandler from "https://esm.sh/langfuse-langchain";

const langfuseHandler = new CallbackHandler({
  secretKey: "sk-lf-...",
  publicKey: "pk-lf-...",
  baseUrl: "https://cloud.langfuse.com", // 🇪🇺 EU region
  // baseUrl: "https://us.cloud.langfuse.com", // 🇺🇸 US region
});

// Your Langchain code

// Add Langfuse handler as callback to `run` or `invoke`
await chain.invoke({ input: "<user_input>" }, { callbacks: [langfuseHandler] });
LlamaIndex

该集成使用 LlamaIndex 回调系统自动捕获 LlamaIndex 执行的详细跟踪。

安装依赖

pip install llama-index langfuse

使用示例

from llama_index.core import Settings
from llama_index.core.callbacks import CallbackManager
from langfuse.llama_index import LlamaIndexCallbackHandler

langfuse_callback_handler = LlamaIndexCallbackHandler(
    public_key="pk-lf-...",
    secret_key="sk-lf-...",
    host="https://cloud.langfuse.com"
)
Settings.callback_manager = CallbackManager([langfuse_callback_handler])

来自 LlamaIndex 应用程序的跟踪和指标现在会在 Langfuse 中自动跟踪。如果您使用上下文中的文档构建新索引或查询 LLM,您的跟踪和指标将立即在 Langfuse UI 中可见。

有关更多详细信息,请参阅 Langchain 集成文档

作者:Jeebiz  创建时间:2024-03-20 13:12
最后编辑:Jeebiz  更新时间:2024-03-20 13:14