使用 OpenLLMetry 实现可观测性¶
OpenLLMetry 是一个基于 OpenTelemetry 的开源项目,用于追踪和监控 LLM 应用程序。它能快速连接至所有主流可观测性平台(如 Datadog、Dynatrace、Honeycomb、New Relic 等),并在几分钟内完成部署。
如果您在 Colab 上打开此 Notebook,可能需要安装 LlamaIndex 🦙 和 OpenLLMetry。
In [ ]:
Copied!
!pip install llama-index
!pip install traceloop-sdk
!pip install llama-index
!pip install traceloop-sdk
配置API密钥¶
注册 Traceloop 账号请访问 app.traceloop.com。然后前往 API keys 页面 创建新的 API 密钥。复制该密钥并粘贴到下方单元格中。
若您倾向使用其他可观测性平台(如 Datadog、Dynatrace、Honeycomb 等),可查阅 此处 获取配置指南。
In [ ]:
Copied!
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["TRACELOOP_API_KEY"] = "..."
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["TRACELOOP_API_KEY"] = "..."
初始化 OpenLLMetry¶
In [ ]:
Copied!
from traceloop.sdk import Traceloop
Traceloop.init()
from traceloop.sdk import Traceloop
Traceloop.init()
Traceloop syncing configuration and prompts Traceloop exporting traces to https://api.traceloop.com authenticating with bearer token
下载数据¶
In [ ]:
Copied!
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
--2024-01-12 12:43:16-- https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.109.133, 185.199.108.133, 185.199.111.133, ... Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.109.133|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 75042 (73K) [text/plain] Saving to: ‘data/paul_graham/paul_graham_essay.txt’ data/paul_graham/pa 100%[===================>] 73.28K --.-KB/s in 0.02s 2024-01-12 12:43:17 (3.68 MB/s) - ‘data/paul_graham/paul_graham_essay.txt’ saved [75042/75042]
In [ ]:
Copied!
from llama_index.core import SimpleDirectoryReader
docs = SimpleDirectoryReader("./data/paul_graham/").load_data()
from llama_index.core import SimpleDirectoryReader
docs = SimpleDirectoryReader("./data/paul_graham/").load_data()
执行查询¶
In [ ]:
Copied!
from llama_index.core import VectorStoreIndex
index = VectorStoreIndex.from_documents(docs)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)
from llama_index.core import VectorStoreIndex
index = VectorStoreIndex.from_documents(docs)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)
The author wrote short stories and also worked on programming, specifically on an IBM 1401 computer in 9th grade. They used an early version of Fortran and typed programs on punch cards. They also mentioned getting a microcomputer, a TRS-80, in about 1980 and started programming on it.