retrieve relevant information through rag
1
总安装量
0
周安装量
#76596
全站排名
安装命令
npx skills add https://github.com/run-llama/vibe-llama --skill 'Retrieve relevant information through RAG'
Skill 文档
Information Retrieval
Quick start
You can create an index on LlamaCloud using the following code. By default, new indexes use managed embeddings (OpenAI text-embedding-3-small, 1536 dimensions, 1 credit/page):
import os
from llama_index.core import SimpleDirectoryReader
from llama_cloud_services import LlamaCloudIndex
# create a new index (uses managed embeddings by default)
index = LlamaCloudIndex.from_documents(
documents,
"my_first_index",
project_name="default",
api_key="llx-...",
verbose=True,
)
# connect to an existing index
index = LlamaCloudIndex("my_first_index", project_name="default")
You can also configure a retriever for managed retrieval:
# from the existing index
index.as_retriever()
# from scratch
from llama_cloud_services import LlamaCloudRetriever
retriever = LlamaCloudRetriever("my_first_index", project_name="default")
# perform retrieval
result = retriever.retrieve("What is the capital of France?")
And of course, you can use other index shortcuts to get use out of your new managed index:
query_engine = index.as_query_engine(llm=llm)
# perform retrieval and generation
result = query_engine.query("What is the capital of France?")
Retriever Settings
A full list of retriever settings/kwargs is below:
dense_similarity_top_k: Optional[int] — If greater than 0, retrieveknodes using dense retrievalsparse_similarity_top_k: Optional[int] — If greater than 0, retrieveknodes using sparse retrievalenable_reranking: Optional[bool] — Whether to enable reranking or not. Sacrifices some speed for accuracyrerank_top_n: Optional[int] — The number of nodes to return after reranking initial retrieval resultsalphaOptional[float] — The weighting between dense and sparse retrieval. 1 = Full dense retrieval, 0 = Full sparse retrieval.
Requirements
The llama_cloud_services and llama-index-core packages must be installed in your environment:
pip install llama-index-core llama_cloud_services
And the LLAMA_CLOUD_API_KEY must be available as an environment variable:
export LLAMA_CLOUD_API_KEY="..."