2024-12-04 Arize @ GitHub
Python: docs.llamaindex.ai
TypeScript: ts.llamaindex.ai
2. Get on the waitlist!
1. Sign up
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)
⚠️ Single-shot
⚠️ No query understanding/planning
⚠️ No tool use
⚠️ No reflection, error correction
⚠️ No memory (stateless)
✅ Multi-turn
✅ Query / task planning layer
✅ Tool interface for external environment
✅ Reflection
✅ Memory for personalization
and then go further
Agentic strategies
Full agent
Follow me on Twitter/X:
@seldo
Please don't add me on LinkedIn.