How to Integrate LlamaIndex for pension funds with Supabase for production AI
Combining LlamaIndex for pension funds with Supabase gives you a clean production pattern: structured pension data retrieval, retrieval-augmented generation, and persistent app state in one stack. You use LlamaIndex to turn policy docs, benefit rules, and member communications into queryable context, then use Supabase to store vectors, metadata, user sessions, and audit trails.
Prerequisites
- •Python 3.10+
- •A Supabase project with:
- •
SUPABASE_URL - •
SUPABASE_SERVICE_ROLE_KEYfor backend services
- •
- •A PostgreSQL database with
pgvectorenabled in Supabase - •Access to your pension fund documents:
- •policy PDFs
- •contribution rules
- •retirement benefit guides
- •FAQ content
- •Installed packages:
- •
llama-index - •
llama-index-vector-stores-supabase - •
supabase - •
python-dotenv
- •
- •An LLM provider key configured for your LlamaIndex setup
Integration Steps
- •
Install dependencies and load configuration
Start by wiring environment variables and installing the libraries your agent will use in production.
pip install llama-index llama-index-vector-stores-supabase supabase python-dotenvimport os from dotenv import load_dotenv load_dotenv() SUPABASE_URL = os.environ["SUPABASE_URL"] SUPABASE_SERVICE_ROLE_KEY = os.environ["SUPABASE_SERVICE_ROLE_KEY"] - •
Create the Supabase vector table
LlamaIndex needs a place to persist embeddings and metadata. In Supabase, create a table that supports vector search.
create extension if not exists vector; create table if not exists pension_docs ( id bigserial primary key, content text, metadata jsonb, embedding vector(1536) );If you are using OpenAI embeddings,
1536is the common dimension. Match this to the embedding model you actually deploy. - •
Connect LlamaIndex to Supabase as a vector store
This is the core integration point. LlamaIndex writes chunks and embeddings into Supabase through the vector store adapter.
from supabase import create_client from llama_index.core import VectorStoreIndex, StorageContext, Settings from llama_index.vector_stores.supabase import SupabaseVectorStore from llama_index.core.node_parser import SentenceSplitter from llama_index.core import SimpleDirectoryReader supabase = create_client(SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY) vector_store = SupabaseVectorStore( postgres_connection_string=None, collection_name="pension_docs", client=supabase, table_name="pension_docs", query_name="match_pension_docs", ) storage_context = StorageContext.from_defaults(vector_store=vector_store) splitter = SentenceSplitter(chunk_size=800, chunk_overlap=120) Settings.node_parser = splitter - •
Ingest pension documents into the index
Read your source files, split them into nodes, and push them into Supabase through LlamaIndex.
from llama_index.core import Document documents = SimpleDirectoryReader("./pension_policy_docs").load_data() index = VectorStoreIndex.from_documents( documents, storage_context=storage_context, show_progress=True, ) index.storage_context.persist() - •
Query the indexed data from your AI agent
Build a retriever or query engine that your agent can call when answering member or admin questions.
query_engine = index.as_query_engine(similarity_top_k=3) response = query_engine.query( "What is the retirement eligibility rule for members with 20 years of service?" ) print(response)
Testing the Integration
Run a direct retrieval test against a real pension-policy question. You want to verify three things: ingestion worked, similarity search returns relevant chunks, and the answer is grounded in your source docs.
from llama_index.core import VectorStoreIndex
query_engine = index.as_query_engine(similarity_top_k=2)
test_question = "How are early retirement benefits calculated?"
result = query_engine.query(test_question)
print("QUESTION:", test_question)
print("ANSWER:", result)
Expected output should look like this:
QUESTION: How are early retirement benefits calculated?
ANSWER: Early retirement benefits are calculated based on accrued service years, final average salary, and an actuarial reduction applied before normal retirement age...
If you get an empty or generic answer, check these first:
- •your documents were actually loaded from disk
- •embeddings were generated with the correct dimension
- •the Supabase table name matches the vector store config
- •your query function is pointed at the same collection/table used during ingestion
Real-World Use Cases
- •
Member self-service assistant
- •Answer questions about contributions, vesting, retirement age, and payout rules using only approved pension documents.
- •
Operations copilot
- •Help internal staff find policy clauses fast during case handling, without manually searching PDFs or shared drives.
- •
Compliance-aware RAG system
- •Store document metadata in Supabase so every answer can include source references, versioning info, and review status for auditability.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit