RAG & Semantic Search
No Pinecone. No LangChain. No Chroma. Mark a field as semantic text, save an entity, and FLIN generates embeddings automatically. Search by meaning with one keyword. Build RAG pipelines in a single page.
How Semantic Search Works
One keyword turns any text field into an AI-searchable vector index. No configuration, no external database, no embedding API calls.
1. Mark the Field
Add semantic before any text field. On every save, FLIN generates vector embeddings and indexes them automatically.
2. Search by Meaning
search "query" in Entity finds records by semantic similarity — not keyword matching. "How do plants make food?" finds "Photosynthesis Explained".
3. Build RAG
Search finds relevant documents, injects them as context into a system prompt, and streams an AI answer. The full RAG pipeline in native FLIN code.
From Data to AI Search in 10 Lines
// 1. Define an entity with semantic text entity Document { title: text @required content: semantic text // ← Embeddings auto-generated on save category: text = "general" } // 2. Save documents — embeddings generated automatically doc = Document { title: "Photosynthesis", content: "Plants convert sunlight..." } save doc // 3. Search by meaning — not keywords results = search "how do plants make food from sunlight" in Document // 4. Or use keyword search for exact matches exact = keyword_search("photosynthesis", "Document", "content", 10) // 5. Use results in templates {for doc in results} <div>{doc.title} — {doc.category}</div> {/for}
Semantic vs. Keyword Search
Keyword search finds exact words. Semantic search finds meaning. FLIN gives you both — side by side, in the same query language.
// Finds by MEANING search "baking desserts at home" in Document // Matches: "Chocolate Chip Cookie // Recipe" even though "baking" and // "desserts" don't appear in content // Uses vector embeddings // Cosine similarity ranking // Works across languages
// Finds by EXACT WORDS keyword_search( "error E-4021", "Document", "content", 10 ) // Matches: documents containing // the exact string "E-4021" // Uses BM25 ranking // Multilingual stop words // Best for codes & identifiers
Semantic Wins
"Tips for better rest" finds "Healthy Sleep Habits". "Ancient kingdoms in Africa" finds "History of West Africa". Meaning over words.
Keyword Wins
"Error E-4021" finds the exact error code. "PostgreSQL max_connections" finds the config parameter. Precision over meaning.
Use Both
FLIN automatically falls back to keyword search when the vector store is unavailable. Best of both worlds with zero configuration.
5 Embedding Providers. One Config Line.
Choose your embedding provider in flin.config. Switch between local and cloud without changing any application code.
Local (FastEmbed)
Free, offline, 384-dim. Model downloads automatically (~100 MB).
embeddings: "local"OpenAI
text-embedding-3-small, 1536-dim. Best quality for English.
embeddings: "openai"OpenAI Vector Store
Hosted vector DB. Auto-chunking, persistent, scalable to millions.
embeddings: "openai-vectorstore"Voyage AI
voyage-3, 1024-dim. Optimized for retrieval quality.
embeddings: "voyage"Cohere
embed-multilingual-v3.0, 1024-dim. Best for multilingual.
embeddings: "cohere"# Local embeddings — free, offline, zero API keys ai { embeddings: "local" } # OpenAI embeddings — best quality ai { embeddings: "openai" embedding_model: "text-embedding-3-small" } # OpenAI Vector Store — hosted, persistent, scalable ai { embeddings: "openai-vectorstore" } # Cohere — best for multilingual content ai { embeddings: "cohere" embedding_model: "embed-multilingual-v3.0" }
RAG Without the Plumbing
In other frameworks, RAG means stitching together a vector database, an embedding API, a chunking strategy, and an LLM client. In FLIN, it is a page.
// 1. User asks a question question = "What is the vacation policy?" // 2. Find the 5 most relevant documents docs = search question in Document by content limit 5 // 3. Build context from search results fn buildContext() { context = "" for doc in docs { context = context + "--- " + doc.title + " ---\n" + doc.content + "\n\n" } return context } // 4. Ask the AI with document context answer = ask_ai(question, [ "provider": "claude", "system_prompt": "Answer based on these documents:\n" + buildContext() ])
Search
Semantic search finds the 5 most relevant documents from your knowledge base. Meaning-based, not keyword-based.
Context
Found documents are injected into the system prompt as context. The LLM answers based on your data, not hallucinations.
Answer
ask_ai() sends the question with context to any of 8 LLM providers. Stream the response or get it in full.
The Full Document Pipeline
Upload a PDF. FLIN extracts the text, generates embeddings, indexes them, and makes the content searchable — all on save.
// Upload and parse a document route POST /upload { validate { document: file @required @document title: text @required } // Extract text from PDF, DOCX, CSV, etc. content = document_extract(body.document) // Save — embeddings generated automatically doc = Document { title: body.title, content: content } save doc response.created(doc) } // Search across all uploaded documents results = search "quarterly revenue projections" in Document
Upload
Multipart file upload with validators: @max_size("50MB"), @extension(".pdf", ".docx"), @document
Parse
document_extract() handles 9 formats: PDF, DOCX, HTML, CSV, XLSX, JSON, YAML, RTF, XML
Embed
Vector embeddings generated on save. 5 providers: local (free), OpenAI, Voyage, Cohere, OpenAI Vector Store
Search
Cosine similarity ranking with automatic BM25 keyword fallback. Results sorted by relevance score
Multilingual by Default
Keyword search uses 220+ stop words across 5 languages. Semantic search with Cohere or FastEmbed works across languages out of the box.
5 Languages
English, French, Spanish, Portuguese, and German stop words built into the keyword search engine. No configuration needed.
Cross-Lingual Search
With multilingual embedding models (FastEmbed, Cohere), search in English across French documents — or any combination.
Smart Filtering
Stop words are filtered before embedding and keyword indexing. "Le", "les", "des", "un" in French; "der", "die", "das" in German — all handled.
Build AI-Powered Search Today
Add semantic text to any field. Upload documents. Search by meaning. Build RAG pipelines. Ship AI features in minutes, not weeks.
curl -fsSL https://flin.sh | bash