Search — Quickstart
Go from "I have a table" to "I have semantic search" in four moves. No queue, no embedding service, no pipeline — the lens config is the whole story.
Note the searchable flag. title and body opt in to embedding — title
with a weight of 2 contributes twice as strongly to the ranking vector.
tags is stored as metadata (you get it back in results) but isn't
vectorized. Forget searchable and the field is invisible to search.
Four moves
Declare the lens
Add the block above to your semilayer.config.ts. searchable: true (or
{ weight: N }) opts a field into embedding; omitting the flag leaves the
field stored-but-not-vectorized. Only fields with searchable participate
in ranking.
Push
This validates the config, provisions the vector partition, and creates a paused lens. Nothing is embedded yet — that happens when you resume ingest.
Ingest
The worker streams rows from your source via the configured bridge, embeds
the searchable fields, and writes vectors to the index. Watch progress with
semilayer status --lens articles.
Generate Beam and call it
This writes a typed client to ./beam.ts. Then in your app:
Done. results is typed SearchResult<ArticlesMetadata>[] — fully inferred
from your config. Ranking is cosine similarity across all searchable fields,
weighted by the weight you declared.
What you skipped
If you've built semantic search by hand, here's what the four moves above replaced:
- An embedding service + retry/backoff/idempotency
- A vector store migration + index tuning
- A changefeed listener (for incremental sync)
- A ranking endpoint with auth, rate limits, and pagination
- A typed client
Next up: Fields & weights — the searchable flag in depth.