AI & Vectors

Generate Embeddings

Generate text embeddings using Edge Functions.


This guide will walk you through how to generate high quality text embeddings in Edge Functions using its built-in AI inference API, so no external API is required.

Build the Edge Function

Let's build an Edge Function that will accept an input string and generate an embedding for it. Edge Functions are server-side TypeScript HTTP endpoints that run on-demand closest to your users.

1

Set up Supabase locally

Make sure you have the latest version of the Supabase CLI installed.

Initialize Supabase in the root directory of your app and start your local stack.


_10
supabase init
_10
supabase start

2

Create Edge Function

Create an Edge Function that we will use to generate embeddings. We'll call this embed (you can name this anything you like).

This will create a new TypeScript file called index.ts under ./supabase/functions/embed.


_10
supabase functions new embed

3

Setup Inference Session

Let's create a new inference session to be used in the lifetime of this function. Multiple requests can use the same inference session.

Currently, only the gte-small (https://huggingface.co/Supabase/gte-small) text embedding model is supported in Supabase's Edge Runtime.

./supabase/functions/embed/index.ts

_10
const session = new Supabase.ai.Session('gte-small');

4

Implement request handler

Modify our request handler to accept an input string from the POST request JSON body.

Then generate the embedding by calling session.run(input).

./supabase/functions/embed/index.ts

_16
Deno.serve(async (req) => {
_16
// Extract input string from JSON body
_16
const { input } = await req.json();
_16
_16
// Generate the embedding from the user input
_16
const embedding = await session.run(input, {
_16
mean_pool: true,
_16
normalize: true,
_16
});
_16
_16
// Return the embedding
_16
return new Response(
_16
JSON.stringify({ embedding }),
_16
{ headers: { 'Content-Type': 'application/json' } }
_16
);
_16
});

Note the two options we pass to session.run():

  • mean_pool: The first option sets pooling to mean. Pooling referes to how token-level embedding representations are compressed into a single sentence embedding that reflects the meaning of the entire sentence. Average pooling is the most common type of pooling for sentence embeddings.
  • normalize: The second option tells to normalize the embedding vector so that it can be used with distance measures like dot product. A normalized vector means its length (magnitude) is 1 - also referred to as a unit vector. A vector is normalized by dividing each element by the vector's length (magnitude), which maintains its direction but changes its length to 1.
5

Test it!

To test the Edge Function, first start a local functions server.


_10
supabase functions serve

Then in a new shell, create an HTTP request using cURL and pass in your input in the JSON body.


_10
curl --request POST 'http://localhost:54321/functions/v1/embed' \
_10
--header 'Authorization: Bearer ANON_KEY' \
_10
--header 'Content-Type: application/json' \
_10
--data '{ "input": "hello world" }'

Be sure to replace ANON_KEY with your project's anonymous key. You can get this key by running supabase status.

Next steps