Comments (6)
Hey @avalanche-tm, that's strange! Can you confirm how you are executing this function - is it through the Supabase JS client rpc()
method (ie. REST API under the hood), or a direct SQL connection?
Edit: Apologies just realized this is the private.handle_storage_update()
trigger function. To confirm, are you receiving this error after it is executed by the trigger, or are you executing it directly?
from chatgpt-your-files.
Hey @gregnr after it's executed by trigger. Haven't tried to execute it directly. Also, I have simplified the whole function for my use case. I generate embedding for a short description instead of documents and store the embeddings into the same table. Here is the code:
create table
public.applications (
id bigint generated by default as identity,
name text not null,
description text null,
create_embeddings boolean not null default false,
description_embeddings extensions.vector null,
)
create or replace function private.handle_application_description_embeddings()
returns trigger
as $$
declare
application_id int := NEW.id;
result int;
begin
if (TG_OP = 'INSERT' and NEW.create_embeddings is true)
or (TG_OP = 'UPDATE' and OLD.description is distinct from NEW.description and NEW.create_embeddings is true)
or (TG_OP = 'UPDATE' and OLD.create_embeddings is false and NEW.create_embeddings is true and OLD.description_embeddings is NULL)
then
select
net.http_post(
url := supabase_url() || '/functions/v1/embedding-application-description',
headers := jsonb_build_object(
'Content-Type', 'application/json',
'Authorization', current_setting('request.headers')::json->>'authorization'
),
body := jsonb_build_object(
'applicationId', application_id,
'description', NEW.description
)
)
into result;
return null;
end if;
return null;
end;
$$ language plpgsql;
create trigger on_application_description_upsert
after insert or update on public.applications
for each row
execute procedure private.handle_application_description_embeddings();
So after I lost couple of hours not getting anything out of current_setting('request.headers') I decided to take a different route. I added anon key as supabase secret and created function just like supabase_url() to fetch my anon key. Now I have this and it works:
headers := jsonb_build_object(
'Content-Type', 'application/json',
'Authorization', 'Bearer ' || supabase_anon_key()
),
I would like to hear your thoughts about this, is it a good idea? It makes complete sense that request.header is present when you call the function with rpc() but in your example it's also triggered by trigger so I'm a bit confused.
from chatgpt-your-files.
Hey @avalanche-tm, thanks for clarifying. Yes, adding the anon key as a DB secret is definitely a valid approach - the only downside is that you will lose information about which user is making the request in case you have any RLS (authorization) logic downstream. This only matters though if your app has a concept of users and you are inserting/updating public.applications
records through the Supabase REST API as that user (ie. via the supabase client library).
Even though the function is invoked via a trigger it should still have access to the same current_setting('request.headers')
from the original session that inserted/updated the records. The fact that you are not seeing data in current_setting('request.headers')
leads me to think that you might not be inserting/updating data via the REST API (eg. using a direct SQL connection instead). Is that correct?
from chatgpt-your-files.
Hey @gregnr you are right, I was inserting rows from Supabase UI and wondering why is it not working:) I feel silly now.
I have one more question, pg_net documentation says "Intended to handle at most 200 requests per second. Increasing the rate can introduce instability" so I'm wondering if this whole thing with triggering functions is a good Idea. I can easily imagine a case where multiple clients are inserting multiple documents at the same time which would send lots of requests. Is this something I should worry about in the production?
from chatgpt-your-files.
No worries, happens to the best of us!
The 200 rps limit is there to cap CPU usage (same as any webhook style service). If you exceed 200 rps at a given moment, the remaining requests will just be queued until the next batch (not dropped). So if you had a burst of clients inserting documents in a single moment, some would just take a bit longer to invoke.
from chatgpt-your-files.
Hey @avalanche-tm, did this help answer your question? I'll close this issue for now, but feel free to comment or create a new issue if you still have any further questions.
from chatgpt-your-files.
Related Issues (16)
- Client connection error and wall clock duration reached - Local HOT 3
- Cannot deploy SupabaseFunction anymore HOT 2
- any file extension chat HOT 1
- Docs feedback from tutorial user HOT 7
- shutdown (reason: CPU time limit reached) HOT 24
- npx supabase functions deploy HOT 2
- Supabase Installation Issues in WSL2 HOT 2
- Error when trying to embed a md file HOT 2
- if embedding or chunking or any of the worker fails in the background, need a simple way to communicate that to user. HOT 1
- Use multilingual embeddings model HOT 2
- Tutorial branches are missing Edge Functions updates? HOT 1
- A few issues when running locally HOT 6
- supabase folder should be excluded in next.config HOT 5
- Cannot upload file
- npm run dev does not work HOT 10
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from chatgpt-your-files.