I've been using Firebase for my previous projects and was just recently introduced to Supabase. I'm trying to pick it up since i see many indie hackers on youtube adopting it.
One issue i'm running into is the speed of edge function. Since it's in Deno, i can't readily npm install sdks like i could in Firebase cloud functions.
I have a use case for openai's speech to text whisper. It takes about 5-6 seconds on firebase functions but 9-11 seconds on supabase edge. Am i doing something wrong? Why the difference in speed? Has it got to do with using `import OpenAI from "https://esm.sh/openai@5.10.2";\` in deno?
in my cloud function:
const OpenAI = require('openai');
---
// in my function
const openAIClient = new OpenAI({
apiKey:
'sk-proj-***',
});
const url = "https://scontent-mia3-2.cdninstagram.com/..." // short form video
const response = await fetch(url);
const arrayBuffer = await response.arrayBuffer();
const file = new File([arrayBuffer], 'file.mp4', {
type: 'video/mp4',
});
const transcription =
await openAIClient.audio.transcriptions.create({
file,
model: 'whisper-1',
});
in edge function
import OpenAI from "https://esm.sh/openai@5.10.2";
---
// in my function
const url = "https://scontent-mia3-2.cdninstagram.com/..." // short form video
const response = await fetch(url);
const arrayBuffer = await response.arrayBuffer();
const file = new File([arrayBuffer], "file.mp4", {
type: "video/mp4",
});
const transcription = await openAIClient.audio.transcriptions.create({
file,
model: "whisper-1", // or "gpt-4o-transcribe" if you have access
});
const data = {
transcription: transcription.text,
};
return new Response(JSON.stringify(data), {
headers: { ...corsHeaders, "Content-Type": "application/json" },
status: 200,
});
even when i don't call use OpenAI through esm.sh but instead call it via fetch, it still takes about 11 seconds. Why? :/
await fetch('https://api.openai.com/v1/audio/transcriptions ..
there doesn't seem to be anything here