Skip to content

Tesla Response Stream is not compatible with Tesla Multipart upload #648

@elijahkim

Description

@elijahkim

I've been trying to stream an upload from GCP Cloud Storage to OpenAi using Tesla and I'm running into issues that I think has to do with Tesla. The code looks similar to this:

def get_stream(bucket, object) do
    {:ok, token} = Goth.Token.for_scope("https://www.googleapis.com/auth/cloud-platform")

    Tesla.client(
      [{Tesla.Middleware.Headers, [{"authorization", "Bearer #{token.token}"}]}],
      {Tesla.Adapter.Mint, [body_as: :stream]}
    )
    |> GoogleApi.Storage.V1.Api.Objects.storage_objects_get(
      bucket,
      object,
      [alt: "media"]
    )
end

def upload(stream) do
    body =
      Multipart.new()
      |> Multipart.add_file_content(stream, "file.mp4")
      |> Multipart.add_field("model", "whisper-1")
      |> Multipart.add_field("response_format", "json")

    post("v1/audio/transcriptions", body)
end

{:ok, resp} = get_stream("foo", "bar")

upload(resp.body)

The response from GCP returns a stream that is a function/2 and if I try to use that directly in upload/1 I get an error due to https://github.com/elixir-tesla/tesla/blob/master/lib/tesla/multipart.ex#L189. I also tried turning the function into a stream doing Stream.map(env.body, &Function.identity/1) but that seems to just hang before doing the upload. The last thing I tried was adding is_function into the guard that checks if the part is a valid type but that hangs as well.

I tried spelunking a bit deeper into Tesla but can't seem to figure out exactly how it works or else I would've tried opening a PR. Any help will be appreciated.

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions