Skip to content

Conversation

feloy
Copy link
Contributor

@feloy feloy commented May 14, 2025

What does this PR do?

Display runtime for running inference servers

Screenshot / video of UI

display-inference-server-runtime

What issues does this PR fix or reference?

Part of #2614

How to test this PR?

Start inference servers with different backends (llama.cpp / whisper.cpp), and check that the runtime is correctly displayed in the list

@feloy feloy requested review from benoitf, jeffmaury and a team as code owners May 14, 2025 13:07
@feloy feloy requested review from dgolovin and deboer-tim May 14, 2025 13:07
import { InferenceTypeLabel, type InferenceServer } from '@shared/models/IInference';
import Badge from '../../Badge.svelte';

export let object: InferenceServer;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cant we use Props here?

Copy link
Contributor Author

@feloy feloy May 14, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is used for none of the other columns, I would prefer to keep it consistent

@feloy feloy force-pushed the feat-2614/inference-server-backend-main branch from 6af6001 to b3e02a2 Compare May 14, 2025 14:19
@feloy feloy requested a review from gastoner May 14, 2025 14:19
@feloy feloy force-pushed the feat-2614/inference-server-backend-main branch from b3e02a2 to 97447fc Compare May 14, 2025 14:38
Copy link
Contributor

@gastoner gastoner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, tested on Linux

@feloy feloy merged commit 06aafd3 into containers:main May 14, 2025
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants