Skip to content

Add Codespace to run the Ollama samples #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Nov 5, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
File renamed without changes.
52 changes: 52 additions & 0 deletions .devcontainer/Ollama/devcontainer.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/dotnet
{
"name": "C# (.NET) - Ollama",
"build": {
"dockerfile": "dockerfile"
},
"customizations": {
"vscode": {
"extensions": [
"GitHub.copilot",
"ms-dotnettools.csdevkit"
]
},
"codespaces": {
"openFiles": [
"./src/videos/firetruck.mp4"
]
}
},

// Features to add to the dev container. More info: https://containers.dev/features.
// "features": {},
"features": {
"ghcr.io/devcontainers/features/docker-in-docker:2": {},
"ghcr.io/devcontainers/features/common-utils:2": {},
"ghcr.io/prulloac/devcontainer-features/ollama:latest": {},
"sshd": "latest"
},

// Use 'forwardPorts' to make a list of ports inside the container available locally.
"forwardPorts": [17057],
"portsAttributes": {
"17057": {
"protocol": "http"
}
},

// Use 'postCreateCommand' to run commands after the container is created.
"postCreateCommand": "sudo dotnet workload update && sudo dotnet workload install aspire && sudo dotnet workload list",
"postStartCommand": "",

// Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root.
"remoteUser": "vscode",
"hostRequirements": {
"memory": "16gb",
"cpus": 4
},
"mounts": [
"source=${localWorkspaceFolder},target=/workspace,type=bind,consistency=cached"
]
}
26 changes: 26 additions & 0 deletions .devcontainer/Ollama/dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# Use the official image as a parent image
# FROM mcr.microsoft.com/dotnet/sdk:8.0
FROM ghcr.io/shimat/opencvsharp/ubuntu22-dotnet6-opencv4.7.0:20230114

# Set the working directory
WORKDIR /workspace

# Copy the current directory contents into the container at /workspace
COPY . /workspace

# Install cURL
RUN apt-get update && apt-get install -y curl

RUN apt-get update && \
apt-get install -y dotnet-sdk-8.0

# Install the .NET workloads
RUN dotnet workload update
RUN dotnet workload install aspire

# Expose ports
EXPOSE 5000
EXPOSE 5001

# Define the entry point for the container
CMD ["dotnet", "run"]
10 changes: 7 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,13 +61,14 @@ COMING SOON!
The sample Console projects are in the folder [./src/Labs/](./src/Labs/).
The full Aspire + Blazor solution is in the folder [./srcBlazor/](./srcBlazor/).

Currently there are labs for:
Currently there are samples for the following technologies:

- [OpenAI .NET SDK](https://devblogs.microsoft.com/dotnet/announcing-the-stable-release-of-the-official-open-ai-library-for-dotnet/)
- [Microsoft AI Extensions](https://devblogs.microsoft.com/dotnet/introducing-microsoft-extensions-ai-preview/)
- [GitHub Models](https://devblogs.microsoft.com/dotnet/using-github-models-and-dotnet-to-build-generative-ai-apps/)
- [Local Image analysis using Ollama](https://ollama.com/blog/vision-models)
- [OpenCV, using OpenCVSharp](https://github.com/shimat/opencvsharp)
- [Phi-3.5](https://aka.ms/Phi-3CookBook)

## Run sample projects

Expand Down Expand Up @@ -95,7 +96,6 @@ Currently there are labs for:

1. Once the Codespace is loaded, it should have all the necessary requirements to run the demo projects.


### Run GitHub Models samples

To run the sample using GitHub Models, located in `./src/ConsoleMEAI-05-GitHubModels`.
Expand Down Expand Up @@ -176,7 +176,11 @@ dotnet run

### Run Ollama sample

**Instructions Coming soon!**
- Follow the instructions in the [Run Ollama Sample](./docs/runollamademo.md)

- You can expect an output similar to this one:

![Run Sample using Ollama local models](./images/20OpenAIRunSample.png)

## Guidance

Expand Down
5 changes: 5 additions & 0 deletions docs/blazordemo.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Aspire + Blazor Demo

## COMING SOON !!!

![Blazor Demo](../images/50BlazorDemo.gif)
89 changes: 89 additions & 0 deletions docs/runollamademo.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
# Run sample using Ollama and local models

## Create a Codespace

- Create a new Codespace using the option **New with options**:

![Create CodeSpace with Options](../images/40CreateCodespaceNewWithOptions.png)

- Select the Ollama template for the Codespace

![Select the Ollama template for the Codespace](../images/42CodeSpaceOllama.png)

- Create the Codespace.

- The process could take a couple of minutes.

## Download Models from Ollama

Once the Codespace is created open a terminal and download the following models from Ollama

- [llava](https://ollama.com/library/llava)

- [llama3.2](https://ollama.com/library/llama3.2)

- [phi3.5](https://ollama.com/library/phi3.5)

Download the models, run the commands

```bash
ollama pull llava:7b
ollama pull llama3.2
ollama pull phi3.5
```

Check the downloaded models with the command:

```bash
ollama ls
```

The output should be similar to this one:
![List Ollama local models](../images/45ollamals.png)

## Run the Spectre Console demo

- Navigate to the sample project folder using the command:

```bash
cd cd ./src/SpectreConsole-MEAI-01-Ollama/
```

- Run the project:

```bash
dotnet run
```

- You can expect an output similar to this one:

![Run Sample using MEAI and Ollama with local models](../images/50CodeSpacesOllamaRunComplete.png)

## Update the demo to use Phi-3.5

- Edit the file `./src/SpectreConsole-MEAI-01-Ollama/Program.cs`

- Change the model name to `phi3.5`

- Increment the number of frames to be processed to get a more detailed analysis.

```csharp
//////////////////////////////////////////////////////
/// Microsoft.Extensions.AI using Ollama
//////////////////////////////////////////////////////
SpectreConsoleOutput.DisplayTitleH1("Video Analysis using Microsoft.Extensions.AI using Ollama");

IChatClient chatClientImageAnalyzer =
new OllamaChatClient(new Uri("http://localhost:11434/"), "llava:7b");
IChatClient chatClient =
new OllamaChatClient(new Uri("http://localhost:11434/"), "phi3.5");

// for the ollama process we use only 5 frames
// change this value to get more frames for a more detailed analysis
var numberOfFrames = 10; //PromptsHelper.NumberOfFrames;

List<string> imageAnalysisResponses = new();
int step = (int)Math.Ceiling((double)frames.Count / numberOfFrames);
```

- Run the project again.
Binary file added images/40CreateCodespaceNewWithOptions.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/42CodeSpaceOllama.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/45ollamals.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/50BlazorDemo.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/50CodeSpacesOllamaRunComplete.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
11 changes: 8 additions & 3 deletions src/SpectreConsole-MEAI-01-Ollama/Program.cs
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,9 @@
if (!video.Read(frame) || frame.Empty())
break;
// resize the frame to half of its size
Cv2.Resize(frame, frame, new OpenCvSharp.Size(frame.Width / 2, frame.Height / 2));
// for the ollama sample, we resize the frame to 1/4 of its size
// smaller images are faster to process
Cv2.Resize(frame, frame, new OpenCvSharp.Size(frame.Width / 3, frame.Height / 3));
frames.Add(frame);
}
video.Release();
Expand All @@ -43,12 +45,15 @@
IChatClient chatClient =
new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.2");

// for the ollama process we use only 5 frames
// change this value to get more frames for a more detailed analysis
var numberOfFrames = 5; //PromptsHelper.NumberOfFrames;

List<string> imageAnalysisResponses = new();
int step = (int)Math.Ceiling((double)frames.Count / PromptsHelper.NumberOfFrames);
int step = (int)Math.Ceiling((double)frames.Count / numberOfFrames);

// show the total number of frames and the step to get the desired number of frames using spectre console
SpectreConsoleOutput.DisplaySubtitle("Process", $"Get 1 frame every [{step}] to get the [{PromptsHelper.NumberOfFrames}] frames for analysis");
SpectreConsoleOutput.DisplaySubtitle("Process", $"Get 1 frame every [{step}] to get the [{numberOfFrames}] frames for analysis");

var tableImageAnalysis = new Table();
await AnsiConsole.Live(tableImageAnalysis)
Expand Down