Skip to content

Commit bf7b1cf

Browse files
committed
Merge remote-tracking branch 'upstream/dev' into macae-v3-dev
2 parents 0b7c9bc + 7deb672 commit bf7b1cf

File tree

10 files changed

+116
-14
lines changed

10 files changed

+116
-14
lines changed

docs/DeploymentGuide.md

Lines changed: 12 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,8 @@ When you start the deployment, most parameters will have **default values**, but
153153
| **GPT Model Capacity** | Sets the GPT model capacity. | 150 |
154154
| **Image Tag** | Docker image tag used for container deployments. | latest |
155155
| **Enable Telemetry** | Enables telemetry for monitoring and diagnostics. | true |
156-
156+
| **Existing Log Analytics Workspace** | To reuse an existing Log Analytics Workspace ID instead of creating a new one. | *(none)* |
157+
| **Existing Azure AI Foundry Project** | To reuse an existing Azure AI Foundry Project ID instead of creating a new one. | *(none)* |
157158

158159
</details>
159160

@@ -176,6 +177,14 @@ To adjust quota settings, follow these [steps](./AzureGPTQuotaSettings.md).
176177

177178
</details>
178179

180+
<details>
181+
182+
<summary><b>Reusing an Existing Azure AI Foundry Project</b></summary>
183+
184+
Guide to get your [Existing Project ID](/docs/re-use-foundry-project.md)
185+
186+
</details>
187+
179188
### Deploying with AZD
180189

181190
Once you've opened the project in [Codespaces](#github-codespaces), [Dev Containers](#vs-code-dev-containers), or [locally](#local-environment), you can deploy it to Azure by following these steps:
@@ -311,11 +320,12 @@ The files for the dev container are located in `/.devcontainer/` folder.
311320
312321
5. **Create a `.env` file:**
313322
314-
- Navigate to the `src` folder and create a `.env` file based on the provided `.env.sample` file.
323+
- Navigate to the `src\backend` folder and create a `.env` file based on the provided `.env.sample` file.
315324
316325
6. **Fill in the `.env` file:**
317326
318327
- Use the output from the deployment or check the Azure Portal under "Deployments" in the resource group.
328+
- Make sure to set APP_ENV to "**dev**" in `.env` file.
319329
320330
7. **(Optional) Set up a virtual environment:**
321331

docs/LocalDeployment.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,11 +102,16 @@ The files for the dev container are located in `/.devcontainer/` folder.
102102
103103
6. **Create a `.env` file:**
104104
105-
- Navigate to the `src` folder and create a `.env` file based on the provided `.env.sample` file.
105+
- Navigate to the `src\backend` folder and create a `.env` file based on the provided `.env.sample` file.
106+
- Update the `.env` file with the required values from your Azure resource group in Azure Portal App Service environment variables.
107+
- Alternatively, if resources were
108+
provisioned using `azd provision` or `azd up`, a `.env` file is automatically generated in the `.azure/<env-name>/.env`
109+
file. To get your `<env-name>` run `azd env list` to see which env is default.
106110
107111
7. **Fill in the `.env` file:**
108112
109113
- Use the output from the deployment or check the Azure Portal under "Deployments" in the resource group.
114+
- Make sure to set APP_ENV to "**dev**" in `.env` file.
110115
111116
8. **(Optional) Set up a virtual environment:**
112117
331 KB
Loading
94.9 KB
Loading
196 KB
Loading

docs/re-use-foundry-project.md

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
[← Back to *DEPLOYMENT* guide](/docs/DeploymentGuide.md#deployment-steps)
2+
3+
# Reusing an Existing Azure AI Foundry Project
4+
To configure your environment to use an existing Azure AI Foundry Project, follow these steps:
5+
---
6+
### 1. Go to Azure Portal
7+
Go to https://portal.azure.com
8+
9+
### 2. Search for Azure AI Foundry
10+
In the search bar at the top, type "Azure AI Foundry" and click on it. Then select the Foundry service instance where your project exists.
11+
12+
![alt text](../docs/images/re_use_foundry_project/azure_ai_foundry_list.png)
13+
14+
### 3. Navigate to Projects under Resource Management
15+
On the left sidebar of the Foundry service blade:
16+
17+
- Expand the Resource Management section
18+
- Click on Projects (this refers to the active Foundry project tied to the service)
19+
20+
### 4. Click on the Project
21+
From the Projects view: Click on the project name to open its details
22+
23+
Note: You will see only one project listed here, as each Foundry service maps to a single project in this accelerator
24+
25+
![alt text](../docs/images/re_use_foundry_project/navigate_to_projects.png)
26+
27+
### 5. Copy Resource ID
28+
In the left-hand menu of the project blade:
29+
30+
- Click on Properties under Resource Management
31+
- Locate the Resource ID field
32+
- Click on the copy icon next to the Resource ID value
33+
34+
![alt text](../docs/images/re_use_foundry_project/project_resource_id.png)
35+
36+
### 6. Set the Foundry Project Resource ID in Your Environment
37+
Run the following command in your terminal
38+
```bash
39+
azd env set AZURE_ENV_FOUNDRY_PROJECT_ID '<Existing Foundry Project Resource ID>'
40+
```
41+
Replace `<Existing Foundry Project Resource ID>` with the value obtained from Step 5.
42+
43+
### 7. Continue Deployment
44+
Proceed with the next steps in the [deployment guide](/docs/DeploymentGuide.md#deployment-steps).

docs/re-use-log-analytics.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
[← Back to *DEPLOYMENT* guide](/docs/DeploymentGuide.md#deployment-options--steps)
1+
[← Back to *DEPLOYMENT* guide](/docs/DeploymentGuide.md#deployment-steps)
22

33
# Reusing an Existing Log Analytics Workspace
44
To configure your environment to use an existing Log Analytics Workspace, follow these steps:
@@ -28,4 +28,4 @@ azd env set AZURE_ENV_LOG_ANALYTICS_WORKSPACE_ID '<Existing Log Analytics Worksp
2828
Replace `<Existing Log Analytics Workspace Id>` with the value obtained from Step 3.
2929

3030
### 5. Continue Deployment
31-
Proceed with the next steps in the [deployment guide](/docs/DeploymentGuide.md#deployment-options--steps).
31+
Proceed with the next steps in the [deployment guide](/docs/DeploymentGuide.md#deployment-steps).

infra/main.bicep

Lines changed: 21 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,8 @@ param enableTelemetry bool = true
1818

1919
param existingLogAnalyticsWorkspaceId string = ''
2020

21+
param azureopenaiVersion string = '2025-01-01-preview'
22+
2123
// Restricting deployment to only supported Azure OpenAI regions validated with GPT-4o model
2224
@metadata({
2325
azd : {
@@ -1000,7 +1002,7 @@ module containerApp 'br/public:avm/res/app/container-app:0.14.2' = if (container
10001002
}
10011003
{
10021004
name: 'AZURE_OPENAI_API_VERSION'
1003-
value: '2025-01-01-preview' //TODO: set parameter/variable
1005+
value: azureopenaiVersion
10041006
}
10051007
{
10061008
name: 'APPLICATIONINSIGHTS_INSTRUMENTATION_KEY'
@@ -1718,3 +1720,21 @@ type webSiteConfigurationType = {
17181720
@description('Optional. The tag of the container image to be used by the Web Site.')
17191721
containerImageTag: string?
17201722
}
1723+
1724+
1725+
output COSMOSDB_ENDPOINT string = 'https://${cosmosDbResourceName}.documents.azure.com:443/'
1726+
output COSMOSDB_DATABASE string = cosmosDbDatabaseName
1727+
output COSMOSDB_CONTAINER string = cosmosDbDatabaseMemoryContainerName
1728+
output AZURE_OPENAI_ENDPOINT string = 'https://${aiFoundryAiServicesResourceName}.openai.azure.com/'
1729+
output AZURE_OPENAI_MODEL_NAME string = aiFoundryAiServicesModelDeployment.name
1730+
output AZURE_OPENAI_DEPLOYMENT_NAME string = aiFoundryAiServicesModelDeployment.name
1731+
output AZURE_OPENAI_API_VERSION string = azureopenaiVersion
1732+
// output APPLICATIONINSIGHTS_INSTRUMENTATION_KEY string = applicationInsights.outputs.instrumentationKey
1733+
// output AZURE_AI_PROJECT_ENDPOINT string = aiFoundryAiServices.outputs.aiProjectInfo.apiEndpoint
1734+
output AZURE_AI_SUBSCRIPTION_ID string = subscription().subscriptionId
1735+
output AZURE_AI_RESOURCE_GROUP string = resourceGroup().name
1736+
output AZURE_AI_PROJECT_NAME string = aiFoundryAiProjectName
1737+
output AZURE_AI_MODEL_DEPLOYMENT_NAME string = aiFoundryAiServicesModelDeployment.name
1738+
// output APPLICATIONINSIGHTS_CONNECTION_STRING string = applicationInsights.outputs.connectionString
1739+
output AZURE_AI_AGENT_MODEL_DEPLOYMENT_NAME string = aiFoundryAiServicesModelDeployment.name
1740+
output AZURE_AI_AGENT_ENDPOINT string = aiFoundryAiServices.outputs.aiProjectInfo.apiEndpoint

src/backend/pyproject.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@ dependencies = [
88
"azure-ai-evaluation>=1.5.0",
99
"azure-ai-inference>=1.0.0b9",
1010
"azure-ai-projects>=1.0.0b9",
11+
"azure-ai-agents>=1.2.0b1",
1112
"azure-cosmos>=4.9.0",
1213
"azure-identity>=1.21.0",
1314
"azure-monitor-events-extension>=0.1.0",

src/backend/uv.lock

Lines changed: 30 additions & 8 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)