You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -102,7 +102,7 @@ Sign into the App Platform web UI using the `platform-admin` account, or another
102
102
103
103
1. Click **Create Team**.
104
104
105
-
1. Provide a **Name** for the Team. Keep all other default values, and click **Submit**. This guide uses the Team name `demo`.
105
+
1. Provide a **Name** for the Team. Keep all other default values, and click **Create Team**. This guide uses the Team name `demo`.
106
106
107
107
### Install the NVIDIA GPU Operator
108
108
@@ -170,11 +170,7 @@ A [Workload](https://apl-docs.net/docs/for-devs/console/workloads) is a self-ser
170
170
171
171
1. Continue with the rest of the default values, and click **Submit**.
172
172
173
-
After the Workload is submitted, App Platform creates an Argo CD application to install the `kserve-crd` Helm chart. Wait for the **Status** of the Workload to become healthy as represented by a green check mark. This may take a few minutes.
174
-
175
-

176
-
177
-
Click on the ArgoCD **Application** link once the Workload is ready. You should be brought to the Argo CD screen in a separate window:
173
+
After the Workload is submitted, App Platform creates an Argo CD application to install the `kserve-crd` Helm chart. Wait forthe **Status** of the Workload to become ready, and click on the ArgoCD **Application** link. You should be brought to the Argo CD screenin a separate window:
178
174
179
175

180
176
@@ -386,11 +382,9 @@ Wait for the Workload to be ready again, and proceed to the following steps for
386
382
387
383
1. Click **Create Service**.
388
384
389
-
1. In the **Name** dropdown list, select the `llama3-model-predictor` service.
385
+
1. In the **Service Name** dropdown list, select the `llama3-model-predictor` service.
390
386
391
-
1. Under **Exposure (ingress)**, select **External**.
392
-
393
-
1. Click **Submit**.
387
+
1. Click **Create Service**.
394
388
395
389
Once the Service is ready, copy the URL for the `llama3-model-predictor` service, and add it to your clipboard.
396
390
@@ -493,11 +487,9 @@ Follow the steps below to follow the second option and add the Kyverno security
493
487
494
488
1. Click **Create Service**.
495
489
496
-
1. In the **Name** dropdown menu, select the `llama3-ui` service.
497
-
498
-
1. Under **Exposure (ingress)**, select **External**.
490
+
1. In the **Service Name** dropdown menu, select the `llama3-ui` service.
@@ -108,7 +109,7 @@ When working in the context of an admin-level Team, users can create and access
108
109
109
110
1. Click **Create Team**.
110
111
111
-
1. Provide a **Name** for the Team. Keep all other default values, and click **Submit**. This guide uses the Team name `demo`.
112
+
1. Provide a **Name** for the Team. Keep all other default values, and click **Create Team**. This guide uses the Team name `demo`.
112
113
113
114
### Create a RabbitMQ Cluster with Workloads
114
115
@@ -136,27 +137,37 @@ This guide uses an example Python chat app to send messages to all connected cli
136
137
137
138
The example app in this guide is not meant for production workloads, and steps may vary depending on the app you are using.
138
139
140
+
### Add the Code Repository for the Example App
141
+
139
142
1. Select **view** > **team** and **team** > **demo** in the top bar.
140
143
141
-
1. Select **Builds**, and click **Create Build**.
144
+
1. Select **Code Repositories**, and click **Add Code Repository**.
142
145
143
-
1. Provide a name for the Build. This is the same name used for the image stored in the private Harbor registry of your Team. This guide uses the Build name `rmq-example-app` with the tag `latest`.
146
+
1. Provide the name `apl-examples` for the Code Repository.
144
147
145
-
1. Select the **Mode**`Buildpacks`.
148
+
1. Select *GitHub* as the **Git Service**.
146
149
147
-
1.To use the example Python messaging app, provide the following GitHub repository URL:
150
+
1.Under **Repository URL**, add the following GitHub URL:
148
151
149
152
```command
150
153
https://github.com/linode/apl-examples.git
151
154
```
152
155
153
-
1. Set the **Buildpacks** path to `rabbitmq-python`.
156
+
1. Click **Add Code Repository**.
154
157
155
-
1. Click **Submit**. The build may take a few minutes to be ready.
158
+
### Create a Container Image
156
159
157
-
{{< note title="Make sure auto-scaling is enabled on your cluster">}}
158
-
When a build is created, each task in the pipeline runs in a pod, which requires a certain amount of CPU and memory resources. To ensure the sufficient number of resources are available, it is recommended that auto-scaling for your LKE cluster is enabled prior to creating the build.
159
-
{{< /note >}}
160
+
1. Select **Container Images** from the menu.
161
+
162
+
1. Select the *BuildPacks* build task.
163
+
164
+
1. In the **Repository** dropdown list, select`apl-examples`.
165
+
166
+
1. In the **Reference** dropdown list, select`main`.
167
+
168
+
1. Set the **Path** field to `rabbitmq-python`.
169
+
170
+
1. Click **Create Container Image**.
160
171
161
172
### Check the Build Status
162
173
@@ -176,12 +187,10 @@ The backend status of the build can be checked from the **PipelineRuns** section
176
187
177
188
Once successfully built, copy the image repository link so that you can create a Workload fordeploying the appin the next step.
178
189
179
-
1. Select **Builds** to view the status of your build.
190
+
1. Select **Container Images** to view the status of your build.
180
191
181
192
1. When ready, use the "copy" button in the **Repository** column to copy the repository URL link to your clipboard.
182
193
183
-

184
-
185
194
## Deploy the App
186
195
187
196
1. Select **view**>**team** and **team**>**demo**in the top bar.
@@ -206,7 +215,7 @@ Once successfully built, copy the image repository link so that you can create a
0 commit comments