You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/content/docs/aws/services/batch.mdx
+94-71Lines changed: 94 additions & 71 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,10 +8,12 @@ import FeatureCoverage from "../../../../components/feature-coverage/FeatureCove
8
8
9
9
## Introduction
10
10
11
-
Batch is a cloud-based service provided by Amazon Web Services (AWS) that simplifies the process of running batch computing workloads on the AWS cloud infrastructure.
12
-
Batch allows you to efficiently process large volumes of data and run batch jobs without the need to manage and provision underlying compute resources.
11
+
Batch is a cloud-based service provided by Amazon Web Services (AWS) that simplifies the process of running batch computing workloads on the AWS cloud infrastructure. Batch allows you to efficiently process large volumes of data and run batch jobs without the need to manage and provision underlying compute resources.
12
+
13
+
Under the hood, the local Docker engine is used to run the containers that simulate your Batch jobs.
14
+
15
+
LocalStack allows you to use the Batch APIs to automate and scale computational tasks in your local environment while handling batch workloads. Batch jobs are executed using the ECS runtime, allowing for support of managed compute environments and improved service compatibility.
13
16
14
-
LocalStack allows you to use the Batch APIs to automate and scale computational tasks in your local environment while handling batch workloads.
15
17
The supported APIs are available on our [API Coverage section](#api-coverage), which provides information on the extent of Batch integration with LocalStack.
16
18
17
19
## Getting started
@@ -30,57 +32,61 @@ We will demonstrate how you create and run a Batch job by following these steps:
30
32
### Create a service role
31
33
32
34
You can create a role using the [`CreateRole`](https://docs.aws.amazon.com/cli/latest/reference/iam/create-role.html) API.
33
-
For LocalStack, the service role simply needs to exist.
34
-
However, when [enforcing IAM policies](/aws/capabilities/security-testing/iam-policy-enforcement), it is necessary that the policy is valid.
35
+
36
+
LocalStack requires the role to exist with a valid trust policy. When [enforcing IAM policies](/aws/capabilities/security-testing/iam-policy-enforcement), ensure that the policy is valid and the role is properly attached.
35
37
36
-
Run the following command to create a role with an empty policy document:
38
+
Run the following command to create a role for ECS task execution:
You can use the [`CreateComputeEnvironment`](https://docs.aws.amazon.com/cli/latest/reference/batch/create-compute-environment.html) API to create a compute environment.
60
-
Run the following command using the role ARN above (`arn:aws:iam::000000000000:role/myrole`), to create the compute environment:
69
+
70
+
Run the following command using the role ARN above (arn:aws:iam::000000000000:role/myrole) to create a managed compute environment with FARGATE:
While an unmanaged compute environment has been specified, there is no need to provision any compute resources for this setup to function.
78
-
Your tasks will run independently in new Docker containers, alongside the LocalStack container.
82
+
While networking resources such as subnets and security groups are required as input, LocalStack does not create real cloud infrastructure. These values must still be present for the compute environment to be created.
79
83
:::
80
84
85
+
81
86
### Create a job queue
82
87
83
88
You can fetch the ARN using the [`DescribeComputeEnvironments`](https://docs.aws.amazon.com/cli/latest/reference/batch/describe-compute-environments.html) API.
89
+
84
90
Run the following command to fetch the ARN of the compute environment:
You can use the ARN to create the job queue using [`CreateJobQueue`](https://docs.aws.amazon.com/cli/latest/reference/batch/create-job-queue.html) API.
113
+
107
114
Run the following command to create the job queue:
Now, you can define what occurs during a job run, or at least what transpires by default.
127
-
In this example, you can execute the `busybox` container from DockerHub and initiate the command: `sleep 30` within it.
128
-
It's important to note that you can override this command when submitting the job.
126
+
Now, you can define what occurs during a job run. In this example, you can execute the 'busybox' container from DockerHub and initiate the command: 'sleep 30'. It's important to note you can override this command when submitting the job.
129
127
130
128
Run the following command to create the job definition using the [`RegisterJobDefinition`](https://docs.aws.amazon.com/cli/latest/reference/batch/register-job-definition.html) API:
If you want to pass arguments to the command as [parameters](https://docs.aws.amazon.com/batch/latest/userguide/job_definition_parameters.html#parameters), you can use the `Ref::` declaration to set placeholders for parameter substitution.
151
+
148
152
This allows the dynamic passing of values at runtime for specific job definitions.
As mentioned in the example above, the creation of a compute environment does not entail the provisioning of EC2 or Fargate instances.
184
-
Rather, it executes Batch jobs on the local Docker daemon, operating alongside LocalStack.
192
+
LocalStack simulates the execution of ECS-based AWS Batch jobs using the local ECS runtime. No real infrastructure is created or managed.
193
+
194
+
Array jobs are supported in sequential mode only.
195
+
196
+
A subset of environment variables is supported, including:
197
+
-`AWS_BATCH_CE_NAME`
198
+
-`AWS_BATCH_JOB_ARRAY_INDEX`
199
+
-`AWS_BATCH_JOB_ARRAY_SIZE`
200
+
-`AWS_BATCH_JOB_ATTEMPT`
201
+
-`AWS_BATCH_JOB_ID`
202
+
-`AWS_BATCH_JQ_NAME`
203
+
204
+
The configuration variable `ECS_DOCKER_FLAGS` can be used to pass additional Docker flags to the container runtime.
205
+
206
+
Setting `ECS_TASK_EXECUTOR=kubernetes` is supported as an alternative backend, though Kubernetes execution is experimental and may not support all features.
0 commit comments