You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Aug 7, 2025. It is now read-only.
@@ -17,15 +17,14 @@ The available operations can be found on the [API coverage](https://docs.localst
17
17
18
18
## Getting started
19
19
20
-
This guide is for users that are new to IoT and assumes a basic knowledge of the AWS CLI and LocalStack [`awslocal`](https://github.com/localstack/awscli-local) wrapper.
20
+
In this guide, we will create a simple pipeline that fetches an object from an S3 bucket and uploads it to a different S3 bucket.
21
+
It is for users that are new to CodePipeline and have a basic knowledge of the AWS CLI and the [`awslocal`](https://github.com/localstack/awscli-local) wrapper.
21
22
22
23
Start LocalStack using your preferred method.
23
24
24
-
In this guide, we will create a simple pipeline that fetches an object from an S3 bucket and uploads it to a different S3 bucket.
25
-
26
25
### Create prerequisite buckets
27
26
28
-
Start by creating the S3 buckets that will serve as the source and target.
27
+
Begin by creating the S3 buckets that will serve as the source and target.
29
28
30
29
{{< command >}}
31
30
$ awslocal s3 mb s3://source-bucket
@@ -66,7 +65,7 @@ Depending on the specifics of the declaration, CodePipeline pipelines need acces
66
65
In this case we want our pipeline to retrieve and upload files to S3.
67
66
This requires a properly configured IAM role that our pipeline can assume.
68
67
69
-
Create the role as follows:
68
+
Create the role and make note of the role ARN:
70
69
71
70
```json
72
71
# role.json
@@ -85,7 +84,10 @@ Create the role as follows:
85
84
```
86
85
87
86
{{< command >}}
88
-
$ awslocal iam create-role --role-name role --assume-role-policy-document file://role.json
87
+
$ awslocal iam create-role --role-name role --assume-role-policy-document file://role.json | jq .Role.Arn
88
+
<disable-copy>
89
+
"arn:aws:iam::000000000000:role/role"
90
+
</disable-copy>
89
91
{{< /command >}}
90
92
91
93
Now add a permissions policy to this role that permits read and write access to S3.
@@ -98,7 +100,7 @@ Now add a permissions policy to this role that permits read and write access to
98
100
{
99
101
"Effect": "Allow",
100
102
"Action": [
101
-
"s3:*",
103
+
"s3:*"
102
104
],
103
105
"Resource": "*"
104
106
},
@@ -114,7 +116,7 @@ Now add a permissions policy to this role that permits read and write access to
114
116
```
115
117
116
118
The permissions in the above example policy are relatively broad.
117
-
You might want to use a more focused policy for better security.
119
+
You might want to use a more focused policy for better security on production systems.
118
120
119
121
{{< command >}}
120
122
$ awslocal iam put-role-policy --role-name role --policy-name policy --policy-document file://policy.json
@@ -124,15 +126,25 @@ $ awslocal iam put-role-policy --role-name role --policy-name policy --policy-do
124
126
125
127
Now we can turn our attention to the pipeline declaration.
126
128
127
-
```json
129
+
A pipeline declaration is used to define the structure of actions and stages to be performed.
130
+
The following pipeline defines two stages with one action each.
131
+
There is a source action which retrieves a file from an S3 bucket and marks it as the output.
132
+
The output is placed in the intermediate bucket until it is picked up by the action in the second stage.
133
+
This is a deploy action which uploads the file to the target bucket.
134
+
135
+
Pay special attention to `roleArn`, `artifactStore.location` as well as `S3Bucket`, `S3ObjectKey`, and `BucketName`.
136
+
These correspond to the resources we created earlier.
137
+
138
+
```json {hl_lines=[6,9,26,27,52]}
139
+
# declaration.json
128
140
{
129
141
"name": "pipeline",
130
142
"executionMode": "SUPERSEDED",
131
143
"pipelineType": "V1",
132
-
"roleArn": "<TODO>",
144
+
"roleArn": "arn:aws:iam::000000000000:role/role",
133
145
"artifactStore": {
134
146
"type": "S3",
135
-
"location": "<TODO>"
147
+
"location": "artifact-store-bucket"
136
148
},
137
149
"version": 1,
138
150
"stages": [
@@ -149,13 +161,13 @@ Now we can turn our attention to the pipeline declaration.
149
161
},
150
162
"runOrder": 1,
151
163
"configuration": {
152
-
"S3Bucket": "<TODO>",
153
-
"S3ObjectKey": "<TODO>",
164
+
"S3Bucket": "source-bucket",
165
+
"S3ObjectKey": "file",
154
166
"PollForSourceChanges": "false"
155
167
},
156
168
"outputArtifacts": [
157
169
{
158
-
"name": "intermediate-artifact-file"
170
+
"name": "intermediate-file"
159
171
}
160
172
],
161
173
"inputArtifacts": []
@@ -175,13 +187,13 @@ Now we can turn our attention to the pipeline declaration.
175
187
},
176
188
"runOrder": 1,
177
189
"configuration": {
178
-
"BucketName": "<TODO>",
190
+
"BucketName": "target-bucket",
179
191
"Extract": "false",
180
-
"ObjectKey": "output-artifact-file"
192
+
"ObjectKey": "output-file"
181
193
},
182
194
"inputArtifacts": [
183
195
{
184
-
"name": "intermediate-artifact-file"
196
+
"name": "intermediate-file"
185
197
}
186
198
],
187
199
"outputArtifacts": []
@@ -192,37 +204,203 @@ Now we can turn our attention to the pipeline declaration.
The [S3 Source](https://docs.aws.amazon.com/codepipeline/latest/userguide/action-reference-S3.html) action is used to specify an S3 bucket object as input to the pipeline.
240
+
Note the `trigger.triggerType` field specifies what initiated the pipeline execution.
241
+
Currently in LocalStack, only two triggers are implemented: `CreatePipeline` and `StartPipelineExecution`.
207
242
208
-
### S3 Deploy
243
+
The above pipeline execution was successful.
244
+
This means that we can retrieve the `output-file` object from the `target-bucket` S3 bucket.
209
245
210
-
The [S3 Deploy](https://docs.aws.amazon.com/codepipeline/latest/userguide/action-reference-S3Deploy.html) action is used to upload artifacts to a given S3 bucket as the output of the pipeline.
download: s3://target-bucket/output-file to ./output-file
250
+
</disable-copy>
251
+
{{< /command >}}
252
+
253
+
To verify that it is the same file as the original input:
254
+
255
+
{{< command >}}
256
+
$ cat output-file
257
+
<disable-copy>
258
+
Hello LocalStack!
259
+
</disable-copy>
260
+
{{< /command >}}
261
+
262
+
### Examine action executions
263
+
264
+
Using the [ListActionExecutions](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_ListPipelineExecutions.html), detailed information about each action execution such as inputs and outputs can be retrieved.
The operations [CreatePipeline](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_CreatePipeline.html), [GetPipeline](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_GetPipeline.html), [UpdatePipeline](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_UpdatePipeline.html), [ListPipelines](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_ListPipelines.html), [DeletePipeline](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_DeletePipeline.html) are used to manage pipeline declarations.
331
+
332
+
Pipeline executions can be managed with [StartPipelineExecution](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_StartPipelineExecution.html), [GetPipelineExecution](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_GetPipelineExecution.html), [ListPipelineExecutions](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_ListPipelineExecutions.html) and [StopPipelineExecutions](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_StopPipelineExecution.html).
333
+
334
+
Action executions can be inspected using the [ListActionExecutions](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_ListPipelineExecutions.html) operation.
335
+
336
+
LocalStack supports emulation for V1 pipelines.
337
+
V2 pipelines are only created as mocks.
338
+
339
+
You can use `runOrder`
340
+
341
+
When stopping pipeline executions with StopPipelineExecution, the stop and abandon method is not supported.
342
+
Setting the `abandon` flag will have no impact.
343
+
This is because LocalStack uses threads as the underlying mechanism to simulate pipelines, and threads can not be cleanly preempted.
344
+
345
+
### Tagging pipelines
346
+
347
+
LocalStack also supports [resources tagging for pipelines](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipelines-tag.html) using the [TagResource](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_TagResource.html), [UntagResource](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_UntagResource.html) and [ListTagsForResource](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_ListTagsForResource.html) operations.
CodePipeline on LocalStack supports the following actions:
378
+
379
+
### CodeBuild Source and Test
380
+
381
+
The [CodeBuild Source and Test](https://docs.aws.amazon.com/codepipeline/latest/userguide/action-reference-CodeBuild.html) action can be used to start a CodeBuild container and run the given buildspec.
211
382
212
383
### CodeConnections Source
213
384
214
385
The [CodeConnections Source](https://docs.aws.amazon.com/codepipeline/latest/userguide/action-reference-CodestarConnectionSource.html) action is used to specify a VCS repo as the input to the pipeline.
215
386
216
-
Currently LocalStack supports integration with [GitHub](https://github.com/).
387
+
LocalStack supports integration only with [GitHub](https://github.com/) at this time.
217
388
Please set the environment configuration option `CODEPIPELINE_GH_TOKEN` with the GitHub Personal Access Token to be able to fetch private repositories.
218
389
219
-
### CodeBuild Source and Test
390
+
### S3 Deploy
220
391
221
-
The [CodeBuild Source and Test](https://docs.aws.amazon.com/codepipeline/latest/userguide/action-reference-CodeBuild.html) action can be used to start a CodeBuild container and run the given buildspec.
392
+
The [S3 Deploy](https://docs.aws.amazon.com/codepipeline/latest/userguide/action-reference-S3Deploy.html) action is used to upload artifacts to a given S3 bucket as the output of the pipeline.
393
+
394
+
### S3 Source
395
+
396
+
The [S3 Source](https://docs.aws.amazon.com/codepipeline/latest/userguide/action-reference-S3.html) action is used to specify an S3 bucket object as input to the pipeline.
222
397
223
398
## Limitations
224
399
225
-
-[V2 pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html)are not supported.
400
+
-Emulation for [V2 pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html)is not supported. They will be created as mocks only.
226
401
-[Rollbacks and stage retries](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipelines-stages.html) are not available.
227
402
-[Triggers](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipelines-triggers.html) are not implemented.
228
403
Pipelines are executed only when [CreatePipeline](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_CreatePipeline.html) and [StartPipelineExecution](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_StartPipelineExecution.html) are invoked.
404
+
-[Execution mode behaviours](https://docs.aws.amazon.com/codepipeline/latest/userguide/concepts-how-it-works.html#concepts-how-it-works-executions) are not implemented. Therefore parallel pipeline executions will not lead to stage locks and waits.
405
+
-[Stage transition controls](https://docs.aws.amazon.com/codepipeline/latest/userguide/transitions.html) are not implemented.
406
+
-[Manual approval action](https://docs.aws.amazon.com/codepipeline/latest/userguide/approvals-action-add.html) and [PutApprovalResult](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_PutApprovalResult.html) operation is not available.
0 commit comments