Skip to content

Commit aedf0b2

Browse files
committed
add resource for datastream lifecycle
1 parent bd73d42 commit aedf0b2

File tree

11 files changed

+788
-29
lines changed

11 files changed

+788
-29
lines changed

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
## [Unreleased]
22

3+
- Add resource `elasticstack_elasticsearch_data_stream_lifecycle` ([838](https://github.com/elastic/terraform-provider-elasticstack/issues/838))
34
- Support updating `elasticstack_elasticsearch_security_api_key` when supported by the backing cluster ([#843](https://github.com/elastic/terraform-provider-elasticstack/pull/843))
45
- Fix validation of `throttle`, and `interval` attributes in `elasticstack_kibana_alerting_rule` allowing all Elastic duration values ([#846](https://github.com/elastic/terraform-provider-elasticstack/pull/846))
56
- Fix boolean setting parsing for `elasticstack_elasticsearch_indices` data source. ([#842](https://github.com/elastic/terraform-provider-elasticstack/pull/842))
Lines changed: 131 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,131 @@
1+
---
2+
subcategory: "Index"
3+
layout: ""
4+
page_title: "Elasticstack: elasticstack_elasticsearch_data_stream_lifecycle Resource"
5+
description: |-
6+
Manages Lifecycle for Elasticsearch Data Streams
7+
---
8+
9+
# Resource: elasticstack_elasticsearch_data_stream
10+
11+
Configures the data stream lifecycle for the targeted data streams, see: https://www.elastic.co/guide/en/elasticsearch/reference/current/data-stream-apis.html
12+
13+
## Example Usage
14+
15+
```terraform
16+
provider "elasticstack" {
17+
elasticsearch {}
18+
}
19+
20+
21+
// First we must have a index template created
22+
resource "elasticstack_elasticsearch_index_template" "my_data_stream_template" {
23+
name = "my_data_stream"
24+
25+
index_patterns = ["my-stream*"]
26+
27+
data_stream {}
28+
}
29+
30+
// and now we can create data stream based on the index template
31+
resource "elasticstack_elasticsearch_data_stream" "my_data_stream" {
32+
name = "my-stream"
33+
34+
// make sure that template is created before the data stream
35+
depends_on = [
36+
elasticstack_elasticsearch_index_template.my_data_stream_template
37+
]
38+
}
39+
40+
// finally we can manage lifecycle of data stream
41+
resource "elasticstack_elasticsearch_data_stream_lifecycle" "my_data_stream_lifecycle" {
42+
name = "my-stream"
43+
data_retention = "3d"
44+
45+
depends_on = [
46+
elasticstack_elasticsearch_data_stream.my_data_stream,
47+
]
48+
}
49+
50+
// or you can use wildcards to manage multiple lifecycles at once
51+
resource "elasticstack_elasticsearch_data_stream_lifecycle" "my_data_stream_lifecycle_multiple" {
52+
name = "stream-*"
53+
data_retention = "3d"
54+
}
55+
```
56+
57+
<!-- schema generated by tfplugindocs -->
58+
59+
## Schema
60+
61+
### Required
62+
63+
- `name` (String) Name of the data stream. Supports wildcards (_). To target all data streams use _ or \_all.
64+
65+
### Optional
66+
67+
- `data_retention` (String) If defined, every document added to this data stream will be stored at least for this time frame. Any time after this duration the document could be deleted. When empty, every document in this data stream will be stored indefinitely
68+
- `downsampling` (Block List) An optional array of downsampling configuration objects, each defining an after interval representing when the backing index is meant to be downsampled (the time frame is calculated since the index was rolled over, i.e. generation time) and a fixed_interval representing the downsampling interval (the minimum fixed_interval value is 5m). A maximum number of 10 downsampling rounds can be configured (see [below for nested schema](#nestedblock--downsampling))
69+
- `elasticsearch_connection` (Block List, Max: 1, Deprecated) Elasticsearch connection configuration block. This property will be removed in a future provider version. Configure the Elasticsearch connection via the provider configuration instead. (see [below for nested schema](#nestedblock--elasticsearch_connection))
70+
- `enabled` (Boolean) If defined, it turns data stream lifecycle on/off (true/false) for this data stream. A data stream lifecycle that is disabled (enabled: false) will have no effect on the data stream. Defaults to true.
71+
- `expand_wildcards` (String) Type of data stream that wildcard patterns can match. Supports comma-separated values, such as open,hidden. Valid values are:
72+
73+
all, hidden - Match any data stream, including hidden ones.
74+
open, closed - Matches any non-hidden data stream. Data streams cannot be closed.
75+
none - Wildcard patterns are not accepted.
76+
77+
Defaults to open.
78+
79+
### Read-Only
80+
81+
- `id` (String) Internal identifier of the resource
82+
- `lifecycles` (List of Object) (see [below for nested schema](#nestedatt--lifecycles))
83+
84+
<a id="nestedblock--downsampling"></a>
85+
86+
### Nested Schema for `downsampling`
87+
88+
Required:
89+
90+
- `after` (String) Interval representing when the backing index is meant to be downsampled
91+
- `fixed_interval` (String) The interval at which to aggregate the original time series index. For example, 60m produces a document for each 60 minute (hourly) interval. This follows standard time formatting syntax as used elsewhere in Elasticsearch.
92+
93+
<a id="nestedblock--elasticsearch_connection"></a>
94+
95+
### Nested Schema for `elasticsearch_connection`
96+
97+
Optional:
98+
99+
- `api_key` (String, Sensitive) API Key to use for authentication to Elasticsearch
100+
- `bearer_token` (String, Sensitive) Bearer Token to use for authentication to Elasticsearch
101+
- `ca_data` (String) PEM-encoded custom Certificate Authority certificate
102+
- `ca_file` (String) Path to a custom Certificate Authority certificate
103+
- `cert_data` (String) PEM encoded certificate for client auth
104+
- `cert_file` (String) Path to a file containing the PEM encoded certificate for client auth
105+
- `endpoints` (List of String, Sensitive) A list of endpoints where the terraform provider will point to, this must include the http(s) schema and port number.
106+
- `es_client_authentication` (String, Sensitive) ES Client Authentication field to be used with the bearer token
107+
- `insecure` (Boolean) Disable TLS certificate validation
108+
- `key_data` (String, Sensitive) PEM encoded private key for client auth
109+
- `key_file` (String) Path to a file containing the PEM encoded private key for client auth
110+
- `password` (String, Sensitive) Password to use for API authentication to Elasticsearch.
111+
- `username` (String) Username to use for API authentication to Elasticsearch.
112+
113+
<a id="nestedatt--lifecycles"></a>
114+
115+
### Nested Schema for `lifecycles`
116+
117+
Read-Only:
118+
119+
- `data_retention` (String)
120+
- `downsampling` (List of Object) (see [below for nested schema](#nestedobjatt--lifecycles--downsampling))
121+
- `enabled` (Boolean)
122+
- `name` (String)
123+
124+
<a id="nestedobjatt--lifecycles--downsampling"></a>
125+
126+
### Nested Schema for `lifecycles.downsampling`
127+
128+
Read-Only:
129+
130+
- `after` (String)
131+
- `fixed_interval` (String)
Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
provider "elasticstack" {
2+
elasticsearch {}
3+
}
4+
5+
// First we must have a index template created
6+
resource "elasticstack_elasticsearch_index_template" "my_data_stream_template" {
7+
name = "my_data_stream"
8+
9+
index_patterns = ["my-stream*"]
10+
11+
data_stream {}
12+
}
13+
14+
// and now we can create data stream based on the index template
15+
resource "elasticstack_elasticsearch_data_stream" "my_data_stream" {
16+
name = "my-stream"
17+
18+
// make sure that template is created before the data stream
19+
depends_on = [
20+
elasticstack_elasticsearch_index_template.my_data_stream_template
21+
]
22+
}
23+
24+
// finally we can manage lifecycle of data stream
25+
resource "elasticstack_elasticsearch_data_stream_lifecycle" "my_data_stream_lifecycle" {
26+
name = "my-stream"
27+
data_retention = "3d"
28+
29+
depends_on = [
30+
elasticstack_elasticsearch_data_stream.my_data_stream,
31+
]
32+
}
33+
34+
// or you can use wildcards to manage multiple lifecycles at once
35+
resource "elasticstack_elasticsearch_data_stream_lifecycle" "my_data_stream_lifecycle_multiple" {
36+
name = "stream-*"
37+
data_retention = "3d"
38+
}

internal/clients/elasticsearch/index.go

Lines changed: 83 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -499,6 +499,89 @@ func DeleteDataStream(ctx context.Context, apiClient *clients.ApiClient, dataStr
499499
return diags
500500
}
501501

502+
func PutDataStreamLifecycle(ctx context.Context, apiClient *clients.ApiClient, dataStreamName string, expand_wildcards string, lifecycle models.LifecycleSettings) diag.Diagnostics {
503+
var diags diag.Diagnostics
504+
505+
esClient, err := apiClient.GetESClient()
506+
if err != nil {
507+
return diag.FromErr(err)
508+
}
509+
510+
lifecycleBytes, err := json.Marshal(lifecycle)
511+
if err != nil {
512+
return diag.FromErr(err)
513+
}
514+
515+
opts := []func(*esapi.IndicesPutDataLifecycleRequest){
516+
esClient.Indices.PutDataLifecycle.WithBody(bytes.NewReader(lifecycleBytes)),
517+
esClient.Indices.PutDataLifecycle.WithContext(ctx),
518+
esClient.Indices.PutDataLifecycle.WithExpandWildcards(expand_wildcards),
519+
}
520+
res, err := esClient.Indices.PutDataLifecycle([]string{dataStreamName}, opts...)
521+
if err != nil {
522+
return diag.FromErr(err)
523+
}
524+
defer res.Body.Close()
525+
if diags := utils.CheckError(res, fmt.Sprintf("Unable to create DataStreamLifecycle: %s", dataStreamName)); diags.HasError() {
526+
return diags
527+
}
528+
529+
return diags
530+
}
531+
532+
func GetDataStreamLifecycle(ctx context.Context, apiClient *clients.ApiClient, dataStreamName string, expand_wildcards string) (*[]models.DataStreamLifecycle, diag.Diagnostics) {
533+
var diags diag.Diagnostics
534+
esClient, err := apiClient.GetESClient()
535+
if err != nil {
536+
return nil, diag.FromErr(err)
537+
}
538+
opts := []func(*esapi.IndicesGetDataLifecycleRequest){
539+
esClient.Indices.GetDataLifecycle.WithContext(ctx),
540+
esClient.Indices.GetDataLifecycle.WithExpandWildcards(expand_wildcards),
541+
}
542+
res, err := esClient.Indices.GetDataLifecycle([]string{dataStreamName}, opts...)
543+
if err != nil {
544+
return nil, diag.FromErr(err)
545+
}
546+
defer res.Body.Close()
547+
if res.StatusCode == http.StatusNotFound {
548+
return nil, nil
549+
}
550+
if diags := utils.CheckError(res, fmt.Sprintf("Unable to get requested DataStreamLifecycle: %s", dataStreamName)); diags.HasError() {
551+
return nil, diags
552+
}
553+
554+
dStreams := make(map[string][]models.DataStreamLifecycle)
555+
if err := json.NewDecoder(res.Body).Decode(&dStreams); err != nil {
556+
return nil, diag.FromErr(err)
557+
}
558+
ds := dStreams["data_streams"]
559+
return &ds, diags
560+
}
561+
562+
func DeleteDataStreamLifecycle(ctx context.Context, apiClient *clients.ApiClient, dataStreamName string, expand_wildcards string) diag.Diagnostics {
563+
var diags diag.Diagnostics
564+
565+
esClient, err := apiClient.GetESClient()
566+
if err != nil {
567+
return diag.FromErr(err)
568+
}
569+
opts := []func(*esapi.IndicesDeleteDataLifecycleRequest){
570+
esClient.Indices.DeleteDataLifecycle.WithContext(ctx),
571+
esClient.Indices.DeleteDataLifecycle.WithExpandWildcards(expand_wildcards),
572+
}
573+
res, err := esClient.Indices.DeleteDataLifecycle([]string{dataStreamName}, opts...)
574+
if err != nil {
575+
return diag.FromErr(err)
576+
}
577+
defer res.Body.Close()
578+
if diags := utils.CheckError(res, fmt.Sprintf("Unable to delete DataStreamLifecycle: %s", dataStreamName)); diags.HasError() {
579+
return diags
580+
}
581+
582+
return diags
583+
}
584+
502585
func PutIngestPipeline(ctx context.Context, apiClient *clients.ApiClient, pipeline *models.IngestPipeline) diag.Diagnostics {
503586
var diags diag.Diagnostics
504587
pipelineBytes, err := json.Marshal(pipeline)

internal/elasticsearch/cluster/slm.go

Lines changed: 5 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,6 @@ import (
1010
"github.com/elastic/terraform-provider-elasticstack/internal/clients/elasticsearch"
1111
"github.com/elastic/terraform-provider-elasticstack/internal/models"
1212
"github.com/elastic/terraform-provider-elasticstack/internal/utils"
13-
"github.com/hashicorp/go-cty/cty"
1413
"github.com/hashicorp/terraform-plugin-log/tflog"
1514
"github.com/hashicorp/terraform-plugin-sdk/v2/diag"
1615
"github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema"
@@ -31,33 +30,11 @@ func ResourceSlm() *schema.Resource {
3130
ForceNew: true,
3231
},
3332
"expand_wildcards": {
34-
Description: "Determines how wildcard patterns in the `indices` parameter match data streams and indices. Supports comma-separated values, such as `closed,hidden`.",
35-
Type: schema.TypeString,
36-
Optional: true,
37-
Default: "open,hidden",
38-
ValidateDiagFunc: func(value interface{}, path cty.Path) diag.Diagnostics {
39-
validValues := []string{"all", "open", "closed", "hidden", "none"}
40-
41-
var diags diag.Diagnostics
42-
for _, pv := range strings.Split(value.(string), ",") {
43-
found := false
44-
for _, vv := range validValues {
45-
if vv == strings.TrimSpace(pv) {
46-
found = true
47-
break
48-
}
49-
}
50-
if !found {
51-
diags = append(diags, diag.Diagnostic{
52-
Severity: diag.Error,
53-
Summary: "Invalid value was provided.",
54-
Detail: fmt.Sprintf(`"%s" is not valid value for this field.`, pv),
55-
})
56-
return diags
57-
}
58-
}
59-
return diags
60-
},
33+
Description: "Determines how wildcard patterns in the `indices` parameter match data streams and indices. Supports comma-separated values, such as `closed,hidden`.",
34+
Type: schema.TypeString,
35+
Optional: true,
36+
Default: "open,hidden",
37+
ValidateDiagFunc: utils.AllowedExpandWildcards,
6138
},
6239
"ignore_unavailable": {
6340
Description: "If `false`, the snapshot fails if any data stream or index in indices is missing or closed. If `true`, the snapshot ignores missing or closed data streams and indices.",

0 commit comments

Comments
 (0)