Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions .apigentools-info
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@
"spec_versions": {
"v1": {
"apigentools_version": "1.6.6",
"regenerated": "2024-11-25 20:00:23.179155",
"spec_repo_commit": "3c840607"
"regenerated": "2024-11-26 13:36:06.674459",
"spec_repo_commit": "cf1aa5ea"
},
"v2": {
"apigentools_version": "1.6.6",
"regenerated": "2024-11-25 20:00:23.205993",
"spec_repo_commit": "3c840607"
"regenerated": "2024-11-26 13:36:06.693826",
"spec_repo_commit": "cf1aa5ea"
}
}
}
20 changes: 10 additions & 10 deletions features/v1/logs_indexes.feature
Original file line number Diff line number Diff line change
Expand Up @@ -10,70 +10,70 @@ Feature: Logs Indexes
And a valid "appKeyAuth" key in the system
And an instance of "LogsIndexes" API

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Create an index returns "Invalid Parameter Error" response
Given new "CreateLogsIndex" request
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15}
When the request is sent
Then the response status is 400 Invalid Parameter Error

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Create an index returns "OK" response
Given new "CreateLogsIndex" request
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15}
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Get all indexes returns "OK" response
Given new "ListLogIndexes" request
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Get an index returns "Not Found" response
Given new "GetLogsIndex" request
And request contains "name" parameter from "REPLACE.ME"
When the request is sent
Then the response status is 404 Not Found

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Get an index returns "OK" response
Given new "GetLogsIndex" request
And request contains "name" parameter from "REPLACE.ME"
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Get indexes order returns "OK" response
Given new "GetLogsIndexOrder" request
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Update an index returns "Invalid Parameter Error" response
Given new "UpdateLogsIndex" request
And request contains "name" parameter from "REPLACE.ME"
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "disable_daily_limit": false, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "num_flex_logs_retention_days": 360, "num_retention_days": 15}
When the request is sent
Then the response status is 400 Invalid Parameter Error

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Update an index returns "OK" response
Given new "UpdateLogsIndex" request
And request contains "name" parameter from "REPLACE.ME"
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "disable_daily_limit": false, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "num_flex_logs_retention_days": 360, "num_retention_days": 15}
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Update indexes order returns "Bad Request" response
Given new "UpdateLogsIndexOrder" request
And body with value {"index_names": ["main", "payments", "web"]}
When the request is sent
Then the response status is 400 Bad Request

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Update indexes order returns "OK" response
Given new "UpdateLogsIndexOrder" request
And body with value {"index_names": ["main", "payments", "web"]}
Expand Down
12 changes: 6 additions & 6 deletions features/v2/logs.feature
Original file line number Diff line number Diff line change
Expand Up @@ -105,44 +105,44 @@ Feature: Logs
Then the response status is 200 OK
And the response has 3 items

@integration-only @skip-terraform-config @skip-validation @team:DataDog/event-platform-intake @team:DataDog/logs-backend
@integration-only @skip-terraform-config @skip-validation @team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment"}]
And request contains "Content-Encoding" parameter with value "deflate"
When the request is sent
Then the response status is 202 Response from server (always 202 empty JSON).

@integration-only @skip-terraform-config @skip-validation @team:DataDog/event-platform-intake @team:DataDog/logs-backend
@integration-only @skip-terraform-config @skip-validation @team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment"}]
And request contains "Content-Encoding" parameter with value "gzip"
When the request is sent
Then the response status is 202 Request accepted for processing (always 202 empty JSON).

@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend
@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send logs returns "Bad Request" response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment"}]
When the request is sent
Then the response status is 400 Bad Request

@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend
@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send logs returns "Payload Too Large" response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment"}]
When the request is sent
Then the response status is 413 Payload Too Large

@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend
@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send logs returns "Request Timeout" response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment"}]
When the request is sent
Then the response status is 408 Request Timeout

@team:DataDog/event-platform-intake @team:DataDog/logs-backend
@team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send logs returns "Request accepted for processing (always 202 empty JSON)." response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment", "status": "info"}]
Expand Down
Loading
Loading