Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introducing the MaxTimeSeriesInBatch config option #3157

Merged
merged 1 commit into from
Jul 4, 2023

Conversation

olegbespalov
Copy link
Contributor

@olegbespalov olegbespalov commented Jun 30, 2023

What?

Moving cloud output v2 from MaxMetricSamplesPerPackage to MaxTimeSeriesInBatch.

Why?

The new option reflects the actual purpose and usage of the config option.

Checklist

  • I have performed a self-review of my code.
  • I have added tests for my changes.
  • I have run linter locally (make ci-like-lint) and all checks pass.
  • I have run tests locally (make tests) and all tests pass.
  • I have commented on my code, particularly in hard-to-understand areas.

Related PR(s)/Issue(s)

Closes #3156

@olegbespalov olegbespalov self-assigned this Jun 30, 2023
@olegbespalov olegbespalov force-pushed the feat/3156-max-timeseries-per-batch branch from a381a43 to b554eba Compare June 30, 2023 08:05
@olegbespalov olegbespalov added this to the v0.46.0 milestone Jun 30, 2023
@olegbespalov olegbespalov force-pushed the feat/3156-max-timeseries-per-batch branch from b554eba to ecbe215 Compare June 30, 2023 08:15
@codecov-commenter
Copy link

codecov-commenter commented Jun 30, 2023

Codecov Report

Merging #3157 (e60c721) into master (3291b98) will decrease coverage by 0.06%.
The diff coverage is 62.50%.

❗ Current head e60c721 differs from pull request most recent head c37e284. Consider uploading reports for the commit c37e284 to get more accurate results

@@            Coverage Diff             @@
##           master    #3157      +/-   ##
==========================================
- Coverage   72.80%   72.75%   -0.06%     
==========================================
  Files         255      253       -2     
  Lines       19581    19582       +1     
==========================================
- Hits        14256    14246      -10     
- Misses       4429     4434       +5     
- Partials      896      902       +6     
Flag Coverage Δ
ubuntu 72.75% <62.50%> (+<0.01%) ⬆️
windows ?

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
output/cloud/output.go 75.33% <0.00%> (-1.54%) ⬇️
cloudapi/config.go 90.65% <100.00%> (+0.26%) ⬆️
output/cloud/expv2/flush.go 86.95% <100.00%> (ø)
output/cloud/expv2/output.go 66.01% <100.00%> (ø)

... and 8 files with indirect coverage changes

Copy link
Contributor

@codebien codebien left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added a few comments. Thanks for taking care.

Can you please add a validation rule like we have for the old option?

k6/output/cloud/output.go

Lines 116 to 119 in 3291b98

if conf.MaxMetricSamplesPerPackage.Int64 < 1 {
return nil, fmt.Errorf("metric samples per package must be a positive number but is %d",
conf.MaxMetricSamplesPerPackage.Int64)
}

cloudapi/config.go Outdated Show resolved Hide resolved
cloudapi/config.go Outdated Show resolved Hide resolved
Moving cloud output v2 from MaxMetricSamplesPerPackage to
MaxTimeSeriesInBatch, to have a flexibility and better reflect the
actual usage of the option.
@olegbespalov olegbespalov force-pushed the feat/3156-max-timeseries-per-batch branch from ecbe215 to c37e284 Compare July 4, 2023 10:00
@olegbespalov olegbespalov merged commit 72b857a into master Jul 4, 2023
20 checks passed
@olegbespalov olegbespalov deleted the feat/3156-max-timeseries-per-batch branch July 4, 2023 10:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Use a dedicated option instead of MaxMetricSamplesPerPackage
4 participants