Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support changing the default of "it will measure for at least 3 seconds" #527

Closed
RalfJung opened this issue Jul 4, 2022 · 8 comments · Fixed by #561
Closed

Support changing the default of "it will measure for at least 3 seconds" #527

RalfJung opened this issue Jul 4, 2022 · 8 comments · Fixed by #561
Labels

Comments

@RalfJung
Copy link

RalfJung commented Jul 4, 2022

It would be nice if one could change the default of "it will measure for at least 3 seconds"; e.g. I might be fine with 30s of measurement for tests that usually take around 2-5s. Being able to specify the time rather than having to use -m is nice because my level of patience to wait for results is mostly constant, so I am fine with more runs when the individual runs are faster.

Looks like this is where the time is currently hard-coded:

min_benchmarking_time: 3.0,

@sharkdp
Copy link
Owner

sharkdp commented Jul 11, 2022

Thank you for your feedback.

I certainly agree that having a hardcoded default is not the best option here. However, I would like to avoid having too many (possibly conflicting) command-line options to control the number of runs. Mostly because I think it would be difficult to explain to users.

So before we decide to integrate this: what would be possible alternatives to the introduction of a new --min-benchmarking-time <secs> option? For example, is the "min. benchmarking time" strategy reasonable at all? Maybe something like a confidence-based min. benchmarking time would be more suitable as a default? (#523)

Contrarily, are there other use cases for the suggested option? Maybe someone is aware of background interference of their benchmarks on a timescale of e.g. a few minutes and would therefore like to set --min-benchmarking-time=10min.

@RalfJung
Copy link
Author

RalfJung commented Jul 11, 2022

confidence-based min. benchmarking time

That also sounds pretty awesome, yes.

But when using benchmarks while working on a feature or optimization, usually the constant is the amount of time I am willing to wait for the benchmarks to complete in each edit-compile-bench cycle. I can do more extensive measurements at the end, but I just need a reasonably good quick feedback during development. So using something time-based makes a lot of sense for that. I might call 5 runs "good enough" but if I can have 12 runs in 10s then I'll take those.

This came up for me because I am running a bunch of benchmarks to cover different workloads, and they take different amounts of time each. I don't want to tweak the run counts for each of them separately.

PS: By the way, thank you so much for this amazing tool. :)

@sharkdp
Copy link
Owner

sharkdp commented Aug 15, 2022

Given how easy this should be to implement, I'm inclined to just add this as a new option, but maybe mark it "hidden" for now... until we are sure that we like the "CLI design". This would hide the option from the --help text but still make it usable. An experimental feature, if you will.

@sharkdp sharkdp added the good first issue Good for newcomers label Aug 15, 2022
@udsamani
Copy link

@sharkdp Are we looking to add this (by default hidden)? Would like to take a stab at it if you don't mind !

@sharkdp
Copy link
Owner

sharkdp commented Aug 29, 2022

That would be great!

@udsamani
Copy link

@sharkdp Just confirming are we planning to go with confidence based benchmarking or just introduce a parameter for existing but just keep it hidden ?

@sharkdp
Copy link
Owner

sharkdp commented Sep 2, 2022

@sharkdp Just confirming are we planning to go with confidence based benchmarking or just introduce a parameter for existing but just keep it hidden ?

Both 😄

@sharkdp
Copy link
Owner

sharkdp commented Sep 8, 2022

@udsamani Sorry that I jumped in here, but I wanted to include that in the release I made yesterday. There are a lot of other issues on this tracker if you want to work on hyperfine!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants