Behdad Esfahbod
|
0623aa598b
|
[benchmark-set] Add benchmark for set copy
|
2022-05-19 15:43:15 -06:00 |
Behdad Esfahbod
|
5f43ce825a
|
[benchmark-set] Split SetLookup into an ordered and random version
|
2022-04-29 13:39:15 -06:00 |
Behdad Esfahbod
|
ae9c7b861b
|
[benchmark-set] At least increase needle by one in lookup benchmark
|
2022-04-29 13:39:04 -06:00 |
Behdad Esfahbod
|
68a9b83d15
|
[benchmark-set] At least increase needle by one in lookup benchmark
|
2022-04-29 13:28:07 -06:00 |
Behdad Esfahbod
|
dd005911b9
|
[benchmark-set] Reduce lookup benchmark overhead
Turnsout 90% was overhead... Now lookup is in the 4ns ballpark.
|
2022-04-29 12:23:53 -06:00 |
Garret Rieger
|
2b03bcedef
|
[perf] Cleanup range specifiers in set benchmark.
|
2022-04-21 11:16:12 -06:00 |
Garret Rieger
|
178c67003f
|
[perf] Rework set insert test to not use pause/resume timing.
These have high overhead which affect the result. Also change set iteration to time the individual iteration operation.
|
2022-04-21 11:16:12 -06:00 |
Garret Rieger
|
fc2027bf07
|
[perf] Add map benchmarks.
|
2022-04-21 11:16:12 -06:00 |
Garret Rieger
|
057ec2c953
|
[perf] Add set ieration and lookup benchmarks.
|
2022-04-21 11:16:12 -06:00 |
Garret Rieger
|
cef64b947d
|
[perf] Add the start of a benchmark for set operations.
|
2022-04-21 11:16:12 -06:00 |