-
Notifications
You must be signed in to change notification settings - Fork 5
Heatmap binning reducing cardinality for log-scale test points #5
Copy link
Copy link
Open
Description
I recently ran a LatencyHeatmap with input_lengths=[50, 100, 500, 1000], for which my actual generated/observed prompt lengths (num_tokens_input) ended up being [76, 136, 656, 1282] due to the usual tokenizer estimation error / prompt prefixing / etc.
The problem is that because the first two generated values are much closer to each other than the others, the binning() function ended up mapping everything to three bins: labelled [226, 528, 1131] on the plot.
I believe this would also affect similar tests that space their points (pseudo)-logarithmically... Suggested solution options:
- Since LatencyHeatmap only generates exactly Nbins prompts today, binning them seems needlessly indirect...
- We could update
binning()to automatically detect if the cardinality of the inputvectoralready exactly matches the number ofbins, and return the data as-is if so with no binning - It might be nice to also indicate on the plot whether binning was actually done for each axis? E.g.
Input tokens (exact),Output tokens (binned) - This would not solve the potential effects of the same issue on output tokens (which would still be binned linearly regardless of user's choice of target values) - but it's not such an obvious blocker there because the cardinality of the data is still high so it's less likely to end up with fewer than target number of bins containing data.
- We could update
- Maybe we could provide options for users to toggle whether the bins are linearly or log-spaced? But it would mean more to configure...
- Since the distribution of the data (both input & output token counts) is likely to be highly non-uniform, might it make more sense to use adaptive binning techniques to represent it? I know the process is more complex & therefore opaque, but the outputs might align more closely with whatever non-uniformly-spaced target values the user set for the test?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels