-
Notifications
You must be signed in to change notification settings - Fork 24
feat: add progress monitoring and logging #132
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat: add progress monitoring and logging #132
Conversation
da76445 to
484ab9a
Compare
| self._bars[model_id] = (req_bar, None, None) | ||
| self._next_position += 1 | ||
|
|
||
| async def update_openai_usage( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
its used by other providers too so maybe don't call openai_usage?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice catch, corrected
| self.model_wait_times.setdefault(response.model_id, []).append(response.duration - response.api_duration) | ||
|
|
||
| # Update progress monitor with usage info | ||
| if hasattr(self, "progress_monitor") and self.progress_monitor is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is the reason more of this can't go in the class and happen when you run self.progress_monitor.update_openai_usage in each model class?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch, It’s centralized now, and the progress update happens only in InferenceAPI.call
safetytooling/apis/inference/api.py
Outdated
| if isinstance(model_class, AnthropicChatModel) or isinstance(model_class, HuggingFaceModel) or isinstance(model_class, GeminiModel) or isinstance(model_class, GeminiVertexAIModel): | ||
| request_increment = num_candidates | ||
|
|
||
| await self.progress_monitor.update_openai_usage( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah I'm confused why this is called again when you call it in each model class
|
This is awesome work! Tysm for implementing this |
|
Can you post a screenshot of what it looks like in the terminal? |
|
@jplhughes bumping this |

No description provided.