Calculation of inference time #9068
-
Hi . I want to calculate the inference time of my model. I am not sure where to the code for measuring the time. I thought its better to do it inside
And in the main funtion,
After removing the initial measurements (considering GPU warm-up) and taking mean of 200 samples, I get If I do the measurement outside the LightningModule then I get a different value. This is how I measured
Using the above code, I get |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Hi, the first snippet measures raw prediction time, whereas the second one doesn't seem to be on gpu at all (e.g. your data likely isn't on gpu) or (if it is) can already include the data preparation and host to device transfers. |
Beta Was this translation helpful? Give feedback.
-
Dear @karthi0804, You could do get the profiling automatically. trainer = Trainer(profiler="simple")
trainer.predict(model=pl_model, datamodule=pl_data) From with self.trainer.profiler.profile("predict_batch_to_device"):
batch = self.trainer.accelerator.batch_to_device(batch, dataloader_idx=dataloader_idx)
self.batch_progress.increment_ready()
with self.trainer.profiler.profile("predict_step"):
self._predict_step(batch, batch_idx, dataloader_idx) |
Beta Was this translation helpful? Give feedback.
Hi, the first snippet measures raw prediction time, whereas the second one doesn't seem to be on gpu at all (e.g. your data likely isn't on gpu) or (if it is) can already include the data preparation and host to device transfers.