Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable dumping raw prof files in AdvancedProfiler #19703

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

clumsy
Copy link
Contributor

@clumsy clumsy commented Mar 26, 2024

What does this PR do?

Adding a new dump_stats flag to AdvanedProfiler to persist .prof files collected during profiling. These files can be visualized by SnakeViz like so: snakeviz fit-run-[Strategy]DDPStrategy.on_train_end.prof

Screenshot 2024-03-26 at 10 04 17 AM

A .prof file is created for each profiled action, e.g.:

...
fit-run-[Strategy]DDPStrategy.on_train_end.prof
fit-run-[Callback]TQDMProgressBar.on_fit_end.prof
fit-run-[Callback]ModelSummary.on_fit_end.prof
fit-run-[Callback]ModelCheckpoint{'monitor': None, 'mode': 'min', 'every_n_train_steps': 0, 'every_n_epochs': 1, 'train_time_interval': None}.on_fit_end.prof
fit-run-[Callback]LearningRateMonitor.on_fit_end.prof
...

Fixes #19698

No breaking changes, adding a new backward-compatible flag, with the new functionality disabled by default.

Before submitting
  • Was this discussed/agreed via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

Reviewer checklist
  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

📚 Documentation preview 📚: https://pytorch-lightning--19703.org.readthedocs.build/en/19703/

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Mar 26, 2024
@clumsy
Copy link
Contributor Author

clumsy commented Mar 26, 2024

For your consideration, @awaelchli

@clumsy clumsy force-pushed the feat/advanced_profiler_dump_stats branch from cf1947c to 0c21547 Compare March 26, 2024 14:31
@clumsy
Copy link
Contributor Author

clumsy commented Apr 8, 2024

Are there any concerns about this change, @awaelchli? Thanks!

@clumsy
Copy link
Contributor Author

clumsy commented Jun 7, 2024

Just bumping this one in case it got lost in notifications, @awaelchli.

Copy link
Member

@awaelchli awaelchli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@clumsy Looks great to me. Just a few minor comments

@override
def summary(self) -> str:
recorded_stats = {}
for action_name, pr in self.profiled_actions.items():
self._maybe_dump_stats(action_name, pr)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nitpick: I would do the if self.dump_stats check outside and remove the maybe_ prefix from the name (general consistency in code base)

def _maybe_dump_stats(self, action_name: str, pr: cProfile.Profile) -> None:
if not self.dump_stats:
return
assert self.dirpath # redundant, but needed for mypy
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, we have this in several places of the code base. I suggest to remove the comment to be consistent in the code base

Suggested change
assert self.dirpath # redundant, but needed for mypy
assert self.dirpath

@@ -75,10 +83,28 @@ def stop(self, action_name: str) -> None:
raise ValueError(f"Attempting to stop recording an action ({action_name}) which was never started.")
pr.disable()

def _maybe_dump_stats(self, action_name: str, pr: cProfile.Profile) -> None:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def _maybe_dump_stats(self, action_name: str, pr: cProfile.Profile) -> None:
def _maybe_dump_stats(self, action_name: str, profile: cProfile.Profile) -> None:

maybe we can spell out the name of the argument for clarity here :)

dst_fs = get_filesystem(dst_filepath)
dst_fs.mkdirs(self.dirpath, exist_ok=True)
# temporarily save to local since pstats can only dump into a local file
with tempfile.TemporaryDirectory(prefix="test", suffix="test", dir=os.getcwd()) as tmp_dir, dst_fs.open(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we'd need to include the rank in the prefix. Otherwise we could run into rare race conditions, right?

Raises:
ValueError:
If you attempt to stop recording an action which was never started.
"""
super().__init__(dirpath=dirpath, filename=filename)
self.profiled_actions: Dict[str, cProfile.Profile] = {}
self.line_count_restriction = line_count_restriction
self.dump_stats = dump_stats
assert not self.dump_stats or self.dirpath is not None, "dirname must be provided for dump_states to work"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we make this a ValueError rather than an assertion error? That's because the error would be caused by user input.

Copy link
Member

@awaelchli awaelchli Jun 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to the docs:

If ``dirpath`` is ``None`` but ``filename`` is present, the
                ``trainer.log_dir`` (from :class:`~lightning.pytorch.loggers.tensorboard.TensorBoardLogger`)
                will be used.

So a None check alone is not enough. We'd have to check that either dirname or filename is set.

@awaelchli awaelchli changed the title feat: add dump_stats to advanced profiler (#19698) Enable dumping raw prof files in AdvancedProfiler Jun 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
has conflicts pl Generic label for PyTorch Lightning package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Dump prof files from AdvancedProfiler
3 participants