Notes from Recent Core Maintainers Meetings (22'Q4 and 23'Q1)

See below for the notes from the two most recent Core Maintainers meetings.

Core Maintainers Meeting March 24th, 2023

Considered Module Maintainer Nominations

  • [specific examples redacted]
  • Summary: Considered 4 new module maintainers.
    • Accepted two. Notification on these will be through a post on dev-discuss within the next few days.
    • Did not accept two. Feedback to these nominees and nominators will be provided privately.
  • Other discussion related to module maintainers:
    • Looking at the areas it was decided that several maintainers were not active any more and it was decided to move them to emeritus status.
    • There was some discussion about the granularity of the various module areas around compiler and various hardware back-ends.
    • The taxonomy seems out of date, we should update it given the new PT2 compiler architecture.
    • Proposal:

Old module taxonomy: Compilers, CPU Performance, Intel/MKLDNN

New module taxonomy:

  • Compiler Front End (includes Dynamo, FX),
  • CPU Performance (includes inductor, MKLDNN, LLVM),
  • GPU Performance (includes inductor, OpenAI Triton)

Considered one new Core Maintainer Nomination

  • [specific example redacted]
  • Summary: Accepted the new core maintainer.
    • Notification will come separately.

Is there an essential competitive threat in the popularity of Transformers and how should PyTorch react? [Agenda item added day-of-meeting]

  • What if transformers are the end of model architecture innovation?
    • Does something narrow end up dominating
  • Unlikely, but what if…
    • Generality would be a less important differentiation for PyTorch among frameworks and other runtimes.
    • Would need to focus more on out of the box large scale training performance and scalability
  • Possible goal: Can PyTorch deliver SOTA Training perf and scalability out of the box?
    • Can we do this in a multi-vendor way?
  • This is about both the transformer architecture but also seamless scaling to cutting edge scales.
    • PyTorch Distributed, FSDP, 3D parallelism
  • Some vendors are is pushing vertically integrated solutions hardware up to user level APIs.
  • We would need to paint the picture of a platform independent E2E system and get folks excited.
  • We have lots of the pieces but out of the box composability across all the components in the pytorch ecosystem is lacking – that will be critical.
  • Decision: Greg, Dima, and Soumith to work on this as a small group.

Should we add Safe Tensor support to PyTorch?

[Agenda item added day-of-meeting]

  • Safe tensors drop Views and don’t support complex nested types.
  • UX challenges … means we have two formats
  • Can we improve PT native serialization instead?
  • Having a lightweight C++ API in core (not tied to JIT) is a good idea
  • Nested types are rare but views are pretty core.
  • Decision: rejected based on coverage, complexity, and composability.

Core Maintainers Meeting October 2022

Distinction between PR review policies and maintainers

  • Do reviewers have a special role?
    • Are maintainers responsible for more than reviewing?
    • Nominate people with no voting power but only power to review?
  • Might be useful for areas with review shortage, e.g. CPU/Backend
  • Decision: if a person has track record of doing things properly, they should also be trusted to be a module maintainer
    • Aim to expand maintainership in understaffed areas

Decision: Reject

Discuss nomination of an additional maintainer

  • [Specific example redacted]
  • When we want to add someone as a maintainer of a module, they need to
    • have shown to have reviewed things/PRs, take the module forward
    • triaging issues correctly, maintain high bar quality
    • responsible and qualified angle for that module
      • “Promotion should be trailing”
    • Commits going forward to actively maintain module
  • How to handle requests for submissions?
    • Person needs to know enough about what’s going on to make decision as is
    • Need comms from folks that are not in this meeting
    • When we get a nomination, can discuss evidence over a private google doc, or discuss over email

Decision: Nomination not approved.

“What are the qualification criteria for a contributor to become a maintainer?”

  • Make doc less vague or not? We have doc in governance section of the website
  • It’s quite difficult to hit the bar of a contributor/maintainer, if you don’t have access e.g. you can join only composability meetings if you’re part of Meta.
    • Give people access and opportunity
    • Information flow is an issue (invite only). Break out of the cycle of internal FB discussions only
    • Should be a federated responsibility to plug in and involve other people. E.g. TorchVision people constantly involve people not at Meta.
  • Escalation for special situations is also possible
  • Existing maintainers should be incentivized already to create more maintainers, creating features and parallelizing review load
  • Responsibilities?
    • Do they need to host monthly public roadmap meetings? Maintain mailing lists?
    • Maintainers need to meet quarterly for roadmap discussion to align with release cycle, join in once every release for retrospective
    • Provide requirements, and discuss state of the module

Decision: No action was taken. This is an ongoing discussion among the core maintainers.

What to include or redact from minutes of maintainership from this meeting

Decision: Maintainers to review notes and decide what to share with OSS community. Primarily redact personal information about nominations.

Discuss proposal for code review tiers

  • Core reviewers sounds good
  • Should we add more core reviewers?
    • Yes
    • Have a core reviewers section, and more broad definition
      • We only want it to be more broad during transition period
    • Move maintainers to correct subprojects
    • Create a list of people keep the lights on
    • Decision is based on time working on PyTorch
    • Stamp commits
  • People will not be able to merge things as is today, [AI] open up merge rules?

Decision: Assigned some follow up actions.

Open AI:

  • Remove metamates special privileges
  • Make sure merge rules.json are in sync with module structure

Requests from vendors to add types (into pytorch core) i.e. FP8 (nvidia)

  • [AI] Ask AO people to review/develop FP8 . Have a 1hr meeting of what they will send as a PR
  • Gregory Chanan to be looped in for these high bandwidth discussion (add Edward Yang)
  • There are 2 FP8s, Separate dtypes

Decision: Use the normal RFC / Design review process.

1 Like

Announcement and agenda for the core maintainers meeting was posted in Announcement: PyTorch Quarterly Maintainers Meeting - March 24th, 2023 - #3 by gottbrath