Skip to content

Conversation

@ashuaibi7
Copy link

Summary:

  • refactor EmbOptimType to OptimizerClass mapping and generate functions to access forward/backward mapping
  • fix optimizer storage clculation by instead using sharder's fused_params["optimizer"] (EmbOptimType) when tensor attribute unavailable
  • add _emb_opt_type_to_optimizer_class helper to convert EmbOptimType back to optimizer class for storage calculation

Differential Revision: D88532615

Summary:
- refactor EmbOptimType to OptimizerClass mapping and generate functions to access forward/backward mapping
- fix optimizer storage clculation by instead using sharder's `fused_params["optimizer"]` (EmbOptimType) when tensor attribute unavailable
- add `_emb_opt_type_to_optimizer_class` helper to convert EmbOptimType back to optimizer class for storage calculation

Differential Revision: D88532615
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 15, 2025
@meta-codesync
Copy link
Contributor

meta-codesync bot commented Dec 15, 2025

@ashuaibi7 has exported this pull request. If you are a Meta employee, you can view the originating Diff in D88532615.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant