From 51412f53dbf3a8b4f40739748a2c2ce76dff635d Mon Sep 17 00:00:00 2001 From: Wenwen Gao <94138584+snowmanwwg@users.noreply.github.com> Date: Fri, 30 Jan 2026 21:47:15 -0800 Subject: [PATCH] Revise README for NeMo Megatron-Bridge usage Updated the README to reflect the use of NVIDIA NeMo Megatron-Bridge instead of NVIDIA DFM. --- .../nemo-pretraining-gke/8node-BF16-GBS64/recipe/README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/training/a4x/wan2-1-14b/nemo-pretraining-gke/8node-BF16-GBS64/recipe/README.md b/training/a4x/wan2-1-14b/nemo-pretraining-gke/8node-BF16-GBS64/recipe/README.md index 33147c1..339dd95 100644 --- a/training/a4x/wan2-1-14b/nemo-pretraining-gke/8node-BF16-GBS64/recipe/README.md +++ b/training/a4x/wan2-1-14b/nemo-pretraining-gke/8node-BF16-GBS64/recipe/README.md @@ -1,7 +1,7 @@ -# Pretrain wan2-1-14b-fp8cs-gbs64-gpus32 workloads on a4x GKE Node pools with NVIDIA DFM & Megatron-Bridge +# Pretrain wan2-1-14b-fp8cs-gbs64-gpus32 workloads on a4x GKE Node pools with NVIDIA NeMo Megatron-Bridge -This recipe outlines the steps for running a wan2-1-14b-fp8cs-gbs64-gpus32 pretraining workload on a4x GKE Node pools by using the NVIDIA DFM (Digital Fingerprint Model) Framework and Megatron-Bridge. +This recipe outlines the steps for running a wan2-1-14b-fp8cs-gbs64-gpus32 pretraining workload on a4x GKE Node pools by using the NVIDIA NeMo Megatron-Bridge. ## Orchestration and deployment tools @@ -144,4 +144,4 @@ uninstall Helm, run the following command from your client: ```bash helm uninstall $USER-a4x-wan2-1-14b-8node -``` \ No newline at end of file +```