diff --git a/training/a4x/wan2-1-14b/nemo-pretraining-gke/8node-BF16-GBS64/recipe/README.md b/training/a4x/wan2-1-14b/nemo-pretraining-gke/8node-BF16-GBS64/recipe/README.md index 33147c1..339dd95 100644 --- a/training/a4x/wan2-1-14b/nemo-pretraining-gke/8node-BF16-GBS64/recipe/README.md +++ b/training/a4x/wan2-1-14b/nemo-pretraining-gke/8node-BF16-GBS64/recipe/README.md @@ -1,7 +1,7 @@ -# Pretrain wan2-1-14b-fp8cs-gbs64-gpus32 workloads on a4x GKE Node pools with NVIDIA DFM & Megatron-Bridge +# Pretrain wan2-1-14b-fp8cs-gbs64-gpus32 workloads on a4x GKE Node pools with NVIDIA NeMo Megatron-Bridge -This recipe outlines the steps for running a wan2-1-14b-fp8cs-gbs64-gpus32 pretraining workload on a4x GKE Node pools by using the NVIDIA DFM (Digital Fingerprint Model) Framework and Megatron-Bridge. +This recipe outlines the steps for running a wan2-1-14b-fp8cs-gbs64-gpus32 pretraining workload on a4x GKE Node pools by using the NVIDIA NeMo Megatron-Bridge. ## Orchestration and deployment tools @@ -144,4 +144,4 @@ uninstall Helm, run the following command from your client: ```bash helm uninstall $USER-a4x-wan2-1-14b-8node -``` \ No newline at end of file +```