Skip to content

Actions: microsoft/DeepSpeed

Formatting

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
5,335 workflow runs
5,335 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Formatting
Formatting #15929: Merge group checks requested
January 4, 2025 05:58 1m 17s
January 4, 2025 05:58 1m 17s
Formatting
Formatting #15927: Scheduled
January 4, 2025 00:20 1m 21s master
January 4, 2025 00:20 1m 21s
Use ds-specific module id to avoid conflicts
Formatting #15926: Pull request #6847 synchronize by loadams
January 3, 2025 22:04 1m 18s olruwase/pr_6772
January 3, 2025 22:04 1m 18s
Add the missing view operations from sequence parallel(async).
Formatting #15925: Pull request #6750 synchronize by loadams
January 3, 2025 19:32 1m 23s inkcherry:ds_overlap_fix
January 3, 2025 19:32 1m 23s
Add fp8_gemm fallback for non-triton systems
Formatting #15923: Pull request #6916 synchronize by loadams
January 3, 2025 16:54 1m 41s oelayan7:fp8_gemm_no_triton
January 3, 2025 16:54 1m 41s
[BUG FIX]:fix get torch.version.cuda error when cuda is None in rocm
Formatting #15922: Pull request #6909 synchronize by loadams
January 3, 2025 16:28 1m 26s hj-wei:dev_hjwei
January 3, 2025 16:28 1m 26s
Fix checkpointable_layers Logic
Formatting #15921: Pull request #6881 synchronize by loadams
January 3, 2025 16:28 1m 22s Quentin-Anthony:qanthony/fix-act-recomp
January 3, 2025 16:28 1m 22s
Formatting
Formatting #15920: Merge group checks requested
January 3, 2025 15:38 1m 25s
January 3, 2025 15:38 1m 25s
Support pure meta model lm_head tp
Formatting #15919: Pull request #6812 synchronize by delock
January 3, 2025 02:56 Action required Yejing-Lai:lyj/lm_head_replace
January 3, 2025 02:56 Action required
Formatting
Formatting #15918: Scheduled
January 3, 2025 00:20 1m 16s master
January 3, 2025 00:20 1m 16s
Cleanup ops/transformer/inference tests
Formatting #15916: Pull request #6830 synchronize by loadams
January 2, 2025 18:47 1m 28s loadams/transformers-inference
January 2, 2025 18:47 1m 28s
Autotp training
Formatting #15914: Pull request #6922 synchronize by inkcherry
January 2, 2025 03:54 1m 21s inkcherry:autotp_training
January 2, 2025 03:54 1m 21s
Formatting
Formatting #15913: Scheduled
January 2, 2025 00:20 1m 21s master
January 2, 2025 00:20 1m 21s
Formatting
Formatting #15912: Scheduled
January 1, 2025 00:22 1m 15s master
January 1, 2025 00:22 1m 15s
Add fp8_gemm fallback for non-triton systems
Formatting #15911: Pull request #6916 synchronize by oelayan7
December 31, 2024 12:01 1m 20s oelayan7:fp8_gemm_no_triton
December 31, 2024 12:01 1m 20s
[inf] Add config var to enable keeping module on host
Formatting #15910: Pull request #6846 synchronize by oelayan7
December 31, 2024 07:32 1m 23s oelayan7:keep_module_on_host
December 31, 2024 07:32 1m 23s
Formatting
Formatting #15909: Scheduled
December 31, 2024 00:20 1m 17s master
December 31, 2024 00:20 1m 17s
Use ds-specific module id to avoid conflicts
Formatting #15908: Pull request #6847 synchronize by loadams
December 30, 2024 21:04 1m 23s olruwase/pr_6772
December 30, 2024 21:04 1m 23s
[BUG FIX]:fix get torch.version.cuda error when cuda is None in rocm
Formatting #15907: Pull request #6909 synchronize by loadams
December 30, 2024 21:02 1m 19s hj-wei:dev_hjwei
December 30, 2024 21:02 1m 19s
Fix checkpointable_layers Logic
Formatting #15905: Pull request #6881 synchronize by loadams
December 30, 2024 18:53 1m 22s Quentin-Anthony:qanthony/fix-act-recomp
December 30, 2024 18:53 1m 22s