Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gemma 2 convert_checkpoint takes gpu ram more than needed #2647

Open
2 of 4 tasks
Alireza3242 opened this issue Jan 2, 2025 · 1 comment
Open
2 of 4 tasks

gemma 2 convert_checkpoint takes gpu ram more than needed #2647

Alireza3242 opened this issue Jan 2, 2025 · 1 comment
Assignees
Labels
bug Something isn't working Investigating LLM API/Workflow triaged Issue has been triaged by maintainers

Comments

@Alireza3242
Copy link

System Info

a100

Who can help?

@kaiyux
@byshiue

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

python3 /app/src/model_files/gemma2_27b/convert/convert_checkpoint.py --model-dir /app/data/gemma2_27b/model --output-model-dir /app/data/tllm_checkpoint --dtype bfloat16 --ckpt-type hf --world-size 2

Expected behavior

Expected behavior: load model once.

actual behavior

actual behavior: the model is loaded world-size times.

additional notes

the problem is in line 254 to 267 in https://github.com/NVIDIA/TensorRT-LLM/blob/v0.16.0/examples/gemma/convert_checkpoint.py:

    for config in trt_llm_config.for_each_rank():
        hf_weights = load_gemma_weights(
            parameters_or_model_dir=args.model_dir,
            trt_llm_config=config,
            ckpt_parser=ckpt_parser,
            load_model_on_cpu=args.load_model_on_cpu)
        ranked_weights = non_modelopt_quantize_if_needed(
            hf_weights,
            model_dir=args.model_dir,
            quantize_modifiers=QuantizeModifiers.from_args(args),
            trt_llm_config=config)
        save_checkpoint(output_dir=args.output_model_dir,
                        weights=ranked_weights,
                        rank=config.mapping.rank)
@Alireza3242 Alireza3242 added the bug Something isn't working label Jan 2, 2025
@nv-guomingz
Copy link
Collaborator

@byshiue would u please take a look on it?

@github-actions github-actions bot added triaged Issue has been triaged by maintainers Investigating labels Jan 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Investigating LLM API/Workflow triaged Issue has been triaged by maintainers
Projects
None yet
Development

No branches or pull requests

4 participants