Skip to content

Run Phantom 14B with single A100/H100 GPU #41

@buiduchanh

Description

@buiduchanh

I noticed there were some shared issues regarding running the Phantom14B model on an A100/H100 GPU, so I found a solution.

Step 1: Download the model Phantom-Wan-14B_fp16.safetensors (FP16) from
https://huggingface.co/Kijai/WanVideo_comfy/tree/main

Step 2: Modify the file phantom_wan/subject2video.py (line 114) to include loading the .safetensors model, as shown in the screenshot.

Run with the following command:
python generate.py --task s2v-14B --size 832*480 --frame_num 241 --sample_fps 24
--ckpt_dir ./Wan2.1-T2V-1.3B --phantom_ckpt ./Phantom-Wan-Models/Phantom-Wan-14B_fp16.safetensors
...

It consumes approximately 77GB of VRAM.

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions