Character Animation with Wan Animate

Wan Animate is a unified framework for character animation and replacement. Generate high-fidelity character videos by precisely replicating expressions and movements from reference videos, or replace characters while maintaining environmental integration.

Unified Character Animation Framework

Wan Animate represents a breakthrough in character animation technology. Built upon the Wan model, it provides precise control over character expressions and movements while maintaining environmental consistency.

🎭

Character Animation

Animate any character by precisely replicating expressions and movements from reference videos

🔄

Character Replacement

Replace characters in videos while preserving expressions and environmental integration

✨

Holistic Replication

Maintain character appearance while applying appropriate lighting and color tone

Advanced AI Architecture

Wan Model Foundation

Wan Animate builds upon the powerful Wan model architecture, employing a modified input paradigm to differentiate between reference conditions and generation regions. This unified approach enables multiple animation tasks through a common symbolic representation, ensuring consistent and high-quality character animation results.

Spatial Skeleton Alignment

The system uses spatially-aligned skeleton signals to replicate body motion with exceptional precision. This approach captures the subtle nuances of human movement, from facial expressions to full-body gestures, enabling the generation of character videos with high controllability and expressiveness.

Relighting LoRA Module

To enhance environmental integration during character replacement, Wan Animate includes an auxiliary Relighting LoRA module. This component preserves character appearance consistency while applying appropriate environmental lighting and color tone, ensuring natural integration into target scenes.

AI Processing Visualization

Demo Videos

Watch Wan Animate in action! Below are sample results demonstrating character animation, replacement, and try-on capabilities.

Character Animation
Character Replacement
Try-On Example

Try Wan Animate Live

Experience the power of Wan Animate with our interactive demo. Upload your character image and reference video to see the magic happen in real-time.

The demo above showcases Wan Animate's capabilities for character animation and replacement. Upload your own images and videos to see how the system works with your content.

Run Wan2.2 & Installation Guide

Follow these steps to set up Wan2.2 and start generating videos, animations, and more with state-of-the-art AI models.

Installation

1. Clone the Repository

git clone https://github.com/Wan-Video/Wan2.2.git
cd Wan2.2

2. Install Dependencies

# Ensure torch >= 2.4.0
# If the installation of flash_attn fails, try installing the other packages first and install flash_attn last
pip install -r requirements.txt
# If you want to use CosyVoice for Speech-to-Video Generation:
pip install -r requirements_s2v.txt

3. Download Model Weights

Download models from Hugging Face or ModelScope. See table below for links and descriptions.
ModelDownload LinksDescription
T2V-A14B🤗 Huggingface  Text-to-Video MoE model, supports 480P & 720P
I2V-A14B🤗 Huggingface  Image-to-Video MoE model, supports 480P & 720P
TI2V-5B🤗 Huggingface  High-compression VAE, T2V+I2V, supports 720P
S2V-14B🤗 Huggingface  Speech-to-Video model, supports 480P & 720P
Animate-14B🤗 Huggingface  Character animation and replacement
💡 Note: The TI2V-5B model supports 720P video generation at 24 FPS.
# Download models using huggingface-cli
pip install "huggingface_hub[cli]"
huggingface-cli download Wan-AI/Wan2.2-T2V-A14B --local-dir ./Wan2.2-T2V-A14B
# Download models using modelscope-cli
pip install modelscope
modelscope download Wan-AI/Wan2.2-T2V-A14B --local_dir ./Wan2.2-T2V-A14B

4. Run the Demo

python demo.py --character_image path/to/character.jpg \
  --reference_video path/to/reference.mp4

System Requirements

GPU with 8GB+ VRAM recommended
Python 3.8 or higher
CUDA 11.8 or compatible
16GB+ RAM recommended

Quick Reference: Run Wan2.2

  • Text-to-Video (T2V-A14B):
    python generate.py --task t2v-A14B --size 1280*720 --ckpt_dir ./Wan2.2-T2V-A14B --offload_model True --convert_model_dtype --prompt "Two anthropomorphic cats in comfy boxing gear and bright gloves fight intensely on a spotlighted stage."
  • Image-to-Video (I2V-A14B):
    python generate.py --task i2v-A14B --size 1280*720 --ckpt_dir ./Wan2.2-I2V-A14B --offload_model True --convert_model_dtype --image examples/i2v_input.JPG --prompt "Summer beach vacation style, a white cat wearing sunglasses sits on a surfboard. ..."
  • Text-Image-to-Video (TI2V-5B):
    python generate.py --task ti2v-5B --size 1280*704 --ckpt_dir ./Wan2.2-TI2V-5B --offload_model True --convert_model_dtype --t5_cpu --prompt "Two anthropomorphic cats in comfy boxing gear and bright gloves fight intensely on a spotlighted stage"
  • Speech-to-Video (S2V-14B):
    python generate.py --task s2v-14B --size 1024*704 --ckpt_dir ./Wan2.2-S2V-14B/ --offload_model True --convert_model_dtype --prompt "Summer beach vacation style, a white cat wearing sunglasses sits on a surfboard." --image "examples/i2v_input.JPG" --audio "examples/talk.wav"
  • Run Wan-Animate (Character Animation/Replacement):
    # Preprocessing (animation mode)
    python ./wan/modules/animate/preprocess/preprocess_data.py \
      --ckpt_path ./Wan2.2-Animate-14B/process_checkpoint \
      --video_path ./examples/wan_animate/animate/video.mp4 \
      --refer_path ./examples/wan_animate/animate/image.jpeg \
      --save_path ./examples/wan_animate/animate/process_results \
      --resolution_area 1280 720 \
      --retarget_flag \
      --use_flux

    # Run in animation mode
    python generate.py --task animate-14B --ckpt_dir ./Wan2.2-Animate-14B/ --src_root_path ./examples/wan_animate/animate/process_results/ --refert_num 1

    # Preprocessing (replacement mode)
    python ./wan/modules/animate/preprocess/preprocess_data.py \
      --ckpt_path ./Wan2.2-Animate-14B/process_checkpoint \
      --video_path ./examples/wan_animate/replace/video.mp4 \
      --refer_path ./examples/wan_animate/replace/image.jpeg \
      --save_path ./examples/wan_animate/replace/process_results \
      --resolution_area 1280 720 \
      --iterations 3 \
      --k 7 \
      --w_len 1 \
      --h_len 1 \
      --replace_flag

    # Run in replacement mode
    python generate.py --task animate-14B --ckpt_dir ./Wan2.2-Animate-14B/ --src_root_path ./examples/wan_animate/replace/process_results/ --refert_num 1 --replace_flag --use_relighting_lora

Applications and Use Cases

🎬

Film and Entertainment

Create realistic character animations for movies, TV shows, and digital content. Wan Animate enables filmmakers to bring characters to life with precise control over expressions and movements, reducing production costs and time.

🎮

Gaming Industry

Generate dynamic character animations for video games, from cutscenes to in-game character interactions. The technology allows for rapid prototyping and iteration of character behaviors and expressions.

📚

Educational Content

Create engaging educational videos with animated characters that can explain complex concepts. Teachers and content creators can use Wan Animate to make learning materials more interactive and appealing.

🎨

Digital Art and Animation

Artists and animators can use Wan Animate to quickly prototype character animations and explore different movement styles. The technology democratizes high-quality character animation for independent creators.

💼

Corporate Training

Develop training materials with animated characters that can demonstrate procedures and concepts. This approach makes corporate training more engaging and memorable for employees.

🔬

Research and Development

Researchers can use Wan Animate to study human movement patterns, facial expressions, and behavioral modeling. The technology provides a platform for advancing our understanding of human communication and expression.

Frequently Asked Questions

Find answers to common questions about Wan Animate

What is Wan Animate?

Wan Animate is a unified framework for character animation and replacement. It can animate any character based on a performer's video, precisely replicating the performer's facial expressions and movements to generate highly realistic character videos.

How does character replacement work?

Wan Animate can replace characters in a video with animated characters, preserving their expressions and movements while also replicating the original lighting and color tone for seamless environmental integration.

What makes Wan Animate different from other animation tools?

Wan Animate uses spatially-aligned skeleton signals to replicate body motion and implicit facial features extracted from source images to reenact expressions, enabling generation of character videos with high controllability and expressiveness.

Is Wan Animate open source?

Yes, the team is committed to open-sourcing the model weights and source code, making this technology accessible to researchers and developers worldwide.

What are the system requirements?

Wan Animate requires a modern GPU with sufficient VRAM for video processing. The exact requirements depend on the video resolution and length, but a GPU with at least 8GB VRAM is recommended for optimal performance.

Can I use Wan Animate for commercial purposes?

Please refer to the license agreement for specific terms regarding commercial use. The models are typically licensed under Apache 2.0, but you should verify the current licensing terms before commercial deployment.