Motion Control
Transfer movements, gestures, and expressions from a reference video to a character image with stable and realistic timing.
Reference video
Reference video defining the motion to transfer. The character's actions, gestures, and expressions in the generated video will follow this reference video.
Accepted formats: MP4, MOV, Matroska • Max size: 100MB • Duration: 3-30 seconds • Must clearly show head, shoulders, and torso
✓ Orientation matches videoVidéo
Reference image
Reference image showing the character. The appearance, backgrounds, and visual elements of the generated video are based on this image. The character must be clearly visible (head, shoulders, and torso).
Accepted formats: JPEG, PNG, JPG • Max size: 10MB • Dimensions: greater than 300px • Aspect ratio: 2:5 to 5:2
Character orientation
- • Vidéo : Cohérent avec la vidéo (max 30s)
- • Image : Même orientation que l'image (max 10s)
Defines the character orientation in the generated video:
Prompt (optionnel)
Décrivez le mouvement ou l'action souhaitée pour améliorer la génération.
- • Maximum 2500 caractères
- • Peut être laissé vide
0/2500
Motion library
Key Features of Motion Control
Discover the advanced capabilities of Motion Control to transfer movements, gestures, and expressions with precision
Perfectly synchronized full-body movements
Motion Control enables precise transfer of full-body movements from a reference video to a character image, maintaining posture, movement rhythm, and body coordination tightly synchronized. By reusing the original performance, the system ensures natural and stable actions even during large movements or dynamic sequences.
Prompt-controlled scene details
While movements and performance are inherited from the reference video, Motion Control allows scene details to be controlled via prompts. You can define backgrounds, environments, and contextual elements independently, enabling the same performance to be reused across different visual settings and scenarios.
Precise hand performances
Motion Control accurately preserves fine-grained hand and finger movements from the reference video. Subtle gestures such as pointing, grasping, or expressive hand motions are transferred with high fidelity, making this capability especially valuable for presentations, demonstrations, and dialogue-focused content.
How to use Motion Control?
Create professional videos in just 3 simple steps
Reference image
Upload a clear, high-quality image of your character. Make sure the framing matches that of your reference video.
Reference video
Select a video with the movements you want to transfer. Choose clear and natural actions for best results.
Generation
Launch the generation and get your personalized video with movements transferred precisely and naturally.
How to get better results
To achieve stable and high-quality results, it's important to carefully align your reference image and reference video.
Framing match
Keep the character framing consistent between the reference image and reference video. A half-body image should be paired with a half-body video, and a full-body image with a full-body video.
Clear and natural movements
Choose reference videos with clear human actions at moderate speed. Avoid movements that are too fast, excessive displacement, or abrupt movement changes.
Sufficient space for large movements
For reference videos including large gestures or full-body actions, the reference image must provide sufficient visual space for the character to move freely.
Real-world use cases of Motion Control
Discover how Motion Control integrates into different professional workflows
Marketing videos and brand spokespersons
Reuse a single human performance across multiple brand characters or spokespersons. Transfer the same movements and expressions to different visual identities.
Product demonstrations and explanations
Gestures, hand movements, and presenter rhythm are preserved while the character's appearance and background can be customized.
AI influencers and virtual creators
Create realistic motion-based content for AI influencers. Real human performances can be mapped onto virtual characters.