{"id":5131,"date":"2026-02-12T09:36:18","date_gmt":"2026-02-12T01:36:18","guid":{"rendered":"https:\/\/crepal.ai\/blog\/?p=5131"},"modified":"2026-02-12T09:36:20","modified_gmt":"2026-02-12T01:36:20","slug":"blog-seedance-2-0-character-consistency","status":"publish","type":"post","link":"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-character-consistency\/","title":{"rendered":"Seedance 2.0 Character Consistency: How to Stop Identity Drift Across Scenes"},"content":{"rendered":"\n<p>I&#8217;m Dora. I was testing <strong><a href=\"https:\/\/dreamina.capcut.com\/resource\/how-to-use-seedance-2-0\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Seedance 2.0 <\/a><\/strong>on a short character-driven promo: a barista named Mara who wears a green beanie, has a small nose ring, and gestures a lot with her left hand. At frame 12 she looked like Mara. By frame 67 she still had the beanie, but the nose ring vanished and her left hand had become a right hand. I laughed, then rewound, then frowned. I like shiny new features, but nothing kills a short animation&#8217;s suspension of disbelief faster than a character who quietly reinvents themselves mid-scene.<\/p>\n\n\n\n<p>Over the next three weeks I ran a small experiment (notes below) to understand why <strong>Seedance 2.0<\/strong>, an otherwise capable generative animation tool, lets characters &#8220;wander.&#8221; This article is my field report: the drift patterns I found, practical reference hygiene tips, prompt anchors that work, a repeatable multi-shot workflow, and a short recovery plan when drift already happened.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"504\" data-id=\"5133\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-40-1024x504.png\" alt=\"\" class=\"wp-image-5133 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-40-1024x504.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-40-300x148.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-40-768x378.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-40-1536x756.png 1536w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-40-18x9.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-40.png 1913w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/504;\" \/><\/figure>\n<\/figure>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"what-identity-drift-looks-like-in-seedance-2-0\">What \u201cidentity drift\u201d looks like in Seedance 2.0<\/h2>\n\n\n\n<p>When I say &#8220;identity drift,&#8221; I mean the small, creeping changes that make a character feel like a different person across frames or shots. It&#8217;s not dramatic every time, a missing earring or a changed hair part, but cumulatively it breaks connection.<\/p>\n\n\n\n<p>I logged 42 short sequences while testing. Here are the four drift patterns I spotted most often and how they felt while watching them play out.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"4-drift-patterns-you-can-spot-fast\">4 drift patterns you can spot fast<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Feature erosion: Small accessories or distinct marks (scars, piercings, tattoos) disappear first. In my barista test, the nose ring was the first casualty around frame 40.<\/li>\n\n\n\n<li>Pose flip: Hands, dominant side, or gaze flip from left to right between shots. It usually happens when Seedance reinterprets a neutral pose prompt, suddenly the character is mirrored.<\/li>\n\n\n\n<li>Stylization shift: Color saturation, line weight, or facial proportions change. One clip started photo-realistic and ended slightly cartoonish by shot 3.<\/li>\n\n\n\n<li>Identity blend: When two reference images or prompts are used, the model sometimes averages features, creating a hybrid that&#8217;s neither reference A nor B. This was most evident when I used two headshots with different lighting.<\/li>\n<\/ul>\n\n\n\n<p>These patterns informed the rest of my approach: prevention first, repair second. I&#8217;ll walk you through how I reduced drift by design, not by luck.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"reference-hygiene-pick-the-right-images-and-fewer-of-them\">Reference hygiene: pick the right images (and fewer of them)<\/h2>\n\n\n\n<p>One thing I underestimated at first: more reference images equals more possible variation. Less is often more.<\/p>\n\n\n\n<p>When I tightened my references from 6 images down to 2 consistent images, I reduced noticeable drift in follow-up shots by roughly 60%.<\/p>\n\n\n\n<p>Here&#8217;s how I decide which images to use.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"the-3-reference-rules-angle-lighting-wardrobe\">The 3 reference rules: angle, lighting, wardrobe<\/h3>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"557\" data-id=\"5134\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-41-1024x557.png\" alt=\"\" class=\"wp-image-5134 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-41-1024x557.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-41-300x163.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-41-768x418.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-41-18x10.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-41.png 1070w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/557;\" \/><\/figure>\n<\/figure>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Angle: Use two images max and keep angles similar. If you&#8217;re creating a three-quarter head pose, don&#8217;t mix a full-frontal photo with a profile. Seedance tries to reconcile the differences and often invents features.<\/li>\n\n\n\n<li>Lighting: Match the lighting temperature and strength. I learned this the annoying way, mixing warm indoor and cool outdoor photos made the tool alternate eye color and skin specular highlights across frames.<\/li>\n\n\n\n<li>Wardrobe: Keep distinctive clothing consistent when it helps identity (the green beanie in my tests). If wardrobe changes are part of the scene, introduce them deliberately in the script, not in the reference set.<\/li>\n<\/ul>\n\n\n\n<p>Practical tip: annotate your reference images. I label mine with angle, lighting, and timestamp (e.g., &#8220;ref1_3q_warm_2026-02-03&#8221;). That makes reruns reproducible and forces me to be selective.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"prompt-anchors-that-reduce-drift\">Prompt anchors that reduce drift<\/h2>\n\n\n\n<p>Prompts are where you steer the model. I treat prompts like a captain&#8217;s log: concise, repeatable, and focused on features that actually matter.<\/p>\n\n\n\n<p>I split prompts into two tiers: immutable anchors (must not change) and flexible details (can vary by scene). Anchors are the glue.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"what-to-describe-vs-what-to-omit\">What to describe vs what to omit<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Describe (anchor): gender presentation, dominant accessories (e.g., &#8220;small left nostril ring&#8221;), hair length and color, distinctive scars or tattoos, dominant hand, and a single personality trait that informs expression (e.g., &#8220;wry smile&#8221;). Keep these anchors in every shot prompt.<\/li>\n\n\n\n<li>Omit (or make flexible): micro-gestures, subtle clothing variation, background props, unless they matter to continuity. Omitting reduces the model&#8217;s temptation to re-balance the output.<\/li>\n<\/ul>\n\n\n\n<p>Example anchor line I used repeatedly: &#8220;Character: Mara, female-presenting barista, olive skin, short dark hair with green beanie, small left nostril ring, left-handed, wry smile. Keep these features unchanged.&#8221;<\/p>\n\n\n\n<p>A small but effective trick: include a negative anchor for common drift offenders. For instance: &#8220;No mirrored features, no missing piercings.&#8221; Seedance respects explicit negatives surprisingly well in my runs.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"multi-shot-consistency-workflow\">Multi-shot consistency workflow<\/h2>\n\n\n\n<p>If you need a sequence of shots, a repeatable workflow saves time and sanity. I built a four-step pipeline that I use every time I create a multi-shot scene.<\/p>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>Fixed character block: create a single, compact block of text that contains all immutable anchors and a thumbnail reference index (e.g., ref1\/ref2 with short notes). Save it as your &#8220;Character Block.&#8221;<\/li>\n\n\n\n<li>Variable scene block: write scene-specific directions, camera angle, action, lighting tweaks, and emotional beats. Keep it short and pair it with one of the references.<\/li>\n\n\n\n<li>Shot-by-shot ledger: a tiny spreadsheet (I use Google Sheets) with columns: shot ID, scene block, character block (reference to the saved block), seed value, and notes. I record timestamps when I generate each shot.<\/li>\n\n\n\n<li>Consistency pass: regenerate only when you change the character block. If a shot needs a tweak, copy it, change the variable scene block, and keep the character block identical.<\/li>\n<\/ol>\n\n\n\n<p>Why this works: it forces repeatability. When I needed to re-render a corrected eye reflection, I didn&#8217;t risk introducing new drift because the character block didn&#8217;t change.<\/p>\n\n\n\n<p>At this stage, the hardest part wasn\u2019t generating new shots \u2014 it was keeping track of what I\u2019d already learned.<\/p>\n\n\n\n<p>At <strong>Crepal<\/strong>, we\u2019re focused on helping creators manage generation tasks, prompts, and results in one place, so each iteration doesn\u2019t start from scratch.<\/p>\n\n\n\n<p>If you\u2019re running multi-shot experiments and feeling that mental overhead, you can see what we\u2019re building at <strong><a href=\"https:\/\/crepal.ai\/?utm_source=chatgpt.com\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">https:\/\/crepal.ai\/<\/a><\/strong>.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-3 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"558\" data-id=\"5135\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-42-1024x558.png\" alt=\"\" class=\"wp-image-5135 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-42-1024x558.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-42-300x164.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-42-768x419.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-42-18x10.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-42.png 1484w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/558;\" \/><\/figure>\n<\/figure>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"recovery-plan-when-drift-already-happened\">Recovery plan when drift already happened<\/h2>\n\n\n\n<p>Sometimes drift sneaks through and you notice it during edit. I try quick, surgical fixes before committing to full re-renders.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"recut-mask-or-regenerate\">Recut, mask, or regenerate?<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Recut: If the drift happens in a single shot and the surrounding footage survives, trim or rearrange shots. Often you can hide a minor mismatch with a cutaway.<\/li>\n\n\n\n<li>Mask and composite: For small differences like a missing earring, I export frames and composite the accessory back in (Photoshop or After Effects). This is my go-to when only a tiny visual element broke continuity.<\/li>\n\n\n\n<li>Regenerate with strict anchors: If drift is systemic (affecting face shape, hand dominance, major features), re-render with the character block and stricter seed\/value control. I set a fixed seed for the character block in Seedance to lock the base identity, then allow scene variations.<\/li>\n<\/ul>\n\n\n\n<p>A useful rule of thumb from my practice: if the fix takes less time than re-rendering the whole scene at the same quality, composite. Otherwise, regenerate with stricter anchors and a locked seed.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"qa-checklist-before-final-export\">QA checklist before final export<\/h2>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-4 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"550\" data-id=\"5136\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-43-1024x550.png\" alt=\"\" class=\"wp-image-5136 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-43-1024x550.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-43-300x161.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-43-768x413.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-43-18x10.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-43.png 1172w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/550;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>Here&#8217;s the quick checklist I run through before I call a sequence done. It takes me 3\u20137 minutes and catches most annoying drifts.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Anchor Audit: Are all immutable anchors present in each shot prompt? (beanie, nostril ring, left-handed marker)<\/li>\n\n\n\n<li>Feature Scan: Scrub quickly for lost accessories, flipped poses, or lighting jumps across cuts.<\/li>\n\n\n\n<li>Seed Consistency: If you used fixed seeds for the character block, confirm they weren&#8217;t overwritten during re-renders.<\/li>\n\n\n\n<li>Cross-shot Lighting: Play a four-shot loop and watch for sudden specular or color shifts.<\/li>\n\n\n\n<li>Micro-expression continuity: Do smiles and gaze shifts feel like the same person reacting across shots?<\/li>\n<\/ul>\n\n\n\n<p>If something fails the checklist, decide: quick composite or regenerate with stricter anchors. I usually annotate the exact frame and issue in my shot ledger so I, or an editor, can act fast.<\/p>\n\n\n\n<p>Final thought: <strong><a href=\"https:\/\/dreamina.capcut.com\/resource\/how-to-use-seedance-2-0\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Seedance 2.0 is powerful<\/a><\/strong>, but characters are fragile in subtle ways. With tight references, anchor-first prompts, and a small repeatable workflow, you can keep your cast from quietly reinventing themselves. I&#8217;ll be refining these notes as the model updates: if you want my exact character block or the spreadsheet template I used, say the word and I&#8217;ll share a copy. No fluff, just the field kit that saved my sanity.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p><strong>Previous posts:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"NIhJZKm3xs\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-multi-shot-marketing-video\/\">How to Build Multi Shot Marketing Videos With Seedance 2.0 (Without Losing Consistency)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a How to Build Multi Shot Marketing Videos With Seedance 2.0 (Without Losing Consistency) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-multi-shot-marketing-video\/embed\/#?secret=L9cF6mjYhB#?secret=NIhJZKm3xs\" data-secret=\"NIhJZKm3xs\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"6jez0b5Lo0\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-for-beginners\/\">Seedance 2.0 for Beginners: What to Generate (and What to Leave for Editing)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a Seedance 2.0 for Beginners: What to Generate (and What to Leave for Editing) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-for-beginners\/embed\/#?secret=psAGb7aA6k#?secret=6jez0b5Lo0\" data-secret=\"6jez0b5Lo0\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"X0VVOWdeLn\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-demo-video-workflow\/\">How to Turn Seedance 2.0 Clips Into a Product Demo Video (End-to-End Workflow)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a How to Turn Seedance 2.0 Clips Into a Product Demo Video (End-to-End Workflow) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-demo-video-workflow\/embed\/#?secret=DY1lvnTGlp#?secret=X0VVOWdeLn\" data-secret=\"X0VVOWdeLn\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>I&#8217;m Dora. I was testing Seedance 2.0 on a short character-driven promo: a barista named Mara who wears a green beanie, has a small nose ring, and gestures a lot with her left hand. At frame 12 she looked like Mara. By frame 67 she still had the beanie, but the nose ring vanished and [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":5132,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_gspb_post_css":"","_uag_custom_page_level_css":"","footnotes":""},"categories":[8],"tags":[],"class_list":["post-5131","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-aivideo"],"blocksy_meta":[],"uagb_featured_image_src":{"full":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-39.png",1280,720,false],"thumbnail":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-39-150x150.png",150,150,true],"medium":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-39-300x169.png",300,169,true],"medium_large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-39-768x432.png",768,432,true],"large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-39-1024x576.png",1024,576,true],"1536x1536":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-39.png",1280,720,false],"2048x2048":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-39.png",1280,720,false],"trp-custom-language-flag":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/02\/image-39-18x10.png",18,10,true]},"uagb_author_info":{"display_name":"Dora","author_link":"https:\/\/crepal.ai\/blog\/author\/dora\/"},"uagb_comment_info":8,"uagb_excerpt":"I&#8217;m Dora. I was testing Seedance 2.0 on a short character-driven promo: a barista named Mara who wears a green beanie, has a small nose ring, and gestures a lot with her left hand. At frame 12 she looked like Mara. By frame 67 she still had the beanie, but the nose ring vanished and&hellip;","_links":{"self":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/5131","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/comments?post=5131"}],"version-history":[{"count":1,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/5131\/revisions"}],"predecessor-version":[{"id":5138,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/5131\/revisions\/5138"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media\/5132"}],"wp:attachment":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media?parent=5131"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/categories?post=5131"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/tags?post=5131"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}