{"id":4941,"date":"2026-01-19T13:42:22","date_gmt":"2026-01-19T05:42:22","guid":{"rendered":"https:\/\/crepal.ai\/blog\/?p=4941"},"modified":"2026-01-19T13:42:25","modified_gmt":"2026-01-19T05:42:25","slug":"blog-ltx-2-comfyui-workflows-t2v-i2v-v2v","status":"publish","type":"post","link":"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-comfyui-workflows-t2v-i2v-v2v\/","title":{"rendered":"LTX-2 Workflows in ComfyUI Explained (T2V vs I2V vs V2V)"},"content":{"rendered":"\n<p>Hey, guys, I&#8217;m Dora. I finally gave in. I&#8217;d seen <strong><a href=\"https:\/\/github.com\/Lightricks\/LTX-2\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">LTX-2<\/a><\/strong> demos all over my feed, glossy product pans, moody fantasy shots, that kind of thing. Curiosity won. I blocked off an hour after lunch, opened a fresh project, and told myself: &#8220;Either this becomes part of my weekly workflow\u2026 or it goes into the graveyard of cool-but-forgettable AI toys.&#8221; Not sponsored, just honest results from my tests over Jan 7\u201312, 2026.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"workflow-map-what-each-pipeline-does\">Workflow map: what each pipeline does<\/h2>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"576\" data-id=\"4943\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-47-1024x576.png\" alt=\"\" class=\"wp-image-4943 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-47-1024x576.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-47-300x169.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-47-768x432.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-47-18x10.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-47.png 1280w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/576;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>LTX-2 has three main pipelines that basically decide how you even approach your clips before you touch a single knob. Think of them like ingredients: T2V is flour, I2V is butter, V2V is icing\u2014you combine them right, and voila.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>T2V (text-to-video): <\/strong>Start from a prompt, get a video. It&#8217;s the fastest way to brainstorm and explore styles. If you need a vibe board or a quick &#8220;what could this look like?&#8221; pass, this is home base.<\/li>\n\n\n\n<li><strong>I2V (image-to-video):<\/strong> Feed it a key image and guide motion on top. When you need brand consistency (logos, packaging, character faces), I2V holds the shape better than T2V.<\/li>\n\n\n\n<li><strong>V2V (video-to-video):<\/strong> Take an existing clip and transform it. Think style transfer, cleanup, or changing mood without re-blocking the whole scene.<\/li>\n<\/ul>\n\n\n\n<p>In practice, I bounce between them: sketch in T2V, lock visuals in I2V, then refine timing and style in V2V. It feels like moving from a pencil sketch to inks to color. For anyone wanting to explore beyond the UI, <strong><a href=\"https:\/\/github.com\/Lightricks\/ComfyUI-LTXVideo\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">ComfyUI LTXVideo<\/a><\/strong> is a handy resource for building custom pipelines.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"682\" height=\"522\" data-id=\"4944\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-48.png\" alt=\"\" class=\"wp-image-4944 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-48.png 682w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-48-300x230.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-48-16x12.png 16w\" data-sizes=\"auto, (max-width: 682px) 100vw, 682px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 682px; --smush-placeholder-aspect-ratio: 682\/522;\" \/><\/figure>\n<\/figure>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"t2v-best-for-ideation-stylized-scenes\">T2V: best for ideation &amp; stylized scenes<\/h2>\n\n\n\n<p>My first run on Jan 7 was a 6-second T2V: <em>&#8220;golden hour slow dolly through a tiny plant shop, cinematic bloom, soft dust motes.&#8221;<\/em> About 45 seconds to render at 720p\u2014not bad. The result? Surprisingly cohesive light, believable camera drift. Sure, the plants were generic\u2026 <strong>but honestly, I didn\u2019t care. Mood = nailed.<\/strong><\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"where-t2v-shines\">Where T2V shines<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Fast ideation: I generated five variations in under 10 minutes and picked two to push forward.<\/li>\n\n\n\n<li>Style-first scenes: dreamy, painterly, anime, cyberpunk, T2V leans into look and lighting.<\/li>\n\n\n\n<li>Camera language: Prompts like &#8220;handheld micro-jitter,&#8221; &#8220;locked-off tripod,&#8221; or &#8220;slow crane up&#8221; worked more often than I expected.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"where-it-stumbles\">Where it stumbles<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Consistent subjects: Characters morph, labels wobble. For product or brand work, it&#8217;s rarely final.<\/li>\n\n\n\n<li>Specific text: Any on-screen typography is hit-or-miss. Sometimes clever, often mushy.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"tiny-prompt-tip\">Tiny prompt tip<\/h3>\n\n\n\n<p>I got better motion control by adding a simple beat: &#8220;slow push in for 6 seconds, start wide, end medium.&#8221; Also, naming lenses (35mm vs 85mm) shaped depth of field in a way that felt real.<\/p>\n\n\n\n<p>When I clicked the &#8220;seed lock&#8221; and re-ran with minor prompt changes, I kept composition while exploring lighting. That saved me minutes per iteration.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"i2v-best-for-consistency-product-shots\">I2V: best for consistency &amp; product shots<\/h2>\n\n\n\n<p>On Jan 9, I tried I2V with a clean PNG of a matte-black bottle\u2026 And guess what? The bottle stayed on-model across three renders, label intact. I almost high-fived my laptop. That alone made I2V my go-to for product shots\u2014brand consistency feels <em>so<\/em> good.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"where-i2v-shines\">Where I2V shines<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Consistency: Faces, logos, packaging, less drift than T2V.<\/li>\n\n\n\n<li>Controlled hero shots: Add a tiny rotation or parallax and keep everything else steady.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"where-it-stumbles\">Where it stumbles<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Overly complex motion: If you push big arcs or whip pans, the base image can smear.<\/li>\n\n\n\n<li>Background synthesis: Without a clean plate or prompt, the backdrop can feel AI-generic.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"practical-setup\">Practical setup<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Start with a high-res, well-lit image. Garbage in, garbage out is very real here.<\/li>\n\n\n\n<li>Use short durations first (3\u20135s). Lock the look, then extend.<\/li>\n<\/ul>\n\n\n\n<p>In my tests, 1080p I2V runs took ~60\u201390s each. Worth it, because I didn&#8217;t have to fight brand drift later.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"v2v-best-for-style-transfer-edits\">V2V: best for style transfer &amp; edits<\/h2>\n\n\n\n<p>On Jan 10, I fed a 7-second iPhone clip of a latte pour (4K, 30fps) into V2V and asked for &#8220;soft film stock, grain, gentle halation, 1990s caf\u00e9 tone.&#8221; The result kept the timing, the foam swirls, and the hand motion, just wrapped it in a nostalgic look. It felt like color grading plus light re-interpretation.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"1080\" style=\"aspect-ratio: 1920 \/ 1080;\" width=\"1920\" controls src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/3fb6c97bf8881d8a0139bbd5b29106b3869b667b6c7dfae3de59c16cb555e734.mp4\"><\/video><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"where-v2v-shines\">Where V2V shines<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Style transfer without losing timing\/blocking.<\/li>\n\n\n\n<li>Cleanup for social: remove harsh digital edges, add filmic motion blur, unify color.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"where-it-stumbles\">Where it stumbles<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Heavy scene changes: Asking for &#8220;nighttime neon rain&#8221; from a sunny clip can break realism.<\/li>\n\n\n\n<li>Fast text overlays: Titles or captions can get softened unless you mask or composite later.<\/li>\n<\/ul>\n\n\n\n<p>Best use I found: Take a so-so clip and make it shippable in a few passes. It&#8217;s the closest to &#8220;AI-as-finisher.&#8221;<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"which-workflow-to-pick-by-scenario\">Which workflow to pick by scenario<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Moodboards or story seeds:<\/strong> T2V. Prompt broad, pick a favorite, then refine. If you&#8217;re pitching, it&#8217;s magic for quick looks.<\/li>\n\n\n\n<li><strong>Brand\/product spots:<\/strong> I2V. Start with approved art or keyframe renders. Keep the hero on-model and layer gentle motion.<\/li>\n\n\n\n<li><strong>Social polish or style match:<\/strong> V2V. Bring your own timing and let LTX-2 restyle it.<\/li>\n\n\n\n<li><strong>Character continuity across scenes: <\/strong>I2V first (lock face), then V2V to match shots into a sequence.<\/li>\n\n\n\n<li><strong>Rapid experimentation with camera moves:<\/strong> T2V with seed lock, then swap lenses\/lighting terms.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"parameter-differences-across-workflows\">Parameter differences across workflows<\/h2>\n\n\n\n<p>Every pipeline exposes a similar spine of controls, but they behave a bit differently.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Guidance\/CFG:<\/strong> In T2V, higher means stronger adherence to the prompt but more artifacts. I liked 5\u20137 for stylized looks, 3\u20134 for natural scenes. In I2V, higher can warp the subject, keep it conservative (3\u20135). In V2V, think of it as style force: too high and you&#8217;ll erase the original charm.<\/li>\n\n\n\n<li><strong>Motion strength: <\/strong>T2V uses it to decide how bold the camera\/scene movement should be. In I2V, it&#8217;s literally &#8220;don&#8217;t break my subject&#8221;, I stayed in the 0.2\u20130.4 range for product spins. In V2V, 0.4\u20130.6 added life without jitter.<\/li>\n\n\n\n<li><strong>Duration &amp; fps:<\/strong> Short clips (3\u20136s) kept coherence best across all three. 24fps looked most natural. 30fps is fine for phone footage in V2V.<\/li>\n\n\n\n<li><strong>Seed: <\/strong>Lock it when you like framing. Change it when you&#8217;re stuck in a weird local minimum (you&#8217;ll know it when you see the same odd face three times in a row).<\/li>\n\n\n\n<li><strong>Resolution: <\/strong>720p drafts are fast. 1080p is the sweet spot for previews. If there&#8217;s a 4K upscale option, use it at the end rather than generating at 4K from scratch.<\/li>\n<\/ul>\n\n\n\n<p>I timed a mini workflow on Jan 12: T2V ideation (5 clips, ~40s each), I2V hero (3 clips, ~75s each), V2V polish (2 clips, ~60s each). About 12 minutes total to get a 10\u201315s sequence ready for a rough cut.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-3 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"869\" height=\"419\" data-id=\"4945\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-49.png\" alt=\"\" class=\"wp-image-4945 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-49.png 869w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-49-300x145.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-49-768x370.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-49-18x9.png 18w\" data-sizes=\"auto, (max-width: 869px) 100vw, 869px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 869px; --smush-placeholder-aspect-ratio: 869\/419;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>Want to turn these workflow experiments into actual clips without juggling multiple nodes or setups? I personally use <strong>CrePal<\/strong>\u2014our own video tool with audio support\u2014so I can generate, refine, and export content fast \u2192 <a href=\"https:\/\/crepal.ai\/?utm_source=chatgpt.com\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">click here.<\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"common-pitfalls-per-workflow\">Common pitfalls per workflow<\/h2>\n\n\n\n<p><strong>T2V<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Over-stuffed prompts. Long shopping lists confuse the model. Write like a DP: mood, lens, move, light.<\/li>\n\n\n\n<li>Asking for legible text. If you need clean typography, composite it later.<\/li>\n<\/ul>\n\n\n\n<p><strong>I2V<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Low-quality source images. Noise in, noise out. Spend an extra 5 minutes cleaning your key image.<\/li>\n\n\n\n<li>Pushing big camera moves. Keep them subtle or the subject melts.<\/li>\n<\/ul>\n\n\n\n<p><strong>V2V<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Expecting physical impossibilities. If the source clip has flat lighting, don&#8217;t demand dramatic spotlights without re-shooting.<\/li>\n\n\n\n<li>Crushing details with heavy style force. Dial guidance back until skin, fabric, or foliage textures return.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-4 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"559\" data-id=\"4946\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-50-1024x559.png\" alt=\"\" class=\"wp-image-4946 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-50-1024x559.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-50-300x164.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-50-768x419.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-50-18x10.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-50.png 1408w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/559;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>One more general tip: Save versions. Jan 11, I lost a killer bottle highlight because I overwrote a seed. <strong>Rookie mistake, don\u2019t be me.<\/strong><\/p>\n\n\n\n<p>If you&#8217;re curious whether LTX-2 can be a real partner in your workflow: yes, with the right lane for the right job. T2V to spark, I2V to lock, V2V to finish. For deeper dives, check the official <strong><a href=\"https:\/\/www.lightricks.com\/ltxv-documentation\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">LTX Video documentation<\/a><\/strong> for tips on workflow, nodes, and settings.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p><strong>Previous posts:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"PzCAYwIKU1\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-free-online-2026\/\">LTX-2 Free Online: Generate AI Videos Without Local Setup<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a LTX-2 Free Online: Generate AI Videos Without Local Setup \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-free-online-2026\/embed\/#?secret=HIZQAxUW1m#?secret=PzCAYwIKU1\" data-secret=\"PzCAYwIKU1\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"UHWcY0L5Wp\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-vram-requirements\/\">LTX-2 VRAM Requirements: Can It Run on 8GB \/ 12GB \/ 24GB GPUs?<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a LTX-2 VRAM Requirements: Can It Run on 8GB \/ 12GB \/ 24GB GPUs? \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-vram-requirements\/embed\/#?secret=GJpu7RCcHA#?secret=UHWcY0L5Wp\" data-secret=\"UHWcY0L5Wp\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"qGULQKHrCW\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-audio-generation-sync-sound\/\">LTX-2 Audio Generation: How to Create Video with Synchronized Sound<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a LTX-2 Audio Generation: How to Create Video with Synchronized Sound \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-audio-generation-sync-sound\/embed\/#?secret=wL23gTIJkq#?secret=qGULQKHrCW\" data-secret=\"qGULQKHrCW\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>Hey, guys, I&#8217;m Dora. I finally gave in. I&#8217;d seen LTX-2 demos all over my feed, glossy product pans, moody fantasy shots, that kind of thing. Curiosity won. I blocked off an hour after lunch, opened a fresh project, and told myself: &#8220;Either this becomes part of my weekly workflow\u2026 or it goes into the [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":4942,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_gspb_post_css":"","_uag_custom_page_level_css":"","footnotes":""},"categories":[8],"tags":[],"class_list":["post-4941","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-aivideo"],"blocksy_meta":[],"uagb_featured_image_src":{"full":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-46.png",1376,768,false],"thumbnail":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-46-150x150.png",150,150,true],"medium":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-46-300x167.png",300,167,true],"medium_large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-46-768x429.png",768,429,true],"large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-46-1024x572.png",1024,572,true],"1536x1536":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-46.png",1376,768,false],"2048x2048":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-46.png",1376,768,false],"trp-custom-language-flag":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-46-18x10.png",18,10,true]},"uagb_author_info":{"display_name":"Dora","author_link":"https:\/\/crepal.ai\/blog\/author\/dora\/"},"uagb_comment_info":9,"uagb_excerpt":"Hey, guys, I&#8217;m Dora. I finally gave in. I&#8217;d seen LTX-2 demos all over my feed, glossy product pans, moody fantasy shots, that kind of thing. Curiosity won. I blocked off an hour after lunch, opened a fresh project, and told myself: &#8220;Either this becomes part of my weekly workflow\u2026 or it goes into the&hellip;","_links":{"self":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4941","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/comments?post=4941"}],"version-history":[{"count":1,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4941\/revisions"}],"predecessor-version":[{"id":4951,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4941\/revisions\/4951"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media\/4942"}],"wp:attachment":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media?parent=4941"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/categories?post=4941"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/tags?post=4941"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}