{"id":4961,"date":"2026-01-21T12:17:39","date_gmt":"2026-01-21T04:17:39","guid":{"rendered":"https:\/\/crepal.ai\/blog\/?p=4961"},"modified":"2026-01-21T12:17:41","modified_gmt":"2026-01-21T04:17:41","slug":"blog-ltx-2-prompting-guide","status":"publish","type":"post","link":"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-prompting-guide\/","title":{"rendered":"LTX-2 Prompting Guide: Motion, Camera Moves, and Cinematic Results"},"content":{"rendered":"\n<p>Hi, I&#8217;m Dora. On January 12, 2026, at 9:40 p.m., Honestly, I opened <a href=\"https:\/\/github.com\/Lightricks\/LTX-2\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">LTX-2<\/a> &#8220;just for five minutes&#8221; and ended up tinkering until midnight. My first clip, a coffee cup on a desk, looked fine until the handle teleported frame to frame. That tiny glitch sent me down a rabbit hole. This LTX-2 Prompting Guide is basically my field notes from the past week of poking, breaking, and fixing prompts.<\/p>\n\n\n\n<p>If you&#8217;re juggling content, research, or marketing, you don&#8217;t need fluff. You need prompts that behave, fast ways to fix jitter and mush, and cues that make LTX-2 feel like a teammate, not another tab you forget.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"549\" data-id=\"4964\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-57-1024x549.png\" alt=\"\" class=\"wp-image-4964 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-57-1024x549.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-57-300x161.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-57-768x412.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-57-1536x823.png 1536w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-57-18x10.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-57.png 1772w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/549;\" \/><\/figure>\n<\/figure>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"how-ltx-2-reads-prompts\">How LTX-2 &#8220;reads&#8221; prompts<\/h2>\n\n\n\n<p>Here&#8217;s what clicked for me after generating 27 test clips between Jan 12\u201318, 2026:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>LTX-2 prioritizes structure first, then style.<\/strong> If you give it a clear subject, action, and constraints, it locks in composition better. Style words (cinematic, dreamy) decorate what&#8217;s already defined.<\/li>\n\n\n\n<li><strong>It favors explicit motion verbs.<\/strong> &#8220;Rotate,&#8221; &#8220;pan,&#8221; &#8220;dolly,&#8221; &#8220;track,&#8221; &#8220;zoom,&#8221; &#8220;tilt.&#8221; When I swapped vague language (make it dynamic) for a concrete move (slow dolly-in), stability improved.<\/li>\n\n\n\n<li><strong>One sentence per idea helps.<\/strong> Long, windy prompts produced drifting and odd transitions. Short clauses = cleaner motion planning.<\/li>\n\n\n\n<li><strong>Order matters a bit.<\/strong> I got more consistent results with this sequence: Subject \u2192 Action \u2192 Camera \u2192 Lighting \u2192 Lens \u2192 Constraints \u2192 Negatives.<\/li>\n<\/ul>\n\n\n\n<p>Think of the model like a careful DP: it can do a lot, but it loves a shot list.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"motion-verbs-constraints\">Motion verbs &amp; constraints<\/h2>\n\n\n\n<p>On Jan 14, I tested identical scenes with only the motion verb changed. Clear verbs reduced wobble by ~30% (rough estimate from side\u2011by\u2011side exports at 24 fps).<\/p>\n\n\n\n<p><strong>Motion verbs that behaved best for me:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Dolly-in \/ dolly-out<\/strong> (smooth depth changes)<\/li>\n\n\n\n<li><strong>Slow pan left\/right<\/strong> (gentle lateral movement)<\/li>\n\n\n\n<li><strong>Track forward\/back<\/strong> (follow movement, less jitter than handheld)<\/li>\n\n\n\n<li><strong>Subtle tilt up\/down<\/strong> (pairs well with reveals)<\/li>\n\n\n\n<li><strong>Static lock-off<\/strong> (when you want crisp detail)<\/li>\n<\/ul>\n\n\n\n<p><strong>Helpful constraints:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Speed:<\/strong> &#8220;slow,&#8221; &#8220;gentle,&#8221; or &#8220;0.5x speed&#8221;<\/li>\n\n\n\n<li><strong>Duration hints:<\/strong> &#8220;5-second shot,&#8221; &#8220;8-second loop&#8221;<\/li>\n\n\n\n<li><strong>Composition locks:<\/strong> &#8220;center-framed,&#8221; &#8220;profile view,&#8221; &#8220;wide establishing shot,&#8221; &#8220;macro close-up&#8221;<\/li>\n<\/ul>\n\n\n\n<p><strong>Example that worked:<\/strong> &#8220;Center-framed subject, slow dolly-in, 5-second shot, gentle camera, keep background stable.&#8221;<\/p>\n\n\n\n<p>When I got greedy (fast pan + fast zoom + handheld), artifacts spiked. One clean move beats three messy ones.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"camera-prompts-pan-zoom-dolly-handheld\">Camera prompts (pan\/zoom\/dolly\/handheld)<\/h2>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"576\" data-id=\"4965\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-58-1024x576.png\" alt=\"\" class=\"wp-image-4965 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-58-1024x576.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-58-300x169.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-58-768x432.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-58-18x10.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-58.png 1280w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/576;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>I didn&#8217;t expect camera words to matter this much you know, but they do.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Pan:<\/strong> &#8220;slow pan right across the skyline&#8221; gave me straighter horizons than &#8220;move across the skyline.&#8221;<\/li>\n\n\n\n<li><strong>Zoom:<\/strong> Digital zoom cues can hunt focus. I got better results with &#8220;dolly-in&#8221; when I wanted that zoom feel, unless I explicitly needed a zoom aesthetic.<\/li>\n\n\n\n<li><strong>Dolly\/Track:<\/strong> Best choice for cinematic reveals. According to <a href=\"https:\/\/www.studiobinder.com\/blog\/dolly-shot-camera-movements\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">StudioBinder&#8217;s dolly shot guide<\/a>, &#8220;Slow dolly-in toward the subject&#8221; creates emotional intimacy and kept geometry stable in my tests.<\/li>\n\n\n\n<li><strong>Handheld:<\/strong> Use sparingly. &#8220;Subtle handheld, micro\u2011shakes only&#8221; can sell realism, but anything stronger turned into jelly edges on Jan 16 tests.<\/li>\n<\/ul>\n\n\n\n<p><strong>Two reliable patterns:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>&#8220;Static camera, subject motion only&#8221; for tutorials or product shots.<\/li>\n\n\n\n<li>&#8220;Slow dolly-in, no camera roll&#8221; for interviews or hero shots.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"lighting-lens-cues-for-realism\">Lighting &amp; lens cues for realism<\/h2>\n\n\n\n<p>Lighting and lens language stopped my clips from looking\u2026 AI-ish.<\/p>\n\n\n\n<p><strong>Lighting cues that behaved:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>&#8220;Soft north\u2011window light, gentle falloff&#8221; for natural interiors<\/li>\n\n\n\n<li>&#8220;Golden hour rim light, long shadows&#8221; for outdoor warmth<\/li>\n\n\n\n<li>&#8220;Overcast, diffuse light, low contrast&#8221; to kill harsh edges<\/li>\n\n\n\n<li>&#8220;Practical lamp as key, warm tungsten, slight spill&#8221; for cozy scenes<\/li>\n<\/ul>\n\n\n\n<p><strong>Lens &amp; exposure cues:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>&#8220;35mm lens, f\/2.8, shallow depth of field&#8221; for people\/products<\/li>\n\n\n\n<li>&#8220;85mm portrait lens, bokeh, crisp eyes&#8221; for talking heads<\/li>\n\n\n\n<li>&#8220;16mm wide, deep focus, minimal distortion&#8221; for rooms\/landscapes<\/li>\n\n\n\n<li>&#8220;Neutral color grade, soft contrast, no extreme saturation&#8221; when I needed clean skin tones<\/li>\n<\/ul>\n\n\n\n<p><strong>Analogy that helped me:<\/strong> the prompt is your shot list: lens and light are your mood board.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"negative-prompts-that-reduce-artifacts\">Negative prompts that reduce artifacts<\/h2>\n\n\n\n<p>If you only adopt one habit, make it this: add light, honest negatives. On Jan 17, I cut visible glitches in half by specifying what I didn&#8217;t want.<\/p>\n\n\n\n<p><strong>Negatives I reuse:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>&#8220;no extra limbs, no face warp, no object duplication&#8221;<\/li>\n\n\n\n<li>&#8220;no text artifacts, no floating logos, no watermark&#8221;<\/li>\n\n\n\n<li>&#8220;no extreme motion blur, no rolling shutter wobble&#8221;<\/li>\n\n\n\n<li>&#8220;no flicker, no frame-to-frame texture shift&#8221;<\/li>\n\n\n\n<li>&#8220;no Dutch angle, no rapid handheld, keep horizon level&#8221;<\/li>\n<\/ul>\n\n\n\n<p>Keep negatives short. When I stacked 20 of them, results got\u2026 weird. Five to eight targeted lines worked best.<\/p>\n\n\n\n<p>When I iterated on prompt shapes and motion cues, I used our <strong>Crepal<\/strong> to manage and preview multiple outputs in a unified interface \u2014 it makes running batches and comparing result variants easier as part of a creative workflow we use ourselves.<\/p>\n\n\n\n<p> \u27a1\ufe0f <strong><a href=\"https:\/\/crepal.ai\/?utm_source=chatgpt.com\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Try Crepal here<\/a><\/strong><\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-3 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"679\" data-id=\"4966\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-59-1024x679.png\" alt=\"\" class=\"wp-image-4966 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-59-1024x679.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-59-300x199.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-59-768x509.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-59-18x12.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-59.png 1270w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/679;\" \/><\/figure>\n<\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"30-copy-paste-prompt-templates-by-category\">30 copy-paste prompt templates (by category)<\/h2>\n\n\n\n<p>Not sponsored, no affiliates. These are the exact shapes I use. Replace [brackets]. I ran each on Jan 15\u201318, 2026.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"product-demos-5\">Product demos (5)<\/h3>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>&#8220;[Product] on matte table, static camera, soft north\u2011window light, 35mm f\/2.8, slow parallax from subject motion only, no text artifacts.&#8221;<\/li>\n\n\n\n<li>&#8220;Slow dolly\u2011in to [product] unboxing, hands enter frame from right, crisp edges, neutral grade, no flicker.&#8221;<\/li>\n\n\n\n<li>&#8220;[Product] rotating 360\u00b0 turntable, black velvet backdrop, edge light, minimal reflections, no object duplication.&#8221;<\/li>\n\n\n\n<li>&#8220;Overhead top\u2011down of [product] with labels placeholders, static lock\u2011off, overcast lighting look, no watermark.&#8221;<\/li>\n\n\n\n<li>&#8220;Lifestyle: [person] uses [product] by window, gentle handheld micro\u2011shake, golden hour rim, keep horizon level, no face warp.&#8221;<\/li>\n<\/ol>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"people-interviews-5\">People &amp; interviews (5)<\/h3>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>&#8220;Center\u2011framed [person] talking, slow dolly\u2011in, 85mm portrait, f\/2, clean skin tones, no extreme saturation, no Dutch angle.&#8221;<\/li>\n\n\n\n<li>&#8220;Two\u2011shot conversation, alternating over\u2011the\u2011shoulder, 35mm, soft key + fill, subtle background bokeh, no jitter.&#8221;<\/li>\n\n\n\n<li>&#8220;[person] walks toward camera, track backward at slow pace, stable background, no rolling shutter wobble.&#8221;<\/li>\n\n\n\n<li>&#8220;Profile interview, static camera, neutral grade, eyes tack sharp, no face morphing.&#8221;<\/li>\n\n\n\n<li>&#8220;Close\u2011up hands typing on [device], macro look, shallow DOF, controlled speculars, no text artifacts.&#8221;<\/li>\n<\/ol>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"b-roll-transitions-5\">B\u2011roll &amp; transitions (5)<\/h3>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>&#8220;City skyline, slow pan right, golden hour, long shadows, no flicker, no horizon roll.&#8221;<\/li>\n\n\n\n<li>&#8220;Office desk details: pens, notebook, coffee steam, static lock\u2011off, overcast light, subtle steam motion only.&#8221;<\/li>\n\n\n\n<li>&#8220;Rack focus from foreground plant to background laptop, 35mm, smooth focus pull, no focus hunting.&#8221;<\/li>\n\n\n\n<li>&#8220;Drone\u2011style reveal, slow tilt up from ground to skyline, stable motion, no rapid handheld.&#8221;<\/li>\n\n\n\n<li>&#8220;Match\u2011cut style: hand closes notebook, gentle dolly\u2011in, neutral color, no motion blur.&#8221;<\/li>\n<\/ol>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"education-explainers-5\">Education &amp; explainers (5)<\/h3>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>&#8220;White backdrop, floating parts of [concept] assemble, static camera, soft shadow, no text, no watermark.&#8221;<\/li>\n\n\n\n<li>&#8220;Chalkboard vibe: hand draws diagram, time\u2011lapse feel, lock\u2011off, high contrast edges, no jitter.&#8221;<\/li>\n\n\n\n<li>&#8220;Timeline of [topic], card\u2011style elements slide in, minimal parallax, no flicker, no overlapping geometry.&#8221;<\/li>\n\n\n\n<li>&#8220;Screen\u2011style demo of [workflow], over\u2011the\u2011shoulder, 35mm, even lighting, no UI gibberish.&#8221;<\/li>\n\n\n\n<li>&#8220;Metaphor shot: domino chain triggers [outcome], static camera, deep focus, no duplication.&#8221;<\/li>\n<\/ol>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"ads-socials-5\">Ads &amp; socials (5)<\/h3>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>&#8220;Punchy 5\u2011second hook for [offer], fast cut feel but single shot, slow push\u2011in, high contrast, no over\u2011saturation.&#8221;<\/li>\n\n\n\n<li>&#8220;UGC style: selfie\u2011like framing, subtle handheld, window light, honest vibe, no beauty filter artifacts.&#8221;<\/li>\n\n\n\n<li>&#8220;Stop\u2011scroll macro of [texture], 50mm macro, raking light, crisp detail, no mushy surfaces.&#8221;<\/li>\n\n\n\n<li>&#8220;Before\/after of [result], split frame, static camera, consistent lighting, no cross\u2011fade artifacts.&#8221;<\/li>\n\n\n\n<li>&#8220;Loopable 3\u2011second cinemagraph of [scene], one element moving (steam\/water), no extra motion.&#8221;<\/li>\n<\/ol>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"nature-travel-5\">Nature &amp; travel (5)<\/h3>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>&#8220;Forest path, slow track forward, overcast diffuse light, deep focus, no flicker.&#8221;<\/li>\n\n\n\n<li>&#8220;Ocean waves at sunrise, static lock\u2011off, long exposure feel, gentle motion blur only, no jitter.&#8221;<\/li>\n\n\n\n<li>&#8220;Mountain vista, slow pan left, 16mm wide, low contrast grade, horizon locked.&#8221;<\/li>\n\n\n\n<li>&#8220;Close\u2011up flower in breeze, macro, shallow DOF, natural sway, no duplication of petals.&#8221;<\/li>\n\n\n\n<li>&#8220;Night city rain, neon reflections, static camera, controlled highlights, no banding.&#8221;<\/li>\n<\/ol>\n\n\n\n<p><strong>Tip:<\/strong> keep a personal library. I tag mine by motion type (dolly, pan, lock\u2011off) so I can swap subjects fast.<\/p>\n\n\n\n<p>For more details on LTX-2&#8217;s technical capabilities, check out the <a href=\"https:\/\/ltx-2.ai\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">official LTX-2 documentation<\/a> or explore the <a href=\"https:\/\/ltx.video\/blog\/how-to-prompt-for-ltx-2\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">LTX-2 prompting guide on Lightricks<\/a>.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/pcnztpnyu4sh.feishu.cn\/space\/api\/box\/stream\/download\/asynccode\/?code=OGNjYjA0ZTZhMzNmNmE2MGE2MTFmMWRlZTEwOGI4ZWVfaVRXdFEyY3M0SVV6ZzI3M0ZJVGdra1ZTMEhhY29CUWxfVG9rZW46Q3ozcGJNRUJib3hIYWF4dzRxb2NRVDQwbmFmXzE3Njg5Njg4Mzk6MTc2ODk3MjQzOV9WNA\" alt=\"\" \/><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p>And now, confession time: how many of you opened LTX-2 \u201cjust for five minutes\u201d and blinked to find it\u2019s 2 a.m.? Drop your personal record (and your favorite glitch-to-gold prompt fix) in the comments\u2014I need to know I\u2019m not the only one living in this rabbit hole.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p>Previous posts:<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"zaeCKQufi6\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-best-settings-comfyui-2026\/\">LTX-2 Best Settings in ComfyUI: Quality vs Speed Presets (2026)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a LTX-2 Best Settings in ComfyUI: Quality vs Speed Presets (2026) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-best-settings-comfyui-2026\/embed\/#?secret=CfuuTwm1dB#?secret=zaeCKQufi6\" data-secret=\"zaeCKQufi6\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"FUSSULw2Ol\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-comfyui-workflows-t2v-i2v-v2v\/\">LTX-2 Workflows in ComfyUI Explained (T2V vs I2V vs V2V)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a LTX-2 Workflows in ComfyUI Explained (T2V vs I2V vs V2V) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-comfyui-workflows-t2v-i2v-v2v\/embed\/#?secret=up440Hgl2z#?secret=FUSSULw2Ol\" data-secret=\"FUSSULw2Ol\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"YWHYM8G9Hn\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-free-online-2026\/\">LTX-2 Free Online: Generate AI Videos Without Local Setup<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a LTX-2 Free Online: Generate AI Videos Without Local Setup \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-free-online-2026\/embed\/#?secret=170tzwLp3Z#?secret=YWHYM8G9Hn\" data-secret=\"YWHYM8G9Hn\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>Hi, I&#8217;m Dora. On January 12, 2026, at 9:40 p.m., Honestly, I opened LTX-2 &#8220;just for five minutes&#8221; and ended up tinkering until midnight. My first clip, a coffee cup on a desk, looked fine until the handle teleported frame to frame. That tiny glitch sent me down a rabbit hole. This LTX-2 Prompting Guide [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":4963,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_gspb_post_css":"","_uag_custom_page_level_css":"","footnotes":""},"categories":[8],"tags":[],"class_list":["post-4961","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-aivideo"],"blocksy_meta":[],"uagb_featured_image_src":{"full":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-56.png",1376,768,false],"thumbnail":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-56-150x150.png",150,150,true],"medium":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-56-300x167.png",300,167,true],"medium_large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-56-768x429.png",768,429,true],"large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-56-1024x572.png",1024,572,true],"1536x1536":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-56.png",1376,768,false],"2048x2048":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-56.png",1376,768,false],"trp-custom-language-flag":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-56-18x10.png",18,10,true]},"uagb_author_info":{"display_name":"Dora","author_link":"https:\/\/crepal.ai\/blog\/author\/dora\/"},"uagb_comment_info":4,"uagb_excerpt":"Hi, I&#8217;m Dora. On January 12, 2026, at 9:40 p.m., Honestly, I opened LTX-2 &#8220;just for five minutes&#8221; and ended up tinkering until midnight. My first clip, a coffee cup on a desk, looked fine until the handle teleported frame to frame. That tiny glitch sent me down a rabbit hole. This LTX-2 Prompting Guide&hellip;","_links":{"self":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4961","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/comments?post=4961"}],"version-history":[{"count":2,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4961\/revisions"}],"predecessor-version":[{"id":4968,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4961\/revisions\/4968"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media\/4963"}],"wp:attachment":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media?parent=4961"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/categories?post=4961"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/tags?post=4961"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}