{"id":4851,"date":"2026-01-13T11:56:03","date_gmt":"2026-01-13T03:56:03","guid":{"rendered":"https:\/\/crepal.ai\/blog\/?p=4851"},"modified":"2026-01-15T11:06:23","modified_gmt":"2026-01-15T03:06:23","slug":"blog-ltx-2-comfyui-quick-start","status":"publish","type":"post","link":"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-comfyui-quick-start\/","title":{"rendered":"LTX-2 Quick Start: Generate Your First Video in 10 Minutes"},"content":{"rendered":"\n<p>Remember me? I&#8217;m your old friend, Dora. Two nights ago, I caught myself doomscrolling sample clips from<strong> LTX\u20112 <\/strong>at 1:13 a.m. The water splashes looked too good. I shut the tab\u2026 then opened it again and said, fine, I&#8217;ll try it. This is my field log from that first session and the next morning&#8217;s cleanup pass. If you want a fast<strong> LTX\u20112<\/strong> quick start without guessing settings, this is the path I&#8217;d hand a friend.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"choose-your-first-goal-t2v-vs-i2v\">Choose your first goal (T2V vs I2V)<\/h2>\n\n\n\n<p>Pick one. Don&#8217;t try both on your very first run, LTX\u20112 is happiest when you&#8217;re decisive.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Text\u2011to\u2011Video (T2V)<\/strong>: You write a prompt, it generates full motion from scratch. Use this if you&#8217;re exploring style, mood, or scenes you don&#8217;t have footage for. My first success was &#8220;a slow, gliding shot of morning light across a wooden table, dust motes floating.&#8221; See the official <strong><a href=\"https:\/\/docs.ltx.video\/open-source-model\/usage-guides\/text-to-video?utm_source=chatgpt.com\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Text\u2011to\u2011Video workflow guide<\/a><\/strong> for detailed steps.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"844\" height=\"473\" data-id=\"4853\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-26.png\" alt=\"LTX-2 quick start 2026: Text-to-Video workflow for generating synchronized video and audio from text prompts\" class=\"wp-image-4853 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-26.png 844w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-26-300x168.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-26-768x430.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-26-18x10.png 18w\" data-sizes=\"auto, (max-width: 844px) 100vw, 844px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 844px; --smush-placeholder-aspect-ratio: 844\/473;\" \/><\/figure>\n<\/figure>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Image\u2011to\u2011Video (I2V)<\/strong>: You start with a still frame and ask LTX\u20112 to animate it. This is great for product shots, logos, and thumbnails. I got more stable motion here on my first try, likely because the model had a strong anchor.<\/li>\n<\/ul>\n\n\n\n<p>On January 11, I ran both. T2V felt more creative, but also more chaotic in motion continuity. I2V gave me a usable, on\u2011brand clip faster. If you need a win today, start with I2V.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"minimal-setup-checklist-what-must-be-ready\">Minimal setup checklist (what must be ready)<\/h2>\n\n\n\n<p>I tested on Windows 11 with an RTX 4090 (24 GB VRAM) and 64 GB RAM. I also tried a 4070 Ti (12 GB VRAM), it worked, but I had to drop resolution you know. At this point I just wanted something to render.<\/p>\n\n\n\n<p>Have these ready before you hit generate:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>GPU with at least 12 GB VRAM. 16\u201324 GB makes life easier.<\/li>\n\n\n\n<li>Latest NVIDIA driver (I used 546.xx) and CUDA\/cuDNN via your PyTorch install. If you&#8217;re on a<strong>Mac<\/strong>, run via the official cloud\/runtime instead of local GPU.<\/li>\n\n\n\n<li>Python 3.10+ and a fresh virtual environment, or use ComfyUI if you prefer node\u2011based workflows.<\/li>\n\n\n\n<li>Storage headroom: a 6\u20138 second, 720p clip at 24 fps can eat 300\u2013600 MB during temp caching.<\/li>\n\n\n\n<li>The official <a href=\"https:\/\/huggingface.co\/Lightricks\/LTX-2\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">LTX\u20112 weights and workflow<\/a> (I pulled from the repo on Jan 11, 2026). Check release notes for model hash to avoid mismatches.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"576\" data-id=\"4854\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-27-1024x576.png\" alt=\"\" class=\"wp-image-4854 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-27-1024x576.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-27-300x169.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-27-768x432.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-27-18x10.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-27.png 1280w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/576;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>A small tip of mine: If you&#8217;re new to this, ComfyUI is the least painful path. Node names match screenshots from the docs, and you can swap encoders\/decoders without re\u2011wiring everything.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"import-an-official-workflow\">Import an official workflow<\/h2>\n\n\n\n<p>I learned the hard way: don&#8217;t build from scratch at 1 a.m. We should grab the <a href=\"https:\/\/github.com\/Lightricks\/ComfyUI-LTXVideo\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">official workflow<\/a> first, it bakes in the sampler, scheduler, and the recommended pre\/post\u2011processing.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"what-i-did-jan-11\"><strong>What I did (Jan 11):<\/strong><\/h3>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>Downloaded the &#8220;LTX\u20112 Text2Video&#8221; and &#8220;Image2Video&#8221; ComfyUI JSONs from the <a href=\"https:\/\/docs.ltx.video\/open-source-model\/integration-tools\/comfy-ui\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">official repo<\/a>. Saved into <code>ComfyUI\/custom_nodes\/workflows<\/code>.<\/li>\n\n\n\n<li>Restarted ComfyUI. Both graphs appeared under the Workflow menu.<\/li>\n\n\n\n<li>In the T2V graph, I only touched three nodes at first: Prompt, Video Length, and Resolution. I left the sampler at the recommended defaults.<\/li>\n<\/ol>\n\n\n\n<p>If you&#8217;re not using ComfyUI, the CLI templates in the docs are fine. Just copy their exact arguments for scheduler, steps, and latent size. Tiny deviations can cause weird motion or memory spikes.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"set-3-key-parameters-duration-resolution-steps\">Set 3 key parameters (duration, resolution, steps)<\/h2>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-3 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"559\" data-id=\"4855\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-28-1024x559.png\" alt=\"LTX-2 quick start tutorial: Key parameters sliders for duration, steps, and resolution control\" class=\"wp-image-4855 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-28-1024x559.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-28-300x164.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-28-768x419.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-28-18x10.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-28.png 1408w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/559;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>These three settings decide whether your first run feels great or glitchy.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Duration<\/strong>: Start at 4\u20136 seconds. I use 96\u2013144 frames at 24 fps. Longer is tempting, but errors compound over time (drift, warping). Nail a short clip first.<\/li>\n\n\n\n<li><strong>Resolution<\/strong>: 768\u00d7432 or 832\u00d7468 for 4090: 640\u00d7360 on 12 GB VRAM. You can upscale later with a video upscaler. Going 1080p on your very first run is how you meet OOM.<\/li>\n\n\n\n<li><strong>Steps<\/strong>: 30\u201340 for a draft, 50\u201360 when you like the composition. Past ~70 I saw diminishing returns and more chance of flicker.<\/li>\n<\/ul>\n\n\n\n<p>My test notes (Jan 11, 10:22 a.m.): 24 fps, 128 frames, 768\u00d7432, 48 steps on 4090 took ~2 min 41 sec and produced the dust\u2011mote table shot I mentioned. Bumping to 60 steps made lighting nicer but also added a tiny shimmer on edges.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"use-5-safe-starter-prompts-copy-paste\">Use 5 &#8220;safe&#8221; starter prompts (copy\/paste)<\/h2>\n\n\n\n<p>If you just want a first clean win, these gave me stable motion with minimal wobble. Paste them as is, then swap nouns and styles.<\/p>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>&#8220;A slow, cinematic dolly shot across a sunlit wooden table, dust particles floating, shallow depth of field, natural morning light, 24mm lens feel.&#8221;<\/li>\n\n\n\n<li>&#8220;A neon sign flickering on in a rainy alley at night, puddles reflecting the colors, subtle camera push\u2011in, moody, high dynamic range.&#8221;<\/li>\n\n\n\n<li>&#8220;Close\u2011up of a steaming cup of coffee on a desk, gentle steam motion, soft window light, macro feel, calm atmosphere.&#8221;<\/li>\n\n\n\n<li>&#8220;A minimalist clay stop\u2011motion style of a small plant sprouting, smooth loopable growth, soft studio lighting.&#8221;<\/li>\n\n\n\n<li><strong>(I2V)<\/strong> &#8220;Animate this still photo with a subtle parallax and natural breathing motion: keep the original colors and composition: avoid warping faces.&#8221;<\/li>\n<\/ol>\n\n\n\n<p><strong>Why these work:<\/strong> they limit chaotic elements (no crowds, no fast cuts), specify camera motion, and aim for lighting that the model handles well. When I tried &#8220;crowded street with rapid handheld camera,&#8221; the motion was jitter city.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"export-settings-fps-format\">Export settings (fps, format)<\/h2>\n\n\n\n<p>Keep exports boring and compatible on day one.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>FPS<\/strong>: 24 fps looks cinematic and is lighter to compute. If you need smoother, go 30. Interpolation plugins can add frames later.<\/li>\n\n\n\n<li><strong>Format<\/strong>: MP4 (H.264) for quick sharing. If you plan color work, export ProRes 422 or a PNG sequence for lossless frames. Sequences are great for fixing a few bad frames in Photoshop.<\/li>\n\n\n\n<li><strong>Bitrate<\/strong>: For 768\u00d7432 at 24 fps, 10\u201316 Mbps is plenty for previews. Go higher if gradients band.<\/li>\n\n\n\n<li><strong>Color<\/strong>: Stick to sRGB\/gamma 2.2 unless your pipeline is color\u2011managed. I mistakenly toggled a wide\u2011gamut profile and got washed highlights on upload.<\/li>\n<\/ul>\n\n\n\n<p>On Jan 11, my 6\u2011second MP4 at 24 fps (H.264, 12 Mbps) landed at 9.8 MB. Looked crisp on mobile and desktop.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"first-run-issues-slow-oom-weird-motion\">First-run issues (slow, OOM, weird motion)<\/h2>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-4 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"835\" height=\"459\" data-id=\"4856\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-29.png\" alt=\"\" class=\"wp-image-4856 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-29.png 835w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-29-300x165.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-29-768x422.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-29-18x10.png 18w\" data-sizes=\"auto, (max-width: 835px) 100vw, 835px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 835px; --smush-placeholder-aspect-ratio: 835\/459;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>Here&#8217;s what bit me, and how I fixed it.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>It&#8217;s slow<\/strong>: If your GPU sits below 60% utilization, you&#8217;re bottlenecked on CPU or disk. Close background apps, keep assets on an SSD, and drop resolution one notch. On my 4070 Ti box, going from 832\u00d7468 to 640\u00d7360 shaved ~35% off render time.<\/li>\n\n\n\n<li><strong>OOM<\/strong><strong> (<\/strong><strong>out of memory<\/strong><strong>)<\/strong>: Cut resolution first, then steps, then duration. Also make sure &#8220;save VRAM&#8221; or tiled decoding is enabled if your workflow supports it. In ComfyUI, I enabled VRAM\u2011saving for VAE decode and it stopped crashing at 48 steps.<\/li>\n\n\n\n<li><strong>Weird motion\/wobble<\/strong>: Add a touch of motion consistency. In T2V, constrain camera motion in the prompt (&#8220;gentle dolly,&#8221; &#8220;tripod shot&#8221;). In I2V, pick a base image with strong edges and avoid thin patterns. I had a logo with hairline text that pulsed: thickening the font solved it.<\/li>\n\n\n\n<li><strong>Flicker in lighting<\/strong>: Lower steps slightly or try a different scheduler (the official one is a good default). Temporal denoise in post helps: even a light pass in DaVinci Resolve cleaned my neon sign clip.<\/li>\n\n\n\n<li><strong>Faces melting<\/strong>: Keep faces larger and well\u2011lit. If you need portraits, I2V with a high\u2011quality still worked better than T2V for me.<\/li>\n<\/ul>\n\n\n\n<p>If nothing helps, pull the exact versions from the official docs and match their sample command line. Version drift is sneaky.<\/p>\n\n\n\n<p><strong>Resources I found useful:<\/strong><a href=\"https:\/\/docs.ltx.video\/open-source-model\" target=\"_blank\" rel=\"noreferrer noopener nofollow\"> official LTX\u20112 docs<\/a> and workflow JSONs (checked Jan 11, 2026), and their release notes for recommended schedulers.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p>Last thought: LTX\u20112 rewards restraint. Short clip, modest res, clear camera direction, then iterate. When it clicks, you&#8217;ll feel that little spark. I did at 1:13 a.m., and yeah, I gotta say, it was worth the lost sleep.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-5 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"493\" data-id=\"4857\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-30-1024x493.png\" alt=\"\" class=\"wp-image-4857 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-30-1024x493.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-30-300x144.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-30-768x370.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-30-18x9.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-30.png 1359w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/493;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>We built <strong>Crepal<\/strong> so I could quickly spin up visual drafts before committing to a full LTX\u20112 render\u2014seeing rough results early usually tells me which path is worth the time. <a href=\"https:\/\/crepal.ai\/?utm_source=chatgpt.com\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Check it out here<\/a>.<\/p>\n\n\n\n<p>Alright, your turn\u2014don\u2019t leave me hanging. What was your very first LTX\u20112 clip about? Drop your victory screenshot in the comments.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p><strong>Previous posts:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"pAjIKNyz2E\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-vs-wan-2-6\/\">LTX-2 vs Wan 2.6: Open-Source Video Models Compared (Quality, Speed, Audio)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a LTX-2 vs Wan 2.6: Open-Source Video Models Compared (Quality, Speed, Audio) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-vs-wan-2-6\/embed\/#?secret=fKOzWppPen#?secret=pAjIKNyz2E\" data-secret=\"pAjIKNyz2E\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"YjAlcllu2r\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-how-to-install-ltx-2-in-comfyui\/\">How to Install LTX-2 in ComfyUI (Step-by-Step, No Custom Nodes)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a How to Install LTX-2 in ComfyUI (Step-by-Step, No Custom Nodes) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-how-to-install-ltx-2-in-comfyui\/embed\/#?secret=IsO4s570CV#?secret=YjAlcllu2r\" data-secret=\"YjAlcllu2r\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"khrhLSwarY\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-comfyui-day0-native-support\/\">LTX-2 ComfyUI: Day-0 Native Support Explained (What You Get Out of the Box)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a LTX-2 ComfyUI: Day-0 Native Support Explained (What You Get Out of the Box) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-comfyui-day0-native-support\/embed\/#?secret=dmFEDbJ8Mp#?secret=khrhLSwarY\" data-secret=\"khrhLSwarY\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>Remember me? I&#8217;m your old friend, Dora. Two nights ago, I caught myself doomscrolling sample clips from LTX\u20112 at 1:13 a.m. The water splashes looked too good. I shut the tab\u2026 then opened it again and said, fine, I&#8217;ll try it. This is my field log from that first session and the next morning&#8217;s cleanup [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":4852,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_gspb_post_css":"","_uag_custom_page_level_css":"","footnotes":""},"categories":[8],"tags":[],"class_list":["post-4851","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-aivideo"],"blocksy_meta":[],"uagb_featured_image_src":{"full":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-25.png",1376,768,false],"thumbnail":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-25-150x150.png",150,150,true],"medium":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-25-300x167.png",300,167,true],"medium_large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-25-768x429.png",768,429,true],"large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-25-1024x572.png",1024,572,true],"1536x1536":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-25.png",1376,768,false],"2048x2048":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-25.png",1376,768,false],"trp-custom-language-flag":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-25-18x10.png",18,10,true]},"uagb_author_info":{"display_name":"Dora","author_link":"https:\/\/crepal.ai\/blog\/author\/dora\/"},"uagb_comment_info":4,"uagb_excerpt":"Remember me? I&#8217;m your old friend, Dora. Two nights ago, I caught myself doomscrolling sample clips from LTX\u20112 at 1:13 a.m. The water splashes looked too good. I shut the tab\u2026 then opened it again and said, fine, I&#8217;ll try it. This is my field log from that first session and the next morning&#8217;s cleanup&hellip;","_links":{"self":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4851","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/comments?post=4851"}],"version-history":[{"count":2,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4851\/revisions"}],"predecessor-version":[{"id":4876,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4851\/revisions\/4876"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media\/4852"}],"wp:attachment":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media?parent=4851"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/categories?post=4851"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/tags?post=4851"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}