{"id":4819,"date":"2026-01-07T18:31:36","date_gmt":"2026-01-07T10:31:36","guid":{"rendered":"https:\/\/crepal.ai\/blog\/?p=4819"},"modified":"2026-01-07T18:31:40","modified_gmt":"2026-01-07T10:31:40","slug":"blog-ltx-2-comfyui-day0-native-support","status":"publish","type":"post","link":"https:\/\/crepal.ai\/blog\/aivideo\/blog-ltx-2-comfyui-day0-native-support\/","title":{"rendered":"LTX-2 ComfyUI: Day-0 Native Support Explained (What You Get Out of the Box)"},"content":{"rendered":"\n<p>I held my cold tea, staring at the waves on the screen and wondering: Am I surfing, or just waiting for the render to finish? I tried to coax a 15\u2011second surf clip out of <strong><a href=\"https:\/\/ltx.io\/model\/ltx-2?utm_source=google&amp;utm_medium=cpc&amp;utm_campaign=22897522929&amp;utm_adset_id=187850528590&amp;utm_ad_id=791337039577&amp;utm_term=ltx-2&amp;gad_source=1&amp;gad_campaignid=22897522929&amp;gbraid=0AAAABBHMIvZ4qfiKbaFDvabD4Z-3XWMRY&amp;gclid=Cj0KCQiApfjKBhC0ARIsAMiR_Is-0FgzyvWPksHYz8P2siEoCyD4jqCN23fMSUPYm7Yz2AT3Nnf9m7UaAsNSEALw_wcB\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">LTX\u20112 <\/a><\/strong>in <strong>ComfyUI<\/strong>. I&#8217;d seen &#8220;4K at 50fps with audio sync&#8221; all over my feed and, honestly, I wanted to see if it was real or just another pretty demo. Not sponsored, just me, my RTX 4090 (24GB), and a stubborn streak.<\/p>\n\n\n\n<p>Here&#8217;s what I found after a weekend of testing, plus what &#8220;day\u20110 native support&#8221; in ComfyUI actually means when you&#8217;re the one hitting Run.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"what-ltx-2-is-4k-50fps-20s-audio-sync\">What LTX-2 is (4K@50fps, 20s, audio sync)<\/h2>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"692\" data-id=\"4821\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-9-1024x692.png\" alt=\"\" class=\"wp-image-4821 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-9-1024x692.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-9-300x203.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-9-768x519.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-9-18x12.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-9.png 1299w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/692;\" \/><\/figure>\n<\/figure>\n\n\n\n<p><a href=\"https:\/\/github.com\/Lightricks\/LTX-2\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">LTX\u20112 is an open text\u2011to\u2011video model <\/a>designed to generate up to 20\u2011second clips, with a headline promise of 4K output at 50fps and optional audio\u2011conditioned lip\/beat sync. The big idea: you feed text (and optionally an audio track), and it renders motion that feels less jittery and more coherent than earlier open models.<\/p>\n\n\n\n<p>On my rig, I could preview at 576p\u2013720p quickly, then push to 1080p reliably. 4K worked with tiling and some patience, but it&#8217;s not one\u2011click magic unless you&#8217;ve got serious VRAM and you&#8217;re okay with longer renders. The audio sync part is real, but you need the right weights (more on that below).<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"key-specs-vs-other-open-source-models\">Key specs vs other open-source models<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Clip length: Up to ~20s out of the box. Many open models top out around 6\u201312s unless you hack context windows.<\/li>\n\n\n\n<li>Resolution and fps: LTX\u20112 advertises 4K@50fps. Practically, most creators will iterate at 720p\u20131080p, 12\u201324fps, then upscale\/interpolate. It&#8217;s still an edge over models that crumble past 512p.<\/li>\n\n\n\n<li>Audio conditioning: Native path for audio alignment (lip\/beat) when you supply an audio track. A lot of models still treat audio as an afterthought.<\/li>\n\n\n\n<li>Latency: Comparable to the better open models at similar resolutions. My 12\u2011second 1080p test with moderate motion took ~7.5 minutes\/frame batch on a 24GB GPU, tiled. That&#8217;s not fast, but the consistency was better than I expected.<\/li>\n<\/ul>\n\n\n\n<p>If you&#8217;ve used models like SD\u2011Video forks or the earlier text\u2011to\u2011video baselines, LTX\u20112 feels like a generation bump in motion stability. Not a silver bullet, but a meaningful step.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"what-day-0-native-support-means-in-comfyui\">What &#8220;Day-0 native support&#8221; means in ComfyUI<\/h2>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"604\" data-id=\"4822\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-10-1024x604.png\" alt=\"\" class=\"wp-image-4822 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-10-1024x604.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-10-300x177.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-10-768x453.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-10-1536x906.png 1536w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-10-18x12.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-10.png 1634w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/604;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>ComfyUI pushed LTX\u20112 nodes and example workflows the same day the model dropped. That &#8220;day\u20110&#8221; bit matters because you don&#8217;t need a third\u2011party custom node pack just to get started.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"what-works-out-of-the-box\">What works out of the box<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>New LTX\u20112 loader\/inference nodes appear after you update ComfyUI to the latest main branch (as of Jan 6\u20137, 2026).<\/li>\n\n\n\n<li>Example workflows include text\u2011only generation and text+audio conditioning. I loaded the sample JSON, swapped my prompt, and it ran without wiring drama.<\/li>\n\n\n\n<li>Basic schedulers, tiling, and VRAM\u2011friendly settings are already exposed in node inputs. Nice touch.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"what-still-requires-downloads-weights-models\">What still requires downloads (weights, models)<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Core LTX\u20112 weights: You still have to download them. <strong>ComfyUI <\/strong>will point you to the <a href=\"https:\/\/github.com\/Lightricks\/LTX-2\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">official repo\/checkpoints <\/a>and cache them under models\/ (paths vary by setup).<\/li>\n\n\n\n<li>Audio encoder\/alignment weights: For audio sync, there are extra files (audio conditioning encoder + tokenizer\/config). ComfyUI will throw a clear missing\u2011weights message the first time. Grab them from the official LTX\u20112 docs.<\/li>\n\n\n\n<li>Optional: VAE\/upsamplers\/interpolators. If you want crisp 4K@50fps without waiting ages, you&#8217;ll likely chain an external upscaler (e.g., video\u2011aware ESRGAN variant) and a frame interpolator (RIFE\/IFRNet). Those aren&#8217;t bundled.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"audio-sync-what-s-included-vs-what-needs-setup\">Audio sync: what&#8217;s included vs what needs setup<\/h2>\n\n\n\n<p>Included in the ComfyUI graph: a clear path to feed an audio waveform alongside your prompt. The model uses audio features to guide lip movement or rhythmic motion.<\/p>\n\n\n\n<p>What you still set up:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Audio weights: Download and place them where ComfyUI expects (the error message includes the path). After I added the audio encoder files, the next run picked them up automatically.<\/li>\n\n\n\n<li>Input format: 16\u2011bit WAV at 44.1k or 48k worked best for me. I kept clips under 20s to match the model window.<\/li>\n\n\n\n<li>Alignment expectations: It&#8217;s &#8220;good enough&#8221; for social clips, think vlogger\u2011style talking or beat\u2011matched gestures. It&#8217;s not frame\u2011perfect dubbing. Short plosive sounds (b\/p) can drift if the prompt pushes extreme motion.<\/li>\n<\/ul>\n\n\n\n<p>Quick note from testing on Jan 6, 2026: a 12\u2011second talking\u2011head sample aligned surprisingly well, upper lip dynamics matched syllables, jaw motion lagged by a frame or two. For music, head bobs and camera moves synced to kicks better than I expected.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"quick-checklist-update-comfyui-verify-version\">Quick checklist: update ComfyUI + verify version<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Pull the latest ComfyUI main branch (Jan 6\u20137, 2026): git pull origin master or main, depending on your clone.<\/li>\n\n\n\n<li>Install\/update dependencies when prompted. If you use a venv: pip install -r requirements.txt.<\/li>\n\n\n\n<li>Launch and check the right\u2011click Add Node menu for LTX\u20112 nodes. If they&#8217;re missing, restart ComfyUI or clear the node cache.<\/li>\n\n\n\n<li>Open an official LTX\u20112 example workflow JSON to confirm nodes and connections load without red errors.<\/li>\n\n\n\n<li>First dry run at 512\u2013720p to validate weights and audio encoder paths before you waste time on 4K.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"how-to-load-ltx-2-in-comfyui-high-level-steps\">How to load LTX-2 in ComfyUI (high-level steps)<\/h2>\n\n\n\n<p>Here&#8217;s the flow I actually used on Jan 5\u20136, 2026:<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-3 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"683\" height=\"341\" data-id=\"4823\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-11.png\" alt=\"\" class=\"wp-image-4823 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-11.png 683w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-11-300x150.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-11-18x9.png 18w\" data-sizes=\"auto, (max-width: 683px) 100vw, 683px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 683px; --smush-placeholder-aspect-ratio: 683\/341;\" \/><\/figure>\n<\/figure>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li><strong><a href=\"https:\/\/docs.comfy.org\/installation\/update_comfyui\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Update ComfyUI<\/a><\/strong>, then open the LTX\u20112 example workflow.<\/li>\n\n\n\n<li>In the LTX\u20112 Loader node, select the base checkpoint path (download from the official model page). Same for the audio encoder if you plan to sync.<\/li>\n\n\n\n<li>Set prompt and negative prompt. I kept it concrete: &#8220;sunlit surfer carving a glassy shoulder, telephoto, gentle camera pan.&#8221;<\/li>\n\n\n\n<li>Choose a sane preview size (576p\/720p). Switch on tiling if VRAM is tight.<\/li>\n\n\n\n<li>For audio: import a 10\u201315s WAV of your voice or a beat. Enable audio conditioning in the node.<\/li>\n\n\n\n<li>Render a short low\u2011res pass. If it looks stable, bump to 1080p. For 4K, either enable model\u2011native high\u2011res (slow) or upscale after render. For 50fps, I often render 25fps and run frame interpolation.<\/li>\n\n\n\n<li>Save seeds and settings in the node comments. ComfyUI makes it easy to retrace your steps later.<\/li>\n<\/ol>\n\n\n\n<p>Practical tip: If motion looks &#8220;floaty,&#8221; lower CFG slightly and add a camera\u2011anchor hint in the prompt. It reduced micro\u2011jitter in my beach tests.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"common-misconceptions\">Common misconceptions<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Day\u20110 native support \u2260 bundled weights. You still download checkpoints. The win is that the nodes are maintained in the main repo.<\/li>\n\n\n\n<li>4K@50fps isn&#8217;t effortless. It&#8217;s possible, but you&#8217;ll pay with VRAM, time, or both. I got it via 1080p render + upscale + interpolation faster than pure 4K generation.<\/li>\n\n\n\n<li>Audio sync doesn&#8217;t create a voice track. It aligns motion to audio you provide: it won&#8217;t invent vocals or lyrics.<\/li>\n\n\n\n<li>&#8220;20 seconds&#8221; isn&#8217;t a hard law. You can try longer, but quality and coherence drop. I found 8\u201315s to be the sweet spot.<\/li>\n\n\n\n<li>More steps aren&#8217;t always better. Past a point, I saw diminishing returns, try smarter prompts and steadier camera cues instead.<\/li>\n<\/ul>\n\n\n\n<p>If you want a single\u2011sentence read: LTX\u20112 in ComfyUI is usable today, just respect the hardware math. And if you hit a wall, the official docs and GitHub issues are worth a skim before you assume it&#8217;s broken.<\/p>\n\n\n\n<p>To prototype ideas faster before committing to full renders, I sometimes use <strong><a href=\"https:\/\/crepal.ai\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Crepal.ai<\/a><\/strong>. It lets you mock up visuals from a prompt in minutes, so you can test composition, timing, and framing before running LTX\u20112 locally.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-4 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"620\" data-id=\"4824\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-12-1024x620.png\" alt=\"\" class=\"wp-image-4824 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-12-1024x620.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-12-300x182.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-12-768x465.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-12-18x12.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-12.png 1436w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/620;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>Useful links:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong><a href=\"https:\/\/github.com\/comfyanonymous\/ComfyUI\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">ComfyUI GitHub <\/a><\/strong>(latest nodes)<\/li>\n\n\n\n<li>LTX\u20112 model\/docs: check the official repo linked by the project maintainers.<\/li>\n<\/ul>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>If you&#8217;ve tried <strong>LTX-2<\/strong>, leave a comment telling me your wildest 10-second wave experience\u2014I want to see whose \u201csurfing\u201d is the craziest!<\/p>\n<\/blockquote>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p><strong>Previous posts:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"CQZoV6S66j\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/wan-2-6-comfyui-image-to-video\/\">Wan 2.6 ComfyUI Image to Video Workflow: Step-by-Step<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a Wan 2.6 ComfyUI Image to Video Workflow: Step-by-Step \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/wan-2-6-comfyui-image-to-video\/embed\/#?secret=KrkNCn0pju#?secret=CQZoV6S66j\" data-secret=\"CQZoV6S66j\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"nCinj4GjgU\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/wan-2-6-image-to-video-lip-sync\/\">Wan 2.6 Image to Video Lip Sync: How to Make It Work<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a Wan 2.6 Image to Video Lip Sync: How to Make It Work \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/wan-2-6-image-to-video-lip-sync\/embed\/#?secret=lJvIsdlg7E#?secret=nCinj4GjgU\" data-secret=\"nCinj4GjgU\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"iyhrgyMeME\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/wan-2-6-image-to-video-prompts\/\">Wan 2.6 Image to Video Prompts: Best Examples That Work<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a Wan 2.6 Image to Video Prompts: Best Examples That Work \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/wan-2-6-image-to-video-prompts\/embed\/#?secret=iIaKgfhn1W#?secret=iyhrgyMeME\" data-secret=\"iyhrgyMeME\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>I held my cold tea, staring at the waves on the screen and wondering: Am I surfing, or just waiting for the render to finish? I tried to coax a 15\u2011second surf clip out of LTX\u20112 in ComfyUI. I&#8217;d seen &#8220;4K at 50fps with audio sync&#8221; all over my feed and, honestly, I wanted to [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":4820,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_gspb_post_css":"","_uag_custom_page_level_css":"","footnotes":""},"categories":[8],"tags":[],"class_list":["post-4819","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-aivideo"],"blocksy_meta":[],"uagb_featured_image_src":{"full":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-8.png",1376,768,false],"thumbnail":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-8-150x150.png",150,150,true],"medium":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-8-300x167.png",300,167,true],"medium_large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-8-768x429.png",768,429,true],"large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-8-1024x572.png",1024,572,true],"1536x1536":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-8.png",1376,768,false],"2048x2048":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-8.png",1376,768,false],"trp-custom-language-flag":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/01\/image-8-18x10.png",18,10,true]},"uagb_author_info":{"display_name":"Dora","author_link":"https:\/\/crepal.ai\/blog\/author\/dora\/"},"uagb_comment_info":5,"uagb_excerpt":"I held my cold tea, staring at the waves on the screen and wondering: Am I surfing, or just waiting for the render to finish? I tried to coax a 15\u2011second surf clip out of LTX\u20112 in ComfyUI. I&#8217;d seen &#8220;4K at 50fps with audio sync&#8221; all over my feed and, honestly, I wanted to&hellip;","_links":{"self":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4819","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/comments?post=4819"}],"version-history":[{"count":1,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4819\/revisions"}],"predecessor-version":[{"id":4828,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4819\/revisions\/4828"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media\/4820"}],"wp:attachment":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media?parent=4819"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/categories?post=4819"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/tags?post=4819"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}