{"id":5774,"date":"2026-03-24T19:46:27","date_gmt":"2026-03-24T11:46:27","guid":{"rendered":"https:\/\/crepal.ai\/blog\/?p=5774"},"modified":"2026-03-24T19:46:29","modified_gmt":"2026-03-24T11:46:29","slug":"how-to-install-ltx-2-3-comfyui","status":"publish","type":"post","link":"https:\/\/crepal.ai\/blog\/aivideo\/how-to-install-ltx-2-3-comfyui\/","title":{"rendered":"How to Install LTX 2.3 in ComfyUI: Step-by-Step Guide"},"content":{"rendered":"\n<p>Hello fellows! This is Dora \u2014 and recently I finally cracked the clean install sequence for LTX 2.3. I&#8217;d spent two hours on a broken node graph. Right package version, wrong folder. Right folder, missing VAE. Right VAE, wrong ComfyUI build. When it finally rendered a clean 9-frame clip, I immediately wrote down every exact step so I&#8217;d never have to do that again.<\/p>\n\n\n\n<p>The blog below explains no vague tips but every command, every folder path, every error I hit and how I fixed it. Let&#8217;s get into it.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"prerequisites-comfyui-version-gpu-vram-disk-space\">Prerequisites (ComfyUI version, GPU VRAM, disk space)<\/h2>\n\n\n\n<p>Before downloading anything, map your system against this table. I tested across three machines; this reflects what actually works.<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><tbody><tr><td class=\"has-text-align-center\" data-align=\"center\">Component<\/td><td class=\"has-text-align-center\" data-align=\"center\">Minimum<\/td><td class=\"has-text-align-center\" data-align=\"center\">Recommended<\/td><td class=\"has-text-align-center\" data-align=\"center\">Notes<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">ComfyUI build<\/td><td class=\"has-text-align-center\" data-align=\"center\">Nightly 20260101+<\/td><td class=\"has-text-align-center\" data-align=\"center\">Nightly 20260310+<\/td><td class=\"has-text-align-center\" data-align=\"center\">Older builds missing LTXVModelLoader node<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">GPU VRAM<\/td><td class=\"has-text-align-center\" data-align=\"center\">12 GB<\/td><td class=\"has-text-align-center\" data-align=\"center\">24 GB<\/td><td class=\"has-text-align-center\" data-align=\"center\">12 GB requires distilled + fp8 offload<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">Disk space<\/td><td class=\"has-text-align-center\" data-align=\"center\">16 GB (distilled only)<\/td><td class=\"has-text-align-center\" data-align=\"center\">35 GB (both models)<\/td><td class=\"has-text-align-center\" data-align=\"center\">Include VAE + T5-XXL<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">Python<\/td><td class=\"has-text-align-center\" data-align=\"center\">3.1<\/td><td class=\"has-text-align-center\" data-align=\"center\">3.11<\/td><td class=\"has-text-align-center\" data-align=\"center\">3.12 untested, likely broken<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">CUDA<\/td><td class=\"has-text-align-center\" data-align=\"center\">11.8<\/td><td class=\"has-text-align-center\" data-align=\"center\">12.3<\/td><td class=\"has-text-align-center\" data-align=\"center\">ROCm: experimental, Linux only<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">OS<\/td><td class=\"has-text-align-center\" data-align=\"center\">Windows 10, Ubuntu 22.04<\/td><td class=\"has-text-align-center\" data-align=\"center\">Same<\/td><td class=\"has-text-align-center\" data-align=\"center\">macOS: no CUDA, not viable<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p><strong>Check your ComfyUI build version<\/strong> before proceeding. Open ComfyUI&#8217;s terminal and run:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>cd ComfyUI\ngit log --oneline -1<\/code><\/pre>\n\n\n\n<p>If the commit date is before January 2026, update first:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>git pull origin master\npip install -r requirements.txt<\/code><\/pre>\n\n\n\n<p><strong>Step 1 \u2014 InstallRequired <a href=\"https:\/\/packaging.python.org\/en\/latest\/tutorials\/installing-packages\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Python Packages<\/a><\/strong><\/p>\n\n\n\n<p>This must happen before you download any models. The packages tell ComfyUI how to interpret LTX 2.3&#8217;s architecture.<\/p>\n\n\n\n<p><strong>Identify which Python your ComfyUI uses.<\/strong> This trips people up constantly because they install packages into the wrong environment.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"894\" height=\"499\" data-id=\"5776\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-129.png\" alt=\"\" class=\"wp-image-5776 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-129.png 894w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-129-300x167.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-129-768x429.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-129-18x10.png 18w\" data-sizes=\"auto, (max-width: 894px) 100vw, 894px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 894px; --smush-placeholder-aspect-ratio: 894\/499;\" \/><\/figure>\n<\/figure>\n\n\n\n<p><em>If you use the ComfyUI portable package (Windows):<\/em><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code># Navigate to your ComfyUI portable folder first\ncd C:\\ComfyUI_windows_portable\npython_embeded\\python.exe -m pip install ltx-core ltx-pipelines<\/code><\/pre>\n\n\n\n<p><em>If you use a venv or <\/em><em>conda<\/em><em> environment:<\/em><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code># Activate your environment first\nconda activate comfyui   # or: source venv\/bin\/activate\npip install ltx-core ltx-pipelines<\/code><\/pre>\n\n\n\n<p><strong>Verify the <\/strong><strong>install<\/strong><strong> worked:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>python -c \"import ltx_core; print(ltx_core.__version__)\"\n# Should print: 0.9.2 or higher\npython -c \"import ltx_pipelines; print('OK')\"<\/code><\/pre>\n\n\n\n<p>If you get <code>ModuleNotFoundError<\/code> on the verification step, you installed the wrong Python. Re-read the step above and confirm which <code>python<\/code> executable is actually running ComfyUI.<\/p>\n\n\n\n<p><strong>Step 2 \u2014 Download LTX 2.3 Weights<\/strong><\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"dev-bf16-vs-distilled-model-which-to-get\">Dev (bf16) vs Distilled model \u2014 which to get<\/h3>\n\n\n\n<p>I&#8217;ve run both through hundreds of generations. Here&#8217;s the real tradeoff:<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><tbody><tr><td class=\"has-text-align-center\" data-align=\"center\"><\/td><td class=\"has-text-align-center\" data-align=\"center\">Dev (bf16)<\/td><td class=\"has-text-align-center\" data-align=\"center\">Distilled<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">File size<\/td><td class=\"has-text-align-center\" data-align=\"center\">~27 GB<\/td><td class=\"has-text-align-center\" data-align=\"center\">~14 GB<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">Inference steps<\/td><td class=\"has-text-align-center\" data-align=\"center\">40\u201350<\/td><td class=\"has-text-align-center\" data-align=\"center\">6\u20138<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">Generation time (4090)<\/td><td class=\"has-text-align-center\" data-align=\"center\">~4 min\/clip<\/td><td class=\"has-text-align-center\" data-align=\"center\">~45 sec\/clip<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">Motion quality<\/td><td class=\"has-text-align-center\" data-align=\"center\">High<\/td><td class=\"has-text-align-center\" data-align=\"center\">Moderate<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">Complex scenes<\/td><td class=\"has-text-align-center\" data-align=\"center\">Excellent<\/td><td class=\"has-text-align-center\" data-align=\"center\">Struggles<\/td><\/tr><tr><td class=\"has-text-align-center\" data-align=\"center\">Prompt iteration<\/td><td class=\"has-text-align-center\" data-align=\"center\">Slow<\/td><td class=\"has-text-align-center\" data-align=\"center\">Fast<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p><strong>My actual <\/strong><strong>workflow<\/strong>: I use distilled for prompt exploration (testing 10\u201315 prompt variations quickly), then switch to dev for the final output I&#8217;ll publish. If you have a 12 GB GPU, start with distilled only.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"where-to-download-huggingface-github-monorepo\">Where to download (HuggingFace, GitHub monorepo)<\/h3>\n\n\n\n<p>All official weights are hosted on <strong>HuggingFace<\/strong> by Lightricks:<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"793\" height=\"439\" data-id=\"5777\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-130.png\" alt=\"\" class=\"wp-image-5777 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-130.png 793w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-130-300x166.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-130-768x425.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-130-18x10.png 18w\" data-sizes=\"auto, (max-width: 793px) 100vw, 793px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 793px; --smush-placeholder-aspect-ratio: 793\/439;\" \/><\/figure>\n<\/figure>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong><a href=\"https:\/\/huggingface.co\/Lightricks\/LTX-Video\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Dev model<\/a><\/strong><\/li>\n\n\n\n<li><strong><a href=\"https:\/\/huggingface.co\/spaces\/Lightricks\/ltx-video-distilled\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Distilled model<\/a><\/strong><\/li>\n\n\n\n<li><strong><a href=\"https:\/\/huggingface.co\/mcmonkey\/google_t5-v1_1-xxl_encoderonly\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">T5-XXL text encoder<\/a><\/strong><\/li>\n<\/ul>\n\n\n\n<p><strong>Download via terminal (faster than browser for large files):<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>pip install huggingface_hub\n\n# Dev model\nhuggingface-cli download Lightricks\/LTX-Video \\\n  ltx-video-2b-v0.9.7-dev-bf16.safetensors \\\n  --local-dir .\/downloads\n\n# VAE\nhuggingface-cli download Lightricks\/LTX-Video \\\n  ltxvideo_vae_bf16.safetensors \\\n  --local-dir .\/downloads<\/code><\/pre>\n\n\n\n<p><strong>Verify file integrity after download<\/strong> (sizes as of March 2026):<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Dev bf16: <code>27.1 GB<\/code> \u2014 if your download is significantly different, re-download<\/li>\n\n\n\n<li>Distilled: <code>13.8 GB<\/code><\/li>\n\n\n\n<li>T5-XXL: <code>9.8 GB<\/code><\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"install-the-comfyui-custom-node-pack\">Install the ComfyUI Custom Node Pack<\/h2>\n\n\n\n<p>The model files alone aren&#8217;t enough. You need the LTX-specific <a href=\"https:\/\/www.comfy.org\/zh-cn\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">ComfyUI<\/a> nodes.<\/p>\n\n\n\n<p><strong>Option A \u2014 Via ComfyUI Manager (easiest):<\/strong><\/p>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>Open ComfyUI in your browser<\/li>\n\n\n\n<li>Click <strong>Manager<\/strong> \u2192 <strong>Install<\/strong><strong> Custom Nodes<\/strong><\/li>\n\n\n\n<li>Search: <code>ComfyUI-LTX-Video<\/code><\/li>\n\n\n\n<li>Install \u2192 Restart ComfyUI<\/li>\n<\/ol>\n\n\n\n<p><strong>Option B \u2014 Manual <\/strong><strong>install<\/strong><strong>:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>cd ComfyUI\/custom_nodes\ngit clone https:\/\/github.com\/Lightricks\/ComfyUI-LTXVideo\ncd ComfyUI-LTXVideo\npip install -r requirements.txt<\/code><\/pre>\n\n\n\n<p>After installing, fully restart ComfyUI (kill the process \u2014 don&#8217;t just refresh the browser).<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-3 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1024\" height=\"515\" data-id=\"5778\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-131.png\" alt=\"\" class=\"wp-image-5778 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-131.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-131-300x151.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-131-768x386.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-131-18x9.png 18w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/515;\" \/><\/figure>\n<\/figure>\n\n\n\n<p><strong>Verify nodes loaded correctly:<\/strong> In the ComfyUI node search (double-click canvas), type <code>LTX<\/code>. You should see <code>LTXVModelLoader<\/code>, <code>LTXVSampler<\/code>, and <code>LTXVScheduler<\/code>. If these don&#8217;t appear, the custom node install failed.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"folder-paths-and-file-placement\">Folder paths and file placement<\/h2>\n\n\n\n<p>This is where most failed installs happen. LTX 2.3 weights are <strong>not<\/strong> checkpoint files \u2014 they use a different loader and must live in a different folder.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>ComfyUI\/\n\u2514\u2500\u2500 models\/\n    \u251c\u2500\u2500 checkpoints\/\n    \u2502   \u2514\u2500\u2500 (your SD\/SDXL models \u2014 NOT LTX weights)\n    \u2502\n    \u251c\u2500\u2500 diffusion_models\/          \u2190 LTX model weights go HERE\n    \u2502   \u251c\u2500\u2500 ltx-video-2b-v0.9.7-dev-bf16.safetensors\n    \u2502   \u2514\u2500\u2500 ltx-video-2b-v0.9.7-distilled-bf16.safetensors\n    \u2502\n    \u251c\u2500\u2500 vae\/                       \u2190 VAE goes here\n    \u2502   \u2514\u2500\u2500 ltxvideo_vae_bf16.safetensors\n    \u2502\n    \u2514\u2500\u2500 clip\/                      \u2190 T5-XXL text encoder goes here\n        \u2514\u2500\u2500 t5xxl_fp16.safetensors<\/code><\/pre>\n\n\n\n<p><strong>If <\/strong><strong><code>diffusion_models\/<\/code><\/strong><strong> doesn&#8217;t exist yet:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>mkdir ComfyUI\/models\/diffusion_models<\/code><\/pre>\n\n\n\n<p>After moving files, <strong>do not rename them<\/strong>. The node loader uses partial name matching and renaming can cause <code>KeyError<\/code> on load.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"load-the-official-comfyui-workflow\">Load the official ComfyUI workflow<\/h2>\n\n\n\n<p>Lightricks ships tested starter workflows. Use these instead of building from scratch \u2014 you&#8217;ll avoid node version mismatches.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"t2v-starter-workflow\">T2V starter workflow<\/h3>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li><a href=\"https:\/\/github.com\/Lightricks\/ComfyUI-LTXVideo\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Download from the official repo<\/a>:<\/li>\n\n\n\n<li>In ComfyUI: drag-and-drop the JSON onto the canvas, or use <strong>Load<\/strong> from the menu<\/li>\n\n\n\n<li>In the <code>LTXVModelLoader<\/code> node, select your model from the dropdown<\/li>\n\n\n\n<li>In the VAE loader node, select <code>ltxvideo_vae_bf16<\/code><\/li>\n\n\n\n<li>In the CLIP loader node, select your T5-XXL file<\/li>\n<\/ol>\n\n\n\n<p><strong>Writing prompts that work<\/strong>: LTX 2.3 responds strongly to motion language. In my tests, adding camera and movement description consistently improved output quality:<\/p>\n\n\n\n<p><em>Weak prompt<\/em>: &#8220;A woman walking through a market at golden hour&#8221; <em>Strong prompt<\/em>: &#8220;Camera slowly tracking left, a woman walking through a sunlit outdoor market, golden hour lighting, fabric stalls in background, cinematic depth of field&#8221;<\/p>\n\n\n\n<p>The second prompt type produced noticeably more stable, intentional motion in my testing.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-4 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"466\" data-id=\"5779\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-132-1024x466.png\" alt=\"\" class=\"wp-image-5779 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-132-1024x466.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-132-300x137.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-132-768x350.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-132-18x8.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-132.png 1100w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/466;\" \/><\/figure>\n<\/figure>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"i2v-starter-workflow\">I2V starter workflow<\/h3>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>Download<\/li>\n\n\n\n<li>Load the same way as T2V<\/li>\n\n\n\n<li>In the image input node, load your source image<\/li>\n<\/ol>\n\n\n\n<p><strong>Critical<\/strong>: Your source image must match your target generation resolution. Mismatch causes warped, distorted output. I generate at 768\u00d7432 (16:9) and resize source images to match before loading them in.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"verify-your-install-with-sanity-check\">Verify Your Install with (Sanity Check)<\/h2>\n\n\n\n<p>Don&#8217;t run a full 81-frame generation as your first test. Do this instead:<\/p>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>In the workflow, set <code>num_frames<\/code> to <strong>9<\/strong><\/li>\n\n\n\n<li>Use this simple test prompt: <code>\"a red ball rolling slowly on a white table, smooth camera\"<\/code><\/li>\n\n\n\n<li>Set steps to <strong>20<\/strong> (dev model) or <strong>6<\/strong> (distilled)<\/li>\n\n\n\n<li>Queue the generation<\/li>\n<\/ol>\n\n\n\n<p><strong>Watch the terminal during generation:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><code>Loading model...<\/code> \u2192 model path is correct \u2705<\/li>\n\n\n\n<li><code>Step 1\/20<\/code> appearing in progress \u2192 sampler initialized correctly \u2705<\/li>\n\n\n\n<li><code>Saving output...<\/code> \u2192 generation complete \u2705<\/li>\n<\/ul>\n\n\n\n<p>A successful 9-frame test should complete in under 30 seconds on the distilled model with a 4090. If you see the progress bar start, your install is working. Run a full-length clip next.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"common-install-errors-and-exact-fixes\">Common Install Errors and Exact Fixes<\/h2>\n\n\n\n<p><strong>Error: <\/strong><strong><code>ModuleNotFoundError: No module named 'ltx_core'<\/code><\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>Cause: Packages installed in wrong Python environment\nFix: Identify which python runs ComfyUI, re-run pip install with that exact executable\nVerify: python -c \"import ltx_core\" must return no error<\/code><\/pre>\n\n\n\n<p><strong>Error: <\/strong><strong><code>RuntimeError: size mismatch for transformer.blocks...<\/code><\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>Cause: Model file placed in wrong folder (checkpoints\/ instead of diffusion_models\/)\nFix: Move .safetensors file to ComfyUI\/models\/diffusion_models\/<\/code><\/pre>\n\n\n\n<p><strong>Error: <\/strong><strong><code>KeyError: 'LTXVModelLoader'<\/code><\/strong><strong> (red nodes in <\/strong><strong>workflow<\/strong><strong>)<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>Cause: Custom node pack not installed or failed to load\nFix: Check ComfyUI terminal for import errors after restart\n     Re-run: pip install -r ComfyUI\/custom_nodes\/ComfyUI-LTXVideo\/requirements.txt<\/code><\/pre>\n\n\n\n<p><strong>Error: Black video output, no error messages<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>Cause: VAE node pointing to wrong file (often a generic SD VAE)\nFix: Open VAE loader node, explicitly select ltxvideo_vae_bf16.safetensors<\/code><\/pre>\n\n\n\n<p><strong>Error: <\/strong><strong><code>CUDA out of memory<\/code><\/strong><strong> on step 1<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>Cause: VRAM insufficient for chosen model\/resolution\nFix: In LTXVModelLoader node, enable \"enable_sequential_cpu_offload\"\n     Or: switch to distilled model and reduce resolution to 512x320<\/code><\/pre>\n\n\n\n<p><strong>Error: Corrupted or incomplete output video<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>Cause: T5-XXL text encoder missing or wrong version\nFix: Confirm T5-XXL file is in ComfyUI\/models\/clip\/ and is fp16 version (~9.8 GB)<\/code><\/pre>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"note-if-you-have-old-ltx-2-do-not-reuse-weights\">Note: If You Have Old LTX-2 \u2014 Do Not Reuse Weights<\/h2>\n\n\n\n<p>If you have weights from any previous LTX version (2.0, 2.1, 2.2), <strong>do not use them with the 2.3 workflows<\/strong>. The transformer block architecture changed between versions. Loading old weights through the new node loader either crashes immediately or produces visually corrupted output with no error message.<\/p>\n\n\n\n<p>Keep old weights in a separate folder if you need them for archived workflows. Treat 2.3 as a completely fresh install.<\/p>\n\n\n\n<p><strong>AMD GPU (ROCm) \u2014 What Actually Works in 2026<\/strong><\/p>\n\n\n\n<p>Since I see this question constantly: I tested on an RX 7900 XTX (24 GB) running ROCm 6.1 on Ubuntu 22.04. Here&#8217;s the real status:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Installation<\/strong>: Works, but requires <code>HSA_OVERRIDE_GFX_VERSION=11.0.0<\/code> environment variable set before launching ComfyUI<\/li>\n\n\n\n<li><strong>Generation<\/strong>: Functional but ~2.3\u00d7 slower than equivalent NVIDIA hardware<\/li>\n\n\n\n<li><strong><a href=\"https:\/\/platform.stability.ai\/docs\/getting-started\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Stability<\/a><\/strong>: Occasional silent failures mid-generation (no error, just stops)<\/li>\n\n\n\n<li><strong>Windows AMD<\/strong>: Not viable \u2014 ROCm on Windows is still too immature<\/li>\n<\/ul>\n\n\n\n<p>If you&#8217;re on AMD\/Linux and want to try: the LTX GitHub issues page has a pinned ROCm thread with community-tested launch flags.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-5 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"942\" height=\"325\" data-id=\"5780\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-133.png\" alt=\"\" class=\"wp-image-5780 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-133.png 942w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-133-300x104.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-133-768x265.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-133-18x6.png 18w\" data-sizes=\"auto, (max-width: 942px) 100vw, 942px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 942px; --smush-placeholder-aspect-ratio: 942\/325;\" \/><\/figure>\n<\/figure>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"faq\">FAQ<\/h2>\n\n\n\n<p><strong>Q: Why does the workflow show red nodes after loading?<\/strong> A: Red nodes mean the custom node pack isn&#8217;t installed or fails to load on startup. Check your ComfyUI terminal for Python import errors after restart. Re-run <code>pip install -r requirements.txt<\/code> inside the <code>ComfyUI-LTXVideo<\/code> folder.<\/p>\n\n\n\n<p><strong>Q: Can I run LTX 2.3 with 8 GB VRAM?<\/strong> A: In testing, 8 GB is not reliable even with the distilled model. At minimum resolution (512\u00d7320, 9 frames) it sometimes completes, but larger generations fail. 12 GB with CPU offloading enabled is the practical floor.<\/p>\n\n\n\n<p><strong>Q: What&#8217;s the maximum video length?<\/strong> A: The model supports up to 257 frames. At 24fps that&#8217;s approximately 10.7 seconds. In practice, I see quality degradation after ~161 frames (6.7 seconds) at 24fps \u2014 motion consistency weakens in later frames. For longer content, I generate in segments and join in post.<\/p>\n\n\n\n<p><strong>Q: Do I need to download T5-XXL separately, or is it included?<\/strong> A: Separately. The HuggingFace repo at <code>Lightricks\/LTX-Video<\/code> contains the diffusion model and VAE only. T5-XXL comes from <code>mcmonkey\/google_t5-v1_1-xxl_encoderonly<\/code>. Both are required \u2014 without T5-XXL, the CLIP loader node will error and generation won&#8217;t start.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p>Previous Posts:<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"FS7bN7cFdV\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-prompt-engineering-guide\/\">Seedance 2.0 Prompt Engineering: The Exact Structure That Gets Consistent Results<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a Seedance 2.0 Prompt Engineering: The Exact Structure That Gets Consistent Results \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-prompt-engineering-guide\/embed\/#?secret=bE8XnTpx9z#?secret=FS7bN7cFdV\" data-secret=\"FS7bN7cFdV\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"1ApRvoefEe\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-export-settings-tiktok-reels-shorts\/\">Seedance 2.0 Export Settings: Best Specs for TikTok, Reels, and Shorts (No Upload Surprises)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a Seedance 2.0 Export Settings: Best Specs for TikTok, Reels, and Shorts (No Upload Surprises) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-export-settings-tiktok-reels-shorts\/embed\/#?secret=czQ5IhDzuK#?secret=1ApRvoefEe\" data-secret=\"1ApRvoefEe\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"TMG9YWqNqb\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-multi-shot-marketing-video\/\">How to Build Multi Shot Marketing Videos With Seedance 2.0 (Without Losing Consistency)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a How to Build Multi Shot Marketing Videos With Seedance 2.0 (Without Losing Consistency) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-seedance-2-0-multi-shot-marketing-video\/embed\/#?secret=MjCp2lQpfU#?secret=TMG9YWqNqb\" data-secret=\"TMG9YWqNqb\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"UwyL9KHrkr\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/wan-2-6-comfyui-image-to-video\/\">How to Create Image-to-Video with Wan 2.6 in ComfyUI (Easy 2026 Guide)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a How to Create Image-to-Video with Wan 2.6 in ComfyUI (Easy 2026 Guide) \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/wan-2-6-comfyui-image-to-video\/embed\/#?secret=cktONpF4fN#?secret=UwyL9KHrkr\" data-secret=\"UwyL9KHrkr\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"Qik92iCEEm\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/free-ai-video-tools\/\">Best Free AI Video Tools (2026) \u2014 Compare Features &amp; Outputs<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"\u300a Best Free AI Video Tools (2026) \u2014 Compare Features &amp; Outputs \u300b\u2014CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/free-ai-video-tools\/embed\/#?secret=eFXNhYfXEc#?secret=Qik92iCEEm\" data-secret=\"Qik92iCEEm\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>Hello fellows! This is Dora \u2014 and recently I finally cracked the clean install sequence for LTX 2.3. I&#8217;d spent two hours on a broken node graph. Right package version, wrong folder. Right folder, missing VAE. Right VAE, wrong ComfyUI build. When it finally rendered a clean 9-frame clip, I immediately wrote down every exact [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":5775,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_gspb_post_css":"","_uag_custom_page_level_css":"","footnotes":""},"categories":[8],"tags":[],"class_list":["post-5774","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-aivideo"],"blocksy_meta":[],"uagb_featured_image_src":{"full":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-7.jpeg",1376,768,false],"thumbnail":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-7-150x150.jpeg",150,150,true],"medium":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-7-300x167.jpeg",300,167,true],"medium_large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-7-768x429.jpeg",768,429,true],"large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-7-1024x572.jpeg",1024,572,true],"1536x1536":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-7.jpeg",1376,768,false],"2048x2048":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-7.jpeg",1376,768,false],"trp-custom-language-flag":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2026\/03\/image-7-18x10.jpeg",18,10,true]},"uagb_author_info":{"display_name":"Dora","author_link":"https:\/\/crepal.ai\/blog\/author\/dora\/"},"uagb_comment_info":9,"uagb_excerpt":"Hello fellows! This is Dora \u2014 and recently I finally cracked the clean install sequence for LTX 2.3. I&#8217;d spent two hours on a broken node graph. Right package version, wrong folder. Right folder, missing VAE. Right VAE, wrong ComfyUI build. When it finally rendered a clean 9-frame clip, I immediately wrote down every exact&hellip;","_links":{"self":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/5774","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/comments?post=5774"}],"version-history":[{"count":1,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/5774\/revisions"}],"predecessor-version":[{"id":5781,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/5774\/revisions\/5781"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media\/5775"}],"wp:attachment":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media?parent=5774"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/categories?post=5774"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/tags?post=5774"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}