{"id":4338,"date":"2025-12-11T14:34:17","date_gmt":"2025-12-11T06:34:17","guid":{"rendered":"https:\/\/crepal.ai\/blog\/?p=4338"},"modified":"2025-12-11T14:34:19","modified_gmt":"2025-12-11T06:34:19","slug":"blog-fiction-to-video-fiction-animation","status":"publish","type":"post","link":"https:\/\/crepal.ai\/blog\/aivideo\/blog-fiction-to-video-fiction-animation\/","title":{"rendered":"From Fiction to Animation: Novel-to-Video AI Explained"},"content":{"rendered":"\n<p>Hey, I&#8217;m Dora. On October 22, 2025, I was up way too late, staring at a half-finished short story and a blinking cursor. I wondered: could I turn this into a little film without pulling an all-nighter in After Effects? That curiosity pulled me into the rabbit hole of &#8220;fiction to video.&#8221; Not sponsored, just me, coffee, and a stack of tabs.<\/p>\n\n\n\n<p>I tested a few setups across late October and early November (Runway Gen-3, <a href=\"https:\/\/lumalabs.ai\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Luma Dream Machine<\/a>, Pika 1.0). Some parts felt like magic. Others\u2026are not so much. Here&#8217;s what actually helped me turn words into watchable videos, without the hype.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"543\" data-id=\"4340\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-72-1024x543.png\" alt=\"\" class=\"wp-image-4340 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-72-1024x543.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-72-300x159.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-72-768x407.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-72-18x10.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-72.png 1442w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/543;\" \/><\/figure>\n<\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Why Fiction Works Well for Video Adaptation<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Understanding the Benefits of Converting Stories from Fiction to Video<\/h3>\n\n\n\n<p>I used to think turning fiction into video would flatten the imagination. Strange twist: it can do the opposite when you guide it.<\/p>\n\n\n\n<p>Here&#8217;s why fiction works well for video:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Built-in structure: Stories already have beats, setup, rising tension, payoff. That gives AI a spine to follow. When I broke my microfiction into 7 beats (Oct 24 test, 1,180 words \u2192 90 seconds), the scene cuts felt intentional instead of random.<\/li>\n\n\n\n<li>Mood-first visuals: Fiction leans on tone. AI models respond to adjectives surprisingly well. &#8220;Sodium-lit alley, anxious camera, brittle footsteps&#8221; gave me better shots than &#8220;alley at night.&#8221;<\/li>\n\n\n\n<li>Efficient asset generation: I don&#8217;t have to model everything. For my test, I generated keyframes (6 images via Stable Diffusion XL) and let Luma Dream Machine sweep in camera movement. It cut production time by ~60% vs. keyframing a motion comic.<\/li>\n\n\n\n<li>Accessibility: My friend who doesn&#8217;t read much fiction watched the 90-second cut and got the entire arc. That&#8217;s a win if your audience scrolls more than they read.<\/li>\n<\/ul>\n\n\n\n<p>Caveats? Character consistency is the biggest issue. If your protagonist has red hair in scene 1, they might morph into auburn by scene 3. More on fixes below.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">How AI Understands Story Structure for Fiction to Video<\/h2>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"945\" height=\"531\" data-id=\"4341\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-73.png\" alt=\"\" class=\"wp-image-4341 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-73.png 945w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-73-300x169.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-73-768x432.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-73-18x10.png 18w\" data-sizes=\"auto, (max-width: 945px) 100vw, 945px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 945px; --smush-placeholder-aspect-ratio: 945\/531;\" \/><\/figure>\n<\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">Key Techniques AI Uses to Interpret Narrative Flow<\/h3>\n\n\n\n<p>I wanted to know what&#8217;s actually happening under the hood, not just &#8220;the model is smart.&#8221; Here&#8217;s the practical breakdown of how tools interpret narrative flow, plus what mattered in my tests (Oct 22\u2013Nov 3, 2025):<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Beat extraction with an LLM: I ran my story through an outline prompt to chunk it into beats (setup, inciting incident, midpoint shift, climax, denouement). LLMs are good at this, it&#8217;s almost like automated table-of-contents for your plot. Fewer beats = stronger cuts.<\/li>\n\n\n\n<li>Scene cards and shot lists: I converted each beat into a shot list: shot size (WS\/MS\/CU), lens vibe (35mm gritty, 85mm intimate), and motion (push-in, slow pan). Adding camera language improved outputs from <a href=\"https:\/\/docs.runwayml.com\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Runway Gen-3<\/a> and <a href=\"https:\/\/pika.art\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Pika 1.0 <\/a>by a mile.<\/li>\n\n\n\n<li>Entity and style anchors: I defined the protagonist with 3 anchors: age, signature object, color motif. Example: &#8220;Mara, 29, chipped teal lighter.&#8221; Repeating those anchors in every prompt boosted character continuity across clips.<\/li>\n\n\n\n<li>Sentiment-to-pace mapping: I tagged beats with &#8220;tension +2&#8221; or &#8220;calm -1.&#8221; Faster camera moves and tighter cuts on +2 beats felt right. Simple, but it gave a rhythm.<\/li>\n\n\n\n<li>Keyframe strategy: Instead of pure text-to-video, I used image-to-video for important shots (close-ups, reveal moments). <a href=\"https:\/\/docs.lumalabs.ai\/docs\/welcome\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Luma Dream Machine <\/a>respected composition better when I fed it a designed keyframe.<\/li>\n<\/ul>\n\n\n\n<p>What didn&#8217;t work great:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Long monologues. Lip sync is still meh. Voiceover + B-roll cutaways worked better than trying to match a speaking mouth.<\/li>\n\n\n\n<li>Overly abstract directions. &#8220;Make it dreamlike&#8221; gave me mush. Concrete cues like &#8220;shallow DOF, dust motes, cyan spill&#8221; created a dream vibe without the fog machine effect.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">Animation Techniques for Fiction to Video Projects<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Creating Engaging Visuals from Story Content<\/h3>\n\n\n\n<p>Here are the techniques that actually made my cuts feel alive, without me fighting the timeline for hours.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Motion comic with 2.5D parallax: I split foreground, midground, background in Photoshop, then used subtle parallax. It looks fancier than it is. Great for memory scenes or letters.<\/li>\n\n\n\n<li>Ken Burns, but precise: Micro-zooms on nouns that matter, the lighter flick, rain on a windshield, a receipt with a time stamp. Specificity beats constant movement.<\/li>\n\n\n\n<li>Style locking with reference frames: I generated a style board (five stills) and fed one into each video prompt. Kept color and grain consistent across tools.<\/li>\n\n\n\n<li>Loopable atmospherics: Ten-second loops of neon reflections, city steam, and static shots covered transitions. Cheap way to hide jumpy cuts.<\/li>\n\n\n\n<li>Image-to-video for hero shots: Text-to-video for connective tissue: image-to-video for the important frames. In my Nov 1 test, a single keyframe turned into a 4-second push-in that sold the mood.<\/li>\n<\/ul>\n\n\n\n<p>Settings that helped me:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Aspect ratio: 16:9 for YouTube: 9:16 vertical for social teasers. Don&#8217;t fight the crop later.<\/li>\n\n\n\n<li>Duration: 3\u20136 seconds per shot kept momentum. Anything over 8 seconds needed internal action.<\/li>\n\n\n\n<li>Negative prompts: &#8220;No extra characters, no logo, no deformed hands.&#8221; It reduced weird surprise cameos by ~30% in Runway Gen-3.<\/li>\n<\/ul>\n\n\n\n<p>If you&#8217;re doing voiceover, I liked ElevenLabs for narration and Descript for timing the beats. Lip sync inside the generative video still isn&#8217;t there yet.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Examples of Successful Fiction to Video Adaptations<\/h2>\n\n\n\n<p>On Oct 28, 2025, I adapted my 1,180-word microfiction into a 92-second piece:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Tools: Luma Dream Machine v1.6 (image-to-video for 7 key shots), Runway Gen-3 Alpha (connective shots), CapCut for assembly.<\/li>\n\n\n\n<li>Render time: ~28 minutes total on a Studio GPU: cost: ~$6 across credits.<\/li>\n\n\n\n<li>What worked: tone coherence, cinematic close-ups, a reveal that actually landed.<\/li>\n\n\n\n<li>What didn&#8217;t: the protagonist&#8217;s jacket changed texture twice: one shot added a stray background figure. I patched both with a re-render and a crop.<\/li>\n<\/ul>\n\n\n\n<p>Public examples worth studying (not sponsored):<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><a href=\"https:\/\/www.kaiber.ai\/superstudio\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Kaiber community<\/a> shorts adapting public-domain Poe scenes show solid text-to-animatic workflows.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-3 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"610\" data-id=\"4343\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-74-1024x610.png\" alt=\"\" class=\"wp-image-4343 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-74-1024x610.png 1024w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-74-300x179.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-74-768x458.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-74-18x12.png 18w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-74.png 1139w\" data-sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/610;\" \/><\/figure>\n<\/figure>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Runway&#8217;s Gen-3 showcase highlights consistent motion and camera language on story beats (see their official gallery\/docs).<\/li>\n\n\n\n<li>Note: OpenAI&#8217;s Sora (announced Feb 2024) still isn&#8217;t broadly available as of this writing: results look stunning but you can&#8217;t rely on it for production yet.<\/li>\n<\/ul>\n\n\n\n<p>If you try one thing, try this: write a 400\u2013600 word scene, extract 5 beats, and render one image-to-video hero shot per beat. You&#8217;ll learn more in an hour than a week of scrolling examples.<\/p>\n\n\n\n<p>Lately, I&#8217;ve been experimenting with <a href=\"https:\/\/crepal.ai\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">CrePal<\/a>\u2014it handles the whole process from script to finished video in one place, which saves a ton of switching between tools. Worth checking out if you want to jump straight in and test a story idea.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-4 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"946\" height=\"544\" data-id=\"4344\" data-src=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-75.png\" alt=\"\" class=\"wp-image-4344 lazyload\" data-srcset=\"https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-75.png 946w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-75-300x173.png 300w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-75-768x442.png 768w, https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-75-18x10.png 18w\" data-sizes=\"auto, (max-width: 946px) 100vw, 946px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 946px; --smush-placeholder-aspect-ratio: 946\/544;\" \/><\/figure>\n<\/figure>\n\n\n\n<p>For keyframes, their <a href=\"https:\/\/crepal.ai\/blog\/animagine-xl-4-0-free-image-generate-online\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">free online access to Animagine XL 4.0 <\/a>has been a game-changer for quick, detailed anime-style stills that feed right into Luma or Runway without extra hassle. Worth checking out if you want to jump straight in and test a story idea.<\/p>\n\n\n\n<p>Last small note: I keep screenshots and timestamps for each run in a Notion page. When something looks great, I can actually reproduce it later, which is half the battle with generative video.<\/p>\n\n\n\n<p>If you want my beat-sheet prompt or shot-list template, ping me. I&#8217;ll share the file. And if a tool burns through credits without results, I&#8217;ll say it, gently, but I will.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p>Previous posts:<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"qYbRFom787\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-novel-to-video-how-to-convert-novel\/\">How to Turn a Novel To Video Automatically (Step-by-Step)<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"&#8220;How to Turn a Novel To Video Automatically (Step-by-Step)&#8221; &#8212; CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-novel-to-video-how-to-convert-novel\/embed\/#?secret=GbCijzKM8Y#?secret=qYbRFom787\" data-secret=\"qYbRFom787\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"8jq8XZ8vdJ\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-script-to-video-blog-to-video\/\">Blog to Video Generation: SEO Script-to-Video Guide<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"&#8220;Blog to Video Generation: SEO Script-to-Video Guide&#8221; &#8212; CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-script-to-video-blog-to-video\/embed\/#?secret=alflO6WZbd#?secret=8jq8XZ8vdJ\" data-secret=\"8jq8XZ8vdJ\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-crepal-content-center wp-block-embed-crepal-content-center\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"BYAsf0BelQ\"><a href=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-script-to-video-agency-workflow\/\">How Agencies Use Script-to-Video to Reduce Editing Costs by 90%<\/a><\/blockquote><iframe class=\"wp-embedded-content lazyload\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"&#8220;How Agencies Use Script-to-Video to Reduce Editing Costs by 90%&#8221; &#8212; CrePal Content Center\" data-src=\"https:\/\/crepal.ai\/blog\/aivideo\/blog-script-to-video-agency-workflow\/embed\/#?secret=ZjFGu33mXS#?secret=BYAsf0BelQ\" data-secret=\"BYAsf0BelQ\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" data-load-mode=\"1\"><\/iframe>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>Hey, I&#8217;m Dora. On October 22, 2025, I was up way too late, staring at a half-finished short story and a blinking cursor. I wondered: could I turn this into a little film without pulling an all-nighter in After Effects? That curiosity pulled me into the rabbit hole of &#8220;fiction to video.&#8221; Not sponsored, just [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":4339,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_gspb_post_css":"","_uag_custom_page_level_css":"","footnotes":""},"categories":[8],"tags":[],"class_list":["post-4338","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-aivideo"],"blocksy_meta":[],"uagb_featured_image_src":{"full":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-71.png",1376,768,false],"thumbnail":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-71-150x150.png",150,150,true],"medium":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-71-300x167.png",300,167,true],"medium_large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-71-768x429.png",768,429,true],"large":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-71-1024x572.png",1024,572,true],"1536x1536":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-71.png",1376,768,false],"2048x2048":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-71.png",1376,768,false],"trp-custom-language-flag":["https:\/\/crepal.ai\/blog\/wp-content\/uploads\/2025\/12\/image-71-18x10.png",18,10,true]},"uagb_author_info":{"display_name":"Dora","author_link":"https:\/\/crepal.ai\/blog\/author\/dora\/"},"uagb_comment_info":9,"uagb_excerpt":"Hey, I&#8217;m Dora. On October 22, 2025, I was up way too late, staring at a half-finished short story and a blinking cursor. I wondered: could I turn this into a little film without pulling an all-nighter in After Effects? That curiosity pulled me into the rabbit hole of &#8220;fiction to video.&#8221; Not sponsored, just&hellip;","_links":{"self":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4338","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/comments?post=4338"}],"version-history":[{"count":1,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4338\/revisions"}],"predecessor-version":[{"id":4345,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/posts\/4338\/revisions\/4345"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media\/4339"}],"wp:attachment":[{"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/media?parent=4338"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/categories?post=4338"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/crepal.ai\/blog\/wp-json\/wp\/v2\/tags?post=4338"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}