<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[contexteng.ai]]></title><description><![CDATA[Context is for kings.]]></description><link>https://contexteng.ai</link><generator>Substack</generator><lastBuildDate>Wed, 06 May 2026 09:26:41 GMT</lastBuildDate><atom:link href="https://contexteng.ai/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Content Foundry, inc]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[contexteng@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[contexteng@substack.com]]></itunes:email><itunes:name><![CDATA[Randall Bennett]]></itunes:name></itunes:owner><itunes:author><![CDATA[Randall Bennett]]></itunes:author><googleplay:owner><![CDATA[contexteng@substack.com]]></googleplay:owner><googleplay:email><![CDATA[contexteng@substack.com]]></googleplay:email><googleplay:author><![CDATA[Randall Bennett]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Multiple Claude Code instance isolation without multiple file copies or git worktrees]]></title><description><![CDATA[If you use a Mac, APFS let's you do `cp -c` which lets you create a "copy-on-write" file, meaning if you don't write to it, you don't copy it.]]></description><link>https://contexteng.ai/p/multiple-claude-code-instance-isolation</link><guid isPermaLink="false">https://contexteng.ai/p/multiple-claude-code-instance-isolation</guid><dc:creator><![CDATA[Randall Bennett]]></dc:creator><pubDate>Fri, 25 Jul 2025 00:56:23 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!zaWm!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79719440-6599-4589-8c5e-b2cd38805635_1024x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!I0wv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!I0wv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png 424w, https://substackcdn.com/image/fetch/$s_!I0wv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png 848w, https://substackcdn.com/image/fetch/$s_!I0wv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png 1272w, https://substackcdn.com/image/fetch/$s_!I0wv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!I0wv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png" width="1048" height="122" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:122,&quot;width&quot;:1048,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:81343,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://contexteng.ai/i/169188264?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!I0wv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png 424w, https://substackcdn.com/image/fetch/$s_!I0wv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png 848w, https://substackcdn.com/image/fetch/$s_!I0wv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png 1272w, https://substackcdn.com/image/fetch/$s_!I0wv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F58219efd-1b5c-4a94-aeaf-e1aafcfed3a4_1048x122.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><p>I&#8217;m building out a full container isolation system, but before it&#8217;s done I found a way better approach to preventing Claude Code from clobbering other instances&#8230; it turns out classic ol `cp` has a -c flag you can use (on apfs formatted mac disks, ie 10.13 or later)  which does &#8220;copy on write.&#8221; </p><p>That means it won&#8217;t actually copy any files until they&#8217;re written to&#8230; which saves the time (and storage) of copying actual data unless you actually write to it.</p><p>So seriously you can do </p><pre><code>cp -c original.txt clone.txt</code></pre><p>and if you never write to either file, they&#8217;ll stay the same. If you write to either, they&#8217;ll split into their own files.</p><p>Networking and other isolation concerns still exist, but this can solve basic collisions. Containers will solve this, more coming on this soon.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://contexteng.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">contexteng.ai is a reader-supported publication. To receive new posts and support our work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Claude Code workshop]]></title><description><![CDATA[If context is for kings, Claude Code is the crown.]]></description><link>https://contexteng.ai/p/claude-code-workshop</link><guid isPermaLink="false">https://contexteng.ai/p/claude-code-workshop</guid><dc:creator><![CDATA[Randall Bennett]]></dc:creator><pubDate>Wed, 16 Jul 2025 17:16:23 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/168300564/810b4b83e00aae5ec50f93fa62981eb6.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p>We ran a prototype of our intro to Claude Code workshop. More to come, but thanks for rolling through!</p><div class="install-substack-app-embed install-substack-app-embed-web" data-component-name="InstallSubstackAppToDOM"><img class="install-substack-app-embed-img" src="https://substackcdn.com/image/fetch/$s_!zaWm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79719440-6599-4589-8c5e-b2cd38805635_1024x1024.png"><div class="install-substack-app-embed-text"><div class="install-substack-app-header">Get more from Randall Bennett in the Substack app</div><div class="install-substack-app-text">Available for iOS and Android</div></div><a href="https://substack.com/app/app-store-redirect?utm_campaign=app-marketing&amp;utm_content=author-post-insert&amp;utm_source=contexteng" target="_blank" class="install-substack-app-embed-link"><button class="install-substack-app-embed-btn button primary">Get the app</button></a></div>]]></content:encoded></item><item><title><![CDATA[Claude Code workshop / Context Eng podcast]]></title><description><![CDATA[A recording from Randall Bennett's live video]]></description><link>https://contexteng.ai/p/claude-code-workshop-context-eng</link><guid isPermaLink="false">https://contexteng.ai/p/claude-code-workshop-context-eng</guid><dc:creator><![CDATA[Randall Bennett]]></dc:creator><pubDate>Mon, 14 Jul 2025 15:14:59 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/168075041/bc83bc7569edc3236f11b375667f3f0e.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p>This week we&#8217;re talking Claude Code, and general intro. I swear our show notes will get better.</p><div class="install-substack-app-embed install-substack-app-embed-web" data-component-name="InstallSubstackAppToDOM"><img class="install-substack-app-embed-img" src="https://substackcdn.com/image/fetch/$s_!zaWm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79719440-6599-4589-8c5e-b2cd38805635_1024x1024.png"><div class="install-substack-app-embed-text"><div class="install-substack-app-header">Get more from Randall Bennett in the Substack app</div><div class="install-substack-app-text">Available for iOS and Android</div></div><a href="https://substack.com/app/app-store-redirect?utm_campaign=app-marketing&amp;utm_content=author-post-insert&amp;utm_source=contexteng" target="_blank" class="install-substack-app-embed-link"><button class="install-substack-app-embed-btn button primary">Get the app</button></a></div>]]></content:encoded></item><item><title><![CDATA[Empathy driven development: The context engineer's best route to progress]]></title><description><![CDATA[Don't try to get every case. Imagine you're telling a smart, reasonable person a task they need to complete. Empathy, not raw engineering, is the key to LLM quality.]]></description><link>https://contexteng.ai/p/empathy-driven-development-the-context</link><guid isPermaLink="false">https://contexteng.ai/p/empathy-driven-development-the-context</guid><dc:creator><![CDATA[Randall Bennett]]></dc:creator><pubDate>Tue, 08 Jul 2025 14:48:29 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!kKhS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kKhS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kKhS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!kKhS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!kKhS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!kKhS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kKhS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3451ad80-9329-4c36-be38-8282585788da_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2166115,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://contexteng.ai/i/167808142?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kKhS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!kKhS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!kKhS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!kKhS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3451ad80-9329-4c36-be38-8282585788da_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I've found the best way to understand how an AI works is to pretend you're working with a college student who is extremely capable, but inexperienced. Imagine you're giving an intern directions about how to complete a task. For them to do the baseline, you might want to start by giving them explicit instructions and examples. But in order to train them to do the right job, you need to help them understand the right mindset.</p><p>At Facebook, I worked with a lot of new grads and helped push them toward their potential. The thing I found always worked was asking them questions, not pushing them toward specific answers. I imagined, "how did i discover this topic?" and "how can I help Max discover this topic?"</p><p>The key was never giving a ton of context, it was giving the right context at the right time. It didn't matter how capable Max was at engineering, they wouldn't ever be able to do the thing they needed to do if I just overwhelmed them with everything possible.</p><p>So imagine you're talking to a friend, a new developer, or really anyone, and then try to give them the right amount of context to do the job. I've found <a href="https://contexteng.ai/p/context-engineering-101-the-hourglass">the hourglass</a> as a great framework to build context for LLMs (and people, to be frank) and try to get your LLM friend to do things one at a time.</p><p>Rather than create the biggest prompt, focus on making one task at a time work. Often times that's classification tasks, and then moving on from there to more complex tasks. You'll find a pipeline the best tool for the job generally, so getting familiar with job queues and distributed non-linear systems might also be great places to start.</p><p>But the first step, truly, is empathetic reasoning</p><p>.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://contexteng.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading contexteng.ai! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Context Engineering Is Only Half the Battle]]></title><description><![CDATA[Why Scaffolds Matter]]></description><link>https://contexteng.ai/p/context-engineering-is-only-half</link><guid isPermaLink="false">https://contexteng.ai/p/context-engineering-is-only-half</guid><dc:creator><![CDATA[Andrew Denta]]></dc:creator><pubDate>Mon, 07 Jul 2025 14:54:25 GMT</pubDate><enclosure url="https://substackcdn.com/image/youtube/w_728,c_limit/8IkufN4_Tr0" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The next decade of software engineering will revolve around Context Engineering. We no longer write the thing, we write the thing <em>that writes the thing</em>.</p><p>But I think this is only half the equation.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://contexteng.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading contexteng.ai! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Lets watch this cool video of a laser etched dog tag:</p><div id="youtube2-8IkufN4_Tr0" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;8IkufN4_Tr0&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/8IkufN4_Tr0?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>The laser engraver represents context engineering itself: a process of precision and focus. The engraver must follow an exact path to etch a design. This is similar to how you want to construct a prompt to guide the model&#8217;s attention.</p><p>The dog tag represents a different, more primitive technology. Millions of identical blank tags were pumped out of a factory, using the exact same process at thousands a minute. If context engineering is like a laser engraver, lets call the dog tags a <strong>scaffold</strong>.</p><h2>What Is a Scaffold?</h2><p>A scaffold is a computer program that writes boilerplate code. It&#8217;s code that generates code, but <strong>without</strong> an LLM. The key benefit here is it&#8217;s deterministic: It works the same way every time. Scaffolds help AI write better code, because now the AI can focus on the unique parts of an application, instead of the boilerplate.</p><h2>Rails + React</h2><p>I spend a lot of time in Ruby on Rails, and I prefer using React on the front end. This isn&#8217;t exactly a default combo, and there are a dozen ways to wire the two technologies together.</p><p>Here&#8217;s the problem: even with carefully crafted prompts and &#8220;rules,&#8221; language models just don&#8217;t set it up the same way every time. I wind up with three methods to send data to the front end and like <em>five</em> to send it to the back end. The LLM is just taking its best guess each step along the way.</p><p>So I built a <a href="https://gist.github.com/adenta/35d2443957f11fc75b2f0df81005c043#file-react_generator-rb">scaffold</a><a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a>.</p><p>Now, when I spin up a page, it follows the layout I <em>actually</em> use and not some boilerplate hallucination. I still use LLMs, but only for the bespoke stuff: business logic that will change with every application. The LLMs can dance on top of the generated code from the scaffold. It&#8217;s faster, cleaner, and way less frustrating.</p><p>Scaffolds aren&#8217;t glamorous or demo-worthy, but they&#8217;re the invisible infrastructure we&#8217;ll increasingly rely on to make LLM-powered workflows truly reliable. <strong>The challenge isn&#8217;t generating output, it&#8217;s guiding it. </strong>The future won&#8217;t be written by LLMs alone; it&#8217;ll be shaped by the scaffolds that guide them.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>In Ruby on Rails these are called Templates and Generators. https://guides.rubyonrails.org/generators.html. Python has an unrelated thing called a generator, so I went with scaffold for this article.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Ask questions one at a time]]></title><description><![CDATA[Model attention is a scarce resource. Guard it ruthlesslessly.]]></description><link>https://contexteng.ai/p/ask-questions-one-at-a-time</link><guid isPermaLink="false">https://contexteng.ai/p/ask-questions-one-at-a-time</guid><dc:creator><![CDATA[Dan Sisco]]></dc:creator><pubDate>Wed, 02 Jul 2025 17:56:24 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!zaWm!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79719440-6599-4589-8c5e-b2cd38805635_1024x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Flooding your LLM with a massive initial prompt isn&#8217;t helping, it&#8217;s making things worse.</p><p>How many times have you dived straight into building with an LLM only to get 80% of the way there and hit a wall? That maddening loop where Claude confidently says, &#8220;I see the issue!&#8221; only to fix nothing and start again.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://contexteng.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading contexteng.ai! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p><a href="https://boltfoundry.com/blog/2025-06-26-context-engineering">Context Engineering</a> has come to prominence in the last few weeks because actually directing a model&#8217;s attention is a real challenge that requires effort and skill. The science of giving an LLM just enough context to accomplish the task at hand is a critical aspect of reliability.</p><p>It&#8217;s tempting to cram everything into your first prompt and hope for a perfect one-shot response. And while you may one-shot it once in a while, generally your model is going to fail. </p><p>The better approach is to tell your LLM, &#8220;Here&#8217;s my objective, ask me questions about it one at a time until you have enough information to move forward.&#8221;</p><p>At Bolt Foundry we use Claude Code for development, which allows us to create <a href="https://docs.anthropic.com/en/docs/claude-code/slash-commands">custom commands</a>. In our environment we created a custom command called <strong>/questions-one-at-a-time</strong> that prompts Claude to, well, ask questions one at a time before generating code or completing a task.</p><p><strong>At Bolt Foundry, our cardinal rule is: Discovery before developing.</strong></p><p>This is the human version of &#8220;questions-one-at-a-time&#8221;. It forces us to pause and really understand the problem before writing a line of code (or asking Claude to do it).</p><p>Breaking a problem down into discrete questions that follow a logical flow is better for me as the user and for Claude. It helps us both focus our attention to understand the problem and create better plans.</p><p>Reliability comes from structure, not luck.</p><p>We&#8217;ve found the <a href="https://contexteng.substack.com/p/context-engineering-101-the-hourglass">Hourglass method</a> to be the best way to provide context in a structured way that doesn&#8217;t overwhelm the model.</p><p>Model attention is a scarce resource. Guard it ruthlessly and do everything in your power to focus it only on what matters most&#8230;.or <a href="https://www.youtube.com/watch?v=7aUGBT1DZDI&amp;t=34s">you&#8217;re gonna have a bad time</a>.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://contexteng.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading contexteng.ai! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Context engineering 101: The hourglass]]></title><description><![CDATA[We hesitate to say "use this one simple trick to improve your prompts by 100%" but honestly, it might be true.]]></description><link>https://contexteng.ai/p/context-engineering-101-the-hourglass</link><guid isPermaLink="false">https://contexteng.ai/p/context-engineering-101-the-hourglass</guid><dc:creator><![CDATA[Randall Bennett]]></dc:creator><pubDate>Tue, 01 Jul 2025 12:05:23 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ilQT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ilQT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ilQT!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png 424w, https://substackcdn.com/image/fetch/$s_!ilQT!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png 848w, https://substackcdn.com/image/fetch/$s_!ilQT!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png 1272w, https://substackcdn.com/image/fetch/$s_!ilQT!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ilQT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png" width="1000" height="869" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/923dd311-df82-4473-802a-57716ac6f463_1000x869.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:869,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:61185,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://contexteng.ai/i/167184505?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ilQT!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png 424w, https://substackcdn.com/image/fetch/$s_!ilQT!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png 848w, https://substackcdn.com/image/fetch/$s_!ilQT!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png 1272w, https://substackcdn.com/image/fetch/$s_!ilQT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F923dd311-df82-4473-802a-57716ac6f463_1000x869.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>It turns out if you talk to LLMs like you talk to people, they actually perform better. I studied communication in college, and learned there&#8217;s a few techniques you can use to help people understand a topic better.</p><p>The simplest one? <a href="https://en.wikipedia.org/wiki/Inverted_pyramid_(journalism)">Inverted Pyramid</a>. In a nutshell, if you start with the most crucial information at the top, and work your way down to background info, the flow helps people get the most important information as quickly as possible. If they need to know more they keep reading.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://contexteng.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading contexteng.ai! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h1>Inverted pyramid system prompts</h1><p>At <a href="http://boltfoundry.com">Bolt Foundry</a>, we&#8217;re <a href="https://github.com/bolt-foundry/bolt-foundry/tree/main/apps/aibff">building out a system for this</a>, but the concepts are pretty universal. Essentially, you want to describe to the assistant the &#8220;Who / Why&#8221; before the &#8220;What&#8221;, and then lastly the &#8220;How&#8221;.</p><p>We&#8217;ll post more about this, but guiding an LLM into its role, instead of telling it precisely what to do and what not to do, means the LLM will be able to perform better than if you give it a list of do&#8217;s and don&#8217;ts. <a href="https://en.wikipedia.org/wiki/Waluigi_effect">The Waluigi effect</a> is less likely to happen if you &#8220;soften&#8221; the params, and make it less like a recipe and more like a dossier.</p><h1>NEVER INCLUDE VARIABLES IN SYSTEM PROMPTS</h1><p>It&#8217;s in caps for a specific reason: System prompts are the most valuable tool in guiding an LLM. They&#8217;re the first information the assistant sees, and LLM developers are building <a href="https://model-spec.openai.com/2025-04-11.html#chain_of_command">chain of command</a> into everything they do. System prompts should contain developer specific messages which cannot be overridden.</p><div class="pullquote"><p>DO NOT INCLUDE INFORMATION IN YOUR SYSTEM PROMPT WHICH CAN OVERRIDE YOUR INTENTION.</p></div><p>Additionally, prompt caching means that providers are more likely to start inference in the middle (cached) than at the beginning, and if you change your system prompt all the time you lose out on this opportunity.</p><p>(There&#8217;s a bunch more I&#8217;ll explain later, but trust me: DO NOT MAKE SYSTEM PROMPTS DYNAMIC.)</p><h1>Add context to user turns</h1><p>Okay, so if you&#8217;re not allowed to put variables in the system prompt (SERIOUSLY DON&#8217;T) then how do you include them?</p><p>User turns, of course.</p><p>But then what if you don&#8217;t want to make a bunch of calls over and over just to get the assistant up to speed?</p><p>Then don&#8217;t do that. Just make synthetic user turns.</p><h1>Context is the bottom of the hourglass</h1><p>Did you know you can just send the OpenAI (compatible) completion endpoint an array of messages? Like, they don&#8217;t have to be generated by the LLM before&#8230; you can just pretend the LLM said them and the LLM will complete them.</p><p>So just create synthetic user turns, and provide all of your context in the synthetic user turns. Importantly, make it so the user turns are ordered from least important to most important. My recommendation is &#8220;user info&#8221; i.e. user supplied content, then any context you want to provide, and finally the output format you want.</p><p>You need to provide assistant turns just like you would if it was real, but just make the assistant ask for the precise information you need.</p><h1>OK, but seriously how do I know this works?</h1><p>I&#8217;ll have data for you soon, but aibff (the Bolt Foundry tool for doing evals / decks / cards / contexts / samples) will let you build this stuff just using markdown.</p><p>It&#8217;s really the earliest days for us, so the best way to find out what&#8217;s going on is this newsletter, or <a href="http://boltfoundry.com">boltfoundry.com</a> if you care. Also, reach out to me, <a href="http://x.com/randallb">x.com/randallb</a>.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://contexteng.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading contexteng.ai! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Context is for kings]]></title><description><![CDATA[Engineering is moving from law driven to context driven. Buckle up.]]></description><link>https://contexteng.ai/p/context-is-for-kings</link><guid isPermaLink="false">https://contexteng.ai/p/context-is-for-kings</guid><dc:creator><![CDATA[Randall Bennett]]></dc:creator><pubDate>Mon, 30 Jun 2025 13:04:17 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!zaWm!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79719440-6599-4589-8c5e-b2cd38805635_1024x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<blockquote><p>Universal laws are for lackeys.</p><p>Context is for kings.</p><p>&#8212; <a href="https://en.wikipedia.org/wiki/Context_Is_for_Kings">Gabriel Lorca</a></p></blockquote><p>If / then. 0s and 1s. True or false?</p><p>For as long as I can remember, I&#8217;ve always heard about how computers were driven by binary, and how everything could be distilled into two distinct states: On or off.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://contexteng.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading contexteng.ai! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Then, I used GPT-3 for the first time. It was very un-binary.</p><p>Depending on the text I wrote before I ran the completion, I could get wildly different results. And that&#8217;s cool, and can make great new experiences, but it&#8217;s hard to build a business on wildly different results.</p><h1>LLMs are more like construction contractors than software.</h1><p>Have you ever worked with a handyman to do construction around your home? Ask them when they&#8217;re going to come fix that hole in the window. Or when they&#8217;re going to finish painting that room.</p><p>Tomorrow will be a liberal term for &#8220;in the future when all things line up the way I expect and I remember to be there.&#8221;</p><p>For a homeowner, waiting to get something done is fine, because as long as the promise of a less-expensive fix, regardless of when it happens, is appealing because paying less for something that eventually gets delivered at an acceptable quality is a tradeoff most people can make. </p><p>But not for businesses.</p><p>Imagine your employee shows up every day exactly when you ask, but 20-80% of the time they can&#8217;t complete the task without extreme oversight. Maybe they&#8217;re having a bad day, maybe they&#8217;re drunk, but either way, it&#8217;s going to be hard to make it work. </p><p>At best you&#8217;d build a system of redundancy, but at worst, you&#8217;ll destroy your business&#8217;s credibility, or spend so much time managing this employee, you won&#8217;t end up better off than you started. Either way, quality control and performance reviews are a must.</p><h1>Good employees do what they&#8217;re told. Great employees do the work without being asked.</h1><p>So this is the trick, right? How do you make sure your employee shows up every time reliably without micromanaging them?</p><p>Context, it turns out, isn&#8217;t just for LLMs, it&#8217;s for people too.</p><p>I&#8217;ve only had two managers (Blaine Mucklow and Randolph Faust) who have ever understood how i work, and how to put me in positions to succeed.</p><p>The difference between them and every other manager I&#8217;ve ever had? They gathered organizational context on my behalf, and then helped me focus exclusively on what needed to be done next.</p><p>For me, they are context engineers: People who skillfully understand the problem space, and gave me the right information at exact the right time.</p><p>I got the highest performance rating available at facebook&#8230; a feat that is extremely rare.</p><h1>Empower and communicate, don&#8217;t command and control.</h1><p>Bad companies and managers treat their employees the way we treat LLMs. They just give vague ideas, in whatever order seems pertinent, and hope things work.</p><p>Great companies treat their employees like partners. They give them freedom and autonomy to make choices, and concentrate more on outputs than process.</p><p>So then how do we treat our LLMs more like partners?</p><p>Empathy, it turns out, is the key trait of both. Figuring out what your coworker needs to understand, without burdening them with detail, is the difference between a high performer and a person or LLM you want to fire.</p><h1>Communication is the new code</h1><p>We&#8217;ll be talking at length about how communication now matters more than code. 10x engineers have always needed to be effective communicators, but now everyone needs to be.</p><p>I&#8217;m the product of computer science and communication. I can&#8217;t wait to show you how to how communication changes everything.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://contexteng.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading contexteng.ai! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item></channel></rss>