Tools
-
Just: The Command Runner
If you’ve ever used
maketo run commands, it works, but you’re using a build system as a command runner. Just is a dedicated command runner that does exactly what you want and doesn’t come with all the baggage.Created by Casey Rodarmor about a decade ago (it turns 10 this June), Just is written in Rust and laser-focused on being a really good command runner. That’s it; just a clean way to define and run commands.
Why I Use It
On a lot of my projects, I use Just in conjunction with Mise. Mise does a fine job as a task runner, but when I want something more expressive for command orchestration, Just is what I reach for.
For example, I’ve got a lot of really long Python commands these days. I’m injecting environment variables at runtime, calling Poetry, then running Python with specific arguments. There’s a ton of boilerplate you have to type out every time. I don’t want to remember all of that, and I definitely don’t want to go digging through my shell history or copy-pasting from a README.
Just simplifies that. Define it once in a
justfile, and now your common commands are nowjust <recipe>.Modules Changed the Game
Something I started using recently is Just’s module system, which with it you can create separate
.justfiles and have your mainjustfileimport them as submodules. This lets you group related commands naturally.So instead of flat recipes like:
just test-cover just test-unit just test-integrationYou can organize them into modules:
just test cover just test unit just test integrationIt’s a small change, but it makes your command surface feel a lot more intuitive, especially as your project grows and you accumulate dozens of commands.
Getting Started
If you want to dig deeper, Casey wrote a great walkthrough called Tour de Just that covers the highlights. The official documentation is great too.
Just is commercially friendly, widely packaged, and easy to install on pretty much any platform. If you find it useful, you should consider sponsoring Casey on GitHub. Open source maintainers deserve support for tools we rely on daily.
/ Tools / Development / Just / Command-runner
-
I'm Just Going to Use Zed
I’ve been thinking about my editor setup and I wanted to work through the costs out loud. This post is mainly for me, but maybe it helps you too.
Here’s where I landed: I’m canceling(ed) JetBrains and Cursor, and just using Zed.
I know that might sound crazy. JetBrains All Products has been my go-to for a while now, and Cursor was interesting for a bit, but turns out all I needed was speed and cheap.
The Cost Breakdown
Editor Monthly Cost Yearly Cost Zed Pro $10.00 $120.00 Cursor Pro $20.00 $240.00 ($192 if billed annually) JetBrains All Products ~$14.92 $179.00 Zed Pro at $10/month is the cheapest option by a decent margin. That’s $120 a year versus $179 for JetBrains or up to $192 for Cursor. Not a huge difference money, but it adds up.
Zed Is Enough
My primary way of working these days is Zed plus CLI-based tools. I use Claude Code in the terminal (Mostly Ghostty) alongside the IDE, and that combination handles everything. The editor itself is fast, the AI integration is solid.
I already canceled Cursor. It’s a good product, but not at double what Zed costs. Maybe I’ll go back and try it again at some point, but right now there’s no reason to do that.
Sometimes the right move is just picking what works and not overthinking it.
/ Tools / Development / Zed
-
Voice-to-Text in 2026: The Tools and Models Worth Knowing About
As natural language becomes a bigger part of how we build software, it’s worth looking at the state of transcription models. What’s the best way to get voice to text right now?
For a lot of people, talking to your computer is faster than typing. You can stream-of-thought your way through an idea, prompt your tools, and get things moving without your fingers being the bottleneck. If you haven’t tried it yet, it will change how you work with your machine. I’m not exaggerating.
The Tools
Here’s what people are actually using for desktop voice-to-text:
- Willow Voice — Popular choice, lots of people swear by it
- SuperWhisper — My current pick
- Wispr Flow — Another well-regarded option
- Voice Ink — Worth a look?
- Aiko — From an Open Source dev, Sindre Sorhus
- MacWhisper — Solid Mac-native option
I’ve tried several of these, and the biggest pain point for people is going to be that many require monthly subscriptions. I’ve been happy with SuperWhisper and it is worth mentioning they still have a pay for it once (Lifetime) option, so you don’t get locked into monthly payments forever. That said, Willow Voice and Wispr Flow both have strong followings.
The Models Behind the Magic
Most of these tools started with OpenAI’s Whisper, the voice model released and open-sourced back in 2022. With Whisper, you could run solid transcription locally on your own hardware.
But we’re a few years past that now, and there are some more models to choose from. Here is a summary table of the current state of the transcription models.
---Model Company Released Local Run? Used in Desktop Tools? Best For Whisper Large-v3 OpenAI Nov 2023 Yes Yes (The Standard) Multilingual accuracy (99+ langs) Whisper v3 Turbo OpenAI Oct 2024 Yes Yes (Fast Settings) Best speed-to-accuracy ratio for local use Nova-3 Deepgram Apr 2025 Self-Host Limited (API-based) Real-time agents; handling messy background noise Parakeet TDT 1.1B NVIDIA May 2025 Yes Developer-focused / CLI Ultra-low latency; significantly faster than Whisper SenseVoice-Small Alibaba July 2024 Yes Emerging (Fringe) High-precision Mandarin/English and emotion detection Canary-1B NVIDIA Oct 2025 Yes Developer-focused Beating Whisper on technical jargon & punctuation Voxtral Mini V2 Mistral Feb 2026 Yes Yes (Privacy apps) High-speed local transcription on low-VRAM devices Granite Speech 3.3 IBM Jan 2026 Yes No (Enterprise focus) Reliable technical ASR with an Apache 2.0 license Scribe v2 ElevenLabs Jan 2026 No Via API Extremely lifelike punctuation and speaker labels We’re at an interesting inflection point. You can articulate your thoughts faster by speaking than typing, its becoming a real productivity gain. It’s not just an accessabiltiy aid anymore. People who can type well enough are using these tools on a daily basis.
That’s all for now!
/ Productivity / AI / Tools / Voice
-
I got tired of plaintext .env files, so I built LSM.
lsm execwill inject secrets at runtime so they never touch the filesystem. Doppler’s idea, minus the monthly bill.How are you managing local secrets?
/ Programming / Tools / security
-
I switched to mise for version management a month ago. No regrets. No more
brew upgradebreaking Python. Built-in task runner replaced some of projects that were using Makefiles.Still juggling nvm + pyenv + rbenv?
/ DevOps / Programming / Tools
-
I wrote about why you should stop using pip. Poetry or uv. Pick one. pip has no lockfile, no dependency resolution worth trusting, and no isolation by default.
Have you moved to uv yet? Still happy with poetry? How’s it going?
/ Programming / Tools / Python
-
I compared npm, Yarn, pnpm, and Bun. TLDR version: pnpm wins for most teams, Bun wins if you’re already on the runtime.
Has anyone switched their whole team to Bun yet? How’d that go?
/ Programming / Tools / javascript
-
Wrote a guide on writing a good CLAUDE.md. Takeaway: keep it under 200 lines. Every line loads into context every session, so bloat costs real tokens.
How are you handling multiple AI context files across tools?
/ Programming / Tools / Claude-code
-
/ Tools / links / open source / database
-
LangChain: Observe, Evaluate, and Deploy Reliable AI Agents
LangChain provides the engineering platform and open source frameworks developers use to build, test, and deploy reliable AI agents.
/ Tools / links / agent / open source / software engineering
-
Local Secrets Manager - Dotenv Encrypter
I built a thing to solve a problem. It has helped me, maybe it will help you?
It all starts with a question.
Why isn’t there a good local secrets manager that encrypts your secrets at rest? I imagine a lot of people, like me, have a number of local applications. I don’t want to pay per-seat pricing just to keep my sensitive data from sitting in plaintext on my machine.
I built an app called LSM Local Secrets Manager to solve that problem. The core idea is simple. Encrypt your
.envfiles locally and only decrypt when you need them (sometimes at runtime).The Problem
If you’ve got a bunch of projects on your machine, each with their own
.envor.env.localfile full of API keys you’re definitely not rotating every 90 days. Those files just sit there in plaintext. Any process on your system can read them. And with AI agents becoming part of our dev workflows, the attack surface for leaking secrets is only getting easier.ThE CLAW EnteRed ChaT
I started looking at Doppler specifically for OpenCLAW. Their main selling feature is injecting secrets into your runtime so they never touch the filesystem. I was like, cool. Also I like that Doppler stores everything remotely. The only thing was the cost did not make sense for me right now. I don’t want to pay $10-20 a month for this set of features.
So what else is there?
Well GCP Secret Manager has its own set of issues.
You can’t have duplicate names per project, so something as common as
NODE_ENVacross multiple apps becomes a more work than you want to deal with. Some wrapper script that injects prefixes? No thanks. I imagine there are a thousand and one homegrown solutions to solve this problem. Again, no thanks.So what else is there?
You Find A Solution
AWS Secret Manager
A Problem for Solution Problem
AWS IAM
🫣
I have a lot more to say here on this subject but will save this for another post. Subscribe if you want to see the next post.
The Solution
The workflow is straightforward:
lsm init— Run this once from anywhere. It generates your encryption key file.lsm link <app-name>— Run this inside your project directory. It creates a config entry in~/.lsm/config.yamlfor that application.lsm import— Takes your existing.envor.env.localand creates an encrypted version.lsm clean— Removes the plaintext.envfiles so they’re not just sitting around.lsm dump— Recreates the.envfiles if you need them back.
But wait there’s more.
Runtime Injection with
lsm execRemember that cool thing I just told you about? Instead of dumping secrets back to disk, you run:
lsm exec -- pnpm devI feel like a family man from Jersey, who don’t mess around. Aye, you got, runtime injection. I got that.
Well that’s
lsmanyways. It can decrypt your secrets and inject them directly into the runtime environment of whatever command follows the--. Your secrets exist in memory for the duration of that process and nowhere else. No plaintext files hanging around for other processes to sniff.Credit to Doppler for the idea. The difference to what we are doing is your encrypted files stay local.
What’s Next
I’ve got some possible ideas of improvements to try building.
- Separate encrypt/decrypt keys — You create secrets with one key, deploy the encrypted file to a server, and use a read-only key to decrypt at runtime. The server never has write access to your secrets.
- Time-based derivative keys — Imagine keys that expire or rotate automatically.
- Secure sharing — Right now you’d have to decrypt and drop the file into a password manager to share it. There’s room to make that smoother.
I’m not sure how to do all of that yet, but we’re making progress.
Why Not Just Use Doppler?
There are genuinely compelling reasons to use Doppler or similar services. I mean bsides the remote storage, access controls and auditable logs. There’s a lot to love.
For local development across a bunch of personal projects? I don’t think you should need a SaaS subscription to keep your secrets encrypted.
LSM is still early, but the core workflow is there and it works.
Give it a try if you’re tired of plaintext
.envfiles scattered across your machine.
/ DevOps / Programming / Tools / security
-
Why Ghostty Is My Terminal for Agentic Work
I love Ghostty for agentic work (mostly Claude Code). It doesn’t try to bake in its own agentic coding environment. It’s completely unopinionated about how you use it. It is exactly what I want from a terminal.
It’s open source, primarly made by one person, Mitchell Hashimoto, who doesn’t ask you for any money. No outside investment, no employees. Just a really solid (I think the best?!) terminal emulator.
Sometimes I do wish it had slightly better navigation, and system notifications was easier to figure out, but this is minor stuff and not a blocker for me being productive or enjoying the work.
The Warp’ed De-Tour
I used to use Warp before Ghostty. I’ll still open it occasionally to see what they’re working on. Warp has some interesting ideas, they’re trying to replace your IDE and be your entire agentic development environment. The problem is they seem to have too many features now for general use. I think this approach will turn off both the IDE crowd and the Neovim crowd simultaneously. So, I keep going back to Ghostty.
We now have a new contender.
Enter Cmux
Cmux is a newer option that actually solves those two minor problems I had with Ghostty. It has better navigation with side tabs, and notifications work out of the box. It’s open source and free to use, and it’s built on Ghostty under the hood, so the core terminal experience is solid.
There’s a small AI company behind it. It looks like their Y Combinator batch was in 2024, and they’re trying to build some kind of product on top of Cmux, possibly memory-related. Though with Claude Code getting better at memory and plenty of free memory frameworks already out there, I’m not sure where that is headed. This CMUX project could be the start of a pivot?
The repo is kind of a mess, they have their website mixed in with the application code. And they offer something called a “Founder’s Edition” for $30/month… which I don’t know how that makes any sense when Warp is $20/month, Zed is $10/month, and Cursor is $20/month.
However it’s optional and the free version of Cmux is really good right now; but I’m be doubtful it exists in five or ten years. My guess is their exit strategy is to get acquired by a model provider, given that they have taken investment.
I am having fun with cmux, so check it out if you haven’t yet!
/ Tools / Development / Terminal
-
Claudine — A kanban board for Claude Code
Manage all your Claude Code conversations with a visual kanban board. Auto-status detection, full-text search, drag-and-drop, and more.
/ Tools / Claude / links / digital organization
-
Software Licensing and Distribution API
Easily add license key validation, entitlements, and device activation to your software products using Keygen’s licensing API.
-
peon-ping — Stop babysitting your terminal
Sound notifications for any AI agent — hooks for Claude Code, Cursor, Codex & more, plus an MCP server so the agent can choose its own sounds.
/ Tools / links / agent / Terminal / notifications
-
Polar — Monetize your software with ease | Polar
Monetize your software with ease
/ Tools / Development / links / platform / creators
-
JavaScript Package Managers: NPM, Yarn, PNPM, and Bun Compared
If you’ve been writing JavaScript for any length of time, you’ve probably had opinions about package managers. Everyone has used npm because it’s the default. Maybe you switched to Yarn back in 2016 and haven’t looked back. These days, there are better options.
That may seem like a bold statement, but bear with me. This article is a mix of opinions and facts. The package manager landscape has changed quite a bit in the last decade, and it’s worth exploring. Let’s break it down.
NPM: The Default Everyone Knows
npm is the package manager that ships with Node. It works. Everyone knows it. Node modules are straightforward to reason about, and security has been improving over the years.
But npm has historically struggled with performance. That’s partly a design problem, it was so widely adopted that making fundamental speed improvements meant risking breakage for the massive ecosystem already depending on it. When you’re supporting millions of packages, you need to be careful in managing backward compatibility breaks, making optimization a lot harder.
This performance gap is exactly what opened the door for alternatives.
Yarn: The Pioneer That Over-Optimized
Yarn showed up in 2016, created by Facebook, and it genuinely pushed the ecosystem forward. It parallelized downloads, introduced offline caching, and most notably, introduced lock files to JavaScript. npm eventually adopted lock files too, so Yarn’s influence on the broader ecosystem is undeniable.
Lock files did exist for other languages before 2016, such as Ruby and PHP, but Yarn was the first JavaScript package manager to include it.
The problem came with Yarn 2. It’s a classic case of over-optimization.
Yarn 2 introduced Plug’n’Play mode, which replaces your
node_modulesfolder with zip files. We’re on Yarn 4 now, and while you can swap between modes, if you’re in the zip mode it becomes genuinely painful to inspect the actual JavaScript code you’re installing. You have to unzip things, dig through archives, and it just adds friction where there shouldn’t be any.If you enjoy making JavaScript development harder than it needs to be, Yarn’s PnP mode has you covered. It’s your Toxic coworkers favorite tool.
PNPM: The Clear Upgrade
If you look at the benchmarks, pnpm wins in almost every common scenario. Running install with a warm cache, lock file, and existing node modules? Faster than npm. A clean install with nothing cached? 7 seconds versus 30 seconds. That’s not a marginal improvement.
Speed isn’t even the best part. pnpm uses hard links from a centralized store instead of copying packages into every project’s
node_modules. Depending on the size of your projects, you can save up to 70% of your disk space compared to npm. If you’re working on multiple JavaScript projects (and who isn’t?), that adds up fast.pnpm also handles the node_modules structure in a way that’s strict by default, which means your code can’t accidentally import packages you haven’t explicitly declared as dependencies. It catches bugs that npm would let slide.
So pnpm is the clear winner, right? Well, there’s one more contender we haven’t talked about yet.
Bun: The Speed Demon
Bun was released in 2023, and noticeably faster than pnpm.
The reason comes down to architecture. pnpm is written in TypeScript and runs on Node, which means every time you run
pnpm install, your computer has to start the V8 engine, load all the JavaScript, compile it, and then ask the operating system to do the actual work. That’s a lot of overhead.Bun is a compiled binary written in Zig. It talks directly to your kernel, no middleman, no V8 engine slowing down every tiny decision. On top of that, Bun is hyper-focused on optimizing system calls. Instead of doing file operations one at a time (open file A, write file A, close file A, repeat a thousand times), it aggressively batches them together. The result is speed improvements not just in disk operations but in everything it does.
Earlier versions of Bun had an annoying quirk similar to Yarn, it used a binary lock file that was difficult to manually audit. That’s been fixed. Bun now uses a readable lock file, which removes the biggest objection people had.
Which begs the question… ?
So Why Isn’t Everyone Using Bun?
The short answer: it’s complicated. Bun isn’t just a package manager, it also replaces Node as your runtime. If you’re using Bun as your runtime, using it as your package manager makes total sense. Everything fits together.
But most teams are still on Node. And when you’re on Node, pnpm is the clearer choice for everyone involved. A new developer joining your team sees pnpm and immediately knows, “Oh, this is a JavaScript project, I know how this works.” Bun as a package manager on top of Node adds a layer of “wait, why are we using this?” that you have to explain.
Maybe that changes in the future as Bun’s runtime adoption grows. I’m sure the Bun team is working hard to make that transition as smooth as possible. But the reality right now is that most JavaScript projects are running on Node.
My Recommendation
If you’re starting a new project or looking to switch:
- Using Bun as your runtime? Use Bun for package management too. It’s the fastest option and everything integrates cleanly.
- On Node (most of us)? Use pnpm. It’s faster than npm, saves disk space, and is strict in ways that catch real bugs. Your team will thank you.
- Still on npm? You’re not doing anything wrong, but you’re leaving performance and disk space on the table for no real benefit.
- On Yarn PnP? I have questions, but I respect your commitment.
The JavaScript ecosystem moves fast, and if you haven’t revisited your package manager choice in a while, it might be worth running a quick benchmark on your own project. The numbers might surprise you.
/ Tools / Development / javascript
-
Switching to mise for Local Dev Tool Management
I’ve been making some changes to how I configure my local development environment, and I wanted to share what I’ve decided on.
Let me introduce to you, mise (pronounced “meez”), a tool for managing your programming language versions.
Why Not Just Use Homebrew?
Homebrew is great for installing most things, but I don’t like using it for programming language version management. It is too brittle. How many times has
brew upgradedecided to switch your Python or Node version on you, breaking projects in the process? Too many, in my experience.mise solves this elegantly. It doesn’t replace Homebrew entirely, you’ll still use that for general stuff but for managing your system programming language versions, mise is the perfect tool.
mise the Great, mise the Mighty
mise has all the features you’d expect from a version manager, plus some nice extras:
Shims support: If you want shims in your bash or zsh, mise has you covered. You’ll need to update your RC file to get them working, but once you do, you’re off to the races.
Per-project configuration: mise can work at the application directory level. You set up a
mise.tomlfile that defines its behavior for that specific project.Environment management: You can set up environment variables directly in the toml file, auto-configure your package manager, and even have it auto-create a virtual environment.
It can also load environment variables from a separate file if you’d rather not put them in the toml (which you probably want if you’re checking the file in).
It’s not a package manager: This is important. You still need poetry or uv for Python package management. As a reminder: don’t ever use pip. Just don’t.
A Quick Example
Here’s what a
.mise.tomlfile looks like for a Python project:[tools] python = "3.12.1" "aqua:astral-sh/uv" = "latest" [env] # uv respects this for venv location UV_PROJECT_ENVIRONMENT = ".venv" _.python.venv = { path = ".venv", create = true }Pretty clean, right? This tells mise to use Python 3.12.1, install the latest version of uv, and automatically create a virtual environment in
.venv.Note on Poetry Support
I had to install python from source using mise to get poetry working. You will want to leave this setting to be true. There is some problem with the precompiled binaries they are using.
You can install global python packages, like poetry, with the following command:
mise use --global poetry@latestYes, It’s Written in Rust
The programming veterans among you may have noticed the toml configuration format and thought, “Ah, must be a Rust project.” And you’d be right. mise is written in Rust, which means it’s fast! The project is stable, has a ton of GitHub stars, and is actively maintained.
Task Runner Built-In
One feature I wasn’t expecting: mise has a built-in task runner. You can define tasks right in your
mise.toml:[tasks."venv:info"] description = "Show Poetry virtualenv info" run = "poetry env info" [tasks.test] description = "Run tests" run = "poetry run pytest"Then run them with
mise run testormise r venv:info.If you’ve been putting off setting up Make for a project, this is a compelling alternative. The syntax is cleaner and you get descriptions for free
I’ll probably keep using Just for more complex build and release workflows, but for simple project tasks, mise handles it nicely. One less tool to install.
My Experience So Far
I literally just switched everything over today, and it was a smooth process. No too major so far. I’ll report back if anything breaks, but the migration from my previous setup was straightforward.
Now, I need to get the other languages I use, like Go, Rust, and PHP setup and moved to mise. Having everything consolidated into one tool is going to be so nice.
If you’re tired of Homebrew breaking your language versions or juggling multiple version managers for different languages, give mise a try.
The documentation is solid, and the learning curve is minimal.
/ DevOps / Tools / Development / Python
-
Typst: The new foundation for documents
Typst is the new foundation for documents. Sign up now and experience limitless power to write, create, and automate anything that you can fit on a page.
-
Hiveword is the novel organizer for the serious plotter.
/ Tools / links / task management / notes / digital organization
-
2026: The Year We Stop Blaming the Tools
Here’s a hard truth we’re going to have to face in 2026: sometimes the bottleneck isn’t the technology, it’s us.
I’ve been thinking about how we use tools, how we find the merit in their use. We have access to increasingly powerful tools, but their value depends entirely on our understanding of them.
A hammer is useless if you don’t know which end to hold. The same goes for AI assistants, automation frameworks, and the growing ecosystem of agentic systems.
The rapid adoption of tools like OpenClaw’s agentic assistant tells me something important: people and companies are starting to see the real potential in building autonomous systems. Not just as toys or experiments, but as genuine productivity multipliers. That’s a shift from where we were even a year ago.
I think 2026 will be the year we see more widespread adoption of genuinely useful tools. The Gartner hype cycle is really interesting and how it applies or doesn’t to AI adoption, but I won’t cover it here. I’d like to write more about that in future articles.
The companies that build genuinely useful tools will be the ones that survive. They’ll be the ones that understand the value of tools and how to use them effectively. They’ll be the ones that embrace the future of work, where humans and machines work together to achieve more.
It’s not about replacing humans. It’s about humans getting better at wielding the tools we’ve built. That’s always been how technology works. This time is no different.
/ AI / Tools / automation / 2026
-
The Rise of Spec-Driven Development: A Guide to Building with AI
Spec-driven development isn’t new. It has its own Wikipedia page and has been around longer than you might realize.
With the explosion of AI coding assistants, this approach has found new life and we now have a growing ecosystem of tools to support it.
The core idea is simple: instead of telling an AI “hey, build me a thing that does the boops and the beeps” then hoping it reads your mind, you front-load the thinking.
It’s kinda obvious, with it being in the name, but in case you are wondering, here is how it works.
The Spec-Driven Workflow
Here’s how it typically works:
-
Specify: Start with requirements. What do you want? How should it behave? What are the constraints?
-
Plan: Map out the technical approach. What’s the architecture? What “stack” will you use?
-
Task: Break the plan into atomic, actionable pieces. Create a dependency tree—this must happen before that. Define the order of operations. This is often done by the tool.
-
Implement: You work with whatever tool to build the software from your task list. The human is (or should be) responsible for deciding when a task is completed.
You are still a part of the process. It’s up to you to make the decisions at the beginning. It’s up to you to define the approach. And it’s up to you to decide you’re done.
So how do you get started?
The Tool Landscape
The problem we have now is there is not a unified standard. The tool makers are busy building the moats to take time to agree.
Standalone Frameworks:
-
Spec-Kit - GitHub’s own toolkit that makes “specifications executable.” It supports multiple AI agents through slash commands and emphasizes intent-driven development.
-
BMAD Method - Positions AI agents as “expert collaborators” rather than autonomous workers. Includes 21+ specialized agents for different roles like product management and architecture.
-
GSD (Get Shit Done) - A lightweight system that solves “context rot” by giving each task a fresh context window. Designed for Claude Code and similar tools.
-
OpenSpec - Adds a spec layer where humans and AI agree on requirements before coding. Each feature gets its own folder with proposals, specs, designs, and task lists.
-
Autospec - A CLI tool that outputs YAML instead of markdown, enabling programmatic validation between stages. Claims up to 80% reduction in API costs through session isolation.
Built Into Your IDE:
The major AI coding tools have adopted this pattern too:
- Kiro - Amazon’s new IDE with native spec support
- Cursor - Has a dedicated plan mode
- Claude Code - Plan mode for safe code analysis
- VSCode Copilot - Chat planning features
- OpenCode - Multiple modes including planning
- JetBrains Junie - JetBrains' AI assistant
- Google Antigravity - Implementation planning docs
- Gemini Conductor - Orchestration for Gemini CLI
Memory Tools
- Beads - Use it to manage your tasks. Works very well with your Agents in Claude Code.
Why This Matters
When first getting started building with AI, you might dive right in and be like “go build thing”. You keep then just iterating on a task until it falls apart once you try to do anything substantial.
You end up playing a game of whack-a-mole, where you fix one thing and you break another. This probably sounds familiar to a lot of you from the olden times of 2 years ago when us puny humans did all the work. The point being, even the robots make mistakes.
Another thing that you come to realize is it’s not a mind reader. It’s a prediction engine. So be predictable.
What did we learn? With spec-driven development, you’re in charge. You are the architect. You decide. The AI just handles the details, the execution, but the AI needs structure, and so these are the method(s) to how we provide it.
/ AI / Programming / Tools / Development
-
-
The Foundation for your Design System
A set of beautifully designed components that you can customize, extend, and build on. Start here then make it your own. Open Source. Open Code.
-
Everyone crashing out over OpenCode situation. Why not just use Claude Code (2.1+)? Or you know, there’s AMP. AMP exists too, and it looks equally interesting to me.
/ AI / Tools / Development
-
Claude Code has been working great for me. OpenCode looks interesting, but uh, Opus 4.5 access is necessary for real work. I’m not doing any sketchy workarounds to get it running, and API pricing isn’t appealing either. So for now, OpenCode stays firmly in the “interesting” category.