I used Xcode 26.3 to build an iOS app with my voice in just two days – and it was exhilarating


Xcode sewing app

David Gewirtz / Elyse Betters Picaro / ZDNET

Follow ZDNET: Add us as a preferred source on Google.


ZDNET’s key takeaways

  • Xcode 26.1 AI was unusable. Xcode 26.3 AI is a big leap.
  • AI-driven migration delivered massive change in under two days.
  • One rule fixed it: no background agents, frequent status updates.

I am sure that the time will come when AI coding doesn’t seem like some amazing new magic ripped from the future. But not today.

The project I’ve been working on for the last two days counts as the third major product I’ve done using AI coding. Actually, it’s more like the third major project set, because I added four premium add-ons to my WordPress security plugin, I built iPhone, Mac, and Apple Watch apps for my 3D printer filament manager, and I’ve just made major inroads into a surprisingly powerful sewing pattern manager on iOS.

Also: Xcode 26.3 finally brings agentic coding to Apple’s developer tools

I’ve found that when I work in a different AI environment, I like to work on a unique codebase. That way, I don’t get different AIs confused about the same thing. So when Apple released its Xcode 26.3 candidate, supposedly with substantially enhanced AI integration, I needed a new project to work on with it.

By combining a tightly integrated AI assistant and voice dictation (more on that later) to build a powerful app with built-in machine learning capabilities, I really do feel like I’m living in the future.

The project

My wife suggested I do a sewing pattern manager, something to help her (and possibly other sewists) manage their large stash of paper and digital sewing patterns.

app-interface

Screens from the app that Xcode 26.3 built

Screenshot by David Gewirtz/ZDNET

It’s a project similar to the filament manager, in that it uses photography and NFC tags to track the locations of physical items. But it’s also a wildly different product in that sewing patterns have many more attributes than rolls of 3D printer filament.

This is not a small market. There may be a few hundred thousand 3D printing geeks in the US, up to maybe a max of a million. That pales in comparison to the sewing market, where crafty sewists number around 30 million in the US and Canada, according to the Crafting Industry Alliance.

So, yeah, I chose this project mostly because my wife liked the idea. But the total available market is huge enough that if I decided to take it all the way to an App Store product, there’s some there there.

I tried Xcode before

I tried using Apple’s Coding Intelligence back in November. I was able to use Xcode 26.1 to vibe code a simple “Hello, world” app, which I documented in my how-to article. But as soon as I tried to get it to do more, it got stupid. Then it hung.

Also: How to create your first iPhone app with AI – no coding experience needed

At that time, I used it with both OpenAI’s Codex and Claude Code. That way, it didn’t have to rely on Apple’s own AI. But I couldn’t paste in any screenshots for it to interpret. I couldn’t get it to run for more than one or two steps. It couldn’t create any of the configuration files needed to create an app for any of Apple’s platforms. And it got stuck. A lot.

It was, pretty much, unusable, at least for vibe coding.

Instead, back in November, I used Claude Code in the terminal to create powerful iPhone, Mac, and Apple Watch apps. These used NFC tags and the internal photo capabilities of the iPhone to help me organize my 3D printing filament. The challenge when you have a lot of 3D printers, each of which can support four different spools of filament, is keeping track of what’s on each printer and what’s available in your filament stash.

Also: Claude Code made an astonishing $1B in 6 months – and my own AI-coded iPhone app shows why

It took me a few hours a day, over 11 days, to build the first version of the iPhone app. It took a few more days each to build the Mac and Watch apps. I use that app every day because it solves a personal productivity challenge. For more details, you can read this article.

Claude Code, running out of the humble terminal, was great. It could do all that Xcode 26.1 couldn’t, including configuring those files.

Now there’s Xcode 26.3

Xcode 26.3 isn’t Xcode 26.1. When it came to the integrated AI assistant, Xcode 26.1 was just plain bad news. Xcode 26.3’s developer preview, with some tolerable exceptions, is awesome.

Let me tell you about what I built in less than two days using Xcode 26.3. From now on in this article, when I say Xcode or “the new Xcode,” I mean the developer preview of Xcode 26.3.

Also: Want local vibe coding? This AI stack replaces Claude Code and Codex – and it’s free

I wrote about the new Xcode a few days ago in this article. The big change is that the new Xcode supports agentic operations and can let those agents tap into almost everything that needs doing. The agents also have access to Apple’s coding documentation, which proved to be very helpful.

cleanshot-2026-02-05-at-14-23-332x

Screenshot by David Gewirtz/ZDNET

To get started, I made a copy of my filament spool project, renamed it for sewing patterns, and set to work with Xcode and Claude Agent. I didn’t even really get to work on it for two full days. I had a lot of other work to do for ZDNET yesterday. I also had to stop early because it ran out of token allocation for a while. Today, I had to stop early enough to write this article.

Even so, Claude, Xcode, and I inserted 52,947 new lines of code and deleted 10,626 lines of code over 689 files. The current codebase consists of 116 code files totaling 32,381 lines of code. This includes some super cool machine learning additions I made using Apple’s latest AI/ML libraries.

Also: I tried a Claude Code alternative that’s local, open source, and completely free – how it works

I don’t code full time. Not even close. I steal a few hours here and there from all my other responsibilities. But if I were able to devote full time to coding, I estimate it would have taken me an absolute rock-bottom minimum of 4 to 6 months to do the same work I did for part of yesterday and part of this morning.

For me, as an independent lone developer, the force multiplier of AI coding is nothing short of breathtaking.

The part that broke

There have been two main phases to the project so far. The first was migration. The second is adding and removing features.

Migration is a lot harder than you would think. As I said, I copied the original project folder and renamed it. But everything inside was oriented toward spools of filament. Everything in this project has to be oriented around sewing patterns.

It’s not possible to just do a search and replace. Lots of strings of characters had to change, but so did all the data structures, as well as all the app configuration data. This is perfect work for an AI. It’s very technical, very precise, and very, very tedious.

This is also where I almost gave up.

Also: Anthropic says its new Claude Opus 4.6 can nail your work deliverables on the first try

The experience of vibe coding in (at least the preview build of) Xcode 26.3 alternates between exhilarating and amazing, and “what the hell just happened to me?” and “did I just destroy everything?” It’s not a good feeling.

Imagine you’re on the Starship Enterprise. You go to warp, and all the colors are streaming past you at warp speed. It’s exhilarating. It’s an incredible feeling.

And then, all of a sudden, everything goes dark. You see a tiny glimmer of light, and you realize you’re in some cave somewhere. You have no idea what happened, where you are, or how you got there.

That’s what it feels like to vibe code in Xcode with multiple agents running. In this new build of Xcode, Claude and Xcode love to run multiple parallel agents.

You’re cooking along just fine, and then suddenly nothing. The system hangs. You have no idea where in the set of changes you are. There’s nothing you can do about it except quit Xcode and start back up. The big hope is that nothing got completely destroyed. This is not a good feeling.

This deep, dark moment of despair is not caused by just one thing. It’s not caused by just the AI agent running out of context and needing more tokens. And it’s not caused by just a background agent getting stuck because it doesn’t have permissions and isn’t reporting back. And it’s not caused by just having multiple background agents making changes on top of each other, causing them all to stop running.

Also: OpenAI’s Frontier wants to manage your AI agents – it could upend enterprise software, too

It’s caused by all of these situations. All of a sudden, you’re cooking along nicely, and then the next thing you know, you’re stuck in concrete for three hours. Do you restart Xcode? Do you wait? Do you wish once again you’d listened to your mom and gone to law school instead of engineering school?

In my case, I launched another instance of Claude Code in the terminal and asked it to investigate. That was helpful because it was able to identify stuck processes. But running Claude Code in the terminal defeated the whole purpose of doing things in the Xcode IDE.

There doesn’t appear to be a way to stop running background agent tasks, so they just keep running. As far as I can tell, there is no way to stay in the Xcode IDE and ascertain their status, and Xcode won’t let you stop them unless you just abnormally quit the application entirely.

It got worse. All of a sudden, the terminal Claude Code instance told me I had used up 91% of my usage cap and was about to hit 100%. That resulted in a 3-hour and 19-minute work stoppage. The stalled Xcode background agents that ran for more than 3 hours had consumed most of my token allocation. Apparently, even idle or stuck sessions can eat into the session budget because of the context they hold.

When Claude Code does a compression in the terminal, it stops running for quite a while. But at least there’s a message there telling you what’s happening. With Xcode and its penchant for launching background agents, there’s nothing. Apple, if you take no other advice from me today, take this: you need better management and visibility for background agents. Stat.

That was yesterday. Overnight, I thought about it. This morning, I inserted a new instruction into the general CLAUDE.MD instructions for this project. I told it, “Do NOT use background agents or background tasks. Do NOT split into multiple agents. Update me regularly on each step. Do NOT run steps that take more than a minute or two without having an update heartbeat.”

Also: Is your AI agent up to the task? 3 ways to determine when to delegate

Today was very productive. Very productive. Nothing ran in the background. No agent tried racing against the other agents. Claude and Xcode just did their job. It took about 20 minutes for the AI to clean up the migration mess from the previous day.

And then it was time to add new stuff. This is where things got really fun.

The cool new hotness

So here’s the thing. Sewists love them their patterns. A lot. For decades, patterns came in little paper envelopes. You’ve seen them. Maybe you’ve used them or seen your mom buy them. Paper envelope patterns still exist, but digital PDF-based patterns are also big. So are patterns that come in books of patterns.

Also: I didn’t need this, but I used AI to 3D print a tiny figurine of myself – here’s how

My wife wanted to manage all three types of patterns in the new app. As you can see from the picture, she has boxes and boxes of paper patterns. We also have a server share dedicated to her digital patterns. And there’s an entire bookcase filled with pattern books.

patterns

On the left, a typical sewing pattern envelope. On the right, part of my wife’s pattern collection.

David Gewirtz/ZDNET

Although I’m including support for digital patterns and books of patterns, my attention was mostly focused on the paper envelope-style patterns. That’s because they could be managed in a way similar to my filament spools. They could be tracked with NFC tags and moved from location to location.

But there is also a big difference. The front and back covers of pattern envelopes are like gold to the sewists who collect them. They show what’s being built, provide inspiration, and also often provide useful and necessary purchase information for all the supplies needed to make whatever the pattern describes.

Also: How I turned an old laptop into a home document station – and cut down on paperwork chaos

My app had to capture very high-quality images of those front and back covers. Essentially, I had to build a scanner app into my pattern database. Not only does the app have to capture an image, it has to decide what part of the image is the pattern envelope, then straighten it, reshape it so corners are right angles, and crop it.

And then I wanted to do one more thing. Paper patterns are mostly indexed by vendor name (Simplicity, McCalls, etc) and pattern number. I wanted my app to extract those two pieces of data from the images, and then let users choose them for naming and indexing the patterns.

To do this, I needed to use Apple’s machine learning APIs, most of which are quite new. It turns out that Claude Code, running in the terminal, wasn’t familiar with most of them. But Claude Agent, running in Xcode, was able to easily query Apple’s documentation and create those features.

Also: 7 apps I use to lock down, encrypt, and store my private files – and most are free

I ran into one other snag. The AI in my app was having a hard time distinguishing between the numbers that are part of bar codes and the pattern numbers. So I had Claude and Xcode train my app on how to identify bar codes and eliminate them from consideration from the pattern number selection process.

As a bonus, since my app had to OCR the images in order to find the vendor name and number, I added an OCR data field to each pattern, so users can search on anything on the front or back cover. I also decided to save the bar code number and the actual bar code with each pattern, just in case the user wants it for something.

Claude, Xcode, and I did a lot more in our short development cycle. My big goal was to get the AI to build machine learning into the app and make it work by the time this article was published. We met that goal.

My wife is pretty stoked about the app. I’m having a blast working on it. So I’ll probably keep tinkering with it over the weeks until it meets all her needs. I’m not sure if I’ll put it up on the App Store. I do have plans to do an article called something like, “How to vibe code your way onto the App Store,” so you might see more of it or the filament app at some point in the future.

Other tidbits

There is one usability feature in the Xcode implementation I like a lot more than Claude Code’s terminal implementation. You can Command-V paste an image into Xcode’s AI assistant. In Claude Code, you have to use Control-V. It’s not a big thing, except that I have decades of muscle memory doing Command-V to paste on a Mac, and Control-V takes me out of flow. So I like how Xcode does it.

Also: A MacOS 26 bug bricked my $3,700 Mac Studio – here’s how I miraculously got it back

The AI assistant can also do builds, which means that it can see if the code works. If there are errors, it can fix them on its own. Sometimes, it runs for an hour or more all on its own. Today, that let me take a walk in the park (literally), while the AI team of Xcode and Claude were busy figuring out how to properly orient and crop a pattern cover.

Speaking of flow, I’ve been using Wispr Flow to dictate to Xcode. I’d say about 75% of the prompting I did to the AI was done using Wispr Flow. The product has a vibe coding mode that made it work really well during coding. I used to use the Mac’s own dictation, but it’s notoriously unreliable. I’ve found Wispr Flow far more reliable.

Why do I dictate to my development environment, you ask? I have an 8-pound Yorkipoo that likes to curl up on my left shoulder and be held by Daddy while he sleeps. So, of course, I do a ton of my at-computer work using only one hand. Dictation makes this possible, and Wispr Flow does a great job of doing so effectively.

Also: 9 essential Mac apps everyone should be using in 2026 – and why I vouch for them

As for this being a developer candidate for Xcode 26.3, Apple says the full release will be up on the Mac App Store within the month. So stay tuned. I’m hoping it does something about managing those runaway, rogue background tasks.

But even with those troublesome beasties harshing my coding buzz, I have to admit that I’m having a ton of fun vibe coding my way into my wife’s iPhone’s heart.

Have you tried agentic or vibe coding workflows yet? If so, what worked well and what broke down? Do you think tighter IDE integration beats using AI tools from the terminal, or do you prefer keeping them separate? How concerned are you about background agents, runaway tasks, and token usage when AI is deeply embedded in development tools? And finally, do you see yourself trusting AI to handle large-scale refactors or migrations, or does that still feel like a step too far? Let us know in the comments below.


You can follow my day-to-day project updates on social media. Be sure to subscribe to my weekly update newsletter, and follow me on Twitter/X at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, on Bluesky at @DavidGewirtz.com, and on YouTube at YouTube.com/DavidGewirtzTV.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *