Six Colors
Six Colors

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!

By Jason Snell

I’ll have my AI email your AI

There’s a joke in one of my favorite movies, “Real Genius,” which feels directly applicable to a lot of AI discussions we’re having today. (It’s an ’80s movie, so it’s not a scene—it’s a montage, set to “I’m Falling” by The Comsat Angels.)

In it, our protagonist Mitch attends a normal math lecture, but over the course of the montage most of the class is replaced by tape recorders of various sizes.1 In the final shot, Mitch enters the lecture hall to discover that a large reel-to-reel tape player has replaced the professor himself. It’s just one tape recording being played into all the other tape recorders.

One of the announced features for Apple Intelligence, Smart Reply, will offer quick ways to respond to direct queries in email, asking you simple questions (“Do you like me? Check yes or no.”) and drafting a reply for you.

Apple is hardly the first company to suggest that in the future, your phone will write your emails for you. Gmail’s Smart Compose has been doing it for several years, and Apple’s been offering its own version of multi-word autocomplete for almost a year.

But with this latest round of AI announcements, once again, I’ve heard a lot of people making jokes about how, pretty soon, your AI will email my AI, and humans will never need to be involved anymore! It’s usually meant as absurdity, but I think there might be more to it than that.

Suppose our AIs end up emailing each other endlessly, striking up meaningless conversations and having their own inner lives. In that case, that might make for an interesting science fiction story, but I’m not sure it would really matter to us as humans. Think of it this way: email is just a communication pathway. It was built for humans to talk to each other, but for years now, we’ve received automated emails, newsletters, spam, and the rest.

If you know much about tech, you’ve heard of APIs, or Application Programming Interfaces. APIs are, at their most abstract level, an agreed-upon method for software to use or communicate with other software. APIs are in the cloud, on the web, on our devices, everywhere. So why not in our email messages, too?

I realize that it’s absurd to consider that a free-form email message would ever be better than a programmed API, but email has a flexibility that other APIs don’t. Emails can be about literally anything. And a lot of times, APIs are just not well used because the people who would use them are lazy, busy, uninterested, or don’t know they exist.

Let’s say you need to find a common meeting time for you and four other people. Are there internet calendar APIs for this? Yes! Are there calendar apps that feature built-in support this sort of scheduling? Yes! Are there literally web apps that will do this work for you? Yes! (I use StrawPoll, myself.) And yet, I’d bet that most people just… send an email to everyone asking them if they can make a certain time and try again until they get it right. It’s not efficient, but it is convenient.

Now imagine that same scenario, but everyone is using an AI system that’s reading email and has access to each user’s calendar. The end result might be the same as using an existing API or web app, but instead email messages among AIs are sorting it out. Maybe some AIs know exactly when their person is available; others might need to ask. But instead of the onus being on the users to interface with other systems and bring it all together, the AIs handle most of it and the user just chimes in when it’s necessary.

I don’t think that’s an absurd scenario. (And yeah, if the AIs are particularly intelligent, maybe they’ll use an existing calendar service to solve the problem up front.) It’s the equivalent of each of those people having their own human assistant setting up the meeting—except none of them likely have the budget to hire a personal assistant.

In fact, where AI assistants really run into trouble is not when they’re talking to other AIs, but when they’re talking to human beings. Remember when Google showed off its service that pretended to be a human and called real people to verify Google Maps data or make reservations? That’s what I really dread: being battered by emails and texts and phone calls from AIs operating for people and organizations who want my attention but aren’t willing to give me any of their own.

As long as I, a human, don’t have to read a pile of AI-to-AI email communications, I don’t mind if they have them. The protocol doesn’t really matter—use iMessage or RCS, for all I care—so long as the job gets done and I’m not left to clean up the mess. Keep me out of it, other than answering questions or making my own requests.

Email and text messages may be a stupid way to build an interconnected web of AI software systems, but history has frequently shown us that sometimes the easiest solution is the one that’s available, not the one that’s the most elegant.2


  1. The scene is meant to satirize the apparent mid-80s proclivity of college students to tape their lectures, or to skip their lectures and have a friend tape them? I dunno. Three years after “Real Genius” came out, I went to college and discovered that there was an official student organization that would sell you the complete lecture notes of any major class. 
  2. My university’s Lecture Notes service was eventually replaced by—you guessed it—AI

By John Moltz

This Week in Apple: More like bored meetings, amirite?

John Moltz and his conspiracy board. Art by Shafer Brown.

Phil Schiller has a new role, we get a glimpse into Apple’s fall releases, and why don’t Epic and Apple just kiss already?

Ben Stein: “Schiller? Schiller?”

Congrats to Apple fine fresh Fellow Phil Schiller for landing a cushy gig on the board of OpenAI. Even better for Schiller, he’s just auditing this class.

“Apple Poised to Get OpenAI Board Observer Role as Part of AI Pact”

As an “observer” all he has to do is show up to some Zoom meetings! He doesn’t even have to read the board books! He probably has to put a shirt on, but no one’s gonna know if he’s not wearing pants. Sweet gig.

It’s also sweet for Apple. As Dare Obasanjo notes, Microsoft had to invest $13 billion in OpenAI for the same privilege. Apple paid nothing, it just happens to have the platform OpenAI really wants to be on.

While Schiller is only supposed to only be an observer, maybe he can ask them about this:

“OpenAI’s ChatGPT Mac app was storing conversations in plain text”

Oops.…

This is a post limited to Six Colors members.


Our app routines, how we feel about smart rings, the smartest other tech gadgets in our homes, and how we stay informed without spiraling into doom.



By Jason Snell for Macworld

Hey Siri? No, I didn’t mean you.

We’re about to enter the Apple Intelligence era, and it promises to dramatically change how we use our Apple devices. Most importantly, adding Apple Intelligence to Siri promises to resolve many frustrating problems with Apple’s “intelligent” assistant. A smarter, more conversational Siri is probably worth the price of admission all on its own.

But there’s a problem.

The new, intelligent Siri will only work (at least for a while) on a select number of Apple devices: iPhone 15 Pro and later, Apple silicon Macs, and M1 or better iPads. Your older devices will not be able to provide you with a smarter Siri. Some of Apple’s products that rely on Siri the most—the Apple TV, HomePods, and Apple Watch—are unlikely to have the hardware to support Apple Intelligence for a long, long time. They’ll all be stuck using the older, dumber Siri.

This means that we’re about to enter an age of Siri fragmentation, where saying that magic activation word may yield dramatically different results depending on what device answers the call.

Fortunately, there are some ways that Apple might mitigate things so that it’s not so bad.

Continue reading on Macworld ↦


by Jason Snell

MacStories calls for restraints on AI training

Federico Viticci and John Voorhees of MacStories have published an open letter to EU and U.S. officials calling for stricter regulation of how AI models are trained that honors the intellectual property rights of creators:

The danger to the Internet as a cultural institution is real and evolving as rapidly as AI technology itself. However, while the threat to the web is new and novel, what these AI companies are doing is not. Quite simply, it’s theft, which is something as old as AI is new. The thieves may be well-funded, and their misdeeds wrapped in a cloak of clever technology, but it’s still theft and must be stopped.

It’s a good read and a solid argument. I hope someone, somewhere is paying attention.

—Linked by Jason Snell

After some feedback about the future of the Vision Pro, we discuss Apple’s adventures in the EU, the inevitable fragmentation of Siri, and a curious new AirPods rumor.


By John Moltz

This Week in Apple: Beta times ahead

John Moltz and his conspiracy board. Art by Shafer Brown.

Apple and the EU continue to butt heads, betas are for everyone these days, and Meta gets the cold shoulder.

EU Island

Clearly the only way to solve Apple’s problems with the EU is to rent a mansion somewhere and have the two of them live together for however long it takes to film a 24-episode season of reality television. Hey, it’s gotta work better than whatever it is they’re doing now.

“EU Accuses Apple’s App Store Steering Rules of Violating DMA and Opens Investigation into Developer Fees”

In addition to not thinking much of Apple’s steering rules, the EU said other policies, including the Core Technology Fee, “fall short of ensuring effective compliance with Apple’s obligations under the DMA.” If the EU is suggesting that Apple can’t make money off apps that are distributed in other ways than the App Store, we could be entering a whole new ballcan of wormgames.…

This is a post limited to Six Colors members.


By Dan Moren

The Back Page: EU, gross

Dan writes the Back Page. Art by Shafer Brown.

Hi team,

You might have heard recently about some challenges we’ve been having when it comes to regulation. It’s been alleged that Apple is anticompetitive, that we use our power and position in the market in order to dictate terms. That we are “gatekeepers” preventing a free flow of commerce and innovation. But nothing could be further from the truth: Apple is and always has been committed to building the best products that we can and competing on the merits. Keep gates? We’re firmly anti-gate! People just keep foisting them on us.

Frankly, it’s insulting to suggest that we, the company behind the competitive and democratic engine that is the App Store, would do anything that discourages a free and open market. Apple fully complies with all laws and regulations in local jurisdictions—just look at China, for example. Do you think we want to be in business with a repressive regime that directly contradicts so many of the values we claim to espouse?…

This is a post limited to Six Colors members.



By Jason Snell

Apple’s Vision platform needs to do more than get cheaper

Meta Quest 2 controllers

The Vision Pro isn’t a product many people should buy today, and that’s not really surprising. It’s an example of Apple playing a long game, trying to build a wearable computing platform over many years. You have to start somewhere.

Right now, it’s a development kit for developers who are willing to gamble or experiment with a platform that’s not going to be broadly adopted for a while, if ever. It’s a pretty intriguing niche entertainment product, but it’s desperately in need of more content. And it’s a productivity product for people with very specific use cases and work methods. Still, most people should not consider buying one—especially not at $3500—and most people are definitely not!

But at some point, long game or not, Apple needs to start progressing and growing the visionOS platform. Mark Gurman’s report that Apple is working on a $1500 model for late next year is a start. $1500 isn’t cheap, but it’s less than half the price of the current model, and therefore more likely to snag curious people and improve the viability of buying one just to watch immersive events or 3-D videos or whatever.

So, not cheap, but… cheaper. And that’s a good start. Right now, the high price of the product is the top gating factor in growing the platform. Even if you’re impressed by the demo, it’s hard to get over that price tag.

But price isn’t the platform’s only challenge. The lack of software and content is also huge. If there’s a cheaper Vision product coming in late 2025, that means Apple has a year and a half to beef up what’s available on visionOS so that it can put itself in the best position to grow the platform when the lower-cost model is released.

The Vision Pro is the result of a years-long development process, which means that the current product as shipped is the outcome of Apple’s initial thinking about the device. Presumably, the people working on Vision Pro have learned a lot, both during the final years of the project and in its first few months out in the real world.

That’s good, because it’s time to reconsider some of the early decisions about the product and the platform. Obviously, this is already being done, because there’s no way that Apple can make a $1500 headset without pulling out some ���must-have” features. (The obvious one is the lenticular outward-facing display, but I’m sure there are other features that seemed incredibly important that, in hindsight, are wastes of money.)

On the entertainment front, Apple’s made some strides in at least announcing partnerships with makers of hardware that can shoot in 3D and Immersive formats. But it needs to invest more in getting developers to build their apps on visionOS, and since the size of the near-term market opportunity sure won’t, some other inducement—like maybe even money?—might be a good idea.

And if Apple wants to get serious about expanding and growing the Vision product line, it needs to get over one particular choice it made in launching it. The company was clearly so proud of its advanced hand-tracking interface that it shipped the Vision Pro with no additional input devices. And I get it! “If you see hand controllers, they blew it” could have been one of the catchphrases of the Vision Pro development process. A headset shouldn’t require add-on controllers to be usable.

But just as the Mac eventually got arrow keys (despite omitting them from the first Mac keyboard to encourage using the mouse) and the iPad got an Apple Pencil (despite being a touch-first interface), it’s time for Apple to get over itself, and either build precision hand controllers for visionOS or build an API and make a partnership with a third-party accessory developer.

The fact is, lots of games and game-adjacent apps require a level of precision that Apple’s (excellent) hand tracking just can’t muster. Every Vision Pro game I’ve played that featured hand tracking has been a sloppy mess. I get that Apple wanted to show off its hand tracking and lean into “spatial computing” to send the message that the Vision Pro is not a game console but a serious device, but in doing so, it turned its back on the most popular category of entertainment software in the entire VR headset category.

One way for Apple to entice people to the visionOS platform—especially if a much cheaper model is on the way—is to load up on entertainment content. 3-D movies and immersive video are great, and if Apple’s not trying very hard to cut deals and encourage more content that shines on Vision Pro, it’s going to have wasted all of its effort. But if the platform can play games, if developers can port their games to visionOS from other VR platforms, it increases the viability of the product.

I’ve got a Vision Pro and a Meta Quest 3. And yet the Quest 3, which costs about one-seventh of the price of the Vision Pro, is a vastly superior platform when it comes to playing certain kinds of games. Games just require precision positioning (through detailed movement tracking), and input (via on-controller buttons) that waving your hands and tapping fingers together in Vision Pro just can’t match.

So, does Apple want visionOS to succeed or not? If it does, it needs to build or support hand controllers by the time a cheaper visionOS device ships. It needs to fill the platform with fun, fast-twitch games, exercise apps, and other stuff that’s proven successful elsewhere. No, the Vision Pro is not a games console. But if it stands defiantly against that kind of use case out of some sort of dogmatic opposition, Apple will have made it that much harder for an already hard-to-sell platform to succeed.


TV critic Tim Goodman guests to discuss the WBD and Paramount messes and give an update about what he’s been up to over at his Substack. [Downstream+ subscribers get to hear us talk about a very weird New York Times article about media moduls on a yacht.]


Reliable features of voice-based virtual assistants, our hypothetical U.S. internet legislation, the impact of Apple’s new Passwords app on our password management, and our comfort level with sharing intimate thoughts with an LLM.



By John Moltz

Review: Moaan InkPalm Plus is weird, cheap, small, and my kind of e-reader

The Moaan InkPalm Plus and the Kobo Clara HD
The Moaan InkPalm Plus and the Kobo Clara HD.

It is probably not surprising that I, John Moltz, the world’s leading iPhone mini superfan would also want to use a small e-reader. That’s just science.

After spending years reading ebooks on my iPhones and iPads like an animal, I finally got a Kobo Clara HD three years ago. And I really like it. It’s reasonably small, reasonably priced, has a nice screen, and it helped me reduce my crippling dependency on Amazon.

So, why did I think I needed another e-reader? Because they started making even smaller ones.

So buttons

Last fall Jason reviewed the Boox Palma, a phone-sized e-reader that looked right up my alley. Not only would it be easy to hold with one hand, it also had physical navigation buttons, something my Kobo, like most of the smaller and more inexpensive readers, lacked. The problem is that it costs $280. I said up my alley, not up my gated community. It’s not an unreasonable amount, it was just more than I wanted to spend since I already had an e-reader. Nothing to do but wait for prices to come down, I guess.

Or maybe I didn’t have to wait. A post on Mastodon got boosted into my feed that touted the Xiaomi Moaan InkPalm 5 which sells for about $95. Now you’re talking my kind of cheap. Looking into the Moaan lineup, I then found the InkPalm Plus which features a slightly larger screen, more storage and a more up-to-date version of Android, all for as low as $124 on AliExpress.

Sold.

Continue reading “Review: Moaan InkPalm Plus is weird, cheap, small, and my kind of e-reader”…


By Dan Moren for Macworld

Is an Apple Vision SE the key to spatial computing’s success?

Though Apple Intelligence may have taken the spotlight at this month’s Worldwide Developers Conference keynote, Apple’s big announcement of last year has not been completely forgotten. The company also debuted visionOS 2 with some important features lacking from the first release, as well as announcing the spatial computer’s imminent availability in several international markets.

But both of those may have been temporarily overshadowed by a recent report that the company is currently focusing its efforts on a less expensive version of the product—note that I didn’t say “cheap”, as the rumored price tag is still in the $1500 range, making it more economical only by comparison to the $3500 Vision Pro.

Prioritizing such a device over the Vision Pro 2 makes a lot of sense: the Vision Pro, by all accounts, was cutting-edge technology that was as good a product as Apple could make. As it is, it should continue to be very capable for several years—and I’m sure few of the early adopters would be happy to see their very expensive spatial computer superseded in short order. But such a strategy also raises questions about the future of the Vision line and what exactly Apple is planning for it.

Continue reading on Macworld ↦


by Jason Snell

Rounding up Bartender alternatives

I’ve been meaning to write up a comparison of menu bar utilities in the wake of Bartender being sold, but Niléane beat me to it:

If your trust in Bartender has wavered as a result of this series of events, you may be looking for alternatives. I have been, too. So, I’ve rounded up some of my favorite menu bar management utilities available right now and even a couple of macOS tips to help manage the menu bar without having to install any third-party apps at all.

I’ve been using Hidden Bar on my laptop without issue, but admit to being intrigued by Ice as well. Also don’t miss her article’s tips to compact your menu bar… because you might not need a menu-bar manager at all!

—Linked by Jason Snell

By Jason Snell

Preparing for the era of orchestrated apps

App Intents
Apple’s App Intents slide from WWDC 2024.

It’s going to be years before we can really see the impact of Apple embracing systemwide AI features via Apple Intelligence. Many of the features announced at WWDC 2024 won’t even ship until next year, and the keynote’s Siri segment alone was so full of future-tense descriptions and metaphors about the beginning of a journey that it’s quite clear this is going to take some time.

But let’s try to look out into the future. Let’s consider what the iPhone, in particular, might look like once Siri gets smart and Apple Intelligence takes hold. It’s a future that may dramatically change what we think of as apps—and that holds some serious threats (as well as opportunities) for app developers.

Tuning the orchestra

Over the next few years you’re going to be hearing a lot more about a concept that Apple started to discuss at WWDC this year: orchestration. Broadly, the idea is that the machine-learning models on your Apple devices are going to be able to understand what you want to do, based on your commands and current context, and make it happen by using the combined resources of your device’s system software and third-party apps.

When everything is orchestrated properly, all the capabilities of all your apps are put into a big soup, and the AI system at the heart of your device can choose the right capabilities to do what you need it to do—without you having to specify all the steps it needs to take to get there.

This is, in many ways, the ultimate promise of user automation. For years I’ve been a fan of tools that let users create scripts or automations or workflows that connect up different aspects of their computing lives in order to save time and end busywork. Computers have eliminated countless sorts of drudgery, but if you use a computer every day, you probably still frequently find yourself doing some 21st-century drudgery, pasting this thing over here, clicking that thing over there, often in a mindless, repetitive sequence.

I can automate you out of that with some combination of AppleScript or Shortcuts or Keyboard Maestro or shell script or some other macro language… and I have done so for myself, friends, and family. But the truth is, most people are never going to build even a simple Shortcut for themselves.

But… what if they don’t have to? In a world of properly orchestrated apps, they wouldn’t. They’d just say what they wanted, and their device would do all the work. If they needed to do the same task repeatedly, they could just tell Siri that, and at that point, you’ve basically built an automation workflow in zero steps.

That’s the holy grail of user automation, honestly. Tell your device what to do, and it does it—you don’t need to be involved at all. The drudgery evaporates. How civilized.

Intents and purposes

Okay, so the automation utopia may be upon us soon. But it’s easier said than done, and that’s because the functions in our apps on all our devices aren’t all magically known to Siri and Apple Intelligence. App developers have to specifically mark out the key functionality of their apps and bundle it up in a specific way so that it’s accessible to the broader system.

This is how AppleScript worked back in the day, and in today’s Shortcuts era, it’s enabled by something called App Intents. App Intents aren’t new—as I said, they’re what powers Shortcuts—but as of 2024, they’re much more meaningful than they used to be, because they’re how apps integrate with Apple Intelligence.

What Apple’s asking app developers to do is put in extra work in order to allow their apps to offer up their unique functionality to the system in an organized way. The result will be that the system will know those capabilities exist and will be able to use them as needed, based on whatever the user wants to do. If I’m looking at a photo and say I want to share that with Myke and Stephen in Slack, Apple Intelligence needs to understand what I’m looking at, export that photo in a format that’s reasonable for sharing in Slack, and then use Slack to choose the right venue for me to share with Stephen and Myke. (Oh, and based on context it also needs to intuit that I mean Stephen Hackett and Myke Hurley—two people who are frequently connected—and not people I know separately like Steven Schapansky and Mike Gordon.)

It’s all tricky, but the potential is enormous. Apps are mostly islands unto themselves, and it can be a real effort to get them to work together the way you want them to. I once built a wild system that basically connected my email client to a database1—the apps didn’t know about each other, and they didn’t need to—but by connecting them, I got a huge productivity boost. With Apple Intelligence and App Intents (so many AIs!), the potential is there for your device to connect your apps with one another in all sorts of ways… without you even breaking a sweat.

The potential here is huge. Now, the big question: Will app developers buy in?

App self-esteem

On a device operated by Apple Intelligence and full of apps all tricked out with App Intents, what does “using an app” mean, anyway? I’m dubious that we’re not going to ever want to scroll through lists and tap things and perform other tactile acts on our phones, even if we can drive a lot of work with a voice assistant. But if you’re an app developer, there’s a real risk of feeling like your app is no longer a destination for users but a box of parts that will occasionally be rummaged through by the system while it’s passing through to a different destination. That’s scary.

I do think that if Apple’s idea of an orchestrated future comes into being, the importance of any individual app might be reduced. But there’s also huge potential here for different apps to work together, for them to amplify each other so that they’re far more important for individual users than they could possibly be now.

For some apps, though, the future might be more about supplying great actions and data sources to the big Apple Intelligence soup—presumably for a subscription price. It seems a little bit weird, but the future of iOS apps might be services that just tie into Apple Intelligence, with little to no interface of their own. I don’t know if you could even call them apps.

That’s all years away, but I think it’s already time for app developers to consider what makes their apps unique and useful in a world where a smart machine-learning model is taking user commands and then getting results. If competitors offer the same functionality, they should presumably be motivated to offer App Intents so that the system will use them, and they’ll become crucial, irreplaceable portions of a user’s workflow.

For some apps, that might mean becoming less of a bright, shiny interface in the face of users, and more of a behind-the-scenes workhorse that just makes life better. Developers who are used to having the spotlight may be disquieted by that notion, but it doesn’t mean that their software doesn’t have value—and won’t be able to command an appropriate subscription price.

Existential threats

Apps and the App Store have been very, very good for Apple. I’m sure the company wants that to continue for as close to forever as possible.

But if the future of the devices that keep Apple in business is about to be transformed by AI models that orchestrate our software to do our bidding, there’s a serious risk that it could disrupt Apple’s standing in all of those device categories. That’s why the rise of AI is clearly an existential threat for Apple and why the company spent so much time talking about AI features at WWDC 2024.

It’s worth keeping that fundamental existential threat in mind. While it’s easy to say that apps and the App Store helped make Apple what it is, and therefore, the company will always be inclined to maintain the status quo… the fact is that if Apple thinks the best way for it to survive and flourish is to atomize app functionality into App Intents and drive it all with a user-driven AI assistant, it’ll do that. And it won’t think twice about it, no matter the consequences for app developers.


  1. Because it was the 90s, it was an AppleScript that connected Eudora and FileMaker Pro. 


Search Six Colors