The Man in the High Castle

Count on good old Tom Warren to come up with the take that the MacBook Neo is actually good news for Microsoft in an article titled, and I shit you not, The MacBook Neo is the best thing to happen to Windows in years. In the article, Warren argues that Microsoft has always responded well to competition and from Apple in particular.

If there’s one thing I know about Microsoft after covering the company for more than 20 years, it’s that it will always respond to a competitive threat. Apple’s MacBook Air convinced Microsoft and Intel to launch thin and light laptops with the Ultrabook initiative, the iPad pushed Microsoft to create its own tablet hardware, and the threat of Chromebooks saw Microsoft try to match the security and simplicity of ChromeOS with S mode versions of Windows.

Ultrabooks are indeed a good example of the PC industry responding to a product released by Apple, but as far as I am aware, the initiative to flood the market with off-brand MacBook Airs didn’t come from Microsoft. It came from Intel. I can’t speak to Windows S mode and honestly confused it with the now defunct Windows 10 S operating system, but I would like to know how many buyers were swayed from buying a Chromebook. My best case scenario guess is that maybe it kept a few schools on PCs.

This brings me to Microsoft’s tablet hardware, something that Warren doubled down on in a later paragraph.

It’s no coincidence that Microsoft announced these Windows changes around the same time as the MacBook Neo. Just like how former CEO Steve Ballmer held up an HP tablet PC days before Apple’s original iPad announcement in 2010, Microsoft has always closely followed Apple, be it with the Zune or making Windows Mobile a touch-friendly OS.

Tablet PCs running Windows existed throughout the aughts where they struggled to find a market, mostly because they sucked. The HP branded tablet PC that Ballmer showed at CES just before the iPad was announced was part of a half baked “slate PC” initiative, which involved a line of tablets from different manufacturers and ran Windows 7. Slate PCs turned out to be such a nothing burger that they are basically a footnote in the Table PC Wikipedia entry.

His other two examples are just as laughable. Zunes might have been fine MP3 players, but the first ones were released 5 years after the iPod and less than three months before the iPhone was announced. Speaking of, Microsoft’s response to the iPhone was a textbook example of what not to do. Ballmer first dismissed it out of hand, which his company followed by releasing a few lackluster updates to a struggling Windows Mobile OS before launching an entirely new Windows Phone platform three years later. I actually found Windows Phone compelling at the time, but it was too little/too late and was ultimately canned.

Oh and let’s not forget the unbridled success that’s been Microsoft’s transition to ARM, which as Warren points out, started before Apple decided to finally get its act together and follow suit.

I’m still surprised that I use an Arm-powered Windows laptop daily, especially as Microsoft’s original Windows on Arm launch with the Surface RT was such a mess that it led to the company pivoting to focus on Intel-based Surface Pro devices for many years instead. It wasn’t until 2019 that Microsoft got serious about Windows on Arm again. Microsoft’s Windows chief, Pavan Davuluri, was a key engineer in the effort with Qualcomm to make Windows on Arm a reality. Davuluri worked on the custom Surface processors with AMD and Qualcomm, and helped launch the impressive Surface Pro X model ahead of Apple’s transition to its own silicon.

I’ve already written a whole piece on The Verge’s years long and absurdly positive coverage of Windows on ARM. Taking that aside, Warren is basically arguing that Microsoft responded well to Apple Silicon and also that it didn’t need to respond to Apple Silicon because it was already responding to Apple Silicon before Apple Silicon existed to demand a response. Fourth dimensional check and mate.

There’s at least an argument that Microsoft did eventually after almost a decade get Windows on ARM right, even if it isn’t exactly a shining example of how well the company responds to threats. On the other hand, arguing with a completely straight face that Tablet PCs, Zune, and Windows Mobile are examples of how Microsoft has successfully responded to competition from Apple is inane. His perspective makes me truly wonder if Tom Warren is writing to us from an alternative universe, one where Microsoft has been a bastion of competition replete with a slew of attractive and popular products people love to use.

It’s the only thing that makes sense.

50 Years Without Apple

In light of Apple’s recent 50th anniversary, I have been thinking about what the world would look like had the company never existed. Specifically, I’ve been thinking about Ridley Scott’s famous 1984 commercial. While absurdly melodramatic, the ad points to a future antithetical to the one both Steves at Apple were trying to create, one where everyone each has their own personal computer that empowers them to whatever they wanted. While the book 1984 specifically portrays a dystopian communist future, the 1984 ad doesn’t have anything to do with communism. It’s about IBM and the future that IBM represented. That’s not speculation. It’s how Steve Jobs framed the ad when presenting the famous commercial before introducing the Macintosh. His framing concludes with:

IBM wants it all, and is aiming its guns at its last obstacle to industry control, Apple. Will Big Blue dominate the entire computer industry? The entire information age? Was George Orwell right?”

Despite being called the “IBM Personal Computer”, Jobs saw an IBM controlled future as one with computers that weren’t actually personal. That sounds ridiculous, but it doesn’t take much imagination even here in our Apple universe considering how the PC market actually developed during its formative years of the 1980s. In our universe, IBM never did come to dominate the personal computer industry because Microsoft did, and they did so almost exactly how IBM would.

Most people used DOS in the 80s not because they preferred it, but because the companies that employed them did. At first institutions chose PCs running DOS because they were the computers made by IBM, a serious company with “business machines” in its name and decades old relationships with fortune 500 companies. Not long after, however, most companies eschewed IBM for cheaper off-brand PC clones. Corporate purchasers didn’t buy COMPAQs, HPs, or Tandys because they thought they were better than IBMs, but because they were cheaper and ran the same operating system. Within a span of about five years Microsoft commoditized the PC market, which drove down prices and made DOS and Windows the de facto operating systems of increasingly off-brand computers that only a fraction of its users actually chose to use to begin with. When most people did willingly choose PCs in the 90s, it was because they were cheaper and because that’s what they or someone they knew had at work.

This isn’t to say that I think Microsoft represented the kind of threat Jobs warned about with IBM, rather to illustrate that the early PC market was defined by big businesses and other institutions who represented the vast majority of the market and who quickly consolidated on a single platform. On the contrary, Microsoft has always been very pro-personal computer. Also and most notably, no company screwed IBM better than Microsoft. That said, I think it’s fair to question whether Microsoft would have been able to pull one over IBM in a universe without Apple as they did in ours.

Given Commodore and Tandy also existed and released personal computers in the late 70s, let’s assume personal computers were inevitable and that they would sell enough to garner the attention of IBM. The question then becomes whether any of the other personal computer makers would pose the same threat to IBM as Apple did. If not, it’s easy to imagine a world where IBM doesn’t embrace off the shelf components or get hoodwinked by Microsoft for the sake of speed to market. Eventually big blue would still make something like the PC, which would still run something like DOS. The difference is that it would remain IBM controlled and sold at a price only institutions could justify. Eventually everyone would still use one of these computers, but they would be owned by the employer and only used for work or research, just like mainframes. Laptops would still happen because institutions would embrace their employees being able to take their microcomputer home and on the road for work. There would also without a doubt still be microprocessor-based consumer devices, but they would be more appliance-like. Think the entertainment systems shipped with cars, e-books, smart TVs, and video game consoles.

I’m also confident the internet was also inevitable, but would function entirely differently. Without personal computers, the internet would be defined by the needs of the various institutions who bought and controlled microcomputers and the makers of consumer appliances, sort of like how Amazon Kindles have free cellular connectivity, but only for buying e-books from Amazon. Speaking of cellular, mobile phones would also certainly happen, but there would be little reason for something like an iPhone or Android to exist in a world where computers are by definition work machines. We actually caught a brief glimpse of this. RIM dominated the mobile market because its Blackberry became the device of big business. While a few people did buy personal Blackberries, most bought whatever extremely limited consumer device promoted by their carrier at a given moment. Speaking of carriers, without the context of personal computing or a company like Apple, would anything have broken the carriers’ grip on the mobile phone market?

Could this darkest of timelines have actually happened had Apple not existed? I really think so, but I also think a more likely and less dark Apple-less outcome would still result in personal computers, just crappier and less tasteful ones. And while it’s certainly possible that a timeline without Apple wouldn’t be that much different than our own, I’m sure glad to live in one where Apple does exist and has been a driving force for keeping computers personal for 50 years.

Now let’s see if they can keep it up for 50 more.

visionOS’s Killer App

Gruber, on whether Apple’s upcoming March “experience” might have something to do with F1:

A reader pointed out that the 2026 Formula 1 season starts in Australia on March 8. You will recall from October that Apple TV is now the exclusive broadcast partner for F1 in the U.S. Apple is already dabbling with live immersive sports broadcasting for VisionOS with a limited slate of Lakers games this season. If they have something planned for streaming F1 races live on Vision Pro, with some level of immersion, March 4 would be a pretty good date to demo that experience to the media.

I love the idea that Apple’s March “experience” might include an F1 Vision Pro tie-in. I could see live sports ultimately being that platform’s killer app, in that watching a television broadcast would immediately feel second rate the moment someone watches their favorite sport in visionOS in the same way WAP browsers immediately felt second rate after using Safari on the iPhone. That’s not to say live sports would make the Vision Pro sell like hotcakes and become as ubiquitous as the iPhone. I still think VR is a relatively niche medium. Rather, I think being the undisputed best way to watch sports would give Apple Vision a very clear consumer reason for existing. I’d wager some people already spend more than $3500 to watch live sports at home. That may sound silly, but it’s also extremely relatable. People get when other people seek out the best version of something they enjoy. Someone spending $3500 on a Vision Pro to get the best version of pointless tech capabilities, by comparison, just sounds silly.

The Greatest Trick The Devil Ever Pulled

John Gruber made an excellent and compelling argument for avoiding certain terms when describing the hateful, nationalist, far right party running the United States currently, and instead suggests we all do what we can to ensure the names they call themselves end up just as stigmatized.

Our goal should not be to make fascist or Nazi apply to Trump’s movement, no matter how well those rhetorical gloves fit his short-fingered disgustingly bruised hands. Don’t call Trump “Hitler”. Instead, work until “Trump” becomes a new end state of Godwin’s Law.

The job won’t be done, this era of madness will not end, until we make the names they call themselves universally acknowledged slurs.

“MAGA” and “Trumpist”, for sure. “Republican”, perhaps. Make those names shameful, deservedly, now, and there will be no need to apply the shameful names of hateful anti-democratic illiberal failed nationalist movements from a century ago. We need to assert this rhetoric with urgency, make their names shameful, lest the slur become our name — “American”.

I mostly agree.

We should use caution when calling Trump Hitler or liken his heinous coalition of racists and billionaires to the Nazi party. While inarguably very similar, these assholes are so obviously not the same as those assholes that anyone saying they are risks immediately losing all credibility with most audiences.

Where I disagree is that I think we should still call them fascists. Not only is that who they are, I believe that avoiding the term for over a half century has played some part, however small, in how we got here. Disallowing the term created a world where modern fascism couldn’t exist, which in turn dulled our ability to perceive it as a threat to the extent that fascists could basically hide in plain sight. I wrote about this a couple of times on social media. Here’s a version I wrote in response to Casey Liss on Mastodon:

I’ve been thinking that American culture doesn’t have antibodies for fascism because of our internalized narrative (fascists are the jerks we beat up in WW II), and also because those jerks were so openly evil and so soundly defeated that we’ve largely retired the label “fascist” when describing autocratic governments. We let the ruling party in China self identify as communist and vaguely call Putin a dictator even though both seem more fascist than not to my layman eyes.

Instead we’ve been given a healthy dose of anti-communism vaccines. This isn’t entirely nefarious because communist autocracy was truly the much bigger ongoing threat to our freedoms coming out of WWII. That said, how many times have you seen a good idea (*cough* universal healthcare) get soundly rejected because “that’s socialism” and compare that to the mainstream’s non-reaction to actual fascists currently attempting to do terrible fascists things today.

That’s not to say I think we should only or even primarily call them fascists. We absolutely should do what we can to help Trump and his MAGA cronies own goal themselves until their names also become synonymous with evil and cartoon villainry, but we should also keep calling them what they are. They are fascists. We should get comfortable using that label, not just for the fascists we’re dealing with today, but as practice to label tomorrow’s fascists before they rise to power and infamy.

The Mysterious Case of Columbo’s Missing Clue

My wife and I are big fans of the series Columbo, starring Peter Falk. The show had two separate runs. The first and much better original series aired on NBC in the 1970s while the latter still-sort-of-good revival aired on ABC about a decade later. Like every other program made for television before the age of widescreen (16:9) TVs, both series were made for fullscreen(4:3). Now that we do live in the age of widescreen television, the people and companies responsible for distributing these older shows have a choice. They can either release them in their original 4:3 aspect ratio, with black bars on the left and right (known as a pillar box), or they can crop a bit off the top and bottom to make their aspect ratio match that of 16:9 modern TVs.

The folks distributing Columbo did both.

The Blu-rays of the original 1970s series are presented in 4:3 with a pillar box while those of the revival are cropped for 16:9. So which is better?

Like I said, my wife and I are big fans of the series, and so we want to see all of it as originally intended. So should you, and not just with Columbo, but with any older program made for fullscreen television. That may sound dangerously close to some niche opinion of a film nerd, akin to insisting on a subtly different cut that was only released in Japan, but I assure you watching popular American television as it was made is neither niche nor nerdy.

Alternative cuts and the like are largely extracurricular. They are added material for those who want more than what was originally released for a general audience. Cropping removes from the original work, often in ways the ruins whole scenes. Take this scene from Columbo Goes To College, an episode from the later revival. Our titular detective gestures toward a potential clue, which is completely cropped out in the 16:9 Blu-ray. (You can compare the cropped shot to that of the original by clicking the toggle below.)

The scene then progresses to further emphasize the importance of this one item, which tragically remains cropped out on the Blu-ray.

Those of us who lived in the era of fullscreen televisions were accustomed to watching movies cropped from their intended theatrical aspect ratios, but that was different because there really was no good alternative. By my recollection, most TVs in the 80s and 90s had somewhere around a 20-inch diagonal. You could watch the un-cropped theatrical version of movies, but doing so reduced your already small TV by almost a third to a measly 14-inch TV. Watching theatrical releases at home really was for nerds, who either accepted the tiny viewing experience or who could afford absurdly expensive (and often absurdly heavy) large TVs. Additionally, studios truly invested in making their cropped releases for home video as good as possible. The very existence and prominent use of pan and scan, the derided technique that used virtual panning to show critical parts of a given scene that would otherwise be cropped, is proof of just how much effort went into making cropped films watchable on fullscreen TVs.

The conditions that made pan and scan cropped movies the least worst option last century don’t exist today. The same investment isn’t being made by companies cropping fullscreen content to match widescreen televisions. The folks behind that cropped Columbo scene could have done a tilt and scan, the vertical counterpart to pan and scan, but they didn’t because the cost and effort likely wouldn’t be worth it given their budget. That seems reasonable to me, but begs the question: so long as you’re avoiding the cost involved with making a crop less bad, why not avoid the cost and the bad entirely by not cropping at all? Investing anything to crop old media in the era of ubiquitous giant flatscreen TVs makes no sense. Given a 55-inch diagonal, the viewable portion of an un-cropped fullscreen show is still roughly 42-inches. Never mind fullscreen classic television, have you heard anyone complain when modern shows with wider aspect ratios are streamed with a letter box, where the black bars are on the top and bottom? I haven’t. Even many commercials have black bars of some form or another these days and no one cares because their TVs are big enough that they still see everything without issue.

Outside of a few exceptions, no one should be cropping anything today. Not for streaming and certainly not for Blu-ray releases that only nerds buy. According to some very light research, it sounds like the company responsible for the Blu-rays of my cropped Columbo, Kino Lorber, couldn’t really do anything because the cropping happened when the original film was rescanned by Universal. I worry this was (and perhaps still is) a common practice to make old television shows look more modern, where the only high resolution versions have already been cropped at the source and so cropped is free while making un-cropped versions requires investment. I hope that’s not the case and if it is, studios will pony up the de minimus amount of dough to redo the scans in their original aspect ratio. In the meantime, my wife and I will keep watching the DVD versions of these episodes of Columbo. They may be lower resolution, but at least they show the whole thing.

Kid Gloves: Linux Edition

Nathan Edwards, in a Verge article titled “I replaced Windows with Linux and everything’s going great1“, writes:

First challenge: My mouse buttons don’t work. I can move the cursor, but can’t click on anything. I try plugging in a mouse (without unplugging the first one), same deal. Not a major issue; I can get around fine with just the keyboard.

“Not a major issue.”

As a computer nerd, I get the appeal of trying an alternative OS and I’m sure Linux in particular has gotten better over the years, but framing replacing Windows with Linux as basically frictionless does a disservice to readers when, under most circumstances and for most people, it’s anything but. This is a perfect example of the kind of “kid gloves” coverage I’ve criticized The Verge for previously. The site goes easy on some project or product out of a bias toward novelty or a fear of punching down. I’m not arguing Nathan should be particularly harsh toward CachyOS, especially since this wasn’t a review. He absolutely should write about the fun and benefits of trying an alternative OS, but he should also address the downsides in a way that doesn’t sugar coat them.


  1. I’ve considered the possibility that the title is sarcastic and that the whole post is satire, but I really don’t think it is. Edwards’s other posts don’t read to me as satire and the conclusion to this post seems quite earnest: “I’m sure I’ll run into plenty of fun problems soon enough. But the first few days have been great.” 
Fine! Have Your Windows in iPadOS, but Remember This House Was Built on Apps!

Anil Dash recently made an observation on Mastodon that included a screenshot of an iPadOS 26 setup screen with a prompt showing three options: Full Screen Apps, Windowed Apps, and Stage Manager. While each option had a description, Stage Manager curiously was the only one that ended with “more…” As someone who has written quite a bit about Stage Manager’s lacking lexicon, that “more” was like catnip. I had to see what exactly the “more” linked to. After all, it had been over three years and iPadOS versions since I was pissing and moaning about the lack of coherent terminology, and maybe this “more” would bring me somewhere full of nouns that aptly describe the pieces that make up a Stage Manager. I didn’t have an iPad anymore, but had surmised and later confirmed that “more” linked to the “Organize windows with Stage Manager on iPad” article in the iPad User Guide.

Having now read that article, I can now confirm that the situation with Stage Manager’s terminology remains a complete mess, but the “Organize apps into groups in Stage Manager” in particular section highlights a more fundamental lexical problem I hadn’t fully recognized in my earlier writing: a lack of distinction between apps and windows.

Organize apps into groups in Stage Manager

You can create groups of windows for specific tasks or projects by dragging app windows from the recent apps list into the center of the screen. You can also drag app icons from the Dock into the center of the screen to add them to a group.

You can return a window to the recent apps list by dragging it to the left. If you minimize a window in an app group, the app window is removed from the group and returned to the recent apps list.

Right from the get-go, we have a header that begins with “organize apps in groups” followed immediately by a sentence that reads “you can create groups of windows.” That’s bad, but it doesn’t hold a candle to that section’s pièce de résistance of incoherence that is its last sentence.

If you minimize a window in an app group, the app window is removed from the group and returned to the recent apps list.

While the writing could be improved, the greatest technical writers in the world could only do so much when Apple itself seemingly can’t agree on whether the things Stage Manager manages are “apps” or “windows” (or comedy option three: “app windows”.) The problem extends beyond Stage Manager into the new Windowed Apps mode in iPadOS 26, which has a single list of instructions to “Resize a window”, “Move a window”, and “Manage an app window”. Why can’t they all just use “window”?

My guess as to why this section incoherently conflates apps and windows is related to Apple’s historical reluctance to embrace anything deemed too computer-y in iPadOS prior to Stage Manager. The reasoning goes like this: people love iPads because they not like computers and computers have windows, ergo the iPad can’t have windows. The iOS 9 press release when Apple first introduced iPad multitasking with split view and slide over does not contain the word “window” nor any derivation of it. This “windows are bad” mentality wasn’t just coming from Apple. It was also largely echoed by iPad enthusiasts of the time. Here’s an excerpt from Federico Viticci’s iOS 9 review of the then new mechanisms:

Split View isn’t like window management on a desktop: Apple wanted to eschew the complexities of traditional PC multitasking and windowing systems, and by rethinking the entire concept around size classes and direct touch manipulation, they’ve largely achieved this goal. They have created new complexities specific to iOS and touch, but it’s undeniable that Slide Over and Split View are far from the annoyances inherent to window management on OS X. The iPad is fighting to be a computer and Split View epitomizes this desire, but it doesn’t want to inherit the worst parts of multitasking from desktop computers.

By the time multitasking was revamped for iPadOS 15, the word “windows” could not be avoided. It was however mentioned only once.

Users now have quick access to the Home Screen when using Split View, and using the new shelf, they can also multitask with apps that have multiple windows like Safari and Pages.

This is a great example of why it’s so foolish to avoid the word “windows”. Not only is “apps that have multiple windows” easy and accurate, it also doesn’t require any additional explanation because anyone interested in multitasking on iPadOS already knows what a window is. I’m not arguing Apple should always use “windows” in place of “apps”. No one would ever need to distinguish between apps and hypothetical full-screen windows in iOS because they’d be functionally one in the same. The most recent iPad OS 18 documentation for Split View actually does a good job delineating when to use one verses the other.

On iPad, you can work with multiple apps at the same time. Open two different apps, or two windows from the same app, by splitting the screen into resizable views.

Using “apps” is fine when apps are one-to-one with windows. I’d argue the same is colloquially true even in macOS. “Go to Safari” is a clear instruction when there is only one Safari window open. That said, “windows” becomes unavoidable the instant an app has more than one.

This brings me back to that paragraph from Stage Manager’s documentation. Unlike Split-View, Stage Manager is very clearly a window manager. Every app is represented by one or multiple windows. Here’s the same documentation with a few changes1 that embrace this reality.

Organize windows into groups in Stage Manager

You can create groups of windows from multiple apps for specific tasks or projects by dragging them from the recents list into the center of the screen. You can also add windows to a group by dragging app icons from the Dock into the center of the screen.

You can return a window to the recents list by dragging it to the left. If you minimize a window in any group, the window is removed from that group and returned to the recents list.

While far from perfect (that second paragraph in particular), describing how Stage Manager groups windows is far more coherent and accurate because that’s what Stage Manager actually does. Stage Manager still needs a better lexicon across the board, but a good start would be to finally let windows be windows.


  1. The edits I made fall into one of three buckets: I replaced “apps” with “windows”, replaced “Recent Apps List” with “Recents List” (it’s still not a great term, but it’ll do for the sake of argument), and reiterated that windows belong to apps. 
A Gastric Band Approach to Desktop Clutter

Matt Birchler recently did a nice YouTube video praising the open source tile manager AeroSpace. There is a lot to love about AeroSpace right from the get-go. While I definitely wouldn’t call it Mac-assed, since AeroSpace is for very advanced users and developers who are comfortable with text configuration files, my sense is that AeroSpace is made by people who care deeply about the Mac. Part of this is how Nikita Bobko and others supporting the project have deeply considered how AeroSpace works. Unlike Stage Manager, AeroSpace has a clearly defined lexicon which is deployed across copious documentation. Like Stage Manager tries to ease window management with more automatic windowing, AeroSpace tries to ease tile management with more automatic tiling. Users don’t need to manually arrange windows into a grid with AeroSpace. It just happens. From there users can configure and tweak how windows are automatically tiled to their heart’s content.

The idea of composing sets of apps into custom workspaces is particularly appealing to me. I find super apps (apps that are effectively multiple apps in one) mildly off-putting. Most IDEs are super apps for everything related to software development. They contain version control managers, terminals, text editors, and so on. While many IDEs do all of these things reasonably well, their super app paradigm is effectively a usability barrier to using other apps that might otherwise be better. Instead of using Visual Studio Code, for example, I could imagine a world where I have a coding workspace consisting of BBEdit, Terminal, and Git Tower. The added benefit of this sort of multi-app workspace is that I could still use the individual apps a la carte or mix in other apps as needed.

While I’m sure many people started using tile managers to build custom workspaces, I suspect many more turned to them as a way to address desktop clutter. I’ve written a couple of times already about the modern day problem of desktop clutter. Thanks to an abundance of resources (memory, processor speed, etc…), people can open more and more stuff (apps, windows, tabs) without slowing their computer. Because their computer never slows, there is no intrinsic mechanism that forces users to ever close anything and so their more and more stuff stays open until they get overwhelmed. Tile managers reduce desktop clutter by discouraging or preventing overlapping windows, which physically constrains the number of windows that can be visibly open at a given moment.

Maximally constrained user interfaces are impossible to clutter. Lots of people were drawn to the original iPad specifically because it really was just a big iPhone and they loved their iPhone in part because it was too simple to mess up. I get it. I prefer small hotel rooms when traveling solo because larger ones just give me more space to lose track of my stuff, but small hotel rooms are not without trade-offs. The tiny room at the boutique hotel I stay at when visiting my employer’s New York office isn’t really suitable for anything beyond sleeping and hygiene. Even working on a laptop for any extended period of time would be a challenge. A hotel room that is too small to do work is great until you want to do work. An iPad that doesn’t feel like a computer is also great until you want to do computer-y things with it.

Desktop tiling managers are definitely not maximally constrained like the original iPad nor are they even anywhere near as constrained as previous versions of iPadOS with split view and slide over1, but they are, by their very nature, constrained. Beyond physically constraining the number of windows visible at a given time, tiling managers also constrain the shapes of those windows. I wrote2 about this when reviewing Stage Manager on macOS Ventura:

Personally, I’ve found gridded UIs to be lacking ever since I first saw Windows 1.0. When using grids, apps’ sizes. and more importantly their aspect ratios, are dictated by other windows that are on screen. Say you want to work across a spreadsheet, word processor, and a browser. Not only do all of these apps benefit from a large amount of screen real estate, both the word processor and browser need a minimum amount of size just to be entirely visible. In a gridded UI, some or all apps would have to compromise on space and you would have to scroll constantly within each app to see the entirety of what’s being worked on.

People who use tile managers undoubtedly have strategies for mitigating this inelegance. Tiles don’t have to be even subdivisions of a given display so you can, for example, adjust the width of a word processing tile to that of its document. AeroSpace in particular seems to offer lots of tools for technical users to hone their tiled workspaces. That said, the very nature of tiling according to the size of the display limits what adjustments are even possible.

Part of me feels bad that I used AeroSpace as the jumping off point to argue against tile managers. Its makers clearly have put a praiseworthy of thought and care into how it works, but it was seeing such a well considered tile manager that solidified my thinking. AeroSpace is the most appealing tile manager I’ve seen on the Mac and while I’m certain there are plenty of workflows where AeroSpace shines, being physically constrained by an always on tile manager that dictated the number and shape of open windows would feel like a gastric band to me. Rather than wholly automatic systems like AeroSpace or Stage Manager, the best solution to desktop clutter for me remains to regularly close the stuff I open, that’s only just a little automatic.


  1. Some have lamented that iPadOS 26 new windows-based multitasking is too computer-y and while maybe Apple could have somehow continued to support the old style split-screen and slide over alongside it, I don’t see how anyone could make iPadOS meaningfully less constrained using only split-screen and slide over. 
  2. I used the term “gridded UIs” in my Stage Manager review to encompass not just tile managers, but also iPadOS style split screens. In hindsight, “tile manager” is a better term that would have worked just as well. 
Introducing MacMoji Stickers

Disguised Face MacMoji

In olden days computers had just two emotions. They either happily worked as expected or were too sad to boot. Computers today have a range of emotions, but tragically have no way to express them. That’s why our scientists developed MacMoji using the latest in sticker technology, so your favorite computers can finally convey exactly how they feel.

As a parent working a full time job, I regularly seek out creative outlets that I can manage in my limited spare time. MacMoji started out as one such outlet. The idea of combining more modern emoji with the classic startup icon was too fun to resist. I could gradually illustrate one or two, share them on the Relay Member Discord, and iterate as needed based on feedback. At some point, a Relay member suggested I turn these illustrations into an Apple Messages sticker pack. The idea was such a no brainer that I did just that…eventually. You can now buy the Sticker pack for just $0.99 over at the App Store. You’ll find the F.A.Q. over here, which addresses why these stickers aren’t available in the EU. My thanks to the Relay member community for their feedback and encouragement in creating these stickers.

Thank Fucking God Steve Jobs Took Over the Macintosh Project

There are two arguments some use to try and diminish Steve Jobs contribution to the Macintosh, and by extension all of desktop computing. The first and by far most common is to say that Jobs merely copied what he saw at Xerox Parc. While there is absolutely no doubt both the Macintosh and NeXT grew out of what Steve saw, he even said as much, the system at Parc was akin to an automobile before the Model T. It was unrefined, complicated, and not user friendly. This is why Microsoft copied mostly from the Macintosh (and later NeXTStep) rather than anything from Xerox to make Windows.

The second and more obscure argument is that Jobs merely took over the Macintosh project from Jef Raskin, the suggestion being that Raskin invented the computer and that Jobs swooped in to take credit at the last second. What that argument omits is that Raskin’s vision for the Macintosh was very different than what shipped. How different? Raskin didn’t want a mouse. Here’s Andy Hertzfeld over at the venerable Folklore.org:

He was dead set against the mouse as well, preferring dedicated meta-keys to do the pointing. He became increasingly alienated from the team, eventually leaving entirely in the summer of 1981, when we were still just getting started, and the final product [utilized] very few of the ideas in the Book of Macintosh.

We know this is true not just because of Hertzfeld’s own account, but also because Raskin did eventually get to release his computer in 1987, the Canon Cat. Sure enough, it indeed didn’t use mouse and instead relied on what were called “leap keys”. Cameron Kaiser recently went into detail of how the Cat worked.

Getting around with the Cat requires knowing which keys do what, though once you’ve learned that, they never change. To enter text, just type. There are no cursor keys and no mouse; all motion is by leaping—that is, holding down either LEAP key and typing something to search for. Single taps of either LEAP key “creep” you forward or back by a single character.

Special control sequences are executed by holding down USE FRONT and pressing one of the keys marked with a blue function (like we did for the setup menu). The most important of these is USE FRONT-HELP (the N key), which explains errors when the Cat “beeps” (here, flashes its screen), or if you release the N key but keep USE FRONT down, you can press another key to find out what it does.

Needless to say, the Cat wasn’t the huge success Raskin hoped it would be.