PSA: Apple Silicon Users: Update ffmpeg

TLDR: If you run a Mac using Apple Silicon, update ffmpeg to dramatically speed up your encodes.

Late last year I traded in my beloved iMac Pro for an iMac Pro Portable 14" MacBook Pro. I cannot overstate how much I love this machine, and when paired with the LG UltraFine 5K, it is actually a really phenomenal setup. I have nearly all the benefits of my beloved iMac Pro, but I can pick it up and move it without a ridiculous carrying case.

When I got the machine, one of the first things I tried, for speed-testing purposes, was a ffmpeg encode. As has been mentioned before, I use ffmpeg constantly, either directly, or via Don Melton’s amazing other-transcode tool.

Given this was my first Apple Silicon Mac, and I sprung for the M1 Max, I was super excited to see how fast transcodes were going to be on this new hotness.

I was sorely disappointed. It seemed that encodes were capped at a mere 2× — about 60fps.

As it turns out, I wasn’t the only one giving this some serious 🤨. I was pointed to an issue in the repo for the aforementioned other-transcode repository. Many other people thought this looked really weird.

This was first reported in early November, and then about two months ago, the also-excellent Handbrake found a fix, which seemed to be really simple — a very special boolean needed to be set.

Thankfully, about a month ago, ffmpeg patched as well. This was eventually integrated into ffmpeg version 5.0, which was released on 14 January.

However, I install most things using Homebrew, and the Homebrew formula wasn’t updated. Using a neat trick that Homebrew supports, I was able to grab and build the latest (read: HEAD) version of ffmpeg and get fast encodes. However, if you’re not inclined to deal with stuff that fiddly, as of yesterday, the ffmpeg formula has been updated.

So, if you do any transcoding using ffmpeg on your Apple Silicon Mac, now is the time to do a brew upgrade.

Before the new ffmpeg goodies, I topped out encodes at about 2×. Now, using the latest-and-greatest released version of ffmpeg, I am getting quite a bit more than that. On a test mpeg2video file that I recorded using Channels, I was able to get a full 10×. 🎉


 

This week I joined my pals Ben Chapman and “Doctor Don” Schaffner on their fascinating podcast Food Safety Talk. I know; I am surprised as well.

Nevetheless, on this episode, our conversation is wide-ranging and quite entertaining. We begin with Ben playing 20 questions, flailing about, as he tries to figure out who the special guest was. 😆 After that, we discuss my tastes in food, and how close I am to, well, accidentaly poisoning myself.

The conversation is kind of all over the place, and frankly, those are some of the most fun times I have as a guest. Even if you’re not interested in Food Safety Talk, Ben and Don also host Risky or Not, which is a short podcast evaluating the really poor choices of their audience. It’s both quite a fun listen and also mildly horrifying.


 

From the this-may-only-be-useful-to-me department, I recently did the stereotypical programmer thing. I procrastinated from doing what I should be doing by instead automating something that bothered me.

One of the many perks of SwiftUI is how easy it is to preview your designs/layouts. In fact, you can even do so across multiple devices:

struct SomeView: View {
    var body: some View {
        Text("Hello, world")
    }
}

struct SomeViewPreviews: PreviewProvider {
    static var previews: some View {
        Group {
            SomeView()
                .previewProvider("iPhone 13 Pro")
            SomeView()
                .previewProvider("iPhone SE (2nd generation)")
        }
    }
}

The above code would present you with two renders of SomeView: one shown on an iPhone 13 Pro, and one on an iPhone SE.

The problem with this, however, is you need to know the exact right incantation of device name in order to please Xcode/SwiftUI. For some devices, like iPhone 13 Pro, that’s pretty straightforward. For others, like iPhone SE (2nd generation), it’s less so.

The good news is, you can get a list of installed simulators on your machine using this command:

xcrun simctl list devices available

It occurred to me, if I can easily query Xcode for the list of installed simulators, surely I can then convert that list into a Swift enum or equivalent that I can use from my code? Hell, I can even auto-generate this enum every time I build, in order to make sure I always have the latest-and-greatest list for my particular machine available.

Enter installed-simulators. It’s a small Swift command-line app that does exactly that. When run, without any parameters, it spits out a file called Simulators.swift. That file looks like this:

import SwiftUI

enum Simulator {
    static let iPhone8 = PreviewDevice(rawValue: "iPhone 8")
    static let iPhone8Plus = PreviewDevice(rawValue: "iPhone 8 Plus")
    /* ...and so on, and so on... */
}

That makes it super easy to test your SwiftUI views by device, without having to worry about the precisely correct name of the device you’re thinking of:

struct SomeViewPreviews: PreviewProvider {
    static var previews: some View {
        Group {
            SomeView()
                .previewProvider(Simulator.iPhone13Pro)
            SomeView()
                .previewProvider(Simulator.iPhoneSE2ndgeneration)
        }
    }
}

Naturally, I prefer this over the alternative.

Since I’m so used to wielding a hammer, I wrote this as a Swift command-line app rather than a Perl script. Sorry, John. Also, I know effectively nothing about releasing apps of any sort for macOS, so goodness knows if this will work on anyone else’s desk but mine.

Nevertheless, I’ve open-sourced it, and you can find it — as well as some more robust instructions — over at Github.


 

In developing for Apple platforms — particularly iOS — there are many arguments that are disputed with the same fervor as religion or politics. Storyboards are evil, or they’re the only way to write user interfaces. AutoLayout is evil, or it’s the only reasonable way to write modern UI code. SwiftUI is ready for production, or it’s merely a new/shiny distraction from real code. All Swift files should be linted, or only overbearing technical leads bother with linting.

Today, I’d like to dip my toe into the pool by discussing linting. Linters are tools that look at your source code and ensure that very obvious errors are not made, and that a style guide is being followed. As a silly example, both of these pieces of Swift code are valid:

struct Person {
    var id: Int? = nil
}
struct Person {
    var id: Int?
}

A linter would have an opinion about the above. It may encourage you to use the bottom version — var id: Int? — because the explicit initialization of nil is redundant. By default, an Optional will carry the default value of nil, implicitly.

SwiftLint

In my experience, the first time I really ran into a linter was once I started doing Swift development full-time in 2018. The team I was on dabbled lightly in using SwiftLint, the de facto standard linter for Swift projects. The tough thing about swiftlint is that it has a lot of rules available — over 200 as I write this. Many of those rules are… particular. It’s very easy to end up with a very opinionated set of rules that are trying to change your code into something unfamiliar.

Trust me when I say some of these rules are quite a lot to swallow. One of my absolute " favorite " rules is trailing_whitespace, which enforces absolutely no whitespace at the end of a line of code. 🙄

Even if you want to embrace SwiftLint in your project, you then needed to parse through 200+ rules in order to figure out what they are, whether or not they’re useful, and how many times your own existing code violates each one. No thank you.

swiftlint-autodetect

Enter the new project swiftlint-autodetect by professional grump (but actually good guy) Jonathan Wight. This project — as with all clever ideas — is brilliant in its simplicity. When run against an existing codebase, it will run SwiftLint against all rules, and then figures out which ones are not violated at all. These rules that your code is already passing are then output as a ready-to-use SwiftLint configuration file.

swiftlint-autodetect generate /path/to/project/directory

The generated file will have all currently known SwiftLint rules included, but the ones where violations would occur are commented out, so they are ignored by SwiftLint. Using this file, you can integrate SwiftLint into your build process, painlessly, without having to change your code to meet some weird-ass esoteric linting requirement. 😗 👌🏻

Increasing Coverage

I’m very nearly ready to release a new project, and I’m doing some cleanup and refactoring to get ready for its release. I decided to add SwiftLint support using swiftlint-autodetect, but then I wanted to investigate what SwiftLint rules I was violating, but perhaps shouldn’t be.

Conveniently, swiftlint-autodetect has another trick up its sleeve: it can also output a count of the number of violations for each rule. Additionally, it will mark with an * which rules you can instruct SwiftLint to fix automatically using swiftlint --fix. That makes it easy to start at the bottom of the resulting list, where the counts are low, and use that as a guide to slowly layer on more and more SwiftLint rules, as appropriate.

swiftlint-autodetect count /path/to/project/directory

This is exactly what I’ve done: I started with the automatically generated file, and then went up the list that count generated to turn on rules that seemed to be low-hanging fruit. Some I decided to leave disabled; some I decided to enable and bring my code into compliance.

y tho

Thanks to the combination of these two subcommands on swiftlint-autodetect, I am now linting my source code before every build. I’ve fixed some inconsistencies that I know would bother me over time. I’ve also found a couple spots where taking a slightly different approach can help improve performance/consistency.

Because — not despite — I’m an individual developer, I find it’s important to use the tools available to you to help you keep your code clean, correct, and working. Though I don’t deploy every tool under the sun, I do think having some combination of CI, unit testing, and linting is a very great way to use computers as a bit of parachute that, normally, your peer developers would provide.


In the middle of 2017 — roughly four and a half years ago — I went on a search for a monitor to pair with my MacBook Pro while I was at work. I wanted something that was “retina” quality — which means roughly 220 PPI.

While not terribly scientific, the rules of thumb I landed on were:

  • No more than 24" at 4K
  • No more than 27" at 5K

Back in 2017 — one thousand six hundred and sixty five days ago, as I write this — I compiled a list of options. At the time there were five. Two Dells, one run-of-the-mill LG, and the two LG UltraFine monitors.

The Lineup

1665 days later, let me revise my findings:

  • Budget Option: LG 24UD58-B 24" 4K Monitor — ~$300
    This is what I used, eventually two-up, at work. In 2017. The panel is unremarkable, but for developers, it’s more than serviceable. Honestly, I liked this setup. Even two-up, it’s cheaper than the next available option.

  • Moderate Option: LG UltraFine 4K — ~$700
    A fancier version of the above, which includes the option of daisy-chaining a second 4K display. It also has a small USB-C hub internal to it, offering more connectivity options.

  • Deluxe Option: LG UltraFine 5K — ~$1300
    The same thing as the LG UltraFine 4K, but without the option of daisy-chaining a second display. It, too, has a small USB-C hub. I recently bought one secondhand, and the rumblings are true: the stand is straight-up trash, and the monitor itself is unreliable on the best of days. When it does work, though, it’s great!

  • Ridiculous Option: Apple Pro Display XDR — ~$5000 without a stand
    Apple’s too-fancy-for-its-own-good option. It costs $5,000 without a stand. To add their official stand is another $1,000. Oh, and if you want the fancy nano-texture coating, that’s another $1,000. So, all-in, the Pro Display XDR is $7,000. Which is, charitably, absurd.

The above is the entire lineup. That’s it. Four options. Three of which existed 1665 days ago.

In [effectively] 2022, there are four options for retina-quality monitors to attach to your Mac.

If there are others, please let me know, as I’d love to share them. I know that others have existed at some time in the past — like the Dells I featured in the first version of this post — but they’ve been discontinued and/or are not readily available here in the States.

The Future

Last month I bought a 14" MacBook Pro equipped with a M1 Max. This machine is as fast as my iMac Pro, but considerably more portable. The battery life is by no means infinite, but it’s enough to go work without power for several hours without stressing. MagSafe is back — finally — and the keyboard is both reliable and excellent. I have a HDMI port for when I travel, and an SD card reader. The M1 Pro and Max MacBook Pros are possibly the best machines Apple has released since I’ve been observing the company, for about fifteen years.

Furthermore, the display on this machine is phenomenal. My buddy Jason Snell in particular has been banging this drum for a while: on any other machine, the displays alone would be the star of the show. They’re “true” pixel-doubled retina, they have wide color gamut, they’re backlit by mini-LED, and they sport a fast refresh rate of 120 Hz. They’re nearly perfect.

Why can’t we have this in an external monitor?

Granted, refreshing roughly 15 million pixels 120 times per second requires an immense amount of data/bandwidth, so maybe that isn’t possible. However, everything else about these panels should be possible in an external monitor. Even if we have to suffer through a pedestrian 60 Hz. Why can’t we have an Apple-produced 5K screen that has mini-LED and wide color?

Why can’t we have an option between the unreliable $1300 LG 5K and the $5000+ XDR?

Over the last year or two, Apple has been doing a phenomenal job of filling the holes in their product line. For my money, the completely embarrasing monitor situation is the lowest-hanging fruit. By a mile.

Take my money, Apple. Give me a monitor made for professionals that don’t do video editing for a living. Please.

The non-UltraFine 4K and the XDR items linked above are affiliate links.


 

Judge if you must, but one of my favorite places to vacation — money be damned — is Walt Disney World. I’ve said many times it’s much like a geographical manifestation of Christmas: it’s possible to be in a bad mood while you’re there, but it takes some work. The last time I was there was for Declan’s fifth birthday, back in October 2019, or approximately 14 years ago.

Naturally, a lot has changed at Disney World since then. It shut down for a few months due to the pandemic, and has been reopening slowly since. Like many corporations, and many places, Disney is using this as an opportunity to press the proverbial reset button. New policies and techniques abound!

In this episode of Starport75, I sepnt some time with my friends Chris and Glenn discussing all the changes Disney has put in since I was last there, in the before-times. In a very Siracusian fashion, Glenn had compiled a plethora of notes, but we only were able to get through the highlights.

Nonetheless, I enjoy going on Starport75 tremendously, in no small part because I feel like I have such great chemistry with both hosts. I think you’ll enjoy the episode — especially if you’re also a Disney fan that hasn’t been to Disney World in a long time.


 

Only on Clockwise can you discuss stereos, monitors, NFTs, and robot vacuums… all in the span of 30 minutes. Today, that’s exactly what I did with Shelly Brisbin, Dan Moren, and Mikah Sargent.

In this episode, you can hear moments such as me telling Mikah to get off my lawn, and witness the birth of a gift exchange between the four of us. Interestingly, 3/4 of us will be buying each other the same gift.

Clockwise is always fun and fast. There’s never a bad time to start listening.


 

Don’t take my complete forgetfulness to write this post as an indication of a lack of enthusiasm. I’m trying desperately to get a new app I’m working on across the finish line, and as such, I’ve been pretty distracted. 🤪

Nearly two weeks ago, I had the utmost pleasure of returning to visit with my Canadian pals Angelo and Brian on their podcast Double Density. Despite their completely incorrect opinion on bagels, Brian and Angelo are good guys, and I enjoyed chatting with them again.

On this episode, we discussed how wrong they are about bagels, my thoughts in ordering my new MacBook Pro, audiophiles, and FUD about COVID. I surely made somebody mad when recording this, but at least the three of us had a lot of fun in the process. 😇


 

This past Saturday, I joined my pals Kathy Campbell, Jean MacDonald, James Thomson, and Jason Snell to discuss the final episode of season two of Ted Lasso.

Without spoilers, this season of Ted Lasso hit me differently than the last. I really enjoyed talking with this fine panel of people about this final episode. The discussion helped me interpret the episode differently, and gave me a different perspective of the season at large. The focus of this episode of Football is Life is just the last episode — not the whole season — but we naturally had some broader conversations as well.

Ted Lasso remains one of my favorite shows of all time, and doing these wrap-up shows is immensely fun. If you’re a Ted Lasso super-fan like me, you’d surely enjoy Football is Life.


 

Football is, as they say, life. And though this season of Ted Lasso has been somewhat divisive, I’m overjoyed to have appeared on another episode of The Incomparable’s rewatch podcast, Football is Life.

On this episode, I join host Jason Snell, and fellow panelists Kelly Guimont and James Thomson to discuss Headspace. It’s a varied and long conversation about heel turns, parental issues, and running an enjoyable but meaningful comedy-drama.

I’ve praised Ted Lasso until I’m blue in the face. It’s a phenomenal show, and I’m extremely stoked to see where the bottom half of season two takes us.