Some Thoughts After Moving from macOS to Linux

It has been two weeks and change since I moved from my MacBook Pro to a ThinkPad P52s running Linux. Now that I have some real use time under my belt I thought it would be appropriate to give some of my thoughts.

The first thing I’d like to note is that I have no regrets moving to Linux. My normal routine is to use my laptop at work and whenever I’m away at home and use another computer at home (because I’m too lazy to pull my laptop out of my laptop bag every night). The computer I was using at home was a 2010 Mac Mini. I replaced it with my old MacBook Pro when I got my ThinkPad. I realized the other day that I haven’t once booted up my MacBook Pro since I got my ThinkPad. Instead I have been pulling my ThinkPad out of its bag and using it when I get home. At no point have I felt that I need macOS to get something done. That’s the best testament to the transition that I can give. That’s not to say Linux can do anything that macOS can. I’m merely fortune in that the tools I need are either available on Linux or have a viable alternative.

I’m still impressed with the ThinkPad’s keyboard. One of my biggest gripes about the new MacBooks is the ultra slim keyboards. I am admittedly a bit of a barbarian when it comes to typing. I don’t so much type as bombard my keyboard from orbit. Because of this I like keys with a decent amount of resistance and depth. The keyboard on my 2012 MacBook Pro was good but I’m finding the keyboard on this ThinkPad to be a step up. The keys offer enough resistance that I’m not accidentally pressing them (a problem I have with keyboards offering little resistance) and enough depth to feel comfortable.

With that said the trackpad is still garbage when compared to the trackpad on any MacBook. My external trackball has enough buttons where I can replicate the gestures I actually used on the MacBook though and I still like the TrackPoint enough to use it when I don’t have an external mouse connected.

Linux has proven to be a solid choice on this ThinkPad as well. I bought it with Linux in mind, which means I didn’t get features that weren’t supported in Linux such as the fingerprint reader or the infrared camera for facial recognition (which is technically supported in Linux but tends to show up as the first camera so apps default to it rather than the 720p webcam). My only gripe is the Nidia graphics card. The P52s includes both an integrated Intel graphics card and an Nvidia Quadro P500 discrete graphics card, which isn’t supported by the open source Nouveau driver. In order to make it work properly, you need to install Nvidia’s proprietary drivers. Once that’s installed, everything works… except secure boot. In order to make the P52s boot after installing the Nvidia driver, you need to go into the BIOS and disable secure boot. I really wish there was a laptop with an discrete AMD graphics card that fit my needs on the market.

One thing I’ve learned from my move from macOS to Linux is just how well macOS handled external monitors. My P52s has a 4k display but all of the external monitors I work with are 1080p. Having different resolution screens was never a problem with macOS. On Linux it can lead to some rather funky scaling issues. If I leave the built-in monitors resolution at 4k, any app that opens on that display looks friggin’ huge when moved to an external 1080p display. This is because Linux scales up apps on 4k displays by a factor of two by default. Unfortunately, scaling isn’t done per monitor by default so when the app is moved to the 1080p display, it’s still scaled by two. Fortunately, a 4k display is exactly twice the resolution as a 1080p display so changing the built-in monitor’s resolution to 1080p when using an external display is an easy fix that doesn’t necessitate everything on the built-in display looking blurry.

I’ve been using Gnome for my graphical environment. KDE seems to be the generally accepted “best” desktop environment amongst much of the Linux community these days. While I do like KDE in general, I find that application interfaces are inconsistent whereas Gnome applications tend to have fairly consistent interfaces. I like consistency. I also like that Gnome applications tend to avoid burying features in menus. The choice of desktop environment is entirely subjective but so far my experience using Gnome has been positive (although knowing that I have a ship to which I can jump if that changes is reassuring).

As far as applications go, I used Firefox and Visual Studio Code on macOS and they’re both available on Linux so I didn’t have to make a change in either case. I was using Mail.app on macOS so I had to find a replacement e-mail client. I settled on Geary. My experience with Geary has been mostly positive although I really hate that there is no way, at least that I’ve found, to quickly mark all e-mails as read. I used iCal on macOS for calendaring and Gnome’s Calendar application has been a viable replacement for it. My luck at finding a replacement for my macOS task manager, 2Do, on Linux hasn’t been a positive experience. I’m primarily using Gnome’s ToDo application but it lacks a feature that is very important to me, repeating tasks. I use my task manager to remind me to pay bills. When I mark a bill as paid, I want my task manager to automatically create as task for next month. 2Do does this beautifully. I haven’t found a Linux task manager that can do this though (and in all fairness, Apple’s Reminder.app doesn’t do this well either). I was using Reeder on macOS to read my RSS feeds. On Linux I’m using FeedReader. Both work with Feedbin and both crash at about the same rate. I probably shouldn’t qualify that as a win but at least it isn’t a loss.

The biggest change for me has probably been moving from VMWare Fusion to Virtual Machine Manager, which utilized libvirt (and thus KVM and QEMU). Virtualizing Linux with libvirt is straight forward. Virtualizing Windows 10 wasn’t straight forward until I found SPICE Windows guest tools. Once I installed that guest tool package, the niceties that I came to love about VMWare Fusion such as shared pasteboards and automatically changing the resolution of the guest machine when the virtual machine window is resized worked. libvirt also makes it dead simple to set a virtual machine to automatically start when the system boots.

One major win for Linux over macOS is software installation. Installing software from the Mac App Store is dead simple but installing software from other sources isn’t as nice of an experience. Applications installed from other sources have to include their own update mechanism. Most have have taken the road of including their own embedded update capabilities. While these work, they can usually only run when the application is running so if you haven’t used the application for some time, the first thing you end up having to do is update it. Lots of packages still don’t include automatic update capabilities so you have to manually check for new releases. Oftentimes these applications are available via MacPorts or Homebrew. On the Linux side of things almost every software package is available via a distro’s package manager, which means installation and updates are handled automatically. I prefer this over the hodgepodge of update mechanisms available on macOS.

So in closing I’m happy with this switch, especially since I didn’t have to drop over $3,000 on a laptop to get what I wanted.

Lockdown

I’ve always treated mobile devices differently than desktops and laptops. Part of this is because mobile devices tend to be restrictive. Most mobile devices are closed platforms that don’t allow you to load a different operation system. And while you can load custom firmware on a few mobile devices, it often requires some hackery. It appears as though I jumped ship at the proper time though because Apple is bringing the restrictive nature of iOS to its desktops and laptops:

Apple’s MacBook Pro laptops have become increasingly unfriendly with Linux in recent years while their Mac Mini computers have generally continued working out okay with most Linux distributions due to not having to worry about multiple GPUs, keyboards/touchpads, and other Apple hardware that often proves problematic with the Linux kernel. But now with the latest Mac Mini systems employing Apple’s T2 security chip, they took are likely to crush any Linux dreams.

[…]

Update 2: It looks like even if disabling the Secure Boot functionality, the T2 chip is reportedly still blocking operating systems aside from macOS and Windows 10.

I know a lot of people have expressed the feeling that buying an Apple computer and installing Linux on it is rather foolish. After all, you can buy a computer for far less that is fully supported by Linux (Linux support on Apple computers has always been a bit hit or miss). I mostly agree with that attitude. However, there comes a time in every Mac’s life where Apple drops support for it in macOS. While it’s possible to coax macOS onto a lot of unsupported Macs, there are also quite a few older Macs where installing a modern version of macOS is impossible. In such cases Linux offers an option to continue using the hardware with an operating system that has current security updates.

I prefer to repurpose old computers rather than throw them away. Having the option to install Linux on older Macs has always been a desirable option to me. For me losing that ability severely limits the functional lifetime of a Mac. Moreover, I worry that the limitations put into place by the T2 chip will make installing future versions of macOS on these machines impossible when they fall out of support.

Secure Boot functionality is a good security measure. However, Secure Boot on a vast majority of PCs can be disabled (in fact Microsoft requires that Secure Boot can be disabled for logo-certificate). Even if you don’t disable it, many Linux distributions have signed bootloaders that work with Secure Boot (unfortunately, even these signed bootloaders don’t work on Apple computers with a T2 chip). So it is possible to provide boot-time security while supporting third-party operating systems. Apple is simply choosing not to do so.

Deafening the Bug

I know a lot of people who put a piece of tape over their computer’s webcam. While this is a sane countermeasure, I’m honestly less worried about my webcam than the microphone built into my laptop. Most laptops, unfortunately, lack a hardware disconnect for the microphone and placing a piece of tap over the microphone input often isn’t enough to prevent it from picking up sound in whatever room it’s located. Fortunately, Apple has been stepping up its security game and now offers a solution to the microphone problem:

Little was known about the chip until today. According to its newest published security guide, the chip comes with a hardware microphone disconnect feature that physically cuts the device’s microphone from the rest of the hardware whenever the lid is closed.

“This disconnect is implemented in hardware alone, and therefore prevents any software, even with root or kernel privileges in macOS, and even the software on the T2 chip, from engaging the microphone when the lid is closed,” said the support guide.

The camera isn’t disconnected, however, because its “field of view is completely obstructed with the lid closed.”

While I have misgivings with Apple’s recent design and business decisions, I still give the company credit for pushing hardware security forward.

Implementing a hardware cutoff for the microphone doesn’t require something like Apple’s T2 chip. Any vendor could put a hardware disconnect switch on their computer that would accomplish the same thing. Almost none of them do though, even if they include hardware cutoffs for other peripherals (my ThinkPad, for example, has a build in cover for the webcam, which is quite nice). I hope Apple’s example encourages more vendors to implement some kind of microphone cutoff switch because being able to listen to conversations generally allows gathering more incriminating evidence that merely being able to look at whatever is in front of a laptop.

Jumping Ship

I’ve been running Apple computers for more than a decade now. While I really like macOS, anybody who knows me knows that I’ve been less than enthusiastic with the direction Apple has taken on the hardware front. My biggest gripe with Apple hardware is that it can no longer be serviced. My 2012 MacBook Pro is probably one of the easiest laptops that I’ve ever worked on. The entire back pops off and all of the frequently replaced parts are readily accessible. Part of the reason that I have been able to run that computer since 2012 is because I’ve been able to repair or upgrade components when necessary.

I usually run my laptops between four or five years. I’ve been running that MacBook Pro for six years. I was ready to upgrade last year but Apple had no laptops that appealed to me so I decided to wait a year to see if the situation would improve. When Apple announced its 2018 MacBook Pro line, it had everything I hated. All of the components, including the RAM and SSD, are soldered to the main board. Since the MacBook Pro line can no longer be upgraded, I’d have to order the hardware that I’d want to use for the next four or five years, which would cost about $3,2000. Worse yet, when something broke (all components will fail eventually), I’d have to pay Apple an exorbitant fee to fix it. And if that weren’t bad enough, the 2018 MacBook Pro still has that god awful slim keyboard. While Apple has attempted to improve the reliability of that keyboard by included a rubber membrane under the keys, typing on it is, at least in my opinion, a subpar experience.

I also have some concerns about Apple’s future plans. One of my biggest worries are the rumors of Apple transitioning its Macs to ARM processors. ARM processors are nice but I rely on virtualized x86 environments in my day to day work. If Apple transitioned to ARM processors, I wouldn’t be able to utilize my x86 virtual environments (virtualization turns into emulation when the guest and host architectures differ and emulation always involves a performance hit and usually a lot of glitches), which means I wouldn’t be able to do my work. I’m also a bit nervous about the rumors that Apple is planning to make app notarization mandatory in a future macOS release. Much of the software I rely on isn’t signed and probably never will be. Additionally, building and testing iOS software is a pain in the ass because even test builds need to be signed before they’ll work on an iOS device (anybody who has ran into code signing problems with Xcode will tell you that resolving those problems is often a huge pain in the ass) and I don’t want to bring that “experience” to my other development work. While I would never jump ship over rumors, when there are already reasons I want to jump ship, rumors act as additional low level incentives.

Since Apple didn’t have an upgrade that appealed to me and I’m not entirely comfortable with the rumors of the directions the company maybe going, I decided to look elsewhere. I’ve been running Linux in some capacity for longer than I’ve been running Apple computers. Part of my motivation for adopting macOS in the first place was because I wanted a UNIX system on my laptop (Linux on laptops back then was a dumpster fire). So when I decided to jump ship Linux became the obvious choice, which meant I was looking at laptops with solid Linux support. I also wanted a laptop that was serviceable. I found several solid options and narrowed it down to a Lenovo ThinkPad P52s because it was certified by both Red Hat and Ubuntu, sanely priced, and serviceable (in fact Lenovo publishes material that explains how to service it).

Every platform involves trade-offs. With the exception of Apple’s trackpad, every trackpad that I’ve used has been disappointing. The ThinkPad trackpad is no different in this regard. However, the ThinkPad line includes a TrackPoint, which I’ve always preferred as a mobile mouse solution to trackpads (I still miss Apple’s trackpad gestures though). There also isn’t a decent to do application on Linux (I use 2Do on both iOS and macOS and nothing on Linux is comparable) and setting up Linux isn’t anywhere near as streamlined as setting up a Mac (which involves almost no setup). With that said, I usually use an external trackball so the quality of the trackpad isn’t a big deal. My to do information syncs with my Nextcloud server so I can use its web interface when on my laptop (and continue to use 2Do on my iPhone). And since I chose a certified laptop, setting up Linux wasn’t too difficult (the hardest part was setting up nVidia’s craptastic Linux driver).

The upside to the transition, besides gaining serviceability, is first and foremost the cost. The ThinkPad P52s is a pretty cost effective laptop and I found a 20 percent off coupon code, which knocked the already reasonable price down further. Since neither the RAM nor the SSD in the P52s are soldered to the main board, I was able to save money by buying both separately and installing them when the computer arrived (which is exactly what I did with all of my Macs). In addition to the hardware being cheaper, I was also able to save money on virtualization software. I use virtualization software everyday and on macOS the only decent solution for me was VMWare Fusion (Parallels has better Windows support than Fusion but no serious Linux support, which I also require). Fedora, the Linux distribution I settled on (I run CentOS on my servers so I opted for the closest thing the included more cutting edge software), comes with libvirt installed. After spending a short while familiarizing myself with the differences between VMWare and libvirt, I can say that I’m satisfied with libvirt. It’s better in some regards, worse in others, and pretty much the same otherwise (as far as a user experience, underneath it’s far different).

I also gained a few things on the hardware side. The P52s has two USB-C and two USB-A (all USB 3) ports. My MacBook Pro only had two USB-A ports and the new MacBook Pros only have USB-C ports. All of my USB devices use USB-A so I’d need a bunch of dongles if I didn’t have USB-A ports (not a deal breaker but annoying nonetheless). In addition to being a very good mobile keyboard, the P52s keyboard also has a 10-digit keypad, which no Mac laptop currently has. Like USB-A ports, the lack of a 10 digit keypad isn’t a deal breaker in my world but its inclusion is always welcomed. If that weren’t enough, the keyboard also includes honest to god function keys instead of a TouchBar (as somebody who uses Vim a lot, the lack of a physical escape key is annoying).

My transition was relatively painless because I keep all of my data on my own servers. I didn’t have to spend hours trying to figure out how to pull data off of iCloud so I could use it on Linux. All I had to do was log into my Nextcloud instance and all of my calendar, contact, and to do information was synced to the laptop. The same was true of my e-mail. In anticipation for my move I also changed password managers from 1Password to a self-hosted instance of Bitwarden (1Password is overall a better experience but it lacks a native Linux app so I’d have been stuck with moving to a subscription plan to utilize a browser plugin that would deliver the same experience as Bitwarden). Keeping your data off of proprietary platforms makes moving between platforms easier. Likewise, keeping your data in open standards makes moving easier. I primarily rely on text files instead of word processor files (I used Markdown or LaTeX for most formatting) and most of my other data is stored in standardized formats (PNG or JPEG for images, ePub or PDF for documents, etc.).

Although I won’t give a final verdict until I’ve used this setup for a few months, my initial impressions of moving from macOS to Linux are positive. The transitions has been relatively painless and I’ve remained just as productive as I was on macOS.

Good News from the Arms Race

Security is a constant arms race. When people celebrate good security news, I caution them from getting too excited because bad news is almost certainly soon to follow. Likewise, when people are demoralized by bad security news, I tell them not to lose hope because good news is almost certainly soon to follow.

Earlier this year news about a new smartphone cracking device called GrayKey broke. The device was advertised as being able to bypass the full-disk encryption utilized by iOS. But now it appears that iOS 12 renders GrayKey mostly useless again:

Now, though, Apple has put up what may be an insurmountable wall. Multiple sources familiar with the GrayKey tech tell Forbes the device can no longer break the passcodes of any iPhone running iOS 12 or above. On those devices, GrayKey can only do what’s called a “partial extraction,” sources from the forensic community said. That means police using the tool can only draw out unencrypted files and some metadata, such as file sizes and folder structures.

Within a few months I expect the manufacturer of the GrayKey device to announce an update that gets around iOS’s new protections and within a few months of that announcement I expect Apple to announce an update to iOS that renders GrayKey mostly useless again. But for the time being it appears that law enforcers’ resources for acquiring data from a properly secured iOS device are limited.

Installing macOS Mojave on Unsupported Macs

I’m back, I’m married, and I’m behind the news cycle. Although being behind the news cycle should be treated as a state of bliss, it’s not a great place to be when you use news articles for blog material. It’s going to take me a day or two to catch up.

One project I did tackle over my extended vacation is getting macOS Mojave installed on my computers. Mojave dropped official support for several Macs but just because Apple doesn’t officially support a platform doesn’t mean it can’t be used. I see no reason to throw away perfectly functional hardware and enjoy receiving security updates. Because of that, I ended up playing with dosdude1’s Mojave Patcher.

The patcher originally didn’t work for me because all of my computers have FileVault enabled and the version I first downloaded had a bug where it couldn’t mount FileVault containers. That was before I left for my wedding. Fortunately, by the time I got back a new version that fixed that bug was released.

I used the patcher to install Mojave on my 2010 Mac mini 4,1 and my 2010 MacBook Pro 5,4. Installation on my Mac mini was smooth. I haven’t had any major problems with it. Installation on my MacBook Pro was another matter. I should note beforehand that the MacBook Pro in question has a bad memory controller. One of the two memory banks has a 50/50 chance of working when I power the system on. If it doesn’t work, I only have access to half of my memory. That may be why I have to reset the NVRAM every time I power the system on in order to get it to boot (if I don’t reset the NVRAM, I get the dreaded no symbol when I start the computer).

If you’ve been happily running an older Mac and found out that Mojave won’t install, try dosdude1’s Mojave Patcher. It doesn’t work on every old Mac (a list of supported Macs can be found at the link) but it does work for most of the 64-bit Intel Macs.

Upgrading Your Unsupported Mac to Mojave

macOS Mojave was released last night. As is often the case with major macOS updates, Mojave dropped support for a slew of older platforms. But just because Apple doesn’t support installing Mojave on older computers doesn’t mean that it can’t be installed. dosdude1 has a utility that allows you to install Mojave on a lot of officially unsupported Macs.

I’ve used his patch utility to get High Sierra on my unsupported 2010 MacBook Pro and haven’t had an issues. I attempted to upgrade my 2010 Mac Mini to Mojave last night but discovered that the utility currently has a problem decrypting encrypted APFS containers. dosdude1 is aware of this problem and will hopefully be able to figure out what is going on so it can be fixed. However, if your older Mac isn’t utilizing APFS or FileVault 2 (which it really should be utilizing), you should be good to go.

You Are Responsible for Your Own Security

One of the advertised advantages of Apple’s iOS platform is that all software loaded onto iOS devices has to be verified by Apple. This so-called walled garden is meant to keep the bad guys out. However, anybody who studies military history quickly learns that sitting behind a wall is usually a death sentence. Eventually the enemy breaches the wall. Enemies have breached Apple’s walls before and they continue to do so:

In a blog post entitled “Location Monetization in iOS Apps,” the Guardian team detailed 24 applications from the Apple iOS App Store that pushed data to 12 different “location-data monetization firms”—companies that collect precise location data from application users for profit. The 24 identified applications were found in a random sampling of the App Store’s top free applications, so there are likely many more apps for iOS surreptitiously selling user location data. Additionally, the Guardian team confirmed that one data-mining service was connected with apps from over 100 local broadcasters owned by companies such as Sinclair, Tribune Broadcasting, Fox, and Nexstar Media.

iOS has a good permission system and users can prevent apps from accessing location information but far too many people are willing to grant access to their location information to any application that asks. If a walled garden were perfectly secure, users wouldn’t have to worry about granting unnecessary permissions because the wall guards wouldn’t allow anything malicious inside. Unfortunately, the wall guards aren’t perfect and malicious stuff does get through, which brings me to my second point.

What happens when a malicious app manages to breach Apple’s walled garden? Ideally it should be immediately removed but the universe isn’t ideal:

Adware Doctor is a top app in Apple’s Mac App Store, sitting at number five in the list of top paid apps and leading the list of top utilities apps, as of writing. It says it’s meant to prevent “malware and malicious files from infecting your Mac” and claims to be one of the best apps to do so, but unbeknownst to its users, it’s also stealing their browser history and downloading it to servers in China.

In fairness to Apple, the company did eventually remove Adware Doctor from its app store. Eventually is the keyword though. How many other malicious apps have breached Apple’s walled garden? How long do they manage to hide inside of the garden until they are discovered and how quickly do the guards remove them once they are discovered? Apparently Apple’s guards can be a bit slow to react.

Even in a walled garden you are responsible for your own security. You need to know how to defend yourself in case a bad guy manages to get inside of the defensive walls.

Designed by Apple in California

Designed by Apple in California is a tagline the company uses to add a little prestige to their Chinese manufactured electronics. In addition to designing electronics the company also designs its own stores. However, when people in California design stores they often overlook environmental issues that are rare there but common elsewhere, such as ice and snow:

Apple’s new flagship retail store in Chicago, the one with a MacBook-shaped rooftop, is nothing short of an architectural marvel. At least, that’s how some news reports put it when the store opened back in October. Beyond standing out among the less inspired buildings of the downtown Chicago area, the new Apple Store also happens to be very poorly thought through considering its thin roof now has dangerous icicles hanging perilously over public walkways.

Designed by Apple in a state that doesn’t have to deal with arctic bullshit. As a Minnesotan I can’t help but laugh at this.

Apple isn’t the first company to run into this problem and it won’t be the last. It’s too easy to take architecture for granted. An architect in California can easily overlook the effects harsh winters will have on their building. An architect in Minnesota can easily overlook the effects earthquakes will have on their building. If you’re tasked with designing a building that will be built in another region, it might be a good idea to contact some architects in that area and ask them about environmental issues they have to design around.

Physical Access Isn’t Necessarily Game Over

I swear Apple fanboys are some of the dumbest people on the planet. Quite a few of them have been saying, “If an attacker as physical access, it’s game over anyways,” as if that statement makes the root user exploit recently discovered in High Sierra a nonissue.

At one time that statement was true. However, today physical access is not necessarily game over. Look at all of the trouble the Federal Bureau of Investigations (FBI) has been having with accessing iOS devices. The security model of iOS actually takes physical access into account as part of its threat modeling and has mechanisms to preserve the integrity of the data contained on the device. iOS requires all code to be signed before it will install or run it, which makes it difficult, although far from impossible, to insert malicious software onto iOS devices. But more importantly iOS encrypts all of the data stored in flash memory by default. Fully encrypted disks protect against physical access by both preventing an attacker from getting any usable data from a disk and also by preventing them from altering the data on the disk (such as writing malware directly to the disk).

macOS has a boot mode called single user mode, which boots the computer to a root command prompt. However, if a firmware password is set, single user mode cannot be started without entering the firmware password. The firmware password can be reset on machines with removable RAM (resetting the password requires changing the amount of RAM connected to the mainboard) but most of Apple’s modern computers, some iMacs being the exception, have RAM modules that are soldered to the mainboard.

Physical access is especially dangerous because it allows an attacker to insert malicious hardware, such as a key logger, that would allow them to record everything you type, including your passwords. However, that kind of attack requires some amount of sophistication and time (at least if you want the malicious hardware to be difficult to detect), which is where the real problem with High Sierra’s root exploit comes in. The root exploit required no sophistication whatsoever. Gaining root access only required physical access (or remote access if certain services were enabled) to an unlocked Mac for a few seconds. So long as an attacker had enough time to open System Preferences, click one of the lock icons, and type in “root” for the user name a few times they had complete access to the machine (from there they could turn on remote access capabilities to maintain their access).

Attempting to write off this exploit as a nonissue because it requires physical access requires willful ignorance of both modern security features that defend against attackers with physical access and the concept of severity (an attack that requires no sophistication can be far more severe than a time consuming sophisticated attack under certain threat models).