A Geek With Guns

Chronicling the depravities of the State.

Archive for the ‘Apple’ tag

Linux on a 2010 Mac Mini Part Two

without comments

Last week I mentioned my adventure of installing Linux on a 2010 Mac Mini. Although Ubuntu 18.10 did install and was working for a few days an update left the system unusable. After an update towards the end of last week the system would only boot to a black screen. From what I gathered online I wasn’t the only person who ran into this problem. Anyways, I ended up digging into the matter further.

I once again tried installing Fedora. When I tried to install Fedora 29, I was unable to stop it from booting to a black screen so I decided to try Fedora 28. Using basic graphics mode I was able to get Fedora 28 to boot to the live environment and from there install Fedora on the Mac Mini. After installation I was able to get my Fedora installation to boot. However, when I tried to install the Nvidia driver from RPM Fusion, the system would only boot to a black screen afterwards. I tried installing the Nvidia driver via the negativo17 repository but didn’t expect it to work since the driver distributed from that repository is based on version 418 and the last driver to support the Mac Mini’s GeForce 320M was version 340. Things went as expected. I then tried installing the Nvidia driver manually using a patched version of the 340 driver from here. Unfortunately, that driver doesn’t work with the 4.20 kernel so that was a no go as well.

The reason I hadn’t tried to install the Nvidia driver manually before was because I didn’t want to deal with supporting the setup in the future. As I was trying to install it using the previously linked instructions I felt justified because the guide isn’t nearly as straight forward as installing the driver from a repository. It became a moot point since manual installation didn’t work but it did make me think about the fact that any solution I settled upon would need to be maintained, which lead me to the idea of using Ubuntu 18.04 LTS. The LTS versions of Ubuntu are supported by Canonical for five years so if I could get 18.04 installed, the setup would have a decent chance of working for five years.

After passing the kernel the “nouveau.modeset=0” argument, just as I had to do with 18.10, I was able to boot into a live environment and install 18.04 to the hard drive. Likewise, I had to use the “nouveau.modeset=0” argument to boot into the installation. Once I was booted into the installation I was able to use “sudo apt install nvidia-340” to install the 340 version of the Nvidia driver. After rebooting everything worked properly. I’m hoping that future updates will be less likely to break this setup since the LTS releases of Ubuntu tend to be more stable than non-LTS versions.

So, yeah, if you want to get a currently supported Linux distro running on a 2010 Mac Mini, take a look at Ubuntu 18.04. It might be your best bet (if it continues to run properly for the next month or so, I’ll say it is your best bet).

Written by Christopher Burg

March 4th, 2019 at 10:00 am

Posted in Technology

Tagged with , ,

Linux on a 2010 Mac Mini

without comments

I prefer repurposing old computers to throwing them away. A while ago I acquired a 2010 Mac Mini for $100. It has worked well. I even managed to install macOS Mojave on it using this patcher. However, I wanted to try installing Linux on it.

I first tried installing my go-to distro, Fedora (version 29 to be specific). Unfortunately, I immediately ran into problems. The Mac Mini has an Nvidia card that doesn’t play nicely with the nouveau driver in the kernel so I couldn’t bring up a graphical environment (I just got a black screen with a blinking cursor in the upper left corner). I tried booting the Fedora live distro with the “nouveau.modeset=0” parameter but to no avail.

So I decided to try Ubuntu (18.10). Ubuntu also initially failed to boot but it at least gave me an error message (related to the nouveau driver). When I booted it with the “nouveau.modeset=0” parameter I was able to get to the graphical interface and install Ubuntu. After installation I once again booted with the “nouveau.modeset=0” parameter and install Nvidia’s proprietary driver. After that the system now boots into Ubuntu without any trouble (installing the Nvidia driver also enabled audio output through HDMI).

If you’re having trouble installing Linux on a 2010 Mac Mini, try Ubuntu and try passing the “nouveau.modeset=0” parameter when booting and you may have better luck.

Written by Christopher Burg

February 27th, 2019 at 10:00 am

Posted in Technology

Tagged with , ,

If You’re Good at Something, Never Do It for Free

without comments

A minor controversy has developed in the macOS world. Linuz Henze, a security researcher, has discovered a vulnerability in Keychain for macOS that allows an attacker to access stored passwords. However, Henze isn’t providing the details to Apple because Apple’s bug bounty program, for some stupid reason, doesn’t cover macOS vulnerabilities:

Security researcher Linuz Henze has shared a video demonstration of what is claimed to be a macOS Mojave exploit to access passwords stored in the Keychain. However, he has said he is not sharing his findings with Apple out of protest.

Henze has publicly shared legitimate iOS vulnerabilities in the past, so he has a track record of credibility.

However, Henze is frustrated that Apple’s bug bounty program only applies to iOS, not macOS, and has decided not to release more information about his latest Keychain invasion.

Some people aren’t happy with Henze’s decision because his refusal to provide the exploit to Apple will make it harder for the company to fix the vulnerability. What these people are forgetting is that Henze isn’t refusing to provide the exploit to Apple, he’s refusing to provide it for free. In other words, he wants to be paid for his work. I don’t know many people who would willingly work for free. I certainly wouldn’t. Unless you would, you really should put the blame for this on Apple for refusing to pay for macOS exploits.

Written by Christopher Burg

February 7th, 2019 at 10:00 am

Posted in Technology

Tagged with , ,

Disable FaceTime

without comments

If for some inexplicable reason you own an Apple device and haven’t already disabled FaceTime, you should do so now:

Users have discovered a bug in Apple’s FaceTime video-calling application that allows you to hear audio from a person you’re calling before they accept the call—a critical bug that could potentially be used as a tool by malicious users to invade the privacy of others.

You don’t want a caller to hear you bitching them out for being inconsiderate by calling you instead of having the decency to send a text message.

Written by Christopher Burg

January 29th, 2019 at 10:30 am

Posted in Technology

Tagged with , ,

Corporate Euphemisms

without comments

Apple’s quest to make its products thinner at any cost is once again making some customers unhappy. There have been reports of iPad Pros arriving bent out of the box. I would be unhappy even if a $100 table arrived bent out of the box so it shouldn’t be surprising that I’d be unhappy if an $800+ tablet arrived bent out of the box. But now that Apple is positioning itself as a luxury products company, it’s striving to provide the same level of customer satisfaction as, say, Patek Philippe, right? After all, if you purchased a new Patek Philippe watch and it had any defect whatsoever, the company would likely bend over backwards to remedy the situation since it knows that, as a luxury products company, it lives an dies by its reputation for customer satisfaction. If you believed that, you would be incorrect.

Instead of addressing the issue of bent iPad Pros, Apple has taken the route of using corporate euphemisms to explain why bent iPad Pros are something with which customers will just have to live:

These precision manufacturing techniques and a rigorous inspection process ensure that these new iPad Pro models meet an even tighter specification for flatness than previous generations. This flatness specification allows for no more than 400 microns of deviation across the length of any side — less than the thickness of four sheets of paper. The new straight edges and the presence of the antenna splits may make subtle deviations in flatness more visible only from certain viewing angles that are imperceptible during normal use. These small variances do not affect the strength of the enclosure or the function of the product and will not change over time through normal use.

That’s a lot of words to say your brand new $800+ iPad Pro may arrive at your doorstep bent.

This issue reminds me a lot of the issue with the iPhone 4 where holding it in your left hand could cause cellular signal degradation (and thus drop your call). Instead of addressing the issue right away, Steve Jobs tried to argue that the solution was to hold the phone “correctly.” Eventually Apple opted for the half-assed solution of providing a free case, which was at least better than publishing an official page that used a lot of words to try to hand wave the problem away.

Between this and the high failure rate of the MacBook butterfly switch keyboards, Apple is having a rough start to its transition from a consumer electronics company into a luxury products company.

Written by Christopher Burg

January 8th, 2019 at 10:00 am

You’re Unboxing It Wrong

without comments

Apple has spent the last couple of years transitioning itself from a consumer electronics company to a luxury products company. For the most part it has been doing a good job of this. The company’s attention to detail on its products is easy to see. However, when you’re a luxury products company, expectations go up. Somebody who buys a Seiko 5 isn’t likely to throw a fit because the second hand doesn’t sweep smoothly. Somebody who spends the big bucks on a Rolex is probably going to be unhappy if their second hand isn’t gliding smoothly over the watch face. Likewise, somebody who buys an Amazon Fire table is probably willing to tolerate a number of limitations and defects. Somebody who spends no less than $799 on an iPad Pro is probably going to be unhappy if their brand new tablet is bent out of the box:

Apple has confirmed to The Verge that some of its 2018 iPad Pros are shipping with a very slight bend in the aluminum chassis. But according to the company, this is a side effect of the device’s manufacturing process and shouldn’t worsen over time or negatively affect the flagship iPad’s performance in any practical way. Apple does not consider it to be a defect.

The thing about being a luxury products company is that you need to make your customers feel special. Telling them that they have to live with a defect on a brand new product isn’t going to fly, especially when your cheaper competitors are apt to replace new products that have any kind of defect whatsoever (if you received a slightly bent Fire table, Amazon would probably get a replacement heading your away immediately).

Apple’s response on this matter is reminiscent of Steve Jobs’s response to people complaining about the iPhone 4 dropping calls when they held it in their left hand (for those who don’t know, he told them that they were holding it wrong). That might have flown when the iPhone was a reasonably priced option on the market but I have my doubts that such a cavalier attitude is going to fly now that Apple’s products are priced as high as they are.

Written by Christopher Burg

December 20th, 2018 at 11:00 am

Apple’s Diminishing Quality

without comments

Yesterday I was asked to recommend an Apple laptop (the laptop was going to somebody with a learning disability so the hurdle of transitioning them to a non-Apple platform was great and not a realistic option). As I was making my recommendation it really struck me just how far Apple’s laptops have fallen in the last few years.

In the past when somebody asked me if they should get AppleCare, I usually recommended against doing so. Apple’s laptops were pretty reliable and when they did fail, they could usually be repaired.

Apple’s current lineup has a significant problem. The new slim butterfly keyboards are notoriously fragile. A mere piece of debris getting under a key cap is enough to disable that key. This wouldn’t be a problem with a normal laptop keyboard because there is enough clearance to easily remove most debris that gets caught under a keycap. Moreover, even if the debris cannot be easily remove, the keycap usually can, which allows you to remove the offending debris. Getting a keycap off of a butterfly keyboard without wrecking the fragile butterfly mechanism isn’t easy. And if you do damage the mechanism, you’re stuck replacing the entire keyboard and that requires breaking a bunch of rivets that hold the keyboard to the top of the casing. This is why Apple replaces the entire top case when the keyboard needs to be replaced.

So you have a keyboard that cannot be serviced and has a high probability of failing. Strike one.

Strike two is the solid state drive (SSD). Apple no longer utilizes modular SSDs. Instead their SSDs are soldered to the mainboard. With SSDs failure is a matter of when, not if. This is because flash memory cells can only handle so many erase operations. SSD manufacturers attempt to prolong the life of their product with wear leveling but that only means that the time between failures is extended, it’s not eliminated. This isn’t a big deal with modular SSDs. If an SSD is modular and croaks, you replace the dead SSD with a new one. When an SSD that is soldered to the mainboard croaks, you end up having to replace the entire mainboard. Since the mainboard also has the processor and graphics card soldered to it, you necessary end up replacing those pricey components as well. What used to be a relatively cheap unavoidable repair has become an extremely expensive unavoidable repair.

Recommending an Apple laptop has become an exercise in presenting the least bad option. An expensive repair is a matter of when, not if. The keyboard is likely to suffer a premature death because of its design and lack of repairability. If the keyboard survives, the SSD will eventually die, necessitating replacing the entire mainboard (and thus the processor and graphics card). Instead of recommending a computer that I know will likely leave the buyer happy for years to come, recommending an Apple laptop involves tagging on a great number of caveats and warnings so that when the buyer is looking at an absurd repair bill, they aren’t doing so unexpectedly.

Written by Christopher Burg

December 18th, 2018 at 10:00 am

Posted in Technology

Tagged with ,

Some Thoughts After Moving from macOS to Linux

with one comment

It has been two weeks and change since I moved from my MacBook Pro to a ThinkPad P52s running Linux. Now that I have some real use time under my belt I thought it would be appropriate to give some of my thoughts.

The first thing I’d like to note is that I have no regrets moving to Linux. My normal routine is to use my laptop at work and whenever I’m away at home and use another computer at home (because I’m too lazy to pull my laptop out of my laptop bag every night). The computer I was using at home was a 2010 Mac Mini. I replaced it with my old MacBook Pro when I got my ThinkPad. I realized the other day that I haven’t once booted up my MacBook Pro since I got my ThinkPad. Instead I have been pulling my ThinkPad out of its bag and using it when I get home. At no point have I felt that I need macOS to get something done. That’s the best testament to the transition that I can give. That’s not to say Linux can do anything that macOS can. I’m merely fortune in that the tools I need are either available on Linux or have a viable alternative.

I’m still impressed with the ThinkPad’s keyboard. One of my biggest gripes about the new MacBooks is the ultra slim keyboards. I am admittedly a bit of a barbarian when it comes to typing. I don’t so much type as bombard my keyboard from orbit. Because of this I like keys with a decent amount of resistance and depth. The keyboard on my 2012 MacBook Pro was good but I’m finding the keyboard on this ThinkPad to be a step up. The keys offer enough resistance that I’m not accidentally pressing them (a problem I have with keyboards offering little resistance) and enough depth to feel comfortable.

With that said the trackpad is still garbage when compared to the trackpad on any MacBook. My external trackball has enough buttons where I can replicate the gestures I actually used on the MacBook though and I still like the TrackPoint enough to use it when I don’t have an external mouse connected.

Linux has proven to be a solid choice on this ThinkPad as well. I bought it with Linux in mind, which means I didn’t get features that weren’t supported in Linux such as the fingerprint reader or the infrared camera for facial recognition (which is technically supported in Linux but tends to show up as the first camera so apps default to it rather than the 720p webcam). My only gripe is the Nidia graphics card. The P52s includes both an integrated Intel graphics card and an Nvidia Quadro P500 discrete graphics card, which isn’t supported by the open source Nouveau driver. In order to make it work properly, you need to install Nvidia’s proprietary drivers. Once that’s installed, everything works… except secure boot. In order to make the P52s boot after installing the Nvidia driver, you need to go into the BIOS and disable secure boot. I really wish there was a laptop with an discrete AMD graphics card that fit my needs on the market.

One thing I’ve learned from my move from macOS to Linux is just how well macOS handled external monitors. My P52s has a 4k display but all of the external monitors I work with are 1080p. Having different resolution screens was never a problem with macOS. On Linux it can lead to some rather funky scaling issues. If I leave the built-in monitors resolution at 4k, any app that opens on that display looks friggin’ huge when moved to an external 1080p display. This is because Linux scales up apps on 4k displays by a factor of two by default. Unfortunately, scaling isn’t done per monitor by default so when the app is moved to the 1080p display, it’s still scaled by two. Fortunately, a 4k display is exactly twice the resolution as a 1080p display so changing the built-in monitor’s resolution to 1080p when using an external display is an easy fix that doesn’t necessitate everything on the built-in display looking blurry.

I’ve been using Gnome for my graphical environment. KDE seems to be the generally accepted “best” desktop environment amongst much of the Linux community these days. While I do like KDE in general, I find that application interfaces are inconsistent whereas Gnome applications tend to have fairly consistent interfaces. I like consistency. I also like that Gnome applications tend to avoid burying features in menus. The choice of desktop environment is entirely subjective but so far my experience using Gnome has been positive (although knowing that I have a ship to which I can jump if that changes is reassuring).

As far as applications go, I used Firefox and Visual Studio Code on macOS and they’re both available on Linux so I didn’t have to make a change in either case. I was using Mail.app on macOS so I had to find a replacement e-mail client. I settled on Geary. My experience with Geary has been mostly positive although I really hate that there is no way, at least that I’ve found, to quickly mark all e-mails as read. I used iCal on macOS for calendaring and Gnome’s Calendar application has been a viable replacement for it. My luck at finding a replacement for my macOS task manager, 2Do, on Linux hasn’t been a positive experience. I’m primarily using Gnome’s ToDo application but it lacks a feature that is very important to me, repeating tasks. I use my task manager to remind me to pay bills. When I mark a bill as paid, I want my task manager to automatically create as task for next month. 2Do does this beautifully. I haven’t found a Linux task manager that can do this though (and in all fairness, Apple’s Reminder.app doesn’t do this well either). I was using Reeder on macOS to read my RSS feeds. On Linux I’m using FeedReader. Both work with Feedbin and both crash at about the same rate. I probably shouldn’t qualify that as a win but at least it isn’t a loss.

The biggest change for me has probably been moving from VMWare Fusion to Virtual Machine Manager, which utilized libvirt (and thus KVM and QEMU). Virtualizing Linux with libvirt is straight forward. Virtualizing Windows 10 wasn’t straight forward until I found SPICE Windows guest tools. Once I installed that guest tool package, the niceties that I came to love about VMWare Fusion such as shared pasteboards and automatically changing the resolution of the guest machine when the virtual machine window is resized worked. libvirt also makes it dead simple to set a virtual machine to automatically start when the system boots.

One major win for Linux over macOS is software installation. Installing software from the Mac App Store is dead simple but installing software from other sources isn’t as nice of an experience. Applications installed from other sources have to include their own update mechanism. Most have have taken the road of including their own embedded update capabilities. While these work, they can usually only run when the application is running so if you haven’t used the application for some time, the first thing you end up having to do is update it. Lots of packages still don’t include automatic update capabilities so you have to manually check for new releases. Oftentimes these applications are available via MacPorts or Homebrew. On the Linux side of things almost every software package is available via a distro’s package manager, which means installation and updates are handled automatically. I prefer this over the hodgepodge of update mechanisms available on macOS.

So in closing I’m happy with this switch, especially since I didn’t have to drop over $3,000 on a laptop to get what I wanted.

Written by Christopher Burg

November 21st, 2018 at 11:00 am

Posted in Side Notes

Tagged with , ,


without comments

I’ve always treated mobile devices differently than desktops and laptops. Part of this is because mobile devices tend to be restrictive. Most mobile devices are closed platforms that don’t allow you to load a different operation system. And while you can load custom firmware on a few mobile devices, it often requires some hackery. It appears as though I jumped ship at the proper time though because Apple is bringing the restrictive nature of iOS to its desktops and laptops:

Apple’s MacBook Pro laptops have become increasingly unfriendly with Linux in recent years while their Mac Mini computers have generally continued working out okay with most Linux distributions due to not having to worry about multiple GPUs, keyboards/touchpads, and other Apple hardware that often proves problematic with the Linux kernel. But now with the latest Mac Mini systems employing Apple’s T2 security chip, they took are likely to crush any Linux dreams.


Update 2: It looks like even if disabling the Secure Boot functionality, the T2 chip is reportedly still blocking operating systems aside from macOS and Windows 10.

I know a lot of people have expressed the feeling that buying an Apple computer and installing Linux on it is rather foolish. After all, you can buy a computer for far less that is fully supported by Linux (Linux support on Apple computers has always been a bit hit or miss). I mostly agree with that attitude. However, there comes a time in every Mac’s life where Apple drops support for it in macOS. While it’s possible to coax macOS onto a lot of unsupported Macs, there are also quite a few older Macs where installing a modern version of macOS is impossible. In such cases Linux offers an option to continue using the hardware with an operating system that has current security updates.

I prefer to repurpose old computers rather than throw them away. Having the option to install Linux on older Macs has always been a desirable option to me. For me losing that ability severely limits the functional lifetime of a Mac. Moreover, I worry that the limitations put into place by the T2 chip will make installing future versions of macOS on these machines impossible when they fall out of support.

Secure Boot functionality is a good security measure. However, Secure Boot on a vast majority of PCs can be disabled (in fact Microsoft requires that Secure Boot can be disabled for logo-certificate). Even if you don’t disable it, many Linux distributions have signed bootloaders that work with Secure Boot (unfortunately, even these signed bootloaders don’t work on Apple computers with a T2 chip). So it is possible to provide boot-time security while supporting third-party operating systems. Apple is simply choosing not to do so.

Written by Christopher Burg

November 7th, 2018 at 10:30 am

Posted in Technology

Tagged with ,

Deafening the Bug

with 2 comments

I know a lot of people who put a piece of tape over their computer’s webcam. While this is a sane countermeasure, I’m honestly less worried about my webcam than the microphone built into my laptop. Most laptops, unfortunately, lack a hardware disconnect for the microphone and placing a piece of tap over the microphone input often isn’t enough to prevent it from picking up sound in whatever room it’s located. Fortunately, Apple has been stepping up its security game and now offers a solution to the microphone problem:

Little was known about the chip until today. According to its newest published security guide, the chip comes with a hardware microphone disconnect feature that physically cuts the device’s microphone from the rest of the hardware whenever the lid is closed.

“This disconnect is implemented in hardware alone, and therefore prevents any software, even with root or kernel privileges in macOS, and even the software on the T2 chip, from engaging the microphone when the lid is closed,” said the support guide.

The camera isn’t disconnected, however, because its “field of view is completely obstructed with the lid closed.”

While I have misgivings with Apple’s recent design and business decisions, I still give the company credit for pushing hardware security forward.

Implementing a hardware cutoff for the microphone doesn’t require something like Apple’s T2 chip. Any vendor could put a hardware disconnect switch on their computer that would accomplish the same thing. Almost none of them do though, even if they include hardware cutoffs for other peripherals (my ThinkPad, for example, has a build in cover for the webcam, which is quite nice). I hope Apple’s example encourages more vendors to implement some kind of microphone cutoff switch because being able to listen to conversations generally allows gathering more incriminating evidence that merely being able to look at whatever is in front of a laptop.

Written by Christopher Burg

November 1st, 2018 at 11:00 am