The Users and the Used

I’m happy that computer technology (for the purpose of this post, I mean any device with a computer in it, not a traditional desktop or laptop) has become ubiquitous. An individual who wants a computer no longer has to buy a kit and solder it together. Instead they can go to the store and pick up a device that will be fully functional out of the box. This has lead to a revolution in individual capabilities. Those of us who utilize computers can access a global communication network from almost anywhere using a device that fits in our pocket. We can crank out printed documents faster than any other time in human history. We can collect data from any number of sources and use it to perform analysis that was impractical before ubiquitous access to computers. In summary life is good.

However, the universe is an imperfect place and few things are without their downsides. The downside to the computer revolution is that there are, broadly speaking, different classes of users. They are often divided into technical and non-technical users, but I prefer to refer to them as users and used. My categorization isn’t so much based on technical ability (although there is a strong correlation) as by whether one is using their technology or being used by it.

Before I continue, I want to note that this categorization, like all attempts to categorize unique individuals, isn’t black and white. Most people will fall into the gray area in between the categories. The main question is whether they fall more towards the user category of the used.

It’s probably easiest to explain the used category first. The computing technology market is overflowing with cheap devices and free services. You can get a smartphone for little or even nothing from some carriers, an Internet connected doorbell for a pittance, and an e-mail account with practically unlimited storage for free. On the surface these look like amazing deals, but they come with a hidden cost. The manufacturers of those devices and providers of those services, being predominantly for-profit companies, are making their money in most cases by collecting your personal information and selling it to advertisers and government agencies (both of which are annoying, but the latter can be deadly). While you may think you’re using the technology you’re actually being used through it by the manufacturers and providers.

A user is the opposite. Instead of using technology that uses them, they use technology that they dominate. For example, Windows 10 was a free upgrade for users of previous versions of Windows. Not surprisingly, Windows 10 also collects a lot of personal information. Instead of using Windows 10, users of that operating system are being used by it. The opposite side of the spectrum is something like Linux from Scratch, where a user creates their own Linux distro from the ground up so they know every component that makes up their operating system. As I stated earlier most people fall into the gray area between the extremes. I predominantly run Fedora Linux on my systems. As far as I’m aware there is no included spyware and the developers aren’t otherwise making money by exploiting my use of the operating system. So it’s my system, I’m using it, not being used through it.

Another example that illustrates the user versus the used categories is online services. I sometimes think everybody on the planet has a Gmail account. Its popularity doesn’t surprise me. Gmail is a very good e-mail service. However, Gmail is primarily a mechanism for Google to collect information to sell to advertisers. People who use Gmail are really being used through it by Google. The opposite side of the spectrum (which is where I fall in this case) is self-hosting an e-mail server. I have a physical server in my house that runs an e-mail server that I setup and continue to maintain. I am using it rather than being used by it.

I noted earlier in this article that there is a strong correlation between technical people and users as well as non-technical people and those being used. It isn’t a one-to-one correlation though. I know people with little technical savvy who utilize products and services that aren’t using them. Oftentimes they have a technical friend who assists them (I’m often that friend), but not always. I would actually argue that the bigger correlation to users and those being used is those who are curious about technology versus those who aren’t. I know quite a few people with little technical savvy who are curious about technology. Their curiosity leads them to learn and they oftentimes become technically savvy in time. But before they do they often make use of technology rather than be used by it. They may buy a laptop to put Linux on it without having the slightest clue at first how to do it. They may setup a personal web server poorly, watch it get exploited, and then try again using what they learned from their mistakes. They may decide to use Signal instead of WhatsApp not because they understand the technical differences between the two but because they are curious about the “secure communications app” that their technical friends are always discussing.

Neither category is objectively better. Both involve trade-offs. I generally encourage people to move themselves more towards the user category though because it offers individuals more power over the tools they use and I’m a strong advocate for individual power. If you follow an even slightly radical philosophy though, I strongly suggest that you to move towards the user category. The information being collected by those being used often finds its way into the hands of government agents and they are more than happy to make use of it to suppress dissidents.

Upgrading My Network

The network at my previous dwelling evolved over several years, which made it a hodgepodge of different gear. Before I moved out the final form of it was a Ubiquiti EdgeMax router, a Ubiquiti Edge Switch, and an Apple Airport Extreme (I got a good deal on it, but it was never something I recommended to people). When I bought my new house I decided to upgrade my network to Ubiquiti UniFi gear. For those who are unaware UniFi gear fits into that niche between consumer and enterprise networking gear (it’s often touted as enterprise gear, but I have my doubts that it would work as well on a massive network spanning multiple locations as more traditional enterprise gear) often referred to as prosumer or SOHO (Small Office/Home Office).

Because I live out in the boonies, my Internet connection is pretty lackluster so I opted for a Security Gateway 3P for my router (it’s generally agreed that the hardware is too slow to keep up with the demands of many modern Internet connections, but I don’t have to worry about that). If I had built a new house, I’d have put Ethernet drops in every room, but I bought a preexisting house with no Ethernet drops, which meant Wi-Fi was going to be my primary form of network connectivity. I still needed Ethernet connections for my servers though so I opted for a 24-port switch as my backbone and AP-AC-M access points for Wi-Fi. The AP-AC-M access points provide mesh networking, which is nice in a house without Ethernet drops because you can extend your Wi-Fi network by connecting new access points to already installed access points. Moreover, they’re rated for outdoor use so I can use them to extend my Wi-Fi network across my property.

A UniFi network is really a software defined network, which means that there is a central controller that you enter your configuration information into and it pushes the required settings out to the appropriate devices. Ubiquiti provides the Cloud Key as a hardware controller, but I already have virtual machine hosts aplenty so I decided to setup a UniFi Controller in a virtual machine.

Previously I was resistant to the idea of having to have a dedicated controller for my network. However, after experiencing software defined networking, I don’t think I could ever go back. Making a single change in one location and having that change propagated out to my entire network is a huge time saver. For example, I decided that I wanted to setup a guest Wi-Fi network. Without a central controller this would have required me to log into the web interface of each access point and enter the new guest network configuration. With a software defined network I merely add the new guest network configuration into my UniFi Controller and it pushes that configuration to each of my access points. If I want to change the Wi-Fi Protected Access (WPA) password for one of my wireless networks, I can change it in the UniFi Controller and each access point will receive the update.

The UniFi Controller also provides a lot of valuable information. I initially setup my wireless network with two access points, but the statistics in the UniFi Controller indicated that my wireless coverage wasn’t great in the bedroom, was barely available on my three season porch, and was entirely unavailable out by my fire pit. I purchased a third access point and rearranged the other two and now have excellent Wi-Fi coverage everywhere I want it. While I could have gathered the same information on a network without a centralized controller by logging into each access point individually, it would have been a pain in the ass. The UniFi Controller also allows you to upload the floor plan of your home and it will show you the expected Wi-Fi coverage based on where you place your access points. I haven’t used that feature yet (I need to create the floor plan in a format that the controller can use), but I plan on playing with it in the future.

Overall the investment into more expensive UniFi gear has been worth it to me. However, most people probably don’t need to spend so much money on their home network. I know many people are able to do everything they want using nothing more than the all in one modem/switch/Wi-Fi access point provided by their Internet Service Provider (admittedly I don’t trust such devices and always place them outside of my network’s firewall). But if you need to setup a network that is more complex than the average home network, UniFi gear is something to consider.

The Importance of Open Platforms

Late last week I pre-ordered the UBports Community Edition PinePhone. It’s not ready for prime time yet. Neither of the cameras work and the battery life from what I’ve read is around four to five hours and there are few applications available at the moment. So why did I pre-order it? Because UBports has been improving rapidly, my iPhone is the last closed platform I run regularly (I keep one macOS machine running mostly so I can backup my iPhone to it), and open platforms may soon be our only option for secure communications:

Signal is warning that an anti-encryption bill circulating in Congress could force the private messaging app to pull out of the US market.

Since the start of the coronavirus pandemic, the free app, which offers end-to-end encryption, has seen a surge in traffic. But on Wednesday, the nonprofit behind the app published a blog post, raising the alarm around the EARN IT Act. “At a time when more people than ever are benefiting from these (encryption) protections, the EARN IT bill proposed by the Senate Judiciary Committee threatens to put them at risk,” Signal developer Joshua Lund wrote in the post.

I used Signal as an example for this post, but in the future when (it’s not a matter of if, it’s a matter of when) the government legally mandates cryptographic back doors in consumer products (you know the law will have an exception for products sold to the government) it’ll mean every secure communication application and platform will either have to no longer be made available in the United States or will have to insert a back door that allows government agents and anybody else who can crack the back door complete access to our data.

On an open platform such a Linux this isn’t the end of the world. I can source both my operating system and my applications from anywhere. If secure communication applications are made illegal in the United States, I have the option of downloading and use an application made in a freer area or better yet developed anonymously (it’s much harder to enforce these laws if the government can’t identify and locate the developers). Closed platforms such as iOS and Android (although Android to a lesser extent since it still allows side loading of applications and you can download an image built off of the Android Open Source Project) require you to download software from their walled garden app stores. If Signal is no longer legally available in the United States, people running iOS and Android will no longer be able to use Signal because those apps will no longer be available in the respective United States app stores.

As the governments of the world continue to take our so-called civil rights behind a shed and unceremoniously put a bullet in their heads closed platforms will continue to become more of a liability. Open platforms on the other hand can be developed by anybody anywhere. They can even be developed anonymously (Bitcoin is probably the most successful example of a project whose initial developer remains anonymous), which makes it difficult for governments to put pressure on the developers to comply with laws.

If you want to ensure your ability to communicate securely in the future and you haven’t already transitioned to open platforms, you should either begin your transition or at least begin to plan your transition. Not all of the pieces are ready yet. Smartphones remain one area where open platforms are lagging behind, but there is a roadmap available so you can at least begin planning a move towards open an smartphone (and at $150 the PinePhone is a pretty low risk platform to try).

Don’t Use Zoom

With most of the country under a stay at home order turned into a prison, people are turning to video conferencing software to socialize. With all of the available options out there somehow the worst possible option has become the most popular (which seems like the overarching theme to our current crises). Zoom appears to have become the most popular video conferencing software for people imprisoned in their homes.

Don’t use Zoom.

Why? First, the company uses misleading marketing. If you’ve seen some of the company’s marketing, you might be under the mistaken impression Zoom video conferences are end-to-end encrypted. They’re not. But that’s the tip of the iceberg. A while back Zoom pulled a rather sneaky maneuver and installed a secret web server on Macs, which was supposedly meant to make using the software easier for Safari users (the claim was bullshit). Apple wasn’t amused and removed the software via an update. Zoom did remove that functionality, but the software still had surprises in store for Mac users. It turns out that it contained a security vulnerability that allowed a remote attacker to access the computer’s webcam and microphone… oh and provided them with root access. Don’t worry Windows users, Zoom didn’t forget about you. The Windows version of Zoom contained a vulnerability that allowed attackers to steal system password. And so everybody could suffer equally, Zoom made it easy for randos to join supposedly private video conferences.

I’m not even done yet. Zoom also leaked users’ e-mail addresses and photos to randos and, until it was caught, was also selling personal data to Facebook.

So I reiterate, don’t use Zoom.

Alternate Social Media Project Part 1: Riot.im

When I announced that I was cutting back on blogging, I explained that it was so I could focus my energy on other projects. One of those projects, which I’ve dubbed the Alternate Social Media Project (ASMP), has been replacing the social media functionality provided by Facebook. Why? Because Facebook has become not only a total invasion of privacy (which most people apparently don’t give two shits about) but also an increasingly useless platform for anybody with beliefs that aren’t state approved (which people seem to care about when they find themselves being censored by Facebook’s administrators). Rather than demand that the government step in and force Facebook to run its operations in the manner I approve, I decided it would be easier to just move somewhere freer.

This project is occurring in steps. The first step was to find something to fulfill the primary use of social media: communication. My requirements were modest. The solution upon which I settled had to be decentralized, fully usable on mobile platforms, and offer the option of secure communications. I settled on Riot.im since it was one of the few decent options that met those requirements.

Riot.im is the reference client for the Matrix protocol. The Matrix protocol is, basically, an evolution of Internet Relay Chat (IRC). Unlike other attempts to improve on IRC, Matrix is also federated, which means anybody can run a server and those servers can communicate with one another. Facebook demonstrates the importance of federation. If you express wrongthink of Facebook, you risk being exiled. If you express wrongthink on a Matrix server, you risk being exiled from that specific server but you can migrate over to another server, possibly your own server (where you can express all the wrongthink your heart desires). So long as the new server you’re on is federate with the servers your friends are on, you can continue your conversations.

Unlike IRC and many other older communication protocols (XMPP comes to mind), Riot.im works well on mobile devices. Android and iOS like to kill apps in the background and when those apps are killed, all of their active network connections die with them. With IRC this means you have no idea what is going on in the room until you open the app and reconnect. Riot.im, on the other hand, will work like other modern communication tools when your app isn’t running. When activity happens in one of the rooms of which you’re a member, you will receive notifications (unless you disable those notifications). If something piques your interest, you can open the app and jump into the conversation. My previous attempts to migrate friends to other platforms were thwarted because none of them were willing to use something that didn’t play well with mobile. I’m happy to say that Riot.im doesn’t suffer from that shortcoming.

Riot.im fulfills the third criterion by offering the option of end-to-end encryption. Matrix has no concept of direct messages as far as I can tell. When you want to communicate privately with somebody, you’re placed in a private room with them. If you want your communications to be private, you can turn encryption on in the room. Another nice feature is that once encryption is enabled in a room, it cannot be disabled. This setup, although potentially confusing to some people, has two nice features. The first is that this setup enables any room to be encrypted. You and your friends can setup an encrypted room where you can express wrongthink without the server administrators being able to see it (unless you invite them into your room). The second is that you don’t have to worry about somebody secretly turning encryption off at a future point (and thus exposing your wrongthink to outsiders).

Riot.im obviously isn’t a replacement for Facebook. At most it’s a replacement for Facebook Messenger. Since everything on Riot.im occurs in a chatroom, it’s not as easy to have a conversation about a linked article and there is no way to accrue imaginary Internet points like you can with Facebook’s reactions. However, I’m not actually a fan of services that try to do everything. It’s too difficult to replace individual parts when something better rolls around or an update to the current tool makes it unusable.

If you’re interested in migrating off of Facebook or other restrictive social media platforms, you could do worse than starting with Riot.im.

Linux on a 2010 Mac Mini Part Two

Last week I mentioned my adventure of installing Linux on a 2010 Mac Mini. Although Ubuntu 18.10 did install and was working for a few days an update left the system unusable. After an update towards the end of last week the system would only boot to a black screen. From what I gathered online I wasn’t the only person who ran into this problem. Anyways, I ended up digging into the matter further.

I once again tried installing Fedora. When I tried to install Fedora 29, I was unable to stop it from booting to a black screen so I decided to try Fedora 28. Using basic graphics mode I was able to get Fedora 28 to boot to the live environment and from there install Fedora on the Mac Mini. After installation I was able to get my Fedora installation to boot. However, when I tried to install the Nvidia driver from RPM Fusion, the system would only boot to a black screen afterwards. I tried installing the Nvidia driver via the negativo17 repository but didn’t expect it to work since the driver distributed from that repository is based on version 418 and the last driver to support the Mac Mini’s GeForce 320M was version 340. Things went as expected. I then tried installing the Nvidia driver manually using a patched version of the 340 driver from here. Unfortunately, that driver doesn’t work with the 4.20 kernel so that was a no go as well.

The reason I hadn’t tried to install the Nvidia driver manually before was because I didn’t want to deal with supporting the setup in the future. As I was trying to install it using the previously linked instructions I felt justified because the guide isn’t nearly as straight forward as installing the driver from a repository. It became a moot point since manual installation didn’t work but it did make me think about the fact that any solution I settled upon would need to be maintained, which lead me to the idea of using Ubuntu 18.04 LTS. The LTS versions of Ubuntu are supported by Canonical for five years so if I could get 18.04 installed, the setup would have a decent chance of working for five years.

After passing the kernel the “nouveau.modeset=0” argument, just as I had to do with 18.10, I was able to boot into a live environment and install 18.04 to the hard drive. Likewise, I had to use the “nouveau.modeset=0” argument to boot into the installation. Once I was booted into the installation I was able to use “sudo apt install nvidia-340” to install the 340 version of the Nvidia driver. After rebooting everything worked properly. I’m hoping that future updates will be less likely to break this setup since the LTS releases of Ubuntu tend to be more stable than non-LTS versions.

So, yeah, if you want to get a currently supported Linux distro running on a 2010 Mac Mini, take a look at Ubuntu 18.04. It might be your best bet (if it continues to run properly for the next month or so, I’ll say it is your best bet).

Linux on a 2010 Mac Mini

I prefer repurposing old computers to throwing them away. A while ago I acquired a 2010 Mac Mini for $100. It has worked well. I even managed to install macOS Mojave on it using this patcher. However, I wanted to try installing Linux on it.

I first tried installing my go-to distro, Fedora (version 29 to be specific). Unfortunately, I immediately ran into problems. The Mac Mini has an Nvidia card that doesn’t play nicely with the nouveau driver in the kernel so I couldn’t bring up a graphical environment (I just got a black screen with a blinking cursor in the upper left corner). I tried booting the Fedora live distro with the “nouveau.modeset=0” parameter but to no avail.

So I decided to try Ubuntu (18.10). Ubuntu also initially failed to boot but it at least gave me an error message (related to the nouveau driver). When I booted it with the “nouveau.modeset=0” parameter I was able to get to the graphical interface and install Ubuntu. After installation I once again booted with the “nouveau.modeset=0” parameter and install Nvidia’s proprietary driver. After that the system now boots into Ubuntu without any trouble (installing the Nvidia driver also enabled audio output through HDMI).

If you’re having trouble installing Linux on a 2010 Mac Mini, try Ubuntu and try passing the “nouveau.modeset=0” parameter when booting and you may have better luck.

Self-Inflicted Dystopia

Nike has released its self-lacing shoes and the result is funnier than anything I predicted:

One user writes, “The first software update for the shoe threw an error while updating, bricking the right shoe.” Another says, “App will only sync with left shoe and then fails every time. Also, app says left shoe is already connected to another device whenever I try to reinstall and start over.”

“My left shoe won’t even reboot.” writes another. One user offers a possible solution, saying, “You need to do a manual reset of both shoes per the instructions.”

People like to argue over whether Orwell or Huxley more accurately predicted our dystopian future but I think Mike Judge’s prediction is proving most accurate.

Products like the Nike Adapt BB provide the opportunity for a self-inflicted dystopia. If your life is too free from anxiety, you can buy some. Running a little late for work? Now you can worry about whether or not your shoes have enough charge in them to lace themselves or whether or not your smartphone app will connect to them to activate the self-lacing operation. Will the lithium-ion batteries in your shoes explode? Who knows! Will wearing them outside in -20 weather cause the batteries to discharge to such a point that you won’t be able to unlace them? Perhaps!

On the upside, the entertainment derived from watching people struggle with their “smart” shoes is free.

If You’re Good at Something, Never Do It for Free

A minor controversy has developed in the macOS world. Linuz Henze, a security researcher, has discovered a vulnerability in Keychain for macOS that allows an attacker to access stored passwords. However, Henze isn’t providing the details to Apple because Apple’s bug bounty program, for some stupid reason, doesn’t cover macOS vulnerabilities:

Security researcher Linuz Henze has shared a video demonstration of what is claimed to be a macOS Mojave exploit to access passwords stored in the Keychain. However, he has said he is not sharing his findings with Apple out of protest.

Henze has publicly shared legitimate iOS vulnerabilities in the past, so he has a track record of credibility.

However, Henze is frustrated that Apple’s bug bounty program only applies to iOS, not macOS, and has decided not to release more information about his latest Keychain invasion.

Some people aren’t happy with Henze’s decision because his refusal to provide the exploit to Apple will make it harder for the company to fix the vulnerability. What these people are forgetting is that Henze isn’t refusing to provide the exploit to Apple, he’s refusing to provide it for free. In other words, he wants to be paid for his work. I don’t know many people who would willingly work for free. I certainly wouldn’t. Unless you would, you really should put the blame for this on Apple for refusing to pay for macOS exploits.

Disable FaceTime

If for some inexplicable reason you own an Apple device and haven’t already disabled FaceTime, you should do so now:

Users have discovered a bug in Apple’s FaceTime video-calling application that allows you to hear audio from a person you’re calling before they accept the call—a critical bug that could potentially be used as a tool by malicious users to invade the privacy of others.

You don’t want a caller to hear you bitching them out for being inconsiderate by calling you instead of having the decency to send a text message.