A Geek With Guns

Chronicling the depravities of the State.

Archive for the ‘Technology’ Category

Fleeing Facebook

with one comment

Another election is on the horizon, which can only mean Facebook is clamping down on wrongthink in the futile hope that doing so will appease Congress enough that it won’t say mean things about the company that might hurt its stock price. This week’s clamp down appears to be more severe than others. I have several friends who received temporary bans for making posts or comments that expressed apparently incorrect, albeit quite innocent, opinions. A lot of them also reported that some of their friends received permanent bans for posting similar content.

In the old days of the Internet when websites were dispersed you usually had friends from forums, game servers, and various instant messenger clients added on other services. Because of that, getting banned for any single account wasn’t usually a big deal. However, with the centralization that Facebook has brought, losing your Facebook account can mean losing access to a large number of your contacts.

If you are at risk of losing your Facebook account (and if you hold political views even slightly right of Karl Marx, you are), you need to start establishing your contacts on other services now. If you’re like me and have friends that predominantly lean more libertarian or anarchist, you’ve probably seen a number of services being recommended such as MeWe, Parler, and Gab. The problem with these services is that they, like Facebook, are centralized. That means one of two outcomes is likely. If they’re successful, they will likely decide to capitalize by going public. Once that happens, they will slowly devolve into what Facebook has become today because their stock holders will demand it in order to maximize share prices. If they’re not successful, they’ll likely disappear in the coming years, forcing you to reestablish all of your contacts on another service again.

I’m going to recommend two services that will allow you to nip this problem in the bud permanently. The first is a chat service called Element (which was formerly known as Riot). The second is a Twitter-esque service called Mastodon. The reason I’m recommending these two services is because they share features that are critical if you want to actually socialized freely.

The most important feature is that both services can be self-hosted. This means that in the worst case scenario, if no existing servers will accept you and your friends, you can setup your own server. If you’re running your own server, the only people you have to answer to are yourselves. However, you may want to socialize with people outside of your existing friend groups. That’s where another feature called federation comes in. Federation is a feature that allows services on one server to connect with services on another server. This allows the users on one Element or Mastodon instance to socialize with users on another instance. Federation means not having to put all of your eggs in one basket. If you and your friends sign up on different servers, no one admin can ban you all. Moreover, you can setup backup accounts that your friends can add so if you are banned on one server, your friends already have your alternate account added to their contact list.

The reason I’m recommending two services is because Element and Mastodon offer different features that are geared towards different use cases. Element offers a similar experience to Internet Rely Chat (IRC) and various instant messenger protocols (such as Facebook Messenger). It works well if you and your friends want to have private conversations (you can create public chat rooms as well, if you want anybody to be able to join in the conversation). It also offers end-to-end encrypted chat rooms. End-to-end encrypted rooms cannot be surveilled by outside parties meaning even the server administrators can’t spy on your conversation. It’s much harder for a server administrator to ban you and your friends if they’re entirely ignorant of your conversations.

Mastodon offers an experience similar to Twitter (although with more privacy oriented features). You can create public posts that can be viewed by anybody with a web browser and with which anybody with a Mastodon account can interact. This works great if you have a project that requires a public face. For example, you and your friends may work on an open source project about which you provide periodic public updates. Mastodon enables that. Users can also comment on posts, which allows your posts to act as a public forum. Since Mastodon can be self-hosted, you can also setup a private instance that isn’t federated. Thus you could create a private space for you and your friends.

It’s critical to establish your existing contacts on another service now so you don’t find yourself suddenly unable to communicate with them because you expressed the wrong opinion. Even if you don’t choose Element and/or Mastodon, pick a service that you and your friends can tolerate and at least sign up for accounts and add each other to your contact lists. That way if you disappear down Zuckerberg’s memory hole, you can still keep in contact with your friends.

Written by Christopher Burg

September 5th, 2020 at 9:17 am

Posted in Technology

Tagged with ,

Error Indicators of Limited Value

without comments

When I moved into this house, I decided to use UniFi gear for my entire network because I wanted to centrally manage it (I, like most people who work in the technology field, am lazy by nature). This house doesn’t have Ethernet running through the walls so I (again, being lazy) opted to rely on a mesh network for most of my networking needs. My mesh network consists of three UAP-AC-M access points.

Like most other people working in the technology field, I’ve been working from home since COVID-19 started making headlines. This means my in-person meetings have mostly been done via remote video conferences. My setup ran smoothly until a few weeks ago when I started experiencing a strange issue where I’d periodically lose my video conference feeds for 10 to 30 seconds. Since I first setup my mesh network my UniFi Controller has reported a large number (as in several hundred per 24-hour period) of DHCP Timeout errors along with a handful of WPA Authentication Timeout errors. It also reported long access point association times for my two mesh nodes (the other node is wired to my switch). Searching Ubiquiti’s online support forum returned a lot of results for individuals experiencing these errors without any resolution. In fact several comments made by Ubiquiti employees stated that the DHCP Timeout errors can be ignored so long as the network is performing well. I ended up ignoring the errors because at the time my network was performing well and nobody seemed to have a resolution to the errors.

I began looking into the problem again when the video conferencing problems I mentioned started to manifest. To make a long story short, I finally figured out my problem. UAP-AC-M access points use the 5Ghz spectrum for mesh communications so they all operate on the same 5Ghz channel, but it’s expected that they utilize different 2.4Ghz channels. My mesh nodes were setup to automatically select their 2.4Ghz and 5Ghz channels during boot up. I assumed this was safe because I boot them up in stages one after the other. That should have caused them to see each other when they booted up and select a different 2.4Ghz channel. According to my UniFi controller, all three 2.4Ghz channels (one, six, and 11 are the only channels that don’t overlap with other channels) were being utilized so I assumed the access points were operating as I expected. After trying to few different settings I decided to manually select the 2.4Ghz channels for my access points. I put one access point on channel one, one on channel six, and one on channel 11.

Since doing that I haven’t experienced any video conferencing problems. Moreover, my DHCP Timeout errors have dropped to almost nothing (I now experience between two and four per 24-hour period), the WPA Authentication Timeout errors have remained at one or two per 24-hour period, and I no longer see any errors about access points taking longer than expected to associate.

If you’re one of the many people experiencing a massive number of DHCP Timeout errors with UniFi access points and you haven’t already manually selected non-overlapping 2.4Ghz channels for your access points, give it a try. I will note that since I live in the country and there are no other visible Wi-Fi networks anywhere on my property, your experience may differ if you’re in an environment with a lot of competing Wi-Fi networks.

Written by Christopher Burg

August 3rd, 2020 at 6:00 am

Posted in Technology

Tagged with

The Way It Should Always Have Been

without comments

I received my PinePhone last week. The model I ordered was the UBPorts Community Edition. My initial thoughts on the phone are that the build quality is actually very solid, but otherwise it behaves like a $150 phone. The performance isn’t great, but acceptable; the battery life, which is a known issue, is pretty terrible; and the software is in a pretty rough state (easily beta quality, maybe even late alpha quality). All of these were what was promised and what I expected so none of this should be considered criticism. I’m actually impressed by what the manufacturers and software creators managed to pull off so far.

However, after playing with UBPorts I wanted to try some other operating systems. This is where the PinePhone shines since it doesn’t lock you into any specific operating system. The next released of the Community Edition of the PinePhone will come with postmarketOS so I loaded postmarketOS onto an MicroSD card (you can also flash it to the internal eMMC chip) and booted it on the phone. postmarketOS has a utility that builds an image for you. That utility also allow you to customize a number of things including using full-disk encryption (which I haven’t played with yet since it’s experimental) and choosing your user interface. I chose Phosh for the user interface because I wanted to see what the Librem team has been working on. My experience with postmarketOS was similar to UBPorts. Performance was sluggish, but acceptable and the software is still in a rough state. However, postmarketOS makes it easy to install regular Linux desktop and command line applications so I installed and tried a few applications that I use regularly on the desktop. Unfortunately, most of the available graphical software doesn’t yet support screen scaling so applications are too big for the PinePhone’s screen. With that said, progress is being made in that direction and once more applications support screen scaling there should be a decent number of apps available.

Being able to boot up a different operating system on my phone is the way it should always have been. On my desktop and laptops computers I have always been able to choose what operating system to run, but my mobile devices have always been locked down. Some Android devices do allow you to unlock the boot loader and install a different Android image, but often doing so it’s officially supported by the manufacturer (so it’s often a pain in the ass). It’s nice to finally see a mobile phone that is designed for tinkerers and people who want to actually own their hardware.

Written by Christopher Burg

June 30th, 2020 at 6:30 am

Posted in Technology

Tagged with ,

Mullvad VPN

without comments

Periodically I’m asked to recommend a good Virtual Private Network (VPN) provider. I admit that I don’t spend a ton of time researching VPN providers because my primary use case for VPNs is to access my local network and secure my communications when traveling so most of the time I use my own VPN server. When I want to guard my network traffic against my Internet Service Provider (ISP), I use Tor. With that said, I do try to keep at least one known decent VPN provider in my back pocket to recommend to friends.

In the past I have usually recommended Private Internet Access because it’s ubiquitous, affordable, and its claim that it doesn’t keep logs has been proven in court. However, Private Internet Access is based in the United States, which means it can be subject to National Security Letters (NSL). Moreover, Private Internet Access was recently acquired by Kape Technologies. Kape Technologies has a troubling past and you can never guarantee that a company will maintain the same policies after it has been purchased so I’ve been looking at some alternative recommendations.

Of the handful with which I experimented, I ended up liking Mullvad VPN the most. In fact I ended up really liking it (for me finding a decent VPN provider is usually an exercise in finding the least terrible option).

Mullvad is headquartered in Sweden, which means it’s not subject to NSLs or other draconian United States laws (it’s subject to Swedish laws, but I’m outside of that jurisdiction). But even if it’s subjected to some kind of surveillance law, Mullvad goes to great length to enable you to be anonymous, which greatly hinders its ability to surveil you. To start with your account is just a pseudorandomly generated number. You don’t need to provide any identifiable information, not even an e-mail address. When you want to log in to pay your account, you simple enter your number. The nice thing about this is that the number is also easily disposed of. Since you can generate a new account by simply clicking on a link, you can throw away your account whenever you want. You can even generate accounts via its onion service (this link will only work if you’re using the Tor Browser).

Mullvad’s pricing is €5 (roughly $5.50 when I last paid) per month. Paying per month allows you to change accounts every month if you want. Payments can be made using more traditional services such as credit cards and PayPal, but you can also use more anonymous payment options such as Bitcoin and Bitcoin Cash (I would like to see the option of using Monero since it has anonymity built-in).

The thing that initially motivated me to test Mullvad was the fact that it uses WireGuard. WireGuard is our new VPN overlord. If you’re new to WireGuard or less technically inclined, you can download and use Mullvad’s app. If you’re familiar with WireGuard or willing to learn about it, you can use Mullvad’s configuration file generator to generate WireGuard configuration files for your system (this is how I used it). Mullvad also supports OpenVPN, but I didn’t test it because it’s 2020 and WireGuard is our new VPN overlord.

Like most decent VPN providers, Mullvad also has a page to check if your Mullvad connection is setup correctly. It performs the usual tasks of reporting if you’re connecting through a Mullvad server and if your Domain Name System (DNS) requests are leaking. It also attempts to check if your browser is leaking information through WebRTC. You can also test your torrent client in case you want to download Linux distros (because that’s the only thing anybody downloads via BitTorrent) more securely.

I didn’t come across anything egregious with Mullvad, but don’t take my recommendation too seriously (this is the caveat I give to everybody who asks me to recommend a VPN provider). My VPN use case isn’t centered around maintaining anonymity and I didn’t perform thorough testing in that regard. Instead I tested it based on my use case, which is mostly protecting my connection from local actors when traveling. As with anything, you should test the service yourself.

Written by Christopher Burg

April 15th, 2020 at 6:00 am

The Users and the Used

without comments

I’m happy that computer technology (for the purpose of this post, I mean any device with a computer in it, not a traditional desktop or laptop) has become ubiquitous. An individual who wants a computer no longer has to buy a kit and solder it together. Instead they can go to the store and pick up a device that will be fully functional out of the box. This has lead to a revolution in individual capabilities. Those of us who utilize computers can access a global communication network from almost anywhere using a device that fits in our pocket. We can crank out printed documents faster than any other time in human history. We can collect data from any number of sources and use it to perform analysis that was impractical before ubiquitous access to computers. In summary life is good.

However, the universe is an imperfect place and few things are without their downsides. The downside to the computer revolution is that there are, broadly speaking, different classes of users. They are often divided into technical and non-technical users, but I prefer to refer to them as users and used. My categorization isn’t so much based on technical ability (although there is a strong correlation) as by whether one is using their technology or being used by it.

Before I continue, I want to note that this categorization, like all attempts to categorize unique individuals, isn’t black and white. Most people will fall into the gray area in between the categories. The main question is whether they fall more towards the user category of the used.

It’s probably easiest to explain the used category first. The computing technology market is overflowing with cheap devices and free services. You can get a smartphone for little or even nothing from some carriers, an Internet connected doorbell for a pittance, and an e-mail account with practically unlimited storage for free. On the surface these look like amazing deals, but they come with a hidden cost. The manufacturers of those devices and providers of those services, being predominantly for-profit companies, are making their money in most cases by collecting your personal information and selling it to advertisers and government agencies (both of which are annoying, but the latter can be deadly). While you may think you’re using the technology you’re actually being used through it by the manufacturers and providers.

A user is the opposite. Instead of using technology that uses them, they use technology that they dominate. For example, Windows 10 was a free upgrade for users of previous versions of Windows. Not surprisingly, Windows 10 also collects a lot of personal information. Instead of using Windows 10, users of that operating system are being used by it. The opposite side of the spectrum is something like Linux from Scratch, where a user creates their own Linux distro from the ground up so they know every component that makes up their operating system. As I stated earlier most people fall into the gray area between the extremes. I predominantly run Fedora Linux on my systems. As far as I’m aware there is no included spyware and the developers aren’t otherwise making money by exploiting my use of the operating system. So it’s my system, I’m using it, not being used through it.

Another example that illustrates the user versus the used categories is online services. I sometimes think everybody on the planet has a Gmail account. Its popularity doesn’t surprise me. Gmail is a very good e-mail service. However, Gmail is primarily a mechanism for Google to collect information to sell to advertisers. People who use Gmail are really being used through it by Google. The opposite side of the spectrum (which is where I fall in this case) is self-hosting an e-mail server. I have a physical server in my house that runs an e-mail server that I setup and continue to maintain. I am using it rather than being used by it.

I noted earlier in this article that there is a strong correlation between technical people and users as well as non-technical people and those being used. It isn’t a one-to-one correlation though. I know people with little technical savvy who utilize products and services that aren’t using them. Oftentimes they have a technical friend who assists them (I’m often that friend), but not always. I would actually argue that the bigger correlation to users and those being used is those who are curious about technology versus those who aren’t. I know quite a few people with little technical savvy who are curious about technology. Their curiosity leads them to learn and they oftentimes become technically savvy in time. But before they do they often make use of technology rather than be used by it. They may buy a laptop to put Linux on it without having the slightest clue at first how to do it. They may setup a personal web server poorly, watch it get exploited, and then try again using what they learned from their mistakes. They may decide to use Signal instead of WhatsApp not because they understand the technical differences between the two but because they are curious about the “secure communications app” that their technical friends are always discussing.

Neither category is objectively better. Both involve trade-offs. I generally encourage people to move themselves more towards the user category though because it offers individuals more power over the tools they use and I’m a strong advocate for individual power. If you follow an even slightly radical philosophy though, I strongly suggest that you to move towards the user category. The information being collected by those being used often finds its way into the hands of government agents and they are more than happy to make use of it to suppress dissidents.

Written by Christopher Burg

April 14th, 2020 at 6:00 am

Upgrading My Network

without comments

The network at my previous dwelling evolved over several years, which made it a hodgepodge of different gear. Before I moved out the final form of it was a Ubiquiti EdgeMax router, a Ubiquiti Edge Switch, and an Apple Airport Extreme (I got a good deal on it, but it was never something I recommended to people). When I bought my new house I decided to upgrade my network to Ubiquiti UniFi gear. For those who are unaware UniFi gear fits into that niche between consumer and enterprise networking gear (it’s often touted as enterprise gear, but I have my doubts that it would work as well on a massive network spanning multiple locations as more traditional enterprise gear) often referred to as prosumer or SOHO (Small Office/Home Office).

Because I live out in the boonies, my Internet connection is pretty lackluster so I opted for a Security Gateway 3P for my router (it’s generally agreed that the hardware is too slow to keep up with the demands of many modern Internet connections, but I don’t have to worry about that). If I had built a new house, I’d have put Ethernet drops in every room, but I bought a preexisting house with no Ethernet drops, which meant Wi-Fi was going to be my primary form of network connectivity. I still needed Ethernet connections for my servers though so I opted for a 24-port switch as my backbone and AP-AC-M access points for Wi-Fi. The AP-AC-M access points provide mesh networking, which is nice in a house without Ethernet drops because you can extend your Wi-Fi network by connecting new access points to already installed access points. Moreover, they’re rated for outdoor use so I can use them to extend my Wi-Fi network across my property.

A UniFi network is really a software defined network, which means that there is a central controller that you enter your configuration information into and it pushes the required settings out to the appropriate devices. Ubiquiti provides the Cloud Key as a hardware controller, but I already have virtual machine hosts aplenty so I decided to setup a UniFi Controller in a virtual machine.

Previously I was resistant to the idea of having to have a dedicated controller for my network. However, after experiencing software defined networking, I don’t think I could ever go back. Making a single change in one location and having that change propagated out to my entire network is a huge time saver. For example, I decided that I wanted to setup a guest Wi-Fi network. Without a central controller this would have required me to log into the web interface of each access point and enter the new guest network configuration. With a software defined network I merely add the new guest network configuration into my UniFi Controller and it pushes that configuration to each of my access points. If I want to change the Wi-Fi Protected Access (WPA) password for one of my wireless networks, I can change it in the UniFi Controller and each access point will receive the update.

The UniFi Controller also provides a lot of valuable information. I initially setup my wireless network with two access points, but the statistics in the UniFi Controller indicated that my wireless coverage wasn’t great in the bedroom, was barely available on my three season porch, and was entirely unavailable out by my fire pit. I purchased a third access point and rearranged the other two and now have excellent Wi-Fi coverage everywhere I want it. While I could have gathered the same information on a network without a centralized controller by logging into each access point individually, it would have been a pain in the ass. The UniFi Controller also allows you to upload the floor plan of your home and it will show you the expected Wi-Fi coverage based on where you place your access points. I haven’t used that feature yet (I need to create the floor plan in a format that the controller can use), but I plan on playing with it in the future.

Overall the investment into more expensive UniFi gear has been worth it to me. However, most people probably don’t need to spend so much money on their home network. I know many people are able to do everything they want using nothing more than the all in one modem/switch/Wi-Fi access point provided by their Internet Service Provider (admittedly I don’t trust such devices and always place them outside of my network’s firewall). But if you need to setup a network that is more complex than the average home network, UniFi gear is something to consider.

Written by Christopher Burg

April 13th, 2020 at 9:41 pm

Posted in Technology

Tagged with

The Importance of Open Platforms

without comments

Late last week I pre-ordered the UBports Community Edition PinePhone. It’s not ready for prime time yet. Neither of the cameras work and the battery life from what I’ve read is around four to five hours and there are few applications available at the moment. So why did I pre-order it? Because UBports has been improving rapidly, my iPhone is the last closed platform I run regularly (I keep one macOS machine running mostly so I can backup my iPhone to it), and open platforms may soon be our only option for secure communications:

Signal is warning that an anti-encryption bill circulating in Congress could force the private messaging app to pull out of the US market.

Since the start of the coronavirus pandemic, the free app, which offers end-to-end encryption, has seen a surge in traffic. But on Wednesday, the nonprofit behind the app published a blog post, raising the alarm around the EARN IT Act. “At a time when more people than ever are benefiting from these (encryption) protections, the EARN IT bill proposed by the Senate Judiciary Committee threatens to put them at risk,” Signal developer Joshua Lund wrote in the post.

I used Signal as an example for this post, but in the future when (it’s not a matter of if, it’s a matter of when) the government legally mandates cryptographic back doors in consumer products (you know the law will have an exception for products sold to the government) it’ll mean every secure communication application and platform will either have to no longer be made available in the United States or will have to insert a back door that allows government agents and anybody else who can crack the back door complete access to our data.

On an open platform such a Linux this isn’t the end of the world. I can source both my operating system and my applications from anywhere. If secure communication applications are made illegal in the United States, I have the option of downloading and use an application made in a freer area or better yet developed anonymously (it’s much harder to enforce these laws if the government can’t identify and locate the developers). Closed platforms such as iOS and Android (although Android to a lesser extent since it still allows side loading of applications and you can download an image built off of the Android Open Source Project) require you to download software from their walled garden app stores. If Signal is no longer legally available in the United States, people running iOS and Android will no longer be able to use Signal because those apps will no longer be available in the respective United States app stores.

As the governments of the world continue to take our so-called civil rights behind a shed and unceremoniously put a bullet in their heads closed platforms will continue to become more of a liability. Open platforms on the other hand can be developed by anybody anywhere. They can even be developed anonymously (Bitcoin is probably the most successful example of a project whose initial developer remains anonymous), which makes it difficult for governments to put pressure on the developers to comply with laws.

If you want to ensure your ability to communicate securely in the future and you haven’t already transitioned to open platforms, you should either begin your transition or at least begin to plan your transition. Not all of the pieces are ready yet. Smartphones remain one area where open platforms are lagging behind, but there is a roadmap available so you can at least begin planning a move towards open an smartphone (and at $150 the PinePhone is a pretty low risk platform to try).

Written by Christopher Burg

April 13th, 2020 at 6:00 am

Don’t Use Zoom

with 3 comments

With most of the country under a stay at home order turned into a prison, people are turning to video conferencing software to socialize. With all of the available options out there somehow the worst possible option has become the most popular (which seems like the overarching theme to our current crises). Zoom appears to have become the most popular video conferencing software for people imprisoned in their homes.

Don’t use Zoom.

Why? First, the company uses misleading marketing. If you’ve seen some of the company’s marketing, you might be under the mistaken impression Zoom video conferences are end-to-end encrypted. They’re not. But that’s the tip of the iceberg. A while back Zoom pulled a rather sneaky maneuver and installed a secret web server on Macs, which was supposedly meant to make using the software easier for Safari users (the claim was bullshit). Apple wasn’t amused and removed the software via an update. Zoom did remove that functionality, but the software still had surprises in store for Mac users. It turns out that it contained a security vulnerability that allowed a remote attacker to access the computer’s webcam and microphone… oh and provided them with root access. Don’t worry Windows users, Zoom didn’t forget about you. The Windows version of Zoom contained a vulnerability that allowed attackers to steal system password. And so everybody could suffer equally, Zoom made it easy for randos to join supposedly private video conferences.

I’m not even done yet. Zoom also leaked users’ e-mail addresses and photos to randos and, until it was caught, was also selling personal data to Facebook.

So I reiterate, don’t use Zoom.

Written by Christopher Burg

April 2nd, 2020 at 6:00 am

Posted in Technology

Tagged with ,

Alternate Social Media Project Part 1: Riot.im

without comments

When I announced that I was cutting back on blogging, I explained that it was so I could focus my energy on other projects. One of those projects, which I’ve dubbed the Alternate Social Media Project (ASMP), has been replacing the social media functionality provided by Facebook. Why? Because Facebook has become not only a total invasion of privacy (which most people apparently don’t give two shits about) but also an increasingly useless platform for anybody with beliefs that aren’t state approved (which people seem to care about when they find themselves being censored by Facebook’s administrators). Rather than demand that the government step in and force Facebook to run its operations in the manner I approve, I decided it would be easier to just move somewhere freer.

This project is occurring in steps. The first step was to find something to fulfill the primary use of social media: communication. My requirements were modest. The solution upon which I settled had to be decentralized, fully usable on mobile platforms, and offer the option of secure communications. I settled on Riot.im since it was one of the few decent options that met those requirements.

Riot.im is the reference client for the Matrix protocol. The Matrix protocol is, basically, an evolution of Internet Relay Chat (IRC). Unlike other attempts to improve on IRC, Matrix is also federated, which means anybody can run a server and those servers can communicate with one another. Facebook demonstrates the importance of federation. If you express wrongthink of Facebook, you risk being exiled. If you express wrongthink on a Matrix server, you risk being exiled from that specific server but you can migrate over to another server, possibly your own server (where you can express all the wrongthink your heart desires). So long as the new server you’re on is federate with the servers your friends are on, you can continue your conversations.

Unlike IRC and many other older communication protocols (XMPP comes to mind), Riot.im works well on mobile devices. Android and iOS like to kill apps in the background and when those apps are killed, all of their active network connections die with them. With IRC this means you have no idea what is going on in the room until you open the app and reconnect. Riot.im, on the other hand, will work like other modern communication tools when your app isn’t running. When activity happens in one of the rooms of which you’re a member, you will receive notifications (unless you disable those notifications). If something piques your interest, you can open the app and jump into the conversation. My previous attempts to migrate friends to other platforms were thwarted because none of them were willing to use something that didn’t play well with mobile. I’m happy to say that Riot.im doesn’t suffer from that shortcoming.

Riot.im fulfills the third criterion by offering the option of end-to-end encryption. Matrix has no concept of direct messages as far as I can tell. When you want to communicate privately with somebody, you’re placed in a private room with them. If you want your communications to be private, you can turn encryption on in the room. Another nice feature is that once encryption is enabled in a room, it cannot be disabled. This setup, although potentially confusing to some people, has two nice features. The first is that this setup enables any room to be encrypted. You and your friends can setup an encrypted room where you can express wrongthink without the server administrators being able to see it (unless you invite them into your room). The second is that you don’t have to worry about somebody secretly turning encryption off at a future point (and thus exposing your wrongthink to outsiders).

Riot.im obviously isn’t a replacement for Facebook. At most it’s a replacement for Facebook Messenger. Since everything on Riot.im occurs in a chatroom, it’s not as easy to have a conversation about a linked article and there is no way to accrue imaginary Internet points like you can with Facebook’s reactions. However, I’m not actually a fan of services that try to do everything. It’s too difficult to replace individual parts when something better rolls around or an update to the current tool makes it unusable.

If you’re interested in migrating off of Facebook or other restrictive social media platforms, you could do worse than starting with Riot.im.

Written by Christopher Burg

March 19th, 2019 at 10:00 am

Linux on a 2010 Mac Mini Part Two

without comments

Last week I mentioned my adventure of installing Linux on a 2010 Mac Mini. Although Ubuntu 18.10 did install and was working for a few days an update left the system unusable. After an update towards the end of last week the system would only boot to a black screen. From what I gathered online I wasn’t the only person who ran into this problem. Anyways, I ended up digging into the matter further.

I once again tried installing Fedora. When I tried to install Fedora 29, I was unable to stop it from booting to a black screen so I decided to try Fedora 28. Using basic graphics mode I was able to get Fedora 28 to boot to the live environment and from there install Fedora on the Mac Mini. After installation I was able to get my Fedora installation to boot. However, when I tried to install the Nvidia driver from RPM Fusion, the system would only boot to a black screen afterwards. I tried installing the Nvidia driver via the negativo17 repository but didn’t expect it to work since the driver distributed from that repository is based on version 418 and the last driver to support the Mac Mini’s GeForce 320M was version 340. Things went as expected. I then tried installing the Nvidia driver manually using a patched version of the 340 driver from here. Unfortunately, that driver doesn’t work with the 4.20 kernel so that was a no go as well.

The reason I hadn’t tried to install the Nvidia driver manually before was because I didn’t want to deal with supporting the setup in the future. As I was trying to install it using the previously linked instructions I felt justified because the guide isn’t nearly as straight forward as installing the driver from a repository. It became a moot point since manual installation didn’t work but it did make me think about the fact that any solution I settled upon would need to be maintained, which lead me to the idea of using Ubuntu 18.04 LTS. The LTS versions of Ubuntu are supported by Canonical for five years so if I could get 18.04 installed, the setup would have a decent chance of working for five years.

After passing the kernel the “nouveau.modeset=0” argument, just as I had to do with 18.10, I was able to boot into a live environment and install 18.04 to the hard drive. Likewise, I had to use the “nouveau.modeset=0” argument to boot into the installation. Once I was booted into the installation I was able to use “sudo apt install nvidia-340” to install the 340 version of the Nvidia driver. After rebooting everything worked properly. I’m hoping that future updates will be less likely to break this setup since the LTS releases of Ubuntu tend to be more stable than non-LTS versions.

So, yeah, if you want to get a currently supported Linux distro running on a 2010 Mac Mini, take a look at Ubuntu 18.04. It might be your best bet (if it continues to run properly for the next month or so, I’ll say it is your best bet).

Written by Christopher Burg

March 4th, 2019 at 10:00 am

Posted in Technology

Tagged with , ,