A Geek With Guns

Chronicling the depravities of the State.

Archive for the ‘Apple’ tag

You Are Responsible for Your Own Security

without comments

One of the advertised advantages of Apple’s iOS platform is that all software loaded onto iOS devices has to be verified by Apple. This so-called walled garden is meant to keep the bad guys out. However, anybody who studies military history quickly learns that sitting behind a wall is usually a death sentence. Eventually the enemy breaches the wall. Enemies have breached Apple’s walls before and they continue to do so:

In a blog post entitled “Location Monetization in iOS Apps,” the Guardian team detailed 24 applications from the Apple iOS App Store that pushed data to 12 different “location-data monetization firms”—companies that collect precise location data from application users for profit. The 24 identified applications were found in a random sampling of the App Store’s top free applications, so there are likely many more apps for iOS surreptitiously selling user location data. Additionally, the Guardian team confirmed that one data-mining service was connected with apps from over 100 local broadcasters owned by companies such as Sinclair, Tribune Broadcasting, Fox, and Nexstar Media.

iOS has a good permission system and users can prevent apps from accessing location information but far too many people are willing to grant access to their location information to any application that asks. If a walled garden were perfectly secure, users wouldn’t have to worry about granting unnecessary permissions because the wall guards wouldn’t allow anything malicious inside. Unfortunately, the wall guards aren’t perfect and malicious stuff does get through, which brings me to my second point.

What happens when a malicious app manages to breach Apple’s walled garden? Ideally it should be immediately removed but the universe isn’t ideal:

Adware Doctor is a top app in Apple’s Mac App Store, sitting at number five in the list of top paid apps and leading the list of top utilities apps, as of writing. It says it’s meant to prevent “malware and malicious files from infecting your Mac” and claims to be one of the best apps to do so, but unbeknownst to its users, it’s also stealing their browser history and downloading it to servers in China.

In fairness to Apple, the company did eventually remove Adware Doctor from its app store. Eventually is the keyword though. How many other malicious apps have breached Apple’s walled garden? How long do they manage to hide inside of the garden until they are discovered and how quickly do the guards remove them once they are discovered? Apparently Apple’s guards can be a bit slow to react.

Even in a walled garden you are responsible for your own security. You need to know how to defend yourself in case a bad guy manages to get inside of the defensive walls.

Written by Christopher Burg

September 11th, 2018 at 10:30 am

Posted in Technology

Tagged with , ,

Designed by Apple in California

without comments

Designed by Apple in California is a tagline the company uses to add a little prestige to their Chinese manufactured electronics. In addition to designing electronics the company also designs its own stores. However, when people in California design stores they often overlook environmental issues that are rare there but common elsewhere, such as ice and snow:

Apple’s new flagship retail store in Chicago, the one with a MacBook-shaped rooftop, is nothing short of an architectural marvel. At least, that’s how some news reports put it when the store opened back in October. Beyond standing out among the less inspired buildings of the downtown Chicago area, the new Apple Store also happens to be very poorly thought through considering its thin roof now has dangerous icicles hanging perilously over public walkways.

Designed by Apple in a state that doesn’t have to deal with arctic bullshit. As a Minnesotan I can’t help but laugh at this.

Apple isn’t the first company to run into this problem and it won’t be the last. It’s too easy to take architecture for granted. An architect in California can easily overlook the effects harsh winters will have on their building. An architect in Minnesota can easily overlook the effects earthquakes will have on their building. If you’re tasked with designing a building that will be built in another region, it might be a good idea to contact some architects in that area and ask them about environmental issues they have to design around.

Written by Christopher Burg

December 29th, 2017 at 10:00 am

Physical Access Isn’t Necessarily Game Over

without comments

I swear Apple fanboys are some of the dumbest people on the planet. Quite a few of them have been saying, “If an attacker as physical access, it’s game over anyways,” as if that statement makes the root user exploit recently discovered in High Sierra a nonissue.

At one time that statement was true. However, today physical access is not necessarily game over. Look at all of the trouble the Federal Bureau of Investigations (FBI) has been having with accessing iOS devices. The security model of iOS actually takes physical access into account as part of its threat modeling and has mechanisms to preserve the integrity of the data contained on the device. iOS requires all code to be signed before it will install or run it, which makes it difficult, although far from impossible, to insert malicious software onto iOS devices. But more importantly iOS encrypts all of the data stored in flash memory by default. Fully encrypted disks protect against physical access by both preventing an attacker from getting any usable data from a disk and also by preventing them from altering the data on the disk (such as writing malware directly to the disk).

macOS has a boot mode called single user mode, which boots the computer to a root command prompt. However, if a firmware password is set, single user mode cannot be started without entering the firmware password. The firmware password can be reset on machines with removable RAM (resetting the password requires changing the amount of RAM connected to the mainboard) but most of Apple’s modern computers, some iMacs being the exception, have RAM modules that are soldered to the mainboard.

Physical access is especially dangerous because it allows an attacker to insert malicious hardware, such as a key logger, that would allow them to record everything you type, including your passwords. However, that kind of attack requires some amount of sophistication and time (at least if you want the malicious hardware to be difficult to detect), which is where the real problem with High Sierra’s root exploit comes in. The root exploit required no sophistication whatsoever. Gaining root access only required physical access (or remote access if certain services were enabled) to an unlocked Mac for a few seconds. So long as an attacker had enough time to open System Preferences, click one of the lock icons, and type in “root” for the user name a few times they had complete access to the machine (from there they could turn on remote access capabilities to maintain their access).

Attempting to write off this exploit as a nonissue because it requires physical access requires willful ignorance of both modern security features that defend against attackers with physical access and the concept of severity (an attack that requires no sophistication can be far more severe than a time consuming sophisticated attack under certain threat models).

Written by Christopher Burg

December 1st, 2017 at 11:00 am

The Fix for High Sierra’s Embarrassing Privilege Escalation Bug and the Fix for the Fix

without comments

Apple has already released a fix for its embarrassing privilege escalation bug. If you haven’t already, open the App Store, go to Updates, and install Security Update 2017-001. However, after installing that you may notice that file sharing no longer works. In order to fix this problem you need to perform the following steps:

  1. Open the Terminal app, which is in the Utilities folder of your Applications folder.
  2. Type sudo /usr/libexec/configureLocalKDC and press Return.
  3. Enter your administrator password and press Return.
  4. Quit the Terminal app.

In conclusion High Sierra is still a steaming pile of shit and you should stick to Sierra if you can.

Written by Christopher Burg

November 30th, 2017 at 11:00 am

macOS High Sierra is Still Terrible

without comments

macOS High Sierra may go down in the history books as Apple’s worst release of macOS since the initial one. Swapping the graphical user interface to use the Metal API wasn’t a smooth transition to say the least but the real mess is in regards to security. There was a bug where a user’s password could be displayed in the password hint field so logging in as a malicious user only requires entering a user’s password incorrectly to trigger the hint field. But yesterday it was revealed that the root account, which is normally disabled entirely, could be activated in High Sierra by simply typing root into the user name field in System Preferences:

The bug, discovered by developer Lemi Ergin, lets anyone log into an admin account using the username “root” with no password. This works when attempting to access an administrator’s account on an unlocked Mac, and it also provides access at the login screen of a locked Mac.

The only good news is that you can defend against this bug by enabling the root account and giving it a password.

The security mistakes in High Sierra are incredibly amateur. Automated regression testing should have caught both the password hint mistake and this root account mistake. I can only assume that Apple’s quality assurance department took the year off because both High Sierra and iOS 11 are buggy messes that should never have been released in the states they were released in.

Written by Christopher Burg

November 29th, 2017 at 10:00 am

When You’re Trying to Be Very Smart™ but End Up Looking Stupid

without comments

The announcement of the iPhone X was one of the biggest product announcements of the year. Not only is it the latest iPhone, which always captures headlines, but it includes a new facial recognition feature dubbed Face ID. With the popularity of the iPhone it’s inevitable that politicians will try to latch onto it to capture some headlines of their own. Al Franken, one of Minnesota’s congress critters, decided to try to latch onto the iPhone X by expressing concern about the privacy implications of the Face ID feature. This may appear to have been a smart political maneuver but the senator only managed to make himself appear illiterate since Apple had already published all of the technical information about Face ID:

Apple has responded to Senator Al Franken’s concerns over the privacy implications of its Face ID feature, which is set to debut on the iPhone X next month. In his letter to Tim Cook, Franken asked about customer security, third-party access to data (including requests by law enforcement), and whether the tech could recognize a diverse set of faces.

In its response, Apple indicates that it’s already detailed the tech in a white paper and Knowledge Base article — which provides answers to “all of the questions you raise”. But, it also offers a recap of the feature regardless (a TL:DR, if you will). Apple reiterates that the chance of a random person unlocking your phone is one in a million (in comparison to one in 500,000 for Touch ID). And, it claims that after five unsuccessful scans, a passcode is required to access your iPhone.

Franken should feel fortunate that Apple even bothered entertaining his concerns. Were I Tim Cook I would have directed a member of my staff to send Franken links to the technical publications with a request to have a member of his staff read them to him and not bothered giving him a TL;DR. After all, Apple’s time is worth far more money than Franken’s since it’s actually producing products and services that people want instead of being a parasite feeding off of stolen money.

Still I admit that it was pretty funny seeing Franken make an ass of himself yet again.

Written by Christopher Burg

October 19th, 2017 at 11:00 am

APFS and FileValut

without comments

Apple released macOS High Sierra yesterday. Amongst other changes, High Sierra includes the new Apple File System (APFS), which replaces the decades old Hierarchical File System (HFS). When you install High Sierra, at least if your boot drive is a Solid State Drive (SSD), the file system is supposed to be automatically converted to APFS. Although Apple’s website says that FileVault encrypted drives will be automatically converted, it didn’t give any details.

I installed High Sierra on two of my systems last night. One was a 2012 MacBook Pro and the other was a 2010 Mac Mini. Both contain Crucial SSDs. Since they’re third-party SSDs I wasn’t sure if High Sierra would automatically convert them. I’m happy to report that both were converted automatically. I’m also happy to report that FileVault didn’t throw a wrench into the conversion. I was worried that converting a FileVault encrypted drive would require copying files from one encrypted container to a new encrypted container but that wasn’t necessary.

If you’re installing High Sierra on a FileVault encrypted drive, the conversion from HFS to APFS won’t take a noticeably greater amount of time.

Written by Christopher Burg

September 26th, 2017 at 10:00 am

iOS 11 Makes It More Difficult for Police to Access Your Device

with 2 comments

One reason I prefer iOS over Android is because Apple has invested more heavily in security than Google has. Part of this comes from the fact Apple controls both the hardware and software so it can implement hardware security features such as its Secure Enclave chip whereas the hardware security features available on an Android device are largely dependent on the manufacturer. However, even the best security models have holes in them.

Some of those holes are due to improperly implemented features while others are due to legalities. For example, here in the United States law enforcers have a lot of leeway in what they can do. One thing that has become more popular, especially at the border, are devices that copy data from smartphones. This has been relatively easy to do on Apple devices if the user unlocks the screen because trusting a knew connection has only required the tapping of a button. That will change in iOS 11:

For the mobile forensic specialist, one of the most compelling changes in iOS 11 is the new way to establish trust relationship between the iOS device and the computer. In previous versions of the system (which includes iOS 8.x through iOS 10.x), establishing trusted relationship only required confirming the “Trust this computer?” prompt on the device screen. Notably, one still had to unlock the device in order to access the prompt; however, fingerprint unlock would work perfectly for this purpose. iOS 11 modifies this behaviour by requiring an additional second step after the initial “Trust this computer?” prompt has been confirmed. During the second step, the device will ask to enter the passcode in order to complete pairing. This in turn requires forensic experts to know the passcode; Touch ID alone can no longer be used to unlock the device and perform logical acquisition.

Moreover, Apple has also included a way for users to quickly disable the fingerprint sensor:

In iOS 11, Apple has added an new emergency feature designed to give users an intuitive way to call emergency by simply pressing the Power button five times in rapid succession. As it turns out, this SOS mode not only allows quickly calling an emergency number, but also disables Touch ID.

These two features appear to be aimed at keeping law enforcers accountable. Under the legal framework of the United States, a police officer can compel you to provide your fingerprint to unlock your device but compelling you to provide a password is still murky territory. Some courts have ruled that law enforcers can compel you to provide your password while others have not. This murky legal territory offers far better protection than the universal ruling that you can be compelled to provide your fingerprint.

Even if you are unable to disable the fingerprint sensor on your phone, law enforcers will still be unable to copy the data on your phone without your password.

Written by Christopher Burg

September 15th, 2017 at 11:00 am

It’s Not Your Data When It’s in The Cloud

without comments

I’ve annoyed a great many electrons writing about the dangers of using other people’s computer (i.e. “the cloud”) to store personal information. Most of the time I’ve focused on the threat of government surveillance. If your data is stored on somebody else’s computer, a subpoena is all that is needed for law enforcers to obtain your data. However, law enforcers aren’t the only threat when it comes to “the cloud.” Whoever is storing your data, unless you’ve encrypted it in a way that make it inaccessible to others before you uploaded it, has access to it, which means that their employees could steal it:

Chinese authorities say they have uncovered a massive underground operation involving the sale of Apple users’ personal data.

Twenty-two people have been detained on suspicion of infringing individuals’ privacy and illegally obtaining their digital personal information, according to a statement Wednesday from police in southern Zhejiang province.

Of the 22 suspects, 20 were employees of an Apple “domestic direct sales company and outsourcing company”.

This story is a valuable lesson and warning. Apple has spent a great deal of time developing a reputation for guarding the privacy of its users. But data uploaded to its iCloud service are normally stored unencrypted so while a third-party may not be able to intercept en route, at least some of Apple’s employees have access to it.

The only way you can guard your data from becoming public is to either keep it exclusively on your machines or encrypt it in such a way that third parties cannot access it before uploading it to “the cloud.”

Written by Christopher Burg

June 9th, 2017 at 10:00 am

Paranoia I Appreciate

without comments

My first Apple product was a PowerBook G4 that I purchased back in college. At the time I was looking for a laptop that could run a Unix operating system. Back then (as is still the case today albeit to a lesser extent) running Linux on a laptop meant you had to usually give up sleep mode, Wi-Fi, the additional function buttons most manufacturers added on their keyboards, and a slew of power management features that made the already pathetic battery life even worse. Since OS X was (and still is) Unix based and didn’t involved the headaches of trying to get Linux to run on a laptop the PowerBook fit my needs perfectly.

Fast forward to today. Between then and now I’ve lost confidence in a lot of companies whose products I used to love. Apple on the other hand has continued to impress me. In recent times my preference for Apple products has been influenced in part by the fact that it doesn’t rely on selling my personal information to make money and displays a healthy level of paranoia:

Apple has begun designing its own servers partly because of suspicions that hardware is being intercepted before it gets delivered to Apple, according to a report yesterday from The Information.

“Apple has long suspected that servers it ordered from the traditional supply chain were intercepted during shipping, with additional chips and firmware added to them by unknown third parties in order to make them vulnerable to infiltration, according to a person familiar with the matter,” the report said. “At one point, Apple even assigned people to take photographs of motherboards and annotate the function of each chip, explaining why it was supposed to be there. Building its own servers with motherboards it designed would be the most surefire way for Apple to prevent unauthorized snooping via extra chips.”

Anybody who has been paying attention the the leaks released by Edward Snowden knows that concerns about surveillance hardware being added to off-the-shelf products isn’t unfounded. In fact some companies such as Cisco have taken measure to mitigate such threats.

Apple has a lot of hardware manufacturing capacity and it appears that the company will be using it to further protect itself against surveillance by manufacturing its own servers.

This is a level of paranoia I can appreciate. Years ago I brought a lot of my infrastructure in house. My e-mail, calendar and contact syncing, and even this website are all being hosted on servers running in my dwelling. Although part of the reason I did this was for the experience another reason was to guard against certain forms of surveillance. National Security Letters (NSL), for example, require service providers to surrender customer information to the State and legally prohibit them from informing the targeted customer. Since my servers are sitting in my dwelling any NSL would necessarily require me to inform myself of receiving it.

Written by Christopher Burg

March 25th, 2016 at 10:00 am