Ready for the mobile security news that IT doesn’t want to hear about but needs to? When security firm Positive Technologies started pen testing various mobile apps, security holes were rampant.
Ready for the mobile security news that IT doesn’t want to hear about but needs to? When security firm Positive Technologies started pen-testing various mobile apps, security holes were rampant.
We’ll plunge into the details momentarily, but here’s the upshot: “High-risk vulnerabilities were found in 38 percent of mobile applications for iOS and in 43 percent of Android applications” and “most cases are caused by weaknesses in security mechanisms — 74 percent and 57 percent for iOS and Android apps, respectively, and 42 percent for server-side components — because such vulnerabilities creep in during the design stage, fixing them requires significant changes to code.”
Here’s the most frightening line and it’s frightening because it means there is no easy enterprise IT fix: “Risks do not necessarily result from any one particular vulnerability on the client or server side. In many cases, they are the product of several seemingly small deficiencies in various parts of the mobile application. Taken together, these oversights can add up to serious consequences.
As I’ve argued before, enterprise IT — and certainly enterprise CISOs and CSOs — simply can no longer put any trust into an app from either Apple’s App Store or Android’s Google Play. This is a major nightmare, since that is where employees have to go to download apps, whether personal (BYOD) or corporate.
The security holes could be intentional malware, unintentional malware (an ISV developer leverages existing code for a common function, unaware that it includes malware), unintentional security holes or even perfectly fine code that is clean on its own but that accidentally creates problems when interacting with the rest of the mobile environment. That’s the “taken together” hole that Positive referenced.
What this means is that enterprises must hire and deploy their own penetration testing teams — either on staff or contracted — to test every app that they’re going to permit on a corporate device, even a BYOD device. So, yes, that very well might mean also testing every consumer app that some employee wants to download. (Won’t that make you ultra-popular?!)
This gets worse. To try to create and maintain a secure environment for your enterprise, you can’t merely pen test all of these applications once. Every time they offer an update, that app must be pen tested all over again, given that you don’t really know what that update is changing.
Some CIOs want to exclude from this list their major vendors, on the shaky rationale that they would never put out insecure code. First, sure they would. The larger the vendor, the more it is a target for the craftiest of cyberthieves and cyberterrorists. Unless you want to bet your company’s security on the ludicrous premise that your vendor’s systems are perfect, you simply cannot trust them. (At best, trust but verify, but even “trust but verify” exposes your systems initially.)
Would it be far more efficient and secure for Apple and Google to place a multibillion-dollar investment in security and pen test all of their own apps before they post them for download? Of course it would, but until enterprises insist on it — and shareholders back them up — it won’t happen. Oh, both Apple and Google will spend plenty to protect their own interests — copyright and other policy issues—but protecting enterprises? Don’t hold your nerdy breath.
“You cannot predict if you’ll have inherited vulnerabilities or not,” said Nikolay Anisenya, mobile pentesting team lead at Positive Technologies, in a Computerworld interview. “If you want to lower the risk of your employees being attacked, it’s better to train them about mobile app security.”
The problem with training employees about app security is that employees are going to assume that a corporate-mandated enterprise app (such as, let’s say, a company-approved firewall) had gone through corporate testing. And an app such as a firewall is precisely the kind of app that a good cyberthief will manipulate. If an employee notices the company-endorsed firewall getting invasive, that employee will typically assume that it’s an approved bit of invasiveness.
So how bad were the Positive findings? Bad enough. Some of least favorite favorites:
- “Insecure interprocess communication (IPC) is a common critical vulnerability allowing an attacker to remotely access data processed in a vulnerable mobile application. Android provides Intent message objects as a way for application components to communicate with each other. If these messages are broadcast, any sensitive data in them can be compromised by malware that has registered a BroadcastReceiver instance.
Interprocess communication is generally forbidden for iOS applications. However, there are times when it is necessary. In iOS 8, Apple introduced App Extensions. With them, apps can share their functionality with other apps on the same device. For instance, social networking apps can provide quick in-browser sharing of content.”
- “Deep linking is a common way for developers to implement communication between an app extension and its containing app. In this case, the app is called by a specific URL scheme registered in the system. During installation, the containing app registers itself as the handler for schemes listed in Info.plist. Such schemes are not tied to an application. So if the device contains a malicious app that also handles the same URL scheme, there is no telling which application will win out. This opens up opportunities for attackers to stage phishing attacks and steal user credentials.”
- “One third of vulnerabilities in Android mobile applications stem from configuration flaws. For example, our experts when analyzing AndroidManifest.xml often discover the android:allowBackup attribute set to ‘true.’ This allows creating a backup copy of application data when the device is connected to a computer. This flaw can be used by an attacker to obtain application data even on a non-rooted device.”
- “Mobile devices store data such as geolocation, personal data, correspondence, credentials, and financial data, but secure storage of that data by mobile applications is often overlooked. This vulnerability was found in 76 percent of mobile applications.”
- “If two identical requests are sent to the server one right after the other, with a minimal interval between them, one-time passwords are sent to the user’s device both as push notifications and via SMS to the linked phone number. The attacker can intercept SMS messages and impersonate the legitimate user, for instance, by cleaning out the user’s bank account.”
- “When support for TRACE requests is combined with a Cross-Site Scripting (XSS) vulnerability, an attacker can steal cookies and gain access to the application. Because the server-side component of the mobile application tends to share the same code as the website, Cross-Site Scripting allows attacking users of the web application.”