Smartphone Keyboards: The Achilles Heel of Data Privacy

Typewise
Startup Grind
Published in
6 min readMar 16, 2021

--

With the recent WhatsApp policy change and the ongoing debate about data privacy, smartphone keyboard privacy is becoming an increasingly pressing security issue. The keyboard essentially can capture everything you type, even if you are using supposedly privacy-friendly apps such as Signal or your personal banking app.

Keyboard apps intrude on millions of users’ privacy

Over the past few years, new smartphone keyboard apps have emerged to improve the user experience when typing on a small screen. Some of them have become incredibly popular (e.g. Gboard claims over 1 billion users). Because the smartphone keyboard is not a typical app but is deeply embedded in the smartphone’s operating system, it’s able to access very sensitive user data. Many keyboards have obtained intrusive permissions such as full internet access, GPS location, camera/microphone access, browser history, address book, etc.

Apps ranging from messengers or enterprise software, to banking or even health often contain personal information, compromising which would be devastating for users. As the smartphone keyboard sees everything you type, it deserves special attention when discussing data privacy.

Some developers have taken advantage of this unique position, using keyboard apps as a means to unscrupulously gather and sell data. Others simply got unlucky, leading to unintentional data leaks. A brief overview:

Go Keyboard exchanged personal information of 200 million users with advertising software, got removed from the Play Store in 2017 but has been back in full force since 2020.

Ai.type got taken down in 2019 after making unauthorized in-app purchases of at least $18m, after having been downloaded over 31 million times. It also came back in 2020.

TouchPal bundled its keyboard software with malicious adware, resulting in the Chinese developer CooTek being banned in 2019 by Google. Over 440 million users were affected. The app came back in 2020.

Kika Keyboard, which employed malicious advertising practices such as click flooding and clicks injection, was also removed from the Google Play Store in 2018. 200 million users were affected. By 2020, it, too, made its way back in.

Microsoft SwiftKey had to suspend cloud syncing for a while after a data leak had become public in 2016, particularly when the autocorrection would suggest information (including email addresses) from other users. At the time Swifkey had roughly 300 million users, a fraction of which was affected by this leak, according to a company statement.

Cheetah Mobile applied malicious advertising practices and affected 100 million users worldwide, effectively being banned from the Google Play Store in 2020.

Wave Keyboard effectively put all 10 million of its users at risk by not having a coherent privacy policy. It didn’t ask for consent nor did it present how it uses data. This was condemned by the Norwegian Consumer Council in 2020, but no further retributions took place. It’s still operational.

In total, hundreds of millions of users have been affected by the insincere approach to data privacy on smartphone keyboards. While gatekeepers like Google and Apple tend to only react once a third-party investigator exposes malicious practices by banning guilty apps from their stores, it is often too little, too late. The damage is done, the typical repercussions that apps face is a temporary removal from the stores. A permanent ban of the developer is rare.

The graph below highlights the scandals of the last few years. However, we advise caution, as the graph could be interpreted to imply scandals have become fewer or less severe over time. This is not the case. While in 2020 the number of affected users appears to have decreased, the importance of keyboard privacy in 2021 remains just as prevalent.

Selected keyboard app privacy breaches (2016–2020)

The issue with permissions

The aforementioned scandals are reason enough to be concerned as a privacy-conscious user. However, it doesn’t end there. Malicious practices aside, intrusive permissions are perhaps even more insidious. Why does a keyboard app need access to the camera? Or the GPS location? While requiring full network access can make sense (to enable online functions such as GIF-search), the extent of the permission request often doesn’t appear to be plausible in the eyes of the user.

Often you may find yourself at crossroads: either give the app access to the permissions it requires, or not use the app at all. This leads to a certain “data privacy fatigue” among users, who just want to enjoy the benefits of the keyboard. There are few keyboards out there that permit using the app with no or few permissions. One just has to look. Luckily, Google and Apple now, too, require all apps to indicate to what extent they require permissions.

Comparison of keyboard app permissions (Google Play)

What can users do?

As the awareness for privacy infringements has increased, users have become more sensitized. This includes government policies, such as the EU’s GDPR (General Data Protection Regulation) but also companies implementing the privacy-by-design architecture. This implies proactively and preventively safeguarding data privacy, instead of reacting only once the cat is out of the bag.

With privacy-by-design, privacy finally gets a seat in the front row as the default setting, not hidden deep within. The design of the app and its security structures should embed privacy, enabling a positive-sum game for the users. For example, this could mean utilizing AI algorithms that don’t require syncing with the cloud or being able to enable an offline mode that deactivates any internet connection. At the same time, allowing for visibility and keeping any actual data requirements transparent is the way to keep privacy user centric.

You can protect yourself in different ways, including:

  • Check permissions of apps before installing
  • Not accept requests for permissions unless needed
  • Report malicious activities to store authorities
  • Promote privacy-friendly apps among your friends

By following these simple steps, you can do your part in effecting change in data privacy approaches across the globe. On a day-to-day scale, this means seeking out apps that follow privacy-by-design concepts.

At Typewise, a Swiss deep-tech start-up, we encapsulate the mindset of privacy-by-design to ensure data privacy on smartphone keyboards. Next to an AI-based autocorrection and language detection, Typewise safeguards user data through a privacy-by-design approach. As the keyboard works 100% offline, no typing data is transmitted to the cloud. Our AI runs entirely on the device and doesn’t require syncing to a cloud. You can read more about how we ensure data privacy here.

--

--

Typewise
Startup Grind

Typewise is the true smartphone keyboard that helps you make 4x fewer typos. It’s 100% private as well. Try for free https://typewise.onelink.me/8Sag/mediumtw