Skip to main content
Back to Blog

Protecting people privacy from data-mining keyboards

Reading Time: 2 minutes
Protecting people privacy from data mining image

After scandals such as Wikileaks, documents discovered and published by Edward Snowden, and many other privacy-related issues, it’s more than ever essential that anyone who uses a computer pays attention to who is watching and using their data. Everything you write and send over the internet can potentially be seen, stored, and shared.

Of the top 3 keyboards today in the European market, two are owned by multinational companies: Swiftkey (owned by Microsoft) and GBoard (owned by Google). In exchange for services such as an alarm to leave for your next meeting, for example, Google makes users allow access to their maps, location, their meeting addresses, and calendar appointments. Google also knows the location of your home and work, since they track the time you leave and return on a daily basis.

The XDA Developers Portal has mentioned some of these privacy issues before, and so has HelpNet Security .

All this information could potentially not mean anything, but it can also be used, crunched and sold to advertisers, and much more, which is not the purpose of this post. You get the point though!

For our team

At the moment we started Thingthing Ltd., we’ve questioned ourselves on what we wanted our company to be and we quickly came to the conclusion that it had to become a reflection of our beliefs and values. As founders, we had a strong interest in privacy and encryption. We knew that it would create a long-term relationship with users. Just like friends trust each other, we wanted the same with users.

As we were looking into the software keyboard space, we realized how little attention to privacy was put into keyboards despite them being able to know a lot about ourselves. At that time, end-to-end encryption for messaging apps didn’t really exist apart from a few players who cared about it. Soon after, the others followed with encryption standards.

Interestingly, even when a chat or messenger is encrypted — the user’s keyboard is often NOT encrypted nor private and this is a major concern for all of you readers!

In fact, it made us mad and even more convinced that our company had to do something about it. We purposefully built our technology and algorithms to remain processed locally, and never rely on server-side personal data processing.

As exemplified before, users’ privacy is often exposed on almost all our competitors, from monitoring behaviors to communicating personal data to their cloud servers. Almost all players do mine people’s data for various purposes such as advertising, selling all about you to third parties (ever wondered where some newsletters come from?), personal voice assistants, government, or anyone they can get money from. We, on the flip side, make money based on what people buy inside our app or, soon, when content such as GIFs, Stickers, etc is shared. And all of this is purely unidentifiable data — our users are unique sets of data.

We have chosen to stand in front of people’s privacy and do all we can to build a private & secure experience so that our users can type, chat and message with peace of mind.

Happy typing!

PS: You can learn more about Fleksy’s Privacy Policy

— Olivier, CEO, on behalf of the Thingthing Ltd. team

Did you like it? Spread the word:

✭ If you like Fleksy, give it a star on GitHub ✭