The unspoken costs of leaked data

Dhiraj Bhat
6 min readApr 3, 2023

Let’s call out the elephant in the room first, shall we?

Blogs look and feel so different since I last wrote one. ChatGPT and its younger, (mostly) fitter cousins have shaken my (and, I imagine several of my rookie blogger cousins’)conviction in what used to be so dear to me.

That being said, I realized that I could get GPT to co-pilot this latest piece with me. Earlier today, I was having a conversation with some dear friends about the struggles of being a young woman in metro cities in India.

(Of course, that’s not to say that it’s all fine and dandy if you, as a woman, live outside of urban India……..)

During the course of our conversation, there was inevitable discourse about the safety challenges that women face, particularly younger women in larger cities — women who toil to make an honest living, and sometimes don’t want to cook for themselves, or drive themselves and seek help from the system — be it the everything-all-at-once Uber, Dunzo, Swiggy, or the classic hold-your-arm-out auto rickshaw.

The advent of online, instant-settlement payments in India has eased the burden for the more tech-savvy, smartphone-weilding population of Indian residents. Every payment — be it for larger sums, to petty change tender for roadside vegetable sales is now digital and instantaneous.

That’s how easy it can be!

Is that a sign of progress for the country, even on a universal scale? Absolutely. I mean, the U.S is rolling out FedNow —Just Now. (I couldn’t help myself) and several other payment systems lag behind India in terms of quality, speed and architectural sophistication. Oh, it’s 2023 and you have a browser— maybe I’m wrong, but hear (read) me out..

Does it still pose a risk to the safety and mental peace of women all over? Again, the answer is absolutely. I’ll get into the how below.

Time to co-pilot.

I asked ChatGPT ‘What are the unspoken costs of leaked data?’ — without going deeper. Here’s what I got.

The costs of leaked data can go beyond the obvious financial losses associated with theft, fraud, or other forms of cybercrime. There are also unspoken costs that can be difficult to quantify, but nevertheless have a significant impact on individuals and organizations. Here are a few examples:

1. Loss of trust: When data is leaked, people lose trust in the organization responsible for safeguarding that data. This can lead to a loss of business, negative publicity, and a damaged reputation.

2. Legal costs: Companies that experience data breaches can face costly legal battles. They may be sued by customers, shareholders, or regulatory agencies, and may also need to pay fines and penalties.

3. Reputational damage: A data breach can damage an organization’s reputation, leading to decreased sales and difficulty attracting new customers. It can also result in negative media coverage and a damaged brand image.

4. Employee morale: When data is leaked, employees may feel demoralized and disheartened. This can lead to decreased productivity and a higher rate of turnover.

5. Time and effort: Responding to a data breach can be a time-consuming and resource-intensive process. Organizations may need to devote significant resources to investigating the breach, notifying affected individuals, and implementing new security measures.

6. Lost opportunities: Data breaches can also result in lost opportunities, such as missed business deals or partnerships. This can have a long-term impact on an organization’s growth and success.

Overall, the unspoken costs of leaked data can be significant and far-reaching, affecting not just the organization responsible for the breach, but also its customers, employees, and other stakeholders.

Then, I went a bit deeper — again, I’m not a prompt engineer — but perhaps I ought to be. I also went back to someone saying “add pictures to your blogs to make them engaging” — so here’s a screenshot of ChatGPT, like everyone’s uncle is doing.

These answers are well-written, generated faster than I ever could, and provide genuinely valuable insights. Where do I ‘co-pilot’ then? IMO humans can stay on top, as long as humans have empathy, and that’s where this next bit is coming from.

Financial data privacy and security has been a debate, and a hotly contested one for several decades. Perhaps few people in my network have borne the brunt of this to the extent that I have — having needed to convince hundreds of stakeholders at over a hundred companies that data privacy and security is paramount for our latest product at Unit21, the Fintech Fraud DAO.

Without GPT-ing, I can tell you what the high-level concerns are:

  • Data possessed by financial services firms about you, or your organization is extremely valuable, and by extension, extremely sensitive
  • Information that is personally identifiable, such as your name, date of birth, SSN (Or national ID), your phone number and so on can be used for both good and bad, as illustrated below

Good: Provide quality services that suit your needs, balancing transactional safety and quality of service.
Bad: Allow fraudsters and/or other bad actors to impersonate you, or otherwise victimize you

  • From a company’s perspective, losing control of this sort of data could lead to several repercussions, as mostly better explained by GPT above.

(If you’re lazy, it’s basically regulatory fines, legal hassles and reputational risk, plus the threat of a competitor poaching your clientele with better products or services, or just the illusion of the same)

Same old Dhiraj Bhat — get to the point, dude. How does this relate to women’s safety?

Here’s a screenshot from the account creation flow for UPI (I used PhonePe as an example)

Thanks, PhonePe’s YouTube admin

The software (a lot of the time) prompts you with a ‘sample/template’ UPI ID, which reads yourphonenumber@ybl or yourphonenumber@ibl.

Which means, every autowallah you pay, every vendor you buy your carrots from… has your UPI ID in their transaction history.

Some apps do a good job of masking the PII associated with a payment, and some store them in good ol’ plain text . You do not need to be an engineer, or even fully literate to decipher a phone number from phonenumber@ybl……

Cue the incessant phone calls, lewd messages, threatening and harmful content, and the loss of psychological and at times financial safety.

Loss of control over Personally Identifiable Information can lead to harmful, risky consequences that amount to things beyond tech-savvy fraudsters. Some of these folks have always known our home addresses, our daily routines and even our quirks. Why arm them further with information that can lead to harm?

Privacy by design has never been more relevant, and privacy preserving data sharing never more sexy.

More relevant context (for a change) — Our new fraud product at Unit21 utilizes Privacy Preserving Record Linkage to help customers de-identify end user PII using bloom filters before ingestion, making our system zero knowledge proof, meaning all pattern creation and detection are probabilistic and cannot be reverse engineered.

Actually, far smarter people than me have written about this on Twitter, where each individual also currently seems to be a banking, AI and Web3 expert rolled into one.

If a person, any person, wants you to contact them, they will give you a way to do so. Just as if AI wants to replace you, it will find a way.

Until that happens, let’s keep the mystery alive, shall we?

To all the four readers out here, please verify that your UPI ID is strange and undecipherable. (Yes, it is customizable). To the two that were inspired by this to play around with ChatGPT, please ensure you don’t give it more ammo by sharing PII, or any information you wouldn’t give a stranger.

As I did here, let’s co-pilot the future of the world, at our pace, with our consent.

--

--