Hi Friends,

Even as I launch this today ( my 80th Birthday ), I realize that there is yet so much to say and do. There is just no time to look back, no time to wonder,"Will anyone read these pages?"

With regards,
Hemen Parekh
27 June 2013

Now as I approach my 90th birthday ( 27 June 2023 ) , I invite you to visit my Digital Avatar ( www.hemenparekh.ai ) – and continue chatting with me , even when I am no more here physically

Saturday, 8 July 2023

Monetizing User Data

 

Monetizing User Data



Context :

How India can upend world’s data economy

HOW DO YOU PROTECT PERSONALLY IDENTIFIABLE USER DATA? AND IF SOMEONE AGREES TO THEIR DATA BEING MONETIZED, DOES A COMPANY HAVE TO SHARE A PART OF THE REVENUES WITH THE USER?

·         Hindustan Times ST (Mumbai)  /  8 Jul 2023 / Author > Charles Assisi

·          

 

 

Finally, the Personal Data Protection (PDP) Bill 2023 will be tabled during the monsoon session of Parliament. This is something we were expecting to happen during the Winter Session and had discussed earlier in October on these pages. Be that as it may, our conversations with those who worked behind the scenes to craft the Bill suggest that in its current avatar, this Bill holds the potential to “upend the Data Economy of the world.”

 

What is that supposed to mean? Sharad Sharma, (  sharad@ispirt.in )  a volunteer at the technology think-tank iSpirt who is among those who brainstormed on the Bill with other thought leaders in the tech ecosystem.

 

“If implemented, this can create `The India Way’ or ‘The Third Way’ on how to think about data”, he says.

 

The First Way meant ignoring the problem. When it ran the course, the Second Way came into being, which is to anonymise data on a good faith basis.

 

But ‘good faith’ need not compel action because it may be at loggerheads with an entity’s motive.

Consider India alone. Almost 90% of people use the Chrome browser built by Google. It collects data that is used by another division of Google to target users with personalized ads.

 

All the monies are pocketed by the company and the user whose data is traded gets nothing.

 

Does it have to be this way? At end of the day, all of this personal data is traded for profits by a third party.

 

Why shouldn’t Google be sharing a part of the profits with people whose data it sells? And how can we be sure Google isn’t sharing data if we don’t want it to be shared?

 

This means it is time to ask the big questions: How do you protect personally identifiable user data? And if someone agrees to their data being monetized, does a company have a responsibility to share a part of the revenues it earns with the user?

 

While most companies argue no personally identifiable data is collected or traded, those embedded in technology know this is untrue. In fact, back in October 2006, Reed Hastings, the co-founder of Netflix, announced the ‘Netflix Prize’. Anyone who creates a recommendation engine which is better by 10% than their current algorithms gets to keep $1 million. Only “anonymized data” would be released. It wasn’t too long before coders started to create better algorithms, but they identified individual users as well from that data. The program has since been disbanded.

 

“What it showed,” says Sharma, “is that anonymization is an inexact science.” In fact, he argues, while a company may claim it does not collect any personally identifiable data, there is no incentive for it not to attempt privacy violations.

 

And if a user agrees to share their data, how much data must they share. Here again, Sharma explains, the problem has been “no one knew how to restrict any entity from collecting more than what is needed.” And this, he says, is the hairy problem Indians have cracked.

 

When asked how, Sharma explains that the ‘India way’ which is coming up for scrutiny works differently. When someone built a model, like Netflix did, people could compare it to other datasets in the public domain.

 

In the new scheme of things, after someone builds a model, it is sent to a

Computationally Clean Room” (CCR).

 

The model can only inspect patterns, and not data. This is computationally guaranteed via a mathematical framework called Differential Privacy (DP).

 

When a DP and the CCR is merged, it becomes pretty much impossible to identify a person. Work on this is in progress.

 

Why does this matter and how does it hold the potential to upend the Data Economy of the world?

 

The first is that it opens the door to compensate people who may want to be compensated adequately for their data.

 

Open Source Software will keep them informed of how many times their data accessed and what they are owed.

 

The second, Sharma says, is that between India’s large population, a legislation in place, no ambiguity on data ownership, the doors will begin to open up for India to begin work at becoming the “ Model Making Capital of the world” as well.

 

It appears all the ducks are lined up for India to shoot.

=========================================


MY  TAKE  :


Digital Dividend from Demographic Data [ 4 D ]



  


 


Right to Sell My Soul ?  


Wealth of Nations 



 

 

No comments:

Post a Comment