Crypto Market Commentary 

17 September 2019

Doc's Daily Commentary

Look for the new “Options for Income Masterclass” which is now live!

The 9/11 ReadySetLive session with Doc and Mav is listed below.

Mind Of Mav

How Much Is Your Privacy & Data Worth?

In July, Facebook agreed to pay $5 billion to settle charges that it had misled hundreds of millions of users about their data privacy. That same week, credit reporting agency Equifax was ordered to pay up to $700 million for negligence leading to a data breach that exposed the information of some 150 million people. Earlier this month, Google was fined $170 million for harvesting millions of kids’ data on YouTube without their parents’ consent. Back in February, TikTok’s parent company had to pay $5.7 million for similar violations.

Each of those fines, issued by the Federal Trade Commission (FTC), was record-breaking in some way. The $5 billion Facebook settlement, in particular, dwarfed any that a U.S. company had been previously forced to pay for privacy infractions. Yet privacy advocates found it woefully insufficient, and the two Democrats on the five-member commission publicly dissented. They argued that Facebook’s gains and the public’s injuries from the platform’s fast-and-loose privacy practices may have far exceeded the amount of the fine.

Behind the big numbers is a simmering debate in tech policy and consumer economics over the true value of online privacy. At a time when the states and the federal government are eyeing sweeping privacy laws and the FTC is trying to rein in big tech’s abuses with financial penalties, a growing body of research is trying to establish just how much data protection is worth to consumers. If we could establish that dollar figure, the thinking goes, we could find the right trade-offs between their interests and those of the companies that collect and traffic in their data.

One problem: Figuring out precisely what value consumers place on their personal data is starting to look like a fool’s errand. And the people who have studied the question the most are beginning to come to the conclusion that it’s the wrong approach altogether.

On one hand, we know people say they want more online privacy. Surveys find that the vast majority of Americans feel they’ve lost control of their data and worry about how companies are using it. We can find evidence of that in the rapid growth of encrypted communication, the rising use of VPNs, and market research that shows people are speakers due to privacy concerns. It’s also telling that Apple has built a major marketing campaign around its privacy protections.

On the other hand, the vast majority of Americans still use free services such as Facebook and Google with business models that revolve around the collection and exploitation of our personal data. We post photos of ourselves and our families online, carry around customer loyalty cards that track our purchases, and sign up for apps that track our location wherever we go. If people really cared about online privacy, skeptics and industry advocates ask, wouldn’t more of us be deleting Instagram and doing our internet searches on the privacy-minded Google alternative DuckDuckGo? Instead, Facebook posted record profits amid a privacy backlash, and share prices for Google and Amazon are hovering close to their all-time highs. In a New York Times op-ed this week, author Rob Walker gazed across the consumer landscape and concluded, “There is no tech backlash.”

The apparent disconnect between most people’s stated desire for online privacy and their incautious online behavior has become known as the privacy paradox. There are competing explanations, but most experts agree it’s hard to draw conclusions about people’s values from their real-world actions. That’s because their choices are shaped by all kinds of factors: the limited available options (how many social networks have the majority of your friends and family on them?), the circumstances under which they make decisions (who has time to read a 4,000-word privacy policy before downloading each app?), the difficulty of predicting outcomes (who knew that Yahoo Mail would get hacked when they signed up for it?), and feelings of general helplessness (why spend a ton of energy protecting your data now when so much is already out there?).

For all those reasons, academics have been trying to get a bead on people’s true preferences by conducting controlled surveys and experiments. Typically, these are aimed at figuring out exactly how much monetary value people attach to various forms of privacy under specific conditions.

Historically, fines for data breaches have been based on the direct monetary impact on consumers, such as the time and money they could lose if their identity were stolen or the amount they’d have to pay for credit monitoring. That makes sense in a case like the Equifax breach. But the damages are much harder to assess when the data in question is your browsing history, location history, or address book. No one wants that stuff mishandled, but how much exactly is it worth to them?

study published in the Journal of Consumer Policy this summer found that Americans say they would be willing to pay, on average, $5 a month to delete all their personal data from the companies that have collected it. That’s less than half the price of a subscription to Netflix or Amazon Prime, and it’s certainly less than the value that tech companies, marketers, and data brokers extract from this information. (Facebook alone hauls in about $30 per year in revenue per North American user, or about $2.50 per month, and, of course, it’s just one of the countless companies tracking your online behavior.) So, if you took the $5 per month figure at face value, you might suspect that people don’t particularly value their privacy.

But when the question was flipped — asking how much money companies would have to pay an individual to receive full access to their personal data — the average answer was a hefty $80 per month. Multiply that by some 250 million American adults, and you’d get a value on the order of $240 billion per year for online privacy. That’s more than the combined annual revenue of Facebook and Google, including all their subsidiaries. At that price, stringent privacy laws might be justified on the grounds of welfare economics alone. And the FTC’s $5 billion Facebook fine would look like a pittance compared to the value of what users lost when their online privacy was violated.

Studies using those methods, including a seminal 2012 paper by Carnegie Mellon’s Alessandro Acquisti, have found that people will in fact part with specific types of personal data for much less than $80 per month. One study found that MIT students could be induced to give out their friends’ email addresses in exchange for a free pizza.

You can argue that the mining of data, provided it could be done responsibly, creates trillions of dollars in economic value that otherwise wouldn’t exist.

Another study concluded that privacy regulators risk damaging the economy by overestimating the value of protecting people’s data and underestimating the value of letting tech companies use it. Data is often called the new oil, but while each barrel of oil has a set value on its own, most forms of consumer data are only economically useful when collated and analyzed on a mass scale. Individual users of Facebook and Google would be hard-pressed to monetize their own online behavior as efficiently as those companies do. In that sense, you can argue that the mining of data, provided it could be done responsibly, creates trillions of dollars in economic value that otherwise wouldn’t exist.

A 2015 white paper by Benjamin Wittes and Jodie C. Liu of the Brookings Institution made a different case, arguing that many of the same technologies blamed for eroding our privacy are actually enhancing our privacy in less obvious ways. They suggest that people’s continued use of allegedly privacy-violating internet platforms can be explained by understanding that the type of privacy they’re risking isn’t the kind people really care about.

While the variation tends to be less extreme in experiments involving actual money, it makes sense that people would value online privacy very differently. Some people who signed up for the controversial Facebook Research app, which granted the company deep access to their mobile activity in exchange for $20 per month, reported that they found it a bargain because they weren’t doing anything on their phones that they felt they needed to hide. On the other end of the spectrum, to a dissident using internet platforms to organize a political movement, online anonymity might be the difference between freedom and jail.

But that’s only part of the issue. Study after study has found that people’s valuations of data privacy are driven less by rational assessments of the risks they face than by factors like the wording of the questions they’re asked, the information they’re given beforehand, and the range of choices they’re presented. And they’re easily manipulated by small, immediate incentives, whether it’s a free pizza or a minor inconvenience. A study published this year by Dan Svirsky of Harvard Law School found that requiring a single extra click to learn about the data they’re trading off can deter many people from making a privacy-protecting decision.

The current framework for privacy regulations in the United States is fundamentally flawed.

Where does that leave the FTC and lawmakers trying to pass privacy bills? For one thing, it suggests that the current framework for privacy regulations in the United States is fundamentally flawed.

Much of today’s U.S. consumer privacy law, as enforced by the FTC, is based on a principle known as notice and choice. Companies have to give notice of the data they plan to collect — usually as part of a lengthy privacy policy — and get users’ consent to do so. So, as long as Facebook was using data only in the ways clearly disclosed in its privacy policies, the FTC had no basis on which to penalize it, even if many users felt that its profiling of them for advertising purposes was intrusive or creepy. The $5 billion fine came only because the FTC found that Facebook had failed to properly inform users that it was sharing their data and that of their friends with third-party developers, such as Cambridge Analytica.

Notice and choice relies on consumers to make rational decisions about online privacy in their own self-interest based on the information in privacy policies. The most obvious problem is that few people have time to actually read those privacy policies, which would seem to suggest that clearer and simpler disclosures are the answer. But if people can’t make consistent or rational choices even when presented with clear information, as academic research indicates, then that won’t solve anything.

There are elements of the Constitution that bear on privacy, such as the Fourth Amendment protections against government search and seizure. But the right to privacy isn’t enshrined as directly as freedom of speech, which is why privacy advocates are pushing for new laws.

Which brings us back to the FTC fines. In her dissent from the $5 billion Facebook settlement, Democratic FTC commissioner Rebecca Kelly Slaughter acknowledged that “injury to the public” — part of the legal basis for fines — “can be difficult to quantify in monetary terms in the case of privacy violations.” But she implied that the harms from the Cambridge Analytica affair, which included the use of users’ data by political campaigns without their consent in the 2016 U.S. presidential election, went beyond anything that could be measured in a controlled economics study. Rohit Chopra, Slaughter’s fellow Democrat on the commission, warned that the monetary fine was not a strong deterrent to companies like Facebook, because it left intact the business model to quickly recoup the loss through further monetization of users’ data. Both argued that a stronger approach would be to hold Facebook executives, such as CEO Mark Zuckerberg, personally liable.

Perhaps, then, the question of how much companies like Facebook and Google should be fined for privacy violations is the wrong one to ask in the first place. No dollar amount can perfectly reflect the value of data protection to citizens, and no reasonable dollar amount can deter companies for whom data is an inexhaustible gold mine. What the FTC fines amount to, then, is a sort of charade in which the government pretends to stand up for consumers’ online privacy while preserving the status quo that ensures it will continue to be violated.

Press the "Connect" Button Below to Join Our Discord Community!

Please DM us with your email address if you are a full OMNIA member and want to be given full Discord privileges.

An Update Regarding Our Portfolio

RSC Subscribers,

We are pleased to share with you our Community Portfolio V3!

Add your own voice to our portfolio by clicking here.

We intend on this portfolio being balanced between the Three Pillars of the Token Economy & Interchain:

Crypto, STOs, and DeFi projects

We will also make a concerted effort to draw from community involvement and make this portfolio community driven.

 

Here’s our past portfolios for reference: 

 

 

RSC Managed Portfolio (V2)

 

 [visualizer id=”84848″] 

 

RSC Unmanaged Altcoin Portfolio (V2)

 

 [visualizer id=”78512″] 

 

RSC Managed Portfolio (V1)