390

FTC: Vast Surveillance of Users by Social Media and Video Streaming Companies

To me, what's missing from that set of recommendations is some method to increase the liability of companies who mishandle user data.

It is insane to me that I can be notified via physical mail of months old data breaches, some of which contained my Social Security number, and that my only recourse is to set credit freezes from multiple credit bureaus.

15 hours agosrndsnd

> To me, what's missing from that set of recommendations is some method to increase the liability of companies who mishandle user data.

As nice as this is on paper, it will never happen, lobbyist exists. Not to be tinfoil hat but why would any lawmaker slap the hand that feeds them.

Until there is an independent governing body which is permitted to regulate over the tech industry as a whole it wont happen. Consider the FDA, they decide which drugs and ingredients are allowed and that's all fine. There could be a regulating body which could determine the risk to people's mental health for example from 'features' of tech companies etc. But getting that body created will require a tragedy. Like why the FDA was created in the first place. [1]

That's just my 2cents.

1 : https://www.fda.gov/about-fda/fda-history/milestones-us-food....

12 hours agobilekas

>There could be a regulating body which could determine the risk to people's mental health for example from 'features' of tech companies etc.

I think ideas like this is why it's not going to happen.

Our understanding of mental health is garbage. Psychiatry used to be full of quackery and very well still might be. Treatment for something like depression boils down to "let's try drug in a random order until one works". It's a field where a coin-flip rivals the accuracy of studies. Therefore any regulating body on that will just be political. It will be all about the regulators "doing something" because somebody wrote enough articles (propaganda).

Problems like this are why people aren't interested in supporting such endeavors.

8 hours agoAerroon

That is not the treatment for depression.

this argument reduces mental health to medication, which leaves aside everything from the history of mental health (asylums, witch burnings to today), leaps in medicine (from lobotomies, to SNRIs, bipolar meds and more), to simply better diagnoses.

There are certainly tons of people here who have benefited from mental health professionals - overextending the flaws in psych simply to dismiss the idea of a watchdog is several unsupported arguments too far.

4 hours agointended

I disagree, in brief because the practical side of psychiatry is medication-dominated, mostly because medical research is difficult and expensive.

There are some non-medication treatments for some psychiatric symptoms such as those caused by trauma (Prominently, EMDR) that some hail as actual cures, and even maybe depression (I am clearly not a doctor.) but in the case of depression I think you'll find its quite medication-heavy.

The reason for this is that psychiatrists are Medical Doctors and Psychiatry is a medical field which is of course bounded by the means of medical science. This is not to say there is some "magic" at work which science could never understand--not at all. It is merely the case that medical doctors are a research paper oriented bunch, and most of the medical research which makes it into practice is either relating to anatomy or pharmaceutical interventions.

Most of the treatments we have are pharmaceutical medications because most of our research dollars have gone into pharmaceutical research.

I decided to edit this comment to add: In my personal opinion, is probable that psychiatrists et all, writ large as it were, have already figured out how to cure depression. Only, we cannot really manage to employ it because it isn't a pill, therapy, device or surgery.

an hour agonight862

YMMV, but it took me 15 minutes start to finish to freeze my credit with the 3 bureaus using the following instructions.

https://www.nerdwallet.com/article/finance/how-to-freeze-cre...

4 hours agorunjake

Ok, but this is something that shouldn't be my problem. And it's not just that; I have to go unfreeze it if someone needs to run a credit check.

an hour agosaagarjha

I’m completely sympathetic to making companies more liable for data security. However, until data breaches regularly lead to severe outcomes for subjects whose personal data was leaked, and those outcomes can be causally linked to the breaches in an indisputable manner, it seems unlikely for such legislation to be passed.

12 hours agolayer8

I forgot where I saw this, but the US govt recently announced that they see mass PII theft as a legitimate national security issue.

It’s not just that you or I will be inconvenienced with a bit more fraud or email spam, but rather that large nation state adversaries having huge volumes of data on the whole population can be a significant strategic advantage

And so far we typically see email+password+ssn be the worst data leaked; I expect attackers will put in more effort to get better data where possible. Images, messages, gps locations, etc

11 hours agowepple

yes, privacy is not an individual problem; it's a civil defense problem, and not just when your opponent is a nation-state. we already saw this in 02015 during the daesh capture of mosul; here's the entry from my bookmarks file:

https://www.facebook.com/dwight.crow/media_set?set=a.1010475... “#Weaponry and morale determine outcomes. The 2nd largest city of Iraq (Mosul) fell when 1k ISIS fighters attacked “60k” Iraqi army. 40k soldiers were artifacts of embezzlement, and of 20k real only 1.5k fought - these mostly the AK47 armed local police. An AK47 loses to a 12.7mm machine gun and armored suicide vehicle bombs. Finally, the attack was personal - soldiers received calls mid-fight threatening relatives by name and address. One army captain did not leave quickly enough and had two teenage sons executed.” #violence #Iraq #daesh

of course the americans used this kind of personalized approach extensively in afghanistan, and the israelis are using it today in lebanon and gaza, and while it hasn't been as successful as they hoped in gaza, hamas doesn't exactly seem to be winning either. it's an asymmetric weapon which will cripple "developed" countries with their extensive databases of personal information

why would a politician go to war in the first place if the adversary has the photos and imeis of their spouse, siblings, and children, so they have a good chance of knowing where they are at all times, and the politician can't hope to protect them all from targeted assassination?

the policy changes needed to defend against this kind of attack are far too extreme to be politically viable. they need to be effective at preventing the mere existence of databases like facebook's social graph and 'the work number', even in the hands of the government. many more digital pearl harbors like the one we saw this week in lebanon will therefore ensue; countries with facebook, credit bureaus, and national identity cards are inevitably defenseless

imposing liability on companies whose data is stolen is a completely ineffective measure. first, there's no point in punishing people for things they can't prevent; databases are going to get stolen if they're in a computer. second, the damage done even at a personal level can vastly exceed the recoverable assets of the company that accumulated the database. third, if a company's database leaking got your government overthrown by the zetas or daesh, what court are you going to sue the company in? one operated by the new government?

9 hours agokragen

Are you saying you think more critical government databases than OPM or security clearance rosters are inevitably going to be breached? I'd like to think the government or corporation can effectively protect some databases at least...

8 hours agotreypitt

those are already pretty bad, but i think the really dangerous ones are things like verizon's billing records and customer location history, credit card transaction histories, license plate registrations, credit bureau histories, passport biometrics, enough voice recordings from each person for a deepfake, public twitter postings, etc.

consider https://en.wikipedia.org/wiki/1943_bombing_of_the_Amsterdam_...:

> The 1943 bombing of the Amsterdam civil registry office was an attempt by members of the Dutch resistance to destroy the Amsterdam civil registry (bevolkingsregister), in order to prevent the German occupiers from identifying Jews and others marked for persecution, arrest or forced labour. The March 1943 assault was only partially successful, and led to the execution of 12 participants. Nevertheless, the action likely saved many Jews from arrest and deportation to Nazi extermination camps.

to avoid partisan debate, imagine a neo-nazi group takes over the us, which presumably we can all agree would be very bad. after they took over, how hard would it be for them to find all the jews? not just make a list of them, but physically find them? (much easier than it was in 01943, i'm sure we can agree.) how hard would it be for them to find all the outspoken anti-fascists? where could those anti-fascists hide?

now, step it up a notch. how hard would it be for them to find all the jews before they take over? it wouldn't be that hard if the databases leak. and if you feel safe because you're not jewish, rest assured that neo-nazis aren't the only groups who are willing to use violence for political ends. someone out there wants you dead simply because of the demographic groups you belong to. the reason you haven't been seeing widespread political violence previously is that it hasn't been a winning strategy

the situation is changing very fast

7 hours agokragen

Hey, on a long enough timeline the answer will tend towards yes.

Do note, that this isn’t just an Americas problem.

Your data is probably on DBs in other nations.

Plus - the playbook is to target weaker nations and then use them for staging grounds to target stronger nations.

4 hours agointended

Perhaps you're not aware of https://en.wikipedia.org/wiki/Office_of_Personnel_Management...

9 hours agodantheman

Very aware of that. That to me seemed like a targeted attack by a tracked APT group. What I’m referring to above is that the more vanilla attacks (ex: popular online mattress store gets popped) actually have national security implications, despite seeming like just an inconvenience

8 hours agowepple

> Even minutiae should have a place in our collection, for things of a seemingly trifling nature, when enjoined with others of a more serious cast, may lead to valuable conclusion.

— George Washington.

2 hours agogrugq

They’d need a lot less security if they stopped spying on us and saving all of our most critical ID data, period.

9 hours agoEasyMark

Nearly everyone's data has been leaked already. Any strong protections would only protect people who haven't been born yet imo.

7 hours agodeegles

Then instead of regulating the companies, make SSN easily revokable and unique per service. I don't understand why Americans are so oppposed to a national ID despite the fact that every KYC service use SSNs and driver licenses.

11 hours agoOnavo

Because they're the mark of the beast or a step towards fascism or something.

I don't think it would take much to convert real IDs into a national ID, they are as close to as they can get without "freaking people out".

11 hours agocandiddevmike

Emphasizing that the number can be changed would really help there.

People could even generate their own number (private key), which they never gave out, and appeared differently to each account manager verifying it, and still replace them.

When you choose your own number, it's only the Mark of the Beast if you are the Beast! * **

* 666, 13, 69 and 5318008 expressly prohibited.

** Our offices only provide temporary tattoos.

8 hours agoNevermark

The expansion of KYC and the hegemonic dominance of our global financial intelligence network is a recent infringement on our privacy that would not necessarily pass popular muster if it became well-known.

Most of our population is still living in a headspace where transactions are effectively private and untraceable, from the cash era, and has not considered all the ways that the end of this system makes them potential prey.

The fact is that the market is demanding a way to identify you both publicly and privately, and it will use whatever it needs to, including something fragile like a telephone number 2fa where you have no recourse when something goes wrong. It's already got a covert file on you a mile long, far more detailed than anything the intelligence agencies have bothered putting together. The political manifestation of anti-ID libertarians is wildly off base.

11 hours agomapt

The concern about organizations and the governments feelings that it needs to track you is a very valid concern. Why does the government need to make sure your "hand job from a friend" venmo payment to your friend is "legally legit"? (You can get transactions flagged for this and the moderator will shame you)

Are you correct in what's going on? Yes. Are we placed in this with no option to resist? For the most part yes.

5 hours agomonksy

"What fraction of the FBI and CIA do the Communists have blackmail material on?"

11 hours agomapt

I think the only reason were seeing this revelation from a federal agency after 20 years is to boost the governments case against tiktok.

7 hours agonimbius

If your identity gets stolen, you should be able to sue all the companies that had a leak.

7 hours agobhhaskin

Sounds like a bunch of crap the industry is already trying to sell the public and no its not working and yes we can do with out it.

8 hours agotrinsic2

I agree. Let me tell you about what just happened to me. After a very public burnout and spiral, a friend rescued me and I took a part time gig helping a credit card processing company. About 2 months ago, the owner needed something done while I was out, and got their uber driver to send an email. They emailed the entire customer database, including bank accounts, socials, names, addresses, finance data, to a single customer. When I found out, (was kept hidden from me for 11 days) I said "This is a big deal, here are all the remediations and besides PCI we have 45 days by law to notify affected customers." The owner said "we aren't going to do that", and thus I had to turn in my resignation and am now unemployed again.

So me trying to do the right thing, am now scrambling for work, while the offender pretends nothing happened while potentially violating the entire customer base, and will likely suffer no penalty unless I report it to PCI, which I would get no reward for.

Why is it everywhere I go management is always doing shady stuff. I just want to do linuxy/datacentery things for someone who's honest... /cry

My mega side project isn't close enough to do a premature launch yet. Despite my entire plan being to forgo VC/investors, I'm now considering compromising.

13 hours agoarminiusreturns

>Why is it everywhere I go management is always doing shady stuff.

Well here's a cynical take on this - management is playing the business game at a higher level than you. "Shady stuff" is the natural outcome of profit motivation. Our society is fundamentally corrupt. It is designed to use the power of coercive force to protect the rights and possessions of the rich against the threat of violence by the poor. The only way to engage with it AND keep your hands clean is to be in a position that lets you blind yourself to the problem. At the end of the day, we are all still complicit in enabling slave labor and are beneficiaries of policies that harm the poor and our environment in order to enrich our lives.

>unless I report it to PCI, which I would get no reward for.

You may be looking at that backwards. Unless you report it to PCI, you are still complicit in the mishandling of the breach, even though you resigned. You might have been better off reporting it over the owner's objections, then claiming whistleblower protections if they tried to terminate you.

This is not legal advice, I am not a lawyer, I am not your lawyer, etc.

13 hours agoaftbit

I did verify with an attorney that since I wasn't involved and made sure the owner knew what was what, that I had no legal obligations to disclose.

13 hours agoarminiusreturns

The problem isn't society or profit motivation. It's people. Humanity itself is corrupt. There aren't "good people" and "bad people". There's only "bad people." We're all bad people, just some of us are more comfortable with our corruption being visible to others to a higher degree.

13 hours agopositus

> We're all bad people, just some of us are more comfortable with our corruption being visible to others to a higher degree.

If the GP's story is true (and I have no reason to suspect otherwise), then there are clearly differences in the degree of "badness" between people. GP chose to resign from his job, while his manager chose to be negligent and dishonest.

So, even if we're all bad people, there are less bad and more bad people, so we might as well call the less bad end of the spectrum "good". Thus, there are good and bad people.

12 hours agoragnese

I understand your perspective, but I maintain that "good" (morally pure) isn't a category any of us belong to. We're all lying, hateful people to one extent or another, and lying hateful people aren't "good", even if we haven't lied or hated as much as other lying, hateful people. "Less evil" isn't synonymous with "good".

The argument that profit motivation is the origin of shady business practices ignores the existence of those businesses which pursue profit in an ethical manner. The company I work for, for instance, is highly motivated to produce a profit, but the way we go about obtaining that profit is by providing our customers with products that have real value, at fair (and competitive) prices, and by providing consistently excellent customer support. Our customers are *very* satisfied with our products and services, and they show their satisfaction with extreme brand loyalty. The profit we make year over year allows us to increase the quality of life for our employees, and keeps our employees highly motivated towards serving our customers. We pursue the good of our customers alongside our own, and we avoid shady business practices like the plague.

5 hours agopositus

The DOJ has just launched a corporate whistleblower program, you should look into it maybe it covers your case:

https://www.justice.gov/criminal/criminal-division-corporate...

>As described in more detail in the program guidance, the information must relate to one of the following areas: (1) certain crimes involving financial institutions, from traditional banks to cryptocurrency businesses; (2) foreign corruption involving misconduct by companies; (3) domestic corruption involving misconduct by companies; or (4) health care fraud schemes involving private insurance plans.

>If the information a whistleblower submits results in a successful prosecution that includes criminal or civil forfeiture, the whistleblower may be eligible to receive an award of a percentage of the forfeited assets, depending on considerations set out in the program guidance. If you have information to report, please fill out the intake form below and submit your information via CorporateWhistleblower@usdoj.gov. Submissions are confidential to the fullest extent of the law.

13 hours agoValentinA23

Why would you resign? You could have reported it yourself and then you would have whistleblower protections - if the company retaliated against you (e.g. fired you), you then would have had a strong lawsuit.

13 hours agoTinyRick

Because I don't want to be associated with companies that break the law and violate regulations knowingly. I've long had a reputation of integrity, and it's one of the few things I have left having almost nothing else.

13 hours agoarminiusreturns

So you would rather be known as someone who had an opportunity to report a violation, and chose not to? From my perspective it seem like you decided against acting with integrity in this situation - the moral thing would have been to report the violation, but you chose to look the other way and resign.

13 hours agoTinyRick

> it seem like you decided against acting with integrity in this situation ... you chose to look the other way and resign.

I agree with this statement.

This isn't a judgement, we all have to make choices; the "right" choice (the one that aligns with integrity) is usually the one that will be the least self-serving and even temporarily harmful. They did what was right for them, that's okay, but it was not the choice of integrity.

8 hours ago1659447091

How is quitting right for them? They chose a path that's bad for the users and bad for them.

6 hours agoDylan16807

Because that is the choice they made for themselves.

How it plays out after is another matter entirely. But the choice was what they seemed to think was right, for them, at the time. Thus it was the right choice for them. It doesn't mean it was the right choice in terms of integrity, or the right choice for me, or you or anyone whose data got caught up in it. Nor was it right choice in receiving a paycheck the next week.

But the way it was explained, it doesn't seem like they went out of their way to pick a "wrong" choice, specifically. They picked what they felt was the right one, for them, at that time. There were less ethical options to choose as well, and those were not picked either.

5 hours ago1659447091

Someone choosing an action does not at all mean it's the right choice for them.

5 hours agoDylan16807

I believe we are talking two separate things.

You appear to be talking about the external consequences of choices, while I am talking about them making a choice based on what they believed was the inner rightness of their choice. They did not want to be associated with a company like that, so they made the choice to not be -- because it aligned with their inner knowing of not wanting to be a part of that company. The right or wrongness in terms of external consequences is not what makes the choice, right or wrong -- for them

5 hours ago1659447091

But they left the vast majority of the morality on the table. They even talked to a lawyer to avoid reporting. So in the sense of making the choice that aligns with inner rightness and makes them moral, they still made a bad choice.

4 hours agoDylan16807

> making the choice that aligns with inner rightness

Again, I am talking about -- them -- not anyone one else or what anyone else thinks of it outside of them. I am not talking about "inner rightness" in general, I am talking "what they believed was the inner rightness of their choice" -- Their inner rightness. You seem to be talking about what -- you and/or others -- may believe from an outside perspective. My outside perspective is they made the choice that did not align with integrity. But that does not mean that was not the right choice for them.

And again, they made the right choice, for them -- at that time. How that plays out after is neither here nor there and in your labeling it a "bad" choice for them is akin to saying that they have no real agency over their choices, and we outside of them are the final say in what is good or bad for that person.

3 hours ago1659447091

I wonder if I was part of the database that got emailed.

12 hours agoqup

Very unlikely, this is a very small operation with a tiny customer base.

10 hours agoarminiusreturns

As in.. his actual Uber driver? He just handed his laptop over?

13 hours agomikeodds

Yes. The owner is old, and going blind, but refuses to sell or hand over day to day ops to someone else, and thus must ask for help on almost everything. I even pulled on my network to find a big processor with a good reputation to buy the company, but after constant delays and excuses for not engaging with them, I realized to the owner the business is both their "baby" and their social life, neither of which they want to lose.

13 hours agoarminiusreturns

Regulation is key, but I don’t see it as likely when our society is poisoned by culture war bs. Once we put that behind us (currently unlikely), we can pass sane laws reigning in huge corporations.

15 hours agoalsetmusic

[flagged]

12 hours agoOkeyDokey2

[flagged]

13 hours agoOkeyDokey2

This does nothing for them being able to continue with shadow profiles and inferences about you based on data they gather from others in your social network. It is well beyond "data you provide". Like waaaaay beyond.

13 hours agodylan604

[flagged]

12 hours agoOkeyDokey2

I get a feeling that liability is the missing piece in a lot of these issues. Section 230? Liability. Protection of personal data? Liability. Minors viewing porn? Liability.

Lack of liability is screwing up the incentive structure.

13 hours ago2OEH8eoCRo0

I think I agree, but people will have very different views on where liability should fall, and whether there is a malicious / negligent / no-fault model?

Section 230? Is it the platform or the originating user that's liable?

Protection of personal data? Is there a standard of care beyond which liability lapses (e.g. a nation state supply chain attack exfiltrates encrypted data and keys are broken due to novel quantum attack)?

Minors viewing porn? Is it the parents, the ISP, the distributor, or the creator that's liable?

I'm not here to argue specific answers, just saying that everyone will agree liability would fix this, and few will agree on who should be liable for what.

13 hours agobrookst

It's not a solvable problem. Like most tech problems it's political, not technical. There is no way to balance the competing demands of privacy, security, legality, and corporate overreach.

It might be solvable with some kind of ID escrow, where an independent international agency managed ID as a not-for-profit service. Users would have a unique biometrically-tagged ID, ID confirmation would be handled by the agency, ID and user behaviour tracking would be disallowed by default and only allowed under strictly monitored conditions, and law enforcement requests would go through strict vetting.

It's not hard to see why that will never happen in today's world.

13 hours agoTheOtherHobbes

> It's not a solvable problem

Lawnmower manufacturers said the same thing about making safe lawnmowers. Until government regulations forced them to

12 hours agomalfist

https://i.imgur.com/mXU28ta.jpeg

Specifically, 1970.

11 hours agoToucanLoucan

Well, something to consider is that part of why everything is so much expensive these days is that a lot of the solutions to those problems add costs. That cost needs to be absorbed by the price.

One of the reasons it's so expensive to build a house is safety regulations. They exist for a reason, but they nevertheless add a substantial cost to building a house. If you had mandated such a cost to people living in 1870 then a lot fewer people could've afforded a house.

8 hours agoAerroon

>Protection of personal data? Is there a standard of care beyond which liability lapses (e.g. a nation state supply chain attack exfiltrates encrypted data and keys are broken due to novel quantum attack)?

There absolutely should be, especially for personal data collected and stored without the express written consent of those being surveilled. They should have to get people to sign off on the risks of having their personal data collected and stored, be legally prevented from collecting and storing the personal data of people who haven't consented and/or be liable for any leaking or unlawful sharing/selling of this data.

12 hours agoStanislavPetrov

If you aren’t directly harmed yet what liability would they have? I imagine if your identity is stolen and it can be tied to a breach then they would already be liable.

14 hours agozeroonetwothree

The fact that my data can be stolen in the first place is already outrageous, because I neither consented to allowing these companies to have my data, nor benefit from them having my data.

It's like if you go to an AirBNB and the owner sneaks in at night and takes photos of you sleeping naked and keeps those photos in a folder on his bookshelf. Would you be okay with that? If you're not directly harmed, what liability would they have?

Personal data should be radioactive. Any company retaining it better have a damn good reason, and if not then their company should be burned to the ground and the owners clapped in irons. And before anyone asks, "personalized advertisements" is not a good reason.

14 hours agokibwen

That's the big problem with relying on tort law to curb this kind of bad corporate behavior: The plaintiff has to show actual injury or harm. This kind of bad behavior should be criminal, and the state should be going after companies.

14 hours agoryandrake

I don't think thats a proper parallel.

I think a better example would be You (AirBnB Host) rent a house to Person and Person loses the house key. Later on (perhaps many years later), You are robbed. Does Person have liability for the robbery?

Of course it also gets really muddy because you'll have renting the house out for those years and during that time many people will have lost keys. So does liability get divided? Is it the most recent lost key?

Personally, I think it should just be some statutory damages of probably a very small amount per piece of data.

14 hours agolesuorac

> I think a better example would be You (AirBnB Host) rent a house to Person and Person loses the house key.

This is not a direct analogue, a closer analogy would be when the guest creates a copy of the key (why?) without my direct consent (signing a 2138 page "user agreement" doesn't count) and at some later point when I am no longer renting to them, loses the key.

13 hours agopolygamous_bat

I'm still much more interested in the answer to who is liable for the robbery.

Just the Robber? Or are any of the key-copiers (instead of losers w/e) also?

13 hours agolesuorac

I don't really care about the answer to that specific question, where there's only one household.

What I will say is the guy that has copies of 20000 people's keys should get in trouble if he loses his horde.

6 hours agoDylan16807

The particular problem comes in because the amount of data lost tends to be massive when these breaches occur.

It's kind of like the idea of robbing a minute from someone's life. It's not every much to an individual, but across large populations it's a massive theft.

13 hours agopixl97

Sure and if you pay a statutory fine times 10 million then it becomes a big deal and therefore companies would be incentivized to protect it better the larger they get.

Right now they probably get some near free rate to offer you credit monitoring and dgaf.

13 hours agolesuorac

This version loses multiple parts of things that are important

1. I have no control over what was stored 2. I have no control over where the storage is

The liability in this case is the homeowner/host, as you should have and had full ability to change out the locks.

To make it more similar, I think you'd need one of the guests to have taken some amount of art off the wall, and brought it to a storage unit, and then the art later was stolen from the storage unit, and you don't have access to the storage unit.

It's not as good as the naked pictures example because what's been taken is copies of something sensitive, not the whole thing

13 hours ago8note

> before anyone asks, "personalized advertisements" is not a good reason

The good reason is growth. Our AI sector is based on, in large part, the fruits of these data. Maybe it's all baloney, I don't know. But those are jobs, investment and taxes that e.g. Europe has skipped out on that America and China are capitalising on.

My point, by the way, isn't pro surveillance. I enjoy my privacy. But blanket labelling personal data as radioactive doesn't seem to have any benefit to it outside emotional comfort. Instead, we need to do a better job of specifying which data are harmful to accumulate and why. SSNs are obviously not an issue. Data that can be used to target e.g. election misinformation are.

13 hours agoJumpCrisscross

See - your problem is you think you're talking to politicians, Facebook-era journalists, disinfo activists.

Most people here have thought more about the the topic of privacy in the modern era far more than what some 70 year old politician has.

33 minutes agorockskon

So you're saying it's all vastly valuable and that's why it is right that it is taken without consent or compensation?

10 hours agothfuran

> it's all vastly valuable and that's why it is right that it is taken without consent or compensation?

No, I'm saying it's a common with a benefit to utilisation. A lot of discussions around data involve zealouts on both sides. (One claiming it's the god-given right to harvest everyone's personal information. The other acting like it's the crime of the century for their email address to be leaked.)

8 hours agoJumpCrisscross

[flagged]

12 hours agoin2thec

I mean it's pretty clear that you are directly harmed if someone takes naked photos of you without your knowledge or consent and then keeps them. It's not a good analogy so if we want to convince people like the GP of the points you're making, you need to make a good case because that is not how the law is currently structured. "I don't like ads" is not a good reason, and comments like this that are seething with rage and hyperbole don't convince anyone of anything.

14 hours agopc86

What is the harm? It is not obvious to me, if the victim is unaware...unless you are alleging simply that there is some ill-defined right to privacy. But if that is so, why does it apply to my crotch and not my personal data?

14 hours agodrawkward

These are exactly my questions. If I never, ever know about those pictures and never, ever have my life affected by those pictures, what is the actual harm to me?

If the answer to them ends up being "Well, it's illegal to take non-consensual nudie pictures.", then my follow-up question is "So, why isn't the failure to protect my personal information also illegal?".

To be perfectly clear, I do believe that the scenario kibwen describes SHOULD be illegal. But I ALSO believe that it should be SUPER illegal for a company to fail to secure data that it has on me. Regardless of whether they are retaining that information because there is literally no way they could provide me with the service I'm paying them for without it, or if they're only retaining that information in the hopes of making a few pennies off of it by selling it to data brokers or whoever, they should have a VERY SERIOUS legal obligation to keep that information safe and secure.

13 hours agosimoncion

> to fail to secure data that it has on me

Just want to point out that the company is usually also doing what it can to get other information about you without your consent based on other information it has about you. It's a lot closer to the "taking non-consensual nudie pictures" than "fail to secure data" makes it sound.

12 hours agolcnPylGDnU4H9OF

[flagged]

12 hours agoin2thec

> it's pretty clear that you are directly harmed if someone takes naked photos of you without your knowledge or consent and then keeps them

Sure. In those cases, there are damages and that creates liability. I'm not sure what damages I've ever faced from any leak of e.g. my SSN.

13 hours agoJumpCrisscross

I mean most people won't until that day they find out theirs a house in Idaho under their name (and yes I've seen just this happen).

The problem here is because of all these little data leaks you as an individual now bear a cost ensuring that others out there are not using your identity and if it happens you have to clean up the mess by pleading it wasn't you in the first place.

13 hours agopixl97

>I neither consented to allowing these companies to have my data, nor benefit from them having my data.

I think both of those are debatable.

13 hours agoranger_danger

[flagged]

12 hours agoin2thec

Go ahead, post your phone number here. It's not directly harmful.

14 hours agodrawkward

1-800-call-FEDS

10 hours agoblondelegs

Bahahaha :)

8 hours agodrawkward

This is the traditional way of thinking, and a good question, but it is not the only way.

An able bodied person can fully make complaints against any business that fails their Americans with Disabilities Act obligation. In fact these complaints by able bodied well-doers is the de facto enforcement mechanism even though these people can never suffer damage from that failure.

The answer is simply to legislate the liability into existence.

13 hours agohalJordan

That's the whole problem with "liability", isn't it? If the harms you do are diffuse enough then nobody can sue you!

14 hours agoidle_zealot

The same way you can get ticketed for speeding in your car despite not actually hitting anyone or anything.

13 hours agosqueaky-clean

Surveillance apologist.

14 hours agodrawkward

This is exactly why thinking of it in terms of individual cases of actual harm, as Americans have been conditioned to do by default, is precisely the wrong way to think about it. We're all familiar with the phrase "an ounce of prevention is worth a pound of cure", right?

It's better to to think of it in terms of prevention. This fits into a category of things where we know they create a disproportionate risk of harm, and we therefore decide that the behavior just shouldn't be allowed in the first place. This is why there are building codes that don't allow certain ways of doing the plumbing that tend to lead to increased risk of raw sewage flowing into living spaces. The point isn't to punish people for getting poop water all over someone's nice clean carpet; the point is to keep the poop water from soaking the carpet in the first place.

14 hours agobunderbunder

Safety rules are written in blood. After a disaster there’s a push to regulate. After enough years we only see the costs of the rules and not the prevented injuries and damage. The safety regulations are then considered annoying and burdensome to businesses. Rules are repealed or left unenforced. There is another disaster…

13 hours agosupertrope

Tangentially, there was an internet kerfuffle about someone getting in trouble for having flower planters hanging out the window of their Manhattan high rise apartment a while back, and people's responses really struck me.

People from less dense areas generally saw this as draconian nanny state absurdity. People who had spent time living in dense urban areas with high rise residential buildings, on the other hand, were more likely to think, "Yeah, duh, this rule makes perfect sense."

Similarly, I've noticed that my fellow data scientists are MUCH less likely to have social media accounts. I'd like to think it's because we are more likely to understand the kinds of harm that are possible with this kind of data collection, and just how irreparable that harm can be.

Perhaps Americans are less likely to support Europe-style privacy rules than Europeans are because Americans are less likely than Europeans to know people who saw first-hand some of what was happening in Europe in the 20th century.

12 hours agobunderbunder

Behind the ball by 15 years to start taking this seriously and beginning to think about pushing back, but better late than never.

Next please reign in the CRAs.

15 hours agovundercind

I think Snowden was bang on when in 2013 he warned us of a last chance to fight for some basic digital privacy rights. I think there was a cultural window there which has now closed.

15 hours agoflycaliguy

Snowden pointed and everyone looked at his finger. It was a huge shame, but a cultural sign that the US is descending into a surveillance hell hole and people are ok with that. As someone who was (and still is) vehemently against PRISM and NSLs and all that, it was hard to come to terms with. I'm going to keep building things that circumvent the "empire" and hope people start caring eventually.

14 hours agoorthecreedence

> and people are ok with that

I've seen no evidence of this. People mostly either don't understand it for feel powerless against it.

14 hours agodigging

There's also a vast amount of people that were just too young to be aware of Snowden's revelations. These people are now primarily on TikTok what not, and I doubt there's much in those feeds to bring them to light while directly feeding the beast of data hoarding.

13 hours agodylan604

>> and people are ok with that

> I've seen no evidence of this. People mostly either don't understand it for feel powerless against it.

Isn't feeling powerless and being ok with it, ultimately the same thing: Complacency

3 hours agotommiegannert

> I've seen no evidence of this

Over 99% of Americans point a camera at themselves while they take a shit.

12 hours agodavisr

And I'd bet over 99% of those people have never once considered that said camera could even be capable of saving any data without them operating it.

12 hours agolcnPylGDnU4H9OF

Very doubtful they've not considered it. When I go to coffee shops, I see maybe a quarter-to-half the laptops have a shade over the webcam. But when I see people using their phones, I've never once seen them use a shade, piece of tape, or post-it note.

They use the front-facing camera of their phone so often that the temporary inconvenience of removing a shade outweighs the long-term inconvenience of malware snapping an exposing photo.

12 hours agodavisr

But do you think they're taking a measured inventory of the possible consequences, both personal and societal, and saying, "No, I don't value that" ?

Extremely few decisions that people make are deeply calculated with cold logic. Most decisions are primarily unconscious, automatic, and emotional.

Example: A persons hears it's good to have a webcam cover, so they get one. Nobody mentions doing it for their phone, so they never even think about it. Then someday a friend does mention it, but that would be an inconvenient change, so the person's gut puts up resistance against considering it too strongly. They give in to their emotional response, instead of doing the hard work of changing their emotions based on the knowledge they have.

At no point in the above scenario would the person state "I don't think mass surveillance is a bad thing." For me, that's why I mean when I say people "aren't ok with it."

If one's definition of people being "ok with mass surveillance" just means they tolerate it, that they don't sufficiently resist it (and what level of resistance is sufficient? For a person with a webcam cover but no phone cam cover? Does adding a phone cam cover mean they've declared their opposition to mass surveillance?), then how can you say people aren't okay with literally everything evil or wrong? Most people just won't summon enough activation energy to fight any given injustice around them, no matter how egregious it is. That's not a reflection of their morals and values, it's a reflection of how fucking tired we all are.

I would challenge you to offer up in detail how strongly you have worked to resist mass surveillance in your life. You're logged in and posting on HN, so my guess is, you haven't worked hard enough at it according to someone's metric. Do you have a cover on your phone camera? Just the front one or both? Do you have a cover on the microphones? Do you let others add your number in their contacts or do you refuse to ever give out your real phone number?

10 hours agodigging

The cover over the webcam might not be for security per se. It could be they don't want anyone at work - or home? - to accidentally see where they are. If you cover the camera you don't have to worry any such accidents.

My gut says that for most people is the reason.

11 hours agochiefalchemist

Snowden couldn't convince people that the privacy he was talking about meant a limit on government power. Not sensitive data. And honestly, nobody cares about anyone taking a shit.

You can advocate for limiting govt. power ("LGP") without leaking any NSA docs. I don't think a single story about "LGP" changed due to the leaks. Everyone knows the government can do a lot of violence on you. So it's very hard.

If you're a high drama personality, yeah you can conflate all these nuanced issues. You can make privacy mean whatever you want.

12 hours agodoctorpangloss
[deleted]
12 hours ago

I've seen no evidence people aren't ok with that. Most people around me didn't care about the Snowden revelations. It was only tech people who tightened up security.

13 hours agoimmibis

This is my experience as well. I talked to a LOT of people after the Snowden debacle (techies and otherwise) and the general attitude was "so what? they aren't using the information for anything bad!" or "I have nothing to hide!" (in this thread, for instance: https://news.ycombinator.com/item?id=41594775)

I think people don't really understand what an enormous sleeping dragon the entire thing is.

12 hours agoorthecreedence

> I think people don't really understand what an enormous sleeping dragon the entire thing is.

Isn't that what I said? Mostly we're debating semantics. My deeper point is that it's counterproductive and borderline misanthropic to argue "People just don't care about evil being done!" whereas the argument that "People seriously have no idea yet what they're 'agreeing' to" opens the door to actual solutions, for one inclined to work on them.

10 hours agodigging

But what is the "enormous sleeping dragon" my mom, dad, little sister, and teenage cousins need to understand? - and, even once it's patently clear, does it with certainly not result in another "...and???"

7 hours agoneom

But won't you think of the children!

(EU is trying to implement chat control again...)

We need more real-world analogies... "see, this is like having a microphone recording everything you say in this bar"... "see, this is like someone ID-ing you infront of every store and recording what store you've visited, and then following you inside to see what products you look at. See, this is like someone looking at your clothes and then pasting on higer price tags on products. ..."

12 hours agoajsnigrutin

>and people are ok with that.

All the propagandists said he was a Russian asset, as if even if that were true, it somehow negated the fact that we were now living under a surveillance state.

>Snowden pointed and everyone looked at his finger.

This is a great way of putting it.

14 hours agoClubber

> it somehow negated the fact that we were now living under a surveillance state.

There's long been surveillance programs and also numerous laws outlining the responsibilities of telecom provides to enable wire tapping.

There's really nothing new from Snowden besides the names of a bunch of people to go kill cause they're spies.

FISA [1] isn't a private law either.

https://en.wikipedia.org/wiki/COINTELPRO

https://en.wikipedia.org/wiki/Mass_surveillance_in_the_Unite...

Note: 2006 (Klien) predates 2013 (Snowden)

https://en.wikipedia.org/wiki/Room_641A

[1]: https://en.wikipedia.org/wiki/Foreign_Intelligence_Surveilla...

14 hours agolesuorac

>There's really nothing new from Snowden besides the names of a bunch of people to go kill cause they're spies.

https://en.wikipedia.org/wiki/2010s_global_surveillance_disc...

14 hours agoClubber

You are dense. Imagine a government authorizes 10B for a bridge and then in 5 years a bridge shows up.

Now instead, imagine in 1978 [1] a government authorizes "United States federal law that establishes procedures for the surveillance and collection of foreign intelligence on domestic soil" and in 2008 [2] amends it to not be a big deal if they're foreign or not and then 5 years later it turns out they're doing just that.

These bills are not secret. Were not secret. Have never been secret. It's not my fault you didn't read them but it doesn't make Snowden novel.

[1]: https://en.wikipedia.org/wiki/Foreign_Intelligence_Surveilla...

[2]: https://en.wikipedia.org/wiki/Foreign_Intelligence_Surveilla...

13 hours agolesuorac

>You are dense.

Well, maybe you're one of those propagandists. If you can't attack the idea, attack the person, right?

Hand waves, nothing new to see here, carry on.

The bills aren't what were exposed, it was more the techniques and scope. Like PRISM and XKeyScore and circumventing laws by sharing intelligence on US citizens with allies who aren't restricted by US laws. Spying on allied governments, etc. You know, that stuff.

You should really click on the link.

https://en.wikipedia.org/wiki/2010s_global_surveillance_disc...

13 hours agoClubber

> There's long been surveillance programs and also numerous laws outlining the responsibilities of telecom provides to enable wire tapping.

Laws which the telecoms were knowingly and willfully breaking for years.

You do remember that Congress gave them retroactive immunity? [0][1] You do know that this was only granted because people COULD sue (and were suing) them because of the information made public by Snowden and others?

[0] <https://www.aclu.org/news/national-security/retroactive-tele...>

[1] See Title II of the this bill <https://www.congress.gov/bill/110th-congress/house-bill/6304>

12 hours agosimoncion

It makes me irrationally angry that I suddenly started getting spam emails from Experian. Like motherfucker I never consented for you to have my data, then you leak it all, now you're sending me unsolicited junk email? It's just such bullshit that I'm literally forced to have a relationship with these companies to freeze my credit or else I'm at the mercy of whoever they decide to release my information to without my authorization.

15 hours agodevonbleak

Yep. It sucks. Zero consequences of any import for those companies as far as I'm aware too. Tiny fines end up being "cost of doing business". Then they get to externalize their failures onto us by using terms like "Identity Theft", which indicates something was stolen from ME and is now MY problem.

In actuality some not-well-maintained systems owned by <corp> were hacked or exposed or someone perpetrated fraud on a financial institution and happened to use information that identifies me. It's really backwards.

PSA: If you haven't already, go freeze your credit at Experian, TransUnion, Equifax and Innovis. It will make the perpetration of this type of fraud much more difficult for adversaries.

15 hours agonicholasjarnold

PSA pro tip: they will try to steer you toward “locking” your account. Don’t fall for it. Freeze your account.

14 hours agosingleshot_

Do you know why they do this?

a minute agoHugsun

My pet solution has been to make the credit reporters liable for transmitting false information to the CRAs.

Chase tells Experian I opened a new line of credit with them, but it later is demonstrated that it was a scammer with my SSN? Congratulations, $5,000 fine.

Of course this all gets priced in to the cost and availability of consumer credit. Good! Now the lenders have an incentive to drive those costs down (cheaper, better identity verification) to compete.

15 hours agotwoodfin

Can you describe how you make them liable in this arrangement?

13 hours agotrinsic2

You can challenge entries your credit report today. Win the challenge, whoever reported the entry is liable to the Feds. Maybe add a modest bounty for the injured taxpayer.

12 hours agotwoodfin

The solution is much simpler. Put all of the consequences of being defrauded by a borrower onto the lender.

If a lender wants to be repaid, then they need to show the borrower all the evidence they have for proof that the borrower entered into the contract.

If all a lender has is the fact that a 9 digit number, date of birth, name, and address were entered online, then the borrower simply has to say “I did not enter that information”, and the lender can go pound sand.

Guarantee all the lenders will tighten up their operations very quickly, and consequently, so will the loans that appear on one’s credit report.

14 hours agolotsofpulp

Right. This is a problem between the lenders and the people who stole from the lenders. The person whose name/number was used shouldn't even be part of the transaction or part of the problem.

They call it "Identity Theft" instead of what it should be called: Bank fraud. The term "Identity Theft" 1. needlessly pulls an otherwise uninvolved person into the mix, suddenly making it their problem too, and 2. downplays the bank's negligence.

If someone uses my name to take out a loan, and the bank stupidly lets them, this shouldn't even remotely be my problem. I shouldn't even have to know about it. This is the bank's problem from their own stupidity.

13 hours agoryandrake

"Put all of the consequences of being defrauded by a borrower onto the lender" - that seems a bit strange.

Imagine saying "put all of the consequences of getting robbed onto the bank, not the robber"

11 hours agosib

Who bears the consequences of their home being robbed? Or mugged on the street? Or a contractor taking payment for services and then disappearing?

Why are we subsidizing lenders’ by putting this ridiculous burden on people who have nothing to do with the lender’s business?

The lender can pay to appropriately verify their borrower’s identity, or go to court and sue for damages like everyone else has to.

10 hours agolotsofpulp

Lenders hand over bad loans to collection agencies (“accept the consequences”) all the time. Cost of doing business. That an innocent person’s credit is destroyed is just collateral damage from their perspective.

12 hours agotwoodfin

That's not an irrational reaction.

14 hours agorkagerer

The long term consequences of 9/11.

13 hours agonewsclues

Lina Khan has been on a tear. She actually seems to care about online human rights.

13 hours agocynan123

I think this effort is positive, but a bit misdirected. Think data breach liability. Facebook and YouTube are willing and capable defenders of sensitive customer data. Watch the AshleyMadison documentary. Arrogant disregard for customer privacy and almost no culpability. These smaller, irresponsible players are where consumers are most vulnerable.

9 hours agomontag

Agreed. Mid-sized/smaller players are the places which have very poor data & security practices. Especially when they require PII as part of their operations.

Meta, Google are much better stewards of their users data. One misconception I see is claiming these companies sell user data. I'd instead say that they sell user attention.

8 hours agogopkarthik

They don’t sell user data for a very simple reason - it’s a crappy business, as you can charge much much more with to recurring sales for heavily obfuscated access to the data than just one off selling of said data.

When you think about it - initiatives are kind of aligned with user privacy (kind of, as there’s much more to the story than this simplistic point of view)

5 hours agojustapassenger

I will be surprised if she's still there six months from now. Trump will remove her if he becomes president; whereas if Harris wins, and the GOP take the Senate--a pretty likely scenario--I fear Harris won't hesitate to use Khan as a bargaining chip to gain confirmation of her appointments.

6 hours agoxhevahir

Ever stop and think it's funny that Meta, Google, etc. are worth billions because they figured out how to legally fill a database with information about you? In any other time in history some might call it spying, but well they figured out how to do it legally, and it's worth billions. Meanwhile from a technical standpoint, remotely logging your data is a trivial thing, with consent of course. It's like, we made this imaginary wall (law) and spent billions building a road around that wall, and thats equivalent to econmic prosperity. Similar idea with streaming services versus file sharing.

8 hours agodisambiguation

they are valuable b/c they built something billions of people use. I suspect the revenue loss from every one of these FTC recommendations being implemented would not have a material impact on either of the businesses you mentioned

4 hours agoweixiyen

Spying is done without consent.

Why do people keep saying social media is just a database?

4 hours agocscurmudgeon

Facebook will create shadow profiles of you even if you've never signed up, never created a profile. They'll take your number from other people's contacts via WhatsApp. They'll do facial scans of you on photos other people upload.

Even if you've never visited their site.

Where's the consent there?

an hour agoWickyNilliams

The consent you give to web services isn't much better than if an electrician said "hey, can you tap this button to give me consent to work on your house?" and then installed undetectable hidden microphones inside every surface in your apartment.

All of the UX of online consent forms exists to misinform, trick, and get users used to agreeing to sell their digital soul.

an hour agoLlamamoe

This portion is particularly problematic:

> many companies engaged in broad data sharing that raises serious concerns regarding the adequacy of the companies’ data handling controls and oversight.

15 hours agoGeekyBear

It would be wonderful if the staff report recommendations were taken seriously by our legislators. I think I'll send a copy of this to my reps and say hi.

15 hours agomrmetanoia
[deleted]
7 hours ago

> The report found that the companies collected and could indefinitely retain troves of data, including information from data brokers, and about both users and non-users of their platforms.

As a non-user of many social media platforms, is there anything I can do to prevent companies from collecting data about me? It feels wrong that companies you do not sign up for are still finding and processing data about you.

9 hours agoSamuelAdams

Let’s add automaker to the list as well with all the cameras and microphones spying in auto cabins.

9 hours agoEasyMark

This is truly interesting from a dialectical perspective. The current narrative is that data is simultaneously infinitely valuable and presents zero liability. This contradiction can't hold forever (though it can hold longer than any of us are alive, of course)

I suspect it will break in the direction of the narrative that "data wasn't that valuable anyway", regardless of how disingenuous this sentiment is. Nothing else preserves the economic machine while simultaneously dismissing the concerns of consumers. Perhaps we'll get special protection for stuff like SSNs to make it seem like politicians are acting on the behalf of their constituents (even though a competent manager of a rational society would simply ban use of ssn as a form of identification as this is basically public information.

7 hours agodarby_nine

A little hypocritical when it comes from various government organizations all over the western world. Surveillance companies are essential for police to be able to easily gather data when needed fast. It is a happy accident that surveillance is so lucrative for advertising and also so effective for policing.

13 hours agoseydor

Different parts of government might disagree on the best course of action but I wouldn’t call that disagreement hypocrisy per se.

It’s also not true that it’s an irresolvable conflict. Yes the cops can and do buy your phone location data, but even if we said that was fine and should continue, that doesn’t also mean that any schmuck should be able to buy real-time Supreme Court justice location data from a broker.

11 hours agojanalsncm
[deleted]
15 hours ago

Simple questions:

Should ad prices be lower or higher?

Should YouTube be free for everyone, or should it cost money?

13 hours agodoctorpangloss

Having ads does not require mass surveillance --- that's really just something that social media companies have normalized because that's the particular business model and practices they have adopted and which makes them the most amount of money possible.

13 hours agobeezlebroxxxxxx

Well put. Targeting and more specifically retargeting is the problem.

Most companies can't afford to not do this when their competitors are. Hence the need for regulation.

10 hours agogoosejuice

Those are useful questions but I don’t think they’re the only ones that matter. Here’s another one for consideration:

What is the minimum level of privacy that a person should be entitled to, no matter their economic status?

If we just let the free market decide these questions for us, the results won’t be great. There are a lot of things which shouldn’t be for sale.

13 hours agojanalsncm

> What is the minimum level of privacy that a person should be entitled to, no matter their economic status?

This is an interesting question: maybe the truth is, very little.

I don't think that user-identified app telemetry is below that minimum level of privacy. Knowing what I know about ad tracking in Facebook before Apple removed app identifiers, I don't think any of that was below the minimum level.

This is a complex question for sort of historical reasons, like how privacy is meant to be a limit on government power as opposed to something like, what would be the impact if this piece of data were more widely known about me? We're talking about the latter but I think people feel very strongly about the former.

Anyway, I answered your questions. It's interesting that no one really wants to engage with the basic premise, do you want these services to be free or no? Is it easy to conceive that people never choose the paid version of the service? What proof do you need that normal people (1) understand the distinction between privacy as a barrier to government enforcement versus privacy as a notion of sensitive personal data (2) will almost always view themselves as safe from the government, probably rightly so, so they will almost always choose the free+ads version of any service, and just like they have been coming out ahead for the last 30 years, they are likely to keep coming out ahead, in this country?

12 hours agodoctorpangloss

I didn’t mean to evade your questions, but my opinion is as follows:

Yes I want YouTube to be free, but not if that requires intrusive surveillance.

People who pay for YouTube aren’t opted out of surveillance as far as I can tell. So I reject the premise of your question, that people are choosing free because they don’t value privacy. They haven’t been given the choice in most cases.

On a tangential note, you previously asked if ads should be more expensive. It’s possible that ads should be less expensive, since they may be less effective than ad spend would suggest: https://freakonomics.com/podcast/does-advertising-actually-w...

11 hours agojanalsncm

The issue to me is that these companies have operated and continue to operate by obfuscating the nature of their surveillance to users. This isn’t a system of informed consent to surveillance in exchange for free services; it’s a system of duping ordinary people into giving up sensitive personal information by drawing them in with a free service. I’m almost certain this model could still exist without the surveillance. They could still run ads; the ads would be less targeted.

11 hours agoBriggyDwiggs42

Yes thank you for listening BRAVO BRAVO BRAVO

10 hours agoblondelegs

Please make it so my kids can watch a YouTube video required by school without watching 20 YouTube shorts after. That's all I want.

11 hours agoherf

Assuming it’s on a computer (big assumption for kids) you can install this[0] extension and customize it to do things like remove shorts from appearing, disable autoplay, hide recommended videos, etc. it’s a good way to not let YouTube pull your focus away from you.

[0] https://unhook.app/

7 hours agohackerdood

Download the video?

10 hours agogoosejuice

the full report[0] is a good read don't just read the summary..

>>> But these findings should not be viewed in isolation. They stem from a business model that varies little across these nine firms – harvesting data for targeted advertising, algorithm design, and sales to third parties. With few meaningful guardrails, companies are incentivized to develop ever-more invasive methods of collection. >>>

[0]: https://www.ftc.gov/system/files/ftc_gov/pdf/Social-Media-6b...

15 hours agoshawn-butler

Surveillance is cancerous. It keeps on growing, feeding on justification for every data point "just because", and then eventually it kills you.

11 hours agoCatWChainsaw

Shocked, gambling, establishment, etc.

15 hours agoJackOfCrows

We really need e2ee social media that's designed to protect, not addict people.

12 hours agoianopolous

“E2ee social media” isn’t a coherent concept. E2ee has to do with how information is transferred not what is transferred.

11 hours agojanalsncm

Imagine the respect the government has for your intelligence publishing this while purchasing said surveilled user data.

11 hours agohnpolicestate

The government is large and consists of multiple organizations with different goals.

11 hours agocarom

There is no single "the government".

Instead "The Government" is like a huge community. They are all supposed to adhere to the same code, but like any community there are those members that look for a way to bypass the law, without quite going over it.

That's what said purchases are. And even parts of the community in the same branch of a government department, may do what other parts are not even really aware of. Or agree with.

11 hours agobbarnett

Although you have a valid point, I object to your calling it a community because communities don't have constitutions and cannot throw people in jail if they break the community's rules. Also, a community has much less control over who becomes a member of the community than a government has over who it employs.

11 hours agohollerith

Wait till the FTC discovers Full Story

14 hours agoyieldcrv

Facebook Employees Explain Struggling To Care About Company's Unethical Practices When Gig So Cushy https://www.youtube.com/watch?v=-DiBc1vkTig

15 hours agonabla9

an onion parody/satire video, lol

14 hours agoxyst
[deleted]
13 hours ago

instead of stupid recommendations, which are laughable, the government should actually enforce them.

14 hours agorussdpale

“The government” isn’t a singular entity, and the FTC is an independent agency.

12 hours agolayer8

I love the cognitive dissonance on display within the federal government.

One arm: "everyone is a criminal; spy on everyone"

Other arm: "hey you shouldn't really harvest all of that data"

15 hours agoryanisnan

The cognitive dissonance is in the voters and users.

Even right here on HN, where most people understand the issue, you'll see conversations and arguments in favor of letting companies vacuum up as much data and user info as they want (without consent or opt-in), while also saying it should be illegal for the government to collect the same data without a warrant.

In practice, the corporations and government have found the best of both worlds: https://www.wired.com/story/fbi-purchase-location-data-wray-... Profit for the corporation, legal user data for the government.

15 hours agojlarocco

HN is filled with folks that wrote the code in question, or want to create similar products. And they hate to have it pointed out that these tools may cause harm so they thrash around and make excuses and point fingers. A tale as old as this site.

15 hours agospacemadness

I often have to remind myself who hosts this board and that I am hanging out on a site for successful and aspiring techno-robber-barons.

15 hours agomrmetanoia

> I am hanging out on a site for successful and aspiring techno-robber-barons.

that’s how we first arrive here (all of us). Time pass tho and most around fail then we become proper people capable of reasoning

15 hours agosabbaticaldev

Explaining that modern technology is user-hostile and destructive to the society is nowhere else more on-topic than Paul Graham’s ego blog. While it might be true to say the site is “for” robber barons, There are a lot more users here than the ones you described.

14 hours agosingleshot_

Complete with egotistical and ironic appropriation of the word hacker.

15 hours ago2OEH8eoCRo0

>The cognitive dissonance is in the voters and users.

People really need to learn to say “NO” even if that means an inconvenience “Your personal information might be shared with our business partners for metrics and a customer tailored experience” no thanks, “what is your phone number? so I can give you 10% discount” no thanks, “cash or credit?” Cash, thanks, “login with google/ apple/ blood sample” no thanks

12 hours agoneuralRiot

There isn’t a single intellectually honest harm associated with the majority of app telemetry and for almost all ad data collection. Like go ahead and name one.

Once you say some vague demographic and bodily autonomy stuff: you know, if you’re going to invoke “voters,” I’ve got bad news for you. Some kinds of hate are popular. So you can’t pick and choose what popular stuff is good or what popular stuff is bad. It has to be by some objective criteria.

Anyway, I disagree with your assessment of the popular position anyway. I don’t think there is really that much cognitive dissonance among voters at all. People are sort of right to not care. The FTC’s position is really unpopular, when framed in the intellectually honest way as it is in the EU, “here is the price of the web service if you opt out of ads and targeting.”

You also have to decide if ad prices should go up or down, and think deeply: do you want a world where ad inventory is expensive? It is an escape valve for very powerful networks. Your favorite political causes like reducing fossil fuel use and bodily autonomy benefit from paid traffic all the same as selling junk. The young beloved members of Congress innovate in paid Meta campaign traffic. And maybe you run a startup or work for one, and you want to compete against the vast portfolio of products the network owners now sell. There’s a little bit of a chance with paid traffic but none if you expect to play by organic content creation rules: it’s the same thing, but you are transferring money via meaningless labor of making viral content instead of focusing on your cause or business. And anyway, TikTok could always choose to not show your video for any reason.

The intellectual framework against ad telemetry is really, really weak. The FTC saying it doesn’t change that.

13 hours agodoctorpangloss

> There isn’t a single intellectually honest harm associated with the majority of app telemetry and for almost all ad data collection. Like go ahead and name one.

You’ve already signaled that you’re ready and willing to dismiss any of the many obvious reasons why this is bad. But let’s flip it. What intellectually honest reason do you have for why it would be wrong if I’m watching you while you sleep? If I inventory your house while you’re away, and sell this information to the highest bidder? No bad intentions of course on my part, these things are just my harmless hobby and how I put bread on the table.

In my experience literally everyone who argues that we don’t really have a need for privacy, or that concerns about it are paranoid or that there’s no “real” threat.. well those people still want their own privacy, they just don’t respect anyone else’s.

More to the point though, no one needs to give you an “intellectually honest” reason that they don’t want to be spied on, and they don’t need to demonstrate bad intentions or realistic capabilities of the adversary, etc. If someone threatens to shoot you, charges won’t be dropped because the person doesn’t have a gun. The threat is extremely problematic and damaging in itself, regardless of how we rank that persons ability to follow through with their stated intent.

8 hours agophotonthug

> What intellectually honest reason do you have for why it would be wrong if I’m watching you while you sleep? If I inventory your house while you’re away, and sell this information to the highest bidder?

This is an interesting idea, but it's a pretty far analogy from app telemetry or ad data collection. If you're really saying, "would it be wrong for me as a camera app developer to collect the videos end users record?" I suppose the answer would really be, "It depends." Like that's what Instagram does, it collects videos end users record. But without their permission? I guess not, no, but that's pretty obvious. The same would be true if you made firmware for security cameras, which happened to be pointed at my bedroom. I suppose if you asked for permission, and I granted it, go ahead - if you didn't ask for permission, I would be surprised why you would need to collect the videos as a firmware developer. The house inventory thing is the same tack - are you talking about, does it make sense for Amazon to sell my purchase history, or something? I guess they asked for permission, go ahead... Nobody forces me to use Amazon or whatever.

Instagram, Amazon, etc. do the things they do with permission. And I don't think anyone who is fully educated is surprised what the idea is for the transactional attribution data it collects. There's lying by omission, which is bad, but that is an issue of leadership and education. Everyone in the EU still chooses telemetry and free over no telemetry and paid service, when it is spelled out to them. It's too bad that leadership has to be taken in that form, but there's no alternative in the regime they built there.

If this is just a competition over the leadership and education of laypeople, so be it, but this real life experiment keeps happening, and the people who try to inject drama into ad telemetry keep losing, so I really don't think it's just about lying. There is a real lack of harm.

> reason that they don’t want to be spied on

Nobody forces you to use Instagram. If you think ad data attribution is a form of spying, go for it. Delete the free social media apps. I don't use them. I don't have Instagram, TikTok, etc. I spend less than 10m a week watching something on YouTube. I don't even have a TV in my house. Do you see? They are not enriching your life.

> In my experience literally everyone who argues... well those people still want their own privacy, they just don’t respect anyone else’s.

In my experience this is pure projection. I respect when people don't want to give permission to Instagram to collect ad telemetry when they choose to not install the app. Of course, you say these things on the Internet, but you, you personally, are not going to migrate off of Gmail, which does all the same things. This is all really about vibes, about vibes being vibesy against social media, but not vibes being vibesy against Gmail, which would be a major inconvenience to say no to, and it would suck to have to pay $35/mo for e-mail - at the very least!

5 hours agodoctorpangloss

So basically your argument is everything is fine because consumers can opt out. Another tired old argument where even the people saying it don’t really believe it.

You can’t even rent a hotel room without giving them an email and a phone number they don’t need, and are looking to sell. If this works for you.. the person at the counter probably faked it rather than arguing with you. Some people will be happy when menus disappear and you need to install an app. What happens when you can’t check out of the grocery store without the requisite likes-and-subscribes? What happens when your flashlight app has 37 page ToS that says they reserve the right to steal your contact list for the purposes of selling flashlight apps? All is well because you can see in the dark, and no one makes you choose anything? Well I hope there’s healthy competition amongst the manufacturers of your pacemaker, and they don’t inform your insurance company that your health is deteriorating..

If you’ve got no sense of right or wrong beyond what is legally permissible, just exercise your imagination a bit to look at the likely future, and ask yourself if that’s how you really want to live.

2 hours agophotonthug

The intelligence agencies literally use ad data to do "targeted killing" what are you even talking about?

Ex-NSA Chief: 'We Kill People Based on Metadata'...

13 hours agoarminiusreturns

Can you define a harm suffered by the people that the FTC represents? What about the EU beneficiaries of the GDPR? This is sincere, it is meant to advance to a real and interesting conversation.

12 hours agodoctorpangloss

I think privacy violations are a harm in themselves, but you seem to have already dismissed this issue, so I'll move on. How about behavioral manipulation via microtargeting, economic harm via price discrimination, reselling of the data via monetization to unscrupulous aggregators or third parties, general security reduction (data and metadata sets could be used for APT, etc), or the chilling effect of being tracked all the time in this way?

10 hours agoarminiusreturns

> How about behavioral manipulation via microtargeting...

I don't know. Ads are meant to convince you to buy something. Are they "behavioral manipulation?" Are all ads harmful?

> ...economic harm via price discrimination...

Should all price discrimination be "illegal?" This is interesting because it makes sense for the FTC and for anti-trust regulators to worry about consumer prices. Price discrimination in software services - the thing I know about - helps the average consumer, because it gets richer people to pay more and subsidize the poor.

> reselling of the data via monetization to unscrupulous aggregators or third parties

"Unscrupulous" is doing a lot of work here.

> ...general security reduction...

Gmail and Chrome being free ad subsidized has done a lot more for end user security than anything else. Do you want security to be only for the rich? It really depends how you imagine software works. I don't know what APT stands for.

> chilling effect of being tracked all the time in this way?

Who is chilled?

I guess talk about some specific examples. They would be really interesting.

9 hours agodoctorpangloss

Anti-disclaimer: I'm not one of those folks.

However, that's not at all a cognitive dissonance. Fundamentally, there's a difference between governments and private companies, and it is fairly basic to have different rules for them. The government cannot impinge on free speech, but almost all companies do. The government cannot restrict religion, but to some extent, companies can. Etc.

Of course, in this case, it's understandable to argue that neither side should have that much data without consent. But it's also totally understandable to allow only the private company to do so.

13 hours agoBeetleB

There is fundamentally a difference between corporations and the government, but it's still a cognitive dissonance. These aren't the laws of physics - we chose to have different rules for the government and corporations in this case.

There are plenty of cases where the same rules apply to both the government and corporations.

13 hours agojlarocco

And in Europe, everyone and their dog uses WhatsApp

13 hours agoitronitron

It isn’t cognitive dissonance, the state does lots of things we’re not supposed to do. Like we’re not supposed to kill people, but they have whole departments built around the task.

Should the state do surveillance? Maybe some? Probably less? But the hypocrisy isn’t the problem, the overreach is.

15 hours agobee_rider

The FTC is under the president's authority. This is election pandering, same as Zuckerberg's backpedaling on government censorship.

This is for getting votes from the undecided.

Everything will be back to normal (surveillance, data collection and censorship) after the election.

15 hours agocvnahfn

The FTC is bipartisan, no more than three of the five commissioners can belong to the same party. The present report was unanimously voted by all five.

11 hours agolayer8

Begs the question of agency authority which is manifestly not resolved. You will find that the elections’ results will effect the eventual resolution of the question of the unitary executive quite dramatically.

14 hours agosingleshot_

I don't know if you've been watching but the FTC has actually been extremely proactive during this cycle. Lina Khan is an excellent steward and has pushed for a lot of policy improvements that have been sorely needed - including the ban (currently suspended by a few judges) on non-competes.

It is disingenuous to accuse the FTC of election pandering when they've been doing stuff like this for the past four years consistently.

14 hours agomunk-a

And has sued Amazon for their use of anti-competitive pricing.

This is just what Kahn's FTC does.

13 hours agosrndsnd

[flagged]

13 hours agoOkeyDokey2

There are different organizations with different opinions. The government isn't a monolithic entity.

15 hours agokiba

It seems entirely reasonable/consistent that we would allow some capabilities among publicly sanctioned, democratically legitimate actors while prohibiting private actors from doing the same.

In fact, many such things fall into that category.

15 hours agowhimsicalism

I would be worried if the state was conscious of what it itself was doing as a whole

15 hours agodaedrdev

Since the federal government isn’t a single mind (nor a hive mind), a cognitive dissonance can only be meaningfully located on the observer’s side.

12 hours agolayer8

And it's not just here.

The EU: Unlike the barbarians across the pond, we actually protect people's privacy rights.

Also the EU: ChAt CoNtRoL

15 hours agobitwize

The problem seems deeply fundamental to what it means to be a human.

On one hand, there's a lack of clear leadership, unifying the societal approach, on top of inherently different value systems held by those individuals.

It seems like increasingly, it's up to technologists, like ones who author our anti-surveillance tools, to create a free way forward.

15 hours agoryanisnan

this view presupposes the state as “just another actor” as opposed to a privileged one that can take actions that private actors can’t

15 hours agowhimsicalism

In the matter of corporations vs governments, if you tally up number of people shot it's clear which of the two is more dangerous. You would think Europe of all regions would be quick to recognize this.

I don't like corporations spying on me, but it doesn't scare me nearly as much as the government doing it. In fact the principle risk from corporations keeping databases is giving the government something to snatch.

15 hours agolupusreal

because the government has a monopoly on violence. i would much prefer that to corporations being able to wage war themselves

14 hours agowhimsicalism

Who is arguing for corporations to wage war? What an absolutely insane strawman. What I am arguing against is letting governments grant themselves the ability to spy on their own populations on an unprecedented scale, because governments "waging war" (mass murder) against their own people is a historically common occurrence.

13 hours agolupusreal

Those privileged actions are mostly irrelevant when discussing mass surveillance. Doubly so since they can just buy or acquire the data from corps.

15 hours agoKarunamon

The EU has multiple parts. One part keeps asking for chat control, and another part keeps saying no.

13 hours agoimmibis
[deleted]
15 hours ago

"According to one estimate, some Teens may see as many as 1,260 ads per day.200 Children and Teens may be lured through these ads into making purchases or handing over personal information and other data via dark patterns"

There is a long trail of blood behind google and facebook, amazon... Etc...

13 hours agoDaleNeumann

Even with ad blockers, we still see tons of ads. Corporate news like CNN constantly has front page stories that are just paid promotion for some product or service wrapped in a thin veil of psuedo journalism. Product placement is everywhere too. Tons of reddit front page content is bot-upvoted content that is actually just a marketing campaign disguised as some TIL or meme or sappy story.

12 hours ago93po
[deleted]
13 hours ago

[flagged]

15 hours agoshort_sells_poo

> People criticize the clunky attempts by the EU to reign this in, and yes I agree the execution leaves much to be desired. It's still vastly better than the complete laissez-faire approach of the US authorities.

This is kind of weird as a response to a report by a US regulatory agency that is making specific policy requests for legislation to address this.

15 hours agodragonwriter

Apologies I was unclear: I'm not criticizing this report, I'm criticizing the lack of action over the past decade or so.

15 hours agoshort_sells_poo
[deleted]
15 hours ago

[flagged]

16 hours agoranger_danger

[flagged]

15 hours agomsarrel
[deleted]
15 hours ago

[flagged]

15 hours agomrmetanoia

[flagged]

13 hours agoflerchin
[deleted]
13 hours ago

"these surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking."

Is there any evidence that any of these things have ever happened as a result of this sort of data collection? I'm not talking about data posted to social media, I'm talking about the specific data collection described in this FTC press release.

15 hours agomgraczyk

I have been stalked and harassed by an Apple employee using data they were able to glean from their access at Apple.

The impossible part is proving the abuse. All of these companies keep their database, access controls, and everything they possible can about these data lakes secret. The simple fact of the matter is that you will never have any evidence someone looked you up in a database.

It is really easy to walk the line, but be obvious enough to intimidate.

15 hours agomu53

Apple wasn't listed and (outside the app store) doesn't collect the data described in the press release.

14 hours agomgraczyk

They absolutely do, in fact they even tried to encrypt user data to not be as invasive as other companies but the FBI sued them and said no you can't do that, you need to keep that data so we can subpoena you.

14 hours agostiffenoxygen

They mentioned practices that corporations do. I think any corporation that collects data on you counts here. I don't think its worth it to only talk about the examples provided in the article.

14 hours agostiffenoxygen

So imagine the possible abuses by people at companies who do.

14 hours agodrawkward
[deleted]
14 hours ago

Not only is there evidence of harms, there are is a whole industry focused on fixing the problem for those wealthy enough or incentivized enough to care.

Do a bit of googling, but ADINT and RTB tracking will get you there for search terms.

Or, continue being confidently dismissive of something serious people are taking very seriously. I am sorry if this FTC report targeted the source of your RSUs or otherwise motivated set of incentives, but there’s no free lunch. The consequences are finally landing of your viewpoint, done collectively, over the last decade.

13 hours agodogman144

> targeted the source of your RSUs or otherwise motivated

I don't currently have any financial interest in any of these companies

> but ADINT and RTB tracking will get you there for search terms.

These are good things, do you have any examples of harm that has been caused by ADINT or RTB? Prosecuting criminals doesn't count for me

12 hours agomgraczyk

Your comment is really coming across as "well, nothing bad has happened yet so who cares?" If that's not the case, please let me know how you meant it. If it is the case, surely you can imagine a world in which dragnet surveillance of people who have an expectation of privacy can be abused by corporations, institutions, or private individuals. It really doesn't take a lot of imagination to picture this world.

14 hours agoorthecreedence

It's been ubiquitous for around 20 years now (Google started doing mass surveillance for display ads in the early 2000s) and nothing bad has happened, so yes that's my point.

If nothing bad happens for decades, and that is inconsistent with your model of danger, then the model is probably wrong

14 hours agomgraczyk

Your argument boils down to "yes, someone has had a gun pointed at my head for quite some time now, but they haven't pulled the trigger yet so I don't see the problem."

14 hours agoorthecreedence

No, I'm arguing that it's not actually a gun, and my evidence is that there are 2 billion "guns" that have been pointed at 2 billion people's heads for years, and nobody has been hurt.

It's more like a flashlight than a gun

14 hours agomgraczyk

> It's more like a flashlight than a gun

I disagree, and again, implore you to use your imagination. If private messages (not just yours but someone elses) were to suddenly be public or institutional knowledge, what damning things might happen? What influence might some have over others? What dynamics could or would shift as a result?

I'm comfortable making the claim that you aren't really thinking this through, at all, in any meaningful way.

14 hours agoorthecreedence

The FTC press release is not talking about private messages, that is not the kind of data they are asking to protect. Private messages are already generally protected in the way the FTC is asking for.

12 hours agomgraczyk

What was the fallout last time this happened? Was it like pulling the triggers of guns pointed at people's heads?

13 hours agoimmibis

If you don't think anything bad happens from personal data being accessed without one's consent, please reply to this comment and share:

1. Your full name

2. Your home address

3. Your social security number (if you're American)

4. Your mother's maiden name

If you're right, then you have nothing to worry about.

13 hours agoryandrake

None of this data is included in the FTC report. They are not talking about this.

My full name is Michael Graczyk, I live in San Francisco, none of these companies know any more detail than that about the questions you asked

13 hours agomgraczyk

Michael, I disagree with your point but I recognize your integrity. You just posted your name and city, and your HN profile shares more personal information.

I respect that you are willing to stand behind your claim. Best of success with your current venture.

11 hours agotway_GdBRwW

> none of these companies know any more detail than that about the questions you asked

I suspect you mean that you haven't provided these companies with these details. What reason do you have to think they don't know those details?

12 hours agolcnPylGDnU4H9OF

They don't know these details because they have never asked. It's not the sort of detail that would be useful for ads (except my home address)

9 hours agomgraczyk

> nothing bad has happened

ummm, WTF?

10x increase in teen suicide doesn't qualify as "bad"?

or repeated DOJ lawsuits against Facebook because their advertising practices result in highly effective racial discrimination?

11 hours agotway_GdBRwW

[flagged]

14 hours agostiffenoxygen

Wait for the AI tools Larry Ellison wants to give to law enforcement to retroactively connect/hallucinate the dots.

14 hours agodrawkward

[dead]

15 hours agokiloshib

[flagged]

14 hours agostiffenoxygen

Targeted advertising is a good thing. It lets people who make stuff more efficiently connect with people who want that stuff.

The FTC chair is complaining that companies "monetize that data to the tune of billions of dollars a year," but all this means is that this service is tremendously valuable.

The Internet's targeted advertising system is a major achievement of modern information technology and data science, and we dismantle it at our peril.

8 hours agonegativeonehalf

Really? Peril? We’ll be in perilous danger if we don’t maintain targeted, invasive ad tracking? Get a grip.

4 hours agoickelbawd

> Profound Threats to Users Can Occur When Targeting Occurs Based on Sensitive Categories

> Targeted ads based on knowledge about protected categories can be especially distressing. One example is when someone has not disclosed their sexual orientation publicly, but an ad assumes their sexual orientation. Another example is when a retailer identifies someone as pregnant and targets ads for baby products before others, including family, even know about the pregnancy. These types of assumptions and inferences upon which targeted advertising is based can in some instances result in emotional distress, lead to individuals being misidentified or misclassified, and cause other harms.

If this is one of the biggest harms the FTC can come up with, then honestly as a consumer I don't really care. Having free youtube is worth getting a few mistargeted ads, or I CAN JUST TURN TARGETED ADS OFF. Advertising isn't someone harassing you, its an ad that I can close or just report as not being accurate. I'd really be interested to hear from someone who thinks getting a mistargeted ad is in top 10 most stressful things in their life.

What I would really be interested in is the raw responses from the companies, not this report.

15 hours agokart23

> I CAN JUST TURN TARGETED ADS OFF

The only reason you have the option to do this is because of groups pushing back against advertising companies. Ad companies have no incentive to offer the option to disable targeting.

If you like having this option available, then you should like this FTC report and the position they are taking.

14 hours agocarb

> If you like having this option available, then you should like this FTC report and the position they are taking.

I can like other positions and actions the FTC has done, like requiring the ability to turn off targeted ads, and not like others, like this one. This is among the biggest problems in politics right now. Supporting a political party doesn't mean you need to 100% back all their opinions and policies, thats how change is effected in successful democratic systems.

14 hours agokart23

> I can like other positions and actions the FTC has done, like requiring the ability to turn off targeted ads, and not like others, like this one

They weren't saying that was the case I think you're misunderstanding them here. But they are 100% correct, you are benefiting from other people fighting against this mass surveillance and yet speaking against it. I think you should do some research on why privacy is important and challenge yourself and your potentially entrenched beliefs.

14 hours agostiffenoxygen

Read my first comment. I definitely agree privacy is important. All I'm saying is that this is not one of the harms we should be worrying about when saying targeted advertising is a problem, and I don't understand why this is an important issue that we should care about when targeted advertising can be turned off:

"Profound Threats to Users Can Occur When Targeting Occurs Based on Sensitive Categories"

11 hours agokart23

Use your imagination?