+0.36 Microsoft AI spying scandal: time to rethink privacy standards (spectrum.ieee.org S:-0.27 )
913 points by walterbell 632 days ago | 380 comments on HN | Moderate positive Editorial · v3.7 · 2026-02-28 09:22:49
Summary Digital Privacy & Surveillance Advocates
IEEE Spectrum publishes a 2024 opinion piece by security experts Bruce Schneier and Barath Raghavan that advocates for privacy rights protection through ecosystem-level regulatory frameworks. The authors argue that internet surveillance has become normalized through 'shifting baseline syndrome'—each generation accepts degraded privacy as normal—and call for democratic, scientifically-informed governance to preserve privacy as foundational. However, the site itself implements Sentry tracking, requires account login, uses cookies and ad targeting, and paywalls the privacy content, creating a stark contradiction between message and practice.
Article Heatmap
Preamble: +0.50 — Preamble P Article 1: ND — Freedom, Equality, Brotherhood Article 1: No Data — Freedom, Equality, Brotherhood 1 Article 2: ND — Non-Discrimination Article 2: No Data — Non-Discrimination 2 Article 3: +0.30 — Life, Liberty, Security 3 Article 4: ND — No Slavery Article 4: No Data — No Slavery 4 Article 5: ND — No Torture Article 5: No Data — No Torture 5 Article 6: ND — Legal Personhood Article 6: No Data — Legal Personhood 6 Article 7: +0.30 — Equality Before Law 7 Article 8: ND — Right to Remedy Article 8: No Data — Right to Remedy 8 Article 9: ND — No Arbitrary Detention Article 9: No Data — No Arbitrary Detention 9 Article 10: ND — Fair Hearing Article 10: No Data — Fair Hearing 10 Article 11: ND — Presumption of Innocence Article 11: No Data — Presumption of Innocence 11 Article 12: +0.50 — Privacy 12 Article 13: ND — Freedom of Movement Article 13: No Data — Freedom of Movement 13 Article 14: ND — Asylum Article 14: No Data — Asylum 14 Article 15: ND — Nationality Article 15: No Data — Nationality 15 Article 16: ND — Marriage & Family Article 16: No Data — Marriage & Family 16 Article 17: ND — Property Article 17: No Data — Property 17 Article 18: +0.20 — Freedom of Thought 18 Article 19: +0.30 — Freedom of Expression 19 Article 20: ND — Assembly & Association Article 20: No Data — Assembly & Association 20 Article 21: +0.50 — Political Participation 21 Article 22: ND — Social Security Article 22: No Data — Social Security 22 Article 23: ND — Work & Equal Pay Article 23: No Data — Work & Equal Pay 23 Article 24: ND — Rest & Leisure Article 24: No Data — Rest & Leisure 24 Article 25: ND — Standard of Living Article 25: No Data — Standard of Living 25 Article 26: ND — Education Article 26: No Data — Education 26 Article 27: ND — Cultural Participation Article 27: No Data — Cultural Participation 27 Article 28: +0.40 — Social & International Order 28 Article 29: +0.30 — Duties to Community 29 Article 30: ND — No Destruction of Rights Article 30: No Data — No Destruction of Rights 30
Negative Neutral Positive No Data
Aggregates
Editorial Mean +0.36 Structural Mean -0.27
Weighted Mean +0.40 Unweighted Mean +0.37
Max +0.50 Preamble Min +0.20 Article 18
Signal 9 No Data 22
Confidence 21% Volatility 0.10 (Low)
Negative 0 Channels E: 0.6 S: 0.4
SETL +0.58 Editorial-dominant
FW Ratio 66% 23 facts · 12 inferences
Evidence: High: 1 Medium: 3 Low: 5 No Data: 22
Theme Radar
Foundation Security Legal Privacy & Movement Personal Expression Economic & Social Cultural Order & Duties Foundation: 0.50 (1 articles) Security: 0.30 (1 articles) Legal: 0.30 (1 articles) Privacy & Movement: 0.50 (1 articles) Personal: 0.20 (1 articles) Expression: 0.40 (2 articles) Economic & Social: 0.00 (0 articles) Cultural: 0.00 (0 articles) Order & Duties: 0.35 (2 articles)
HN Discussion 20 top-level · 30 replies
whoknowsidont 2024-06-06 02:50 UTC link
I agree. Philosophically, politically and financially (in terms of groups I donate to).

But it's too late. The frame shift is too great for most people. Getting people to care about this let alone vote in a manner that would cause actual change is so far out I have a better chance of winning the lottery.

I don't know what to do other than to support and educate where I can.

I do hope I'm wrong.

hiddencost 2024-06-06 02:51 UTC link
Why would anyone think that MSFT was doing anything other than everything they were allowed under the law?
EGreg 2024-06-06 03:02 UTC link
For that, we will need to get off Big Tech and embrace open source:

https://qbix.com/blog/2021/01/15/open-source-communities/

Maro 2024-06-06 03:22 UTC link
Companies like Google have had access to our full email, search, location, photo roll, video viewing, docs, etc history for 10+ years. I don't think also having our LLM prompts fundamentally changes this picture..

I guess this is what the shifting baseline argument refers to..

Having said that, I think it's rational for almost all people to not care to give up privacy for all these (addictively) amazing tools. Most people don't do anything worth privacy protecting. Having worked in data/engineering at bigtech, it's not like there's a human on the other side reviewing what each user is doing. For almost all people the data will just be used for boring purposes to build models for better marketing/ads/recommendations. A lot of the models aren't even personalised, the user is just represented by some not-human-readable feature vector that again nobody looks at.

Hell, I have multiple Google Home devices that are always on and listening, and the thing's internal model is so basic and not-personalized that after multiple years it still has trouble parsing me when I say "Play jazz" and "Stop", even though these are the 2 commands I exclusuvely use. Sometimes it starts playing acid rock, and when I say "Stop" it starts reading me stock quotes.

henriquez 2024-06-06 03:37 UTC link
Boiling frog metaphor is much more emotionally compelling for the same argument.
oefrha 2024-06-06 03:49 UTC link
Snowden has shown us that Microsoft and every other big tech will happily give NSA the keys. Given that OpenAI is fully beholden to Microsoft both in terms of ownership and compute, I have to assume NSA gets whatever they want from OpenAI, directly or through Microsoft.

Now, apps and services are integrating AI like crazy, penetrating just about every type of information, and much of these integrations are sending data to OpenAI, voluntarily. People are sending their private thoughts to OpenAI as they brainstorm, as they write, before fully fledged ideas are even formed. NSA must be enjoying this transparency now.

Narhem 2024-06-06 04:23 UTC link
What about Apple illegally going through my MacBook then my Linux laptop.

Forget these legal grey areas. These organizations actively deal with corporate espionage and no one does anything about it.

isodev 2024-06-06 04:33 UTC link
> one that respects people’s privacy rights while also allowing companies to recoup costs for services they provide

Big tech is a bit like tobacco companies at some point - we’ve convinced people and ourselves that certain things are good, even healthy while they’re actually the opposite.

We, the people who build and maintain these services, are in a position to offer pushback to ensure the humans and their privacy is higher than any other concerns.

a-dub 2024-06-06 04:36 UTC link
i thought they were getting flack for building a local activity log for a local assistant?

the linked sources don't really talk about any kind of user activity logging.

i suppose the point stands though: modern tech is kinda like 70s and 80s wingnut nightmares come true. the tv actually does watch you now, and when you read something in the standard way, detailed information about your demographics and behavior are shared and then stored within milliseconds across tens of parties.

how many entities with an ein get notified when you read that ieee spectrum article? too many!

kisamoto 2024-06-06 05:38 UTC link
People are individual, some really don't mind offering their privacy for lower costs (their data is mined and doesn't belong to them) or additional features.

That doesn't mean it should be the norm though. If enough people actually care about their privacy there is the possibility for cheap, privacy invasive and expensive, privacy respecting products to live in the market.

Shameless plug: I launched Cognos[0] a few days ago that encrypts your AI Prompts and Outputs. No risk of leaks/hacks/being used for training.

If you like generative AI but are worried about putting all your data into ChatGPT and don't know how/want to run inference infrastructure yourself it could be something for you.

- [0] https://cognos.io/cognos-beta-is-live/

openrisk 2024-06-06 05:58 UTC link
Trying to understand consumer privacy behaviors outside the prevalent social contract that the vast majority of people operate under is bound to missinterpret what is happening and why.

We live in a regulated "supermarket" economy. What surfaces on a screen is entirely analogous to what surfaces on a shelf: People check the price and make their choices based on taste, budget etc. They are not idiots, they operate under a simplifying assumption that makes life in a complex world possible.

The implicit assumption central to this way of organising the economy is that anything legally on sale is "safe". That it has been checked and approved by experts that know what they are doing and have the consumer interest as top priority.

People will not rush back home to their chemistry labs to check what is in their purchased food, whether it corresponds to the label (assuming that such a label even exists) and what might be the short or long term health effects. They dont have the knowledge, resources and time to do that for all the stuff they get exposed to.

What has drifted in the digital economy is not consumer standards, it is regulatory standards. Surfacing digital products with questionable short and long term implications for individuals and society has become a lucrative business, has captured its regulatory environment and will keep exploiting opportunities and blind spots until there is pushback.

Ultimately regulators only derive legitimacy from serving their constituencies, but that feedback loop can be very slow and it gets tangled with myriad other unrelated political issues.

rokizero 2024-06-06 06:28 UTC link
It would have been fair if the author would have mentioned that Microsoft is very open about this. I went to a Microsoft training, where the instructor also made clear that the service is being monitored. Suspicious messages get flagged and are reviewed under a 4-eyes principle. [1]

At least in the EU (I'm told) they are required by law keep logs to ensure that their AI services are not being used to do bad.

I'm glad that their system worked.

[1] https://learn.microsoft.com/en-us/legal/cognitive-services/o...

crimsoneer 2024-06-06 07:56 UTC link
Saying something is a scandal in a headline doesn't actually make it a scandal.
notarobot123 2024-06-06 09:45 UTC link
> we need to step back and look at what a healthy technological ecosystem would look like: one that respects people’s privacy rights while also allowing companies to recoup costs for services they provide.

User-community owned services/infrastructure could reasonably be the way towards services that run well, recoup costs and prevent abuses (both of the service and of users). Are there any examples of this kind of thing for AI yet?

ajb 2024-06-06 10:48 UTC link
A lot of this is due to the failure of people's privacy instincts. People do have a strong instinctive desire for privacy, but only against other identifiable people. When private data are revealed to people they know or can be connected to their home address, then they are very upset. But they don't react when 'faceless' organizations collect their data, and use it in ways that are opaque, even when this is strongly to their disadvantage (such as HR departments sharing salary data).

Anti-privacy interests are completely aware of this and avoid triggering the viceral response. For example, the spy agencies always give the example of "no-one is listening to your calls" as if the risk of surveillance was the guys transcribing from reel-to-reel tape recordings, not the dossier available to the powerful on each citizen. But the guy listening to our words on the tape is the one we have a viceral response to. Similarly facebook advertising terms (and presumably that of other companies) are careful to ban advertisers from showing the targetting and giving the impression that there is a person at the other end who knows all about you.

I think the upshot of this is that we need to personalise it for the wider public, not just talk about abstracts like privacy. Bring out the fact that your next employer has that data when negotiating your salary. That companies spend thousands of hours figuring out how to manipulate you based on your data. That at the press of a button, the security services could assemble a dossier on you, and make it available to anyone powerful that you have annoyed.

codr7 2024-06-06 12:33 UTC link
If they want to get rid of Windows completely, they're doing a pretty good job.

It's turning into a total disaster.

And there are viable options these days...

Joel_Mckay 2024-06-06 13:33 UTC link
So it is a Key-logger rebranded as "AI", and I'm sure Microsoft protected themselves in the EULA.

Did this violate the legal definition of an "expectation of privacy"?

In most WIPO countries, you can't legally record other peoples conversations unless both parties give you prior consent (hence the quality assurance disclaimer on phone services.) Single-party-consent does not mean what most people think it means, and has launched many lawsuits.

I usually recommend this list after a windows 11 "offline" install:

https://github.com/StellarSand/privacy-settings/blob/main/Pr...

They sure don't make it easy to un-dork your PC for sure, but it works if you need to run the OS for that one legacy application that people refused to port to Mac/Linux.

Good luck, and make sure to sue them in a jury trial... They won't change their behavior unless there are fiscal disincentives. =)

oglop 2024-06-06 14:28 UTC link
Good thing that we saved all those fish since learning about relative baselines…

A very nothing article. It amazes me people think some private company owes them anything like privacy or free speech. That’s not how that relationship works. It’s an exchange relationship. Transactional. Not an ethical relationship. The only motivation for ethics here is PR. That’s the best you can do.

cynicaltarsius 2024-06-07 23:02 UTC link
Strongly agree with this. Most folks assume that big tech will keep their data safe because well they're big corporations with 'rules' and therefore they must have people working 24/7 keeping customer data safe right? right??. This is just like how we assume data centers and servers keeping all are data are safe and secure, when in fact tons of data have been accidentally deleted by companies all the time. We just don't hear about it as often.
arendtio 2024-06-08 05:48 UTC link
A reasonable first step would be to disallow targeted advertising (because it is the reason so much data is being collected).

I am not a big fan of advertising because its purpose is to manipulate thoughts. But I also like the idea of having a less formal form of paying for services. So, having ad-driven businesses should be okay as long as the thought manipulation machines have limits.

However, modern ad targeting has become so sophisticated and powerful that it creates problems on a global level (remember Cambridge Analytica). All the big ad companies have to play the targeting game because the others will eat their lunch if they don't.

If we want to protect users from ad companies' total exploitation, we need regulation. And if the rules change for all the players equally, it should stay a fair game.

INGSOCIALITE 2024-06-06 02:54 UTC link
...and more. because they work directly with/for "the law" - and have enough money to pay any fine that could possibly be levied on them
EGreg 2024-06-06 03:06 UTC link
Vegans can talk all they want

But the Impossible Burger did more than talking ever did

Government tried antitrust for decades w telcos

But the costs of long distance calls dropped to practically zero when open VoIP protocols came out

The Web disrupted AOL, MSN, CompuServe, TV, Newspapers, Magazines etc.

Wikipedia disrupted Britannica and Encarta

The answer: open source must get good enough that people can switch from big tech corporation-controlled solutions and then they will be in control of their own software !! :)

poikroequ 2024-06-06 03:19 UTC link
I've seen first hand just how little the average person cares about privacy. The type of people who believe they have nothing to hide. My friends and family don't understand why I don't post everything to social media. No I'm not going to install this app and scan all my receipts. BuT wHy?!?!?!

I think many people here on hackernews live in a bubble and just don't understand what is an average human being. They surround themselves with like-minded tech savvy individuals and fail to comprehend how there isn't stronger support for privacy.

pennybanks 2024-06-06 03:43 UTC link
is it such a problem if people dont care? its not like they arent aware.

i mean its not too complicated to get to a desired privacy standard for the most part so it seems like the privacy conscious in general get angry for other people, who dont even care.

its because of those people that we can even use these products for free. i mean they arent making ad revenue off of us....

JumpCrisscross 2024-06-06 03:48 UTC link
> it's too late. The frame shift is too great for most people

The capacity for the current crop of privacy advocates to make a cogent case it lacking. Peoples’ capacity to recognise and rearrange themselves in defence of a novel threat is not. If anything, our present malaise is one of allergic reactions to phantom threats.

benreesman 2024-06-06 04:00 UTC link
These concerns are doubtlessly well-founded based on disclosures, particularly the Snowden disclosures.

But as for infinite data and definitely smart enough people to have ten SOTA LLMs in flight?

I don’t think TAO/EG needs any help from fucking Microsoft.

TaylorAlexander 2024-06-06 04:17 UTC link
> I don't think also having our LLM prompts fundamentally changes this picture

Is that what is being discussed? The biggest issue with Microsoft's new AI announcement was that their system was going to take screenshots of your computer every second and process them with AI. That means they could have way, way more data about you than LLM prompts.

https://arstechnica.com/ai/2024/06/windows-recall-demands-an...

__MatrixMan__ 2024-06-06 04:36 UTC link
> Most people don't do anything worth privacy protecting.

That may be true, but most people still benefit indirectly from the actions of the few who do have something to hide (e.g. protestors, journalists, whistleblowers).

If we want the masses to continue to benefit from the actions of those few, then we need to find a middle ground. Someplace where the masses are private enough that a truly private individual can hide among them without sticking out like a sore thumb. You don't need to hide, you just need to be able to hide.

shiroiushi 2024-06-06 04:57 UTC link
>We, the people who build and maintain these services, are in a position to...

Speak for yourself. Most people in tech do not work for Google or Microsoft. Working in tech alone doesn't give you some kind of power to offer pushback in these tech companies, just like being a random worker in the agriculture sector doesn't give you power to offer pushback against the tobacco companies.

shiroiushi 2024-06-06 05:00 UTC link
How exactly did Apple go through your Linux laptop (assuming you didn't physically hand it to them)?
behnamoh 2024-06-06 05:21 UTC link
> we, the people

unfortunately most engineers I've seen (i come from an engineering background) care mostly about technical stuff and lack the EQ to see the bigger picture.

Eduard 2024-06-06 05:33 UTC link
amatecha 2024-06-06 05:41 UTC link
> Most people don't do anything worth privacy protecting.

That rationale sounds great (albeit dismissive/invalidating) until something you've done (and have provided ample digital evidence of) becomes illegal or is otherwise used against you.

Oh actually, what's your email password? I mean, since you're not doing anything worth keeping private, right?

You think there isn't a human reviewing the data of what each user is doing, but there absolutely could be, and there's no reason there can't be, like when Tesla employees were viewing and exfiltrating footage/imagery from customers' vehicles. Not just one or two people but apparently disparate _groups_ of employees. https://www.reuters.com/technology/tesla-workers-shared-sens...

newrotik 2024-06-06 05:59 UTC link
Privacy is (a) freedom.

The reason why people care about privacy is not necessarily because giving up privacy has some directly observable negative effect. But, simply, living without freedom sucks.

I don't want you to know my personal information not because you could/would do something nefarious with it. I don't want you to have it simply because it's none of your business.

nonrandomstring 2024-06-06 06:22 UTC link
Not sure that regulatory capture explains the poor quality of digital goods and how people value them. I think it's that cargo cults (what we have) are antithetical to true technological societies (which is what we say we want)

A fault I see in Bruce Schneier's article, and the general hypothesis of "frame/baseline shifting" - it's not that people have been conditioned or forgotten the value of privacy, but that we're looking at the world through ever smaller lenses. The story in the article is that science is guilty of that too. The decline has been around education and perspective in general, not just attitudes to a small issue like "privacy". We thought the internet would widen our scope. It narrowed it.

In the UK we have a great travel and culture show by Romesh Ranganathan. (I highly recommend it, so get on your VPN to watch BBC or find it on the torrents). It will cheer you up [0]. He's a funny guy. But also the cultural vista is breath-taking. Looking at life in central Africa it's great to be reminded of the diversity of humankind. Watch for the little things - like a whole bus queue of people, none of whom are on mobile phones.

What the Internet promised - the great "conversation of mankind" - never emerged. Instead we got cat memes and social control media that forced people into ever smaller parochial silos. HN is no different. Here it's cool to have a bleak outlook on humanity and technology. "Oh it's too late... we're doomed... oh woe is me!" C'mon hackers... what happened to the joy of shaping the world? :)

Bruce Schneier appeals to a macro systems theory metaphor, and mentions Daniel Pauly [1]. But he neglects some of the more profound lessons that Forrester, Meadows and actually Norbert Weiner gave us about feedback and the empty dream of cybernetic governance. Nothing as big as humanity will fit in bottle that small unless you're willing to destroy it in the process. And what you're destroying is the very innovative base that gave you the technology in the first place. This is why they should teach history, geography and other cultures in schools.

[0] https://en.wikipedia.org/wiki/Romesh_Ranganathan

[1] https://oceans.ubc.ca/2023/05/19/daniel-pauly/

renegat0x0 2024-06-06 06:40 UTC link
Normalization of abusive behavior is not OK! If one company abuses you for 10+ years, then it is not OK for other companies to abuse you!

If big data were not lucrative, they could not sell your data. If your data was not valuable Facebook and Google would immediately remove it from their servers without hesitation.

Mantra.

- I don't care about privacy

- I don't care big tech has access to all my data

- I don't care that Google has access all my politicians data

- I don't care I am building a worse future for society

- I don't care that I am being recorded while having sex in Tesla

- I don't care I am building surveillance state

- I don't care that my data are being sold to China, to India, to wherever highest bidder lives

- I don't care how my data are being used. I don't care if my data are being used to train military robot dogs that will be used for wars

- I don't care that I will not receive insurance because my medical data are sold wherever

- I don't care about privacy

- I don't care about privacy

- I don't care about privacy

alvah 2024-06-06 08:00 UTC link
"The implicit assumption central to this way of organising the economy is that anything legally on sale is "safe". That it has been checked and approved by experts that know what they are doing and have the consumer interest as top priority.

People will not rush back home to their chemistry labs to check what is in their purchased food, whether it corresponds to the label (assuming that such a label even exists) and what might be the short or long term health effects. They dont have the knowledge, resources and time to do that for all the stuff they get exposed to."

What you describe is a feature of a high-trust society, where you don't have to double-check every single transaction or interaction you enter into, but can take most statements on trust. This allows people to get on with the fundamental task at hand, rather than dealing with the overhead of checking their food in the chemistry lab, or whatever the equivalent is for the specific transaction.

I have read suggestions that this was a major contributor to the growth of the Western economies, relative to other low-trust societies. If this was the case, we are in for a bumpy ride, as we seem to be rapidly changing from a high-trust to a low-trust society.

cryptonym 2024-06-06 08:16 UTC link
Throwing such statement with no beginning of proof is ridiculous.

Even *if* that happened, as long as you provide nothing tangible, you should keep that for yourself.

benterix 2024-06-06 08:57 UTC link
> The implicit assumption central to this way of organising the economy is that anything legally on sale is "safe". That it has been checked and approved by experts

This applies to certain producs only and even in the EU where these laws are more strict some products are regularly withdrawn from sale (these are mostly quality issues, not intentional actions), even as far as drugs are concerned. It's one of the reasons I tend to buy fresh products and from bio shops mostly - to increase my chances.

dkobia 2024-06-06 09:04 UTC link
They are indeed very open about this. More here [1]. You can also put in a request for limited monitoring [2].

[1] https://learn.microsoft.com/en-us/azure/ai-services/openai/c...

[2] https://customervoice.microsoft.com/Pages/ResponsePage.aspx?...

pjc50 2024-06-06 09:29 UTC link
Schneier used to talk about "the Exxon Valdez of privacy", the idea that there would be a single giant spill that had significantly bad effects enough that it would force change.

That has basically not happened. It sometimes seems that the situation has got worse in terms of public debate, due to the usual bad-faith actors. For example, the TikTok discussion is not framed around privacy in general but focuses on "China bad". With the implication that an algorithmic megacorp controlling political sentiment through feeds is completely fine so long as it's Americans doing it. And the voting security discussion: there were questions about voting machines long before 2020, but partisan attacks focused on discrediting valid results.

djtango 2024-06-06 09:37 UTC link
You chose an analogy close to my heart! I have managed to convince myself that most "food" in the supermarket is inedible poison!

It's exhausting and a real nuisance to my quality of life but I equally refuse to knowingly consume excess additives unless completely in a pinch.

Needless to say I'm also very suspicious of online businesses. Although I'm actually getting a bit fatigued/defeatist by privacy issues. We're all so overwhelmingly in this ship that I don't know what I really stand to gain by constantly hamstringing myself digitally...

If we wake up in the worst case scenario, I'm sure I have enough of a footprint I wouldn't be able to meaningfully hide much from a determined bad actor...

blitzar 2024-06-06 09:51 UTC link
Snowden has shown us that nobody cares.

The disclosures of Snowden should have brought about profound change in this entire debate (> 10 years ago).

Instead he is a refugee with not much more to do than to shill crypto.

TiredOfLife 2024-06-06 10:37 UTC link
Snowden showed only what his handlers told him to show.
oefrha 2024-06-06 11:22 UTC link
I had a look at your offering, and unfortunately I’m not convinced. Obviously you need to send plain text to the actual AI model in use, which is confirmed by

> Sending a message involves transmitting the message to the server in plaintext over TLS encrypted connections. Your plaintext message is encrypted with the conversation public key and saved before being forward to your AI model of choice, again in plaintext over an TLS encrypted connection. When the AI model has generated a response, this is returned to our server. This response is also encrypted with the conversation public key and saved before being forwarded back to you.

The model operator still gets everything, just not trivially traced back to the user and not collected in one place; in addition you get everything and there’s no way to verify you’re not inspecting and/or storing clear text user info, or hacked to do so. It’s basically a pinky promise from an unknown entity (no offense but you can see that from the user point of view) whose main value proposition is that pinky promise.

Hizonner 2024-06-06 12:46 UTC link
Very impressive. How many actual users waste days of their lives in Microsoft training to get basic information?

And, no, burying it in the ToS isn't being "very open" either. "Very open" would be putting a big visible banner on every page of the UI.

... and the abuse still wouldn't be acceptable even if Microsoft actually were being forthcoming about it.

immibis 2024-06-06 14:41 UTC link
Not really. There's Apple (less spying, more lock-in) and there's Linux (build-your-own operating system construction kit for DIY-loving folks. The prefabs aren't good enough that you can ignore what's underneath.).
ryukoposting 2024-06-06 18:30 UTC link
> ...as if the risk of surveillance was the guys transcribing from reel-to-reel tape recordings

At a minimum, the guy transcribing the tapes is an accomplice to the surveillance effort.

As a society, we desperately need privacy regulations to catch up to the digital age. Legislation will take years (decades?) to catch up, though. In the meantime, as tech industry professionals, we have a responsibility to not be the guy transcribing the tape.

Right now, there's a lot of incentive to be that guy, and good data ethics regulation will remove those incentives from our industry. Until those incentives go away, it's up to us to make good moral choices, even if that means turning down a fat paycheck. How do we, as an industry, self-regulate while legislation catches up?

FMecha 2024-06-06 18:44 UTC link
>We, the people who build and maintain these services, are in a position to offer pushback to ensure the humans and their privacy is higher than any other concerns.

So, a Hollywood-style strike with the aim to do so?

warkdarrior 2024-06-06 19:00 UTC link
Did you sue them or have a criminal case brought against them? I think corporate espionage is illegal in many countries, probably in yours too.
Editorial Channel
What the content says
+0.70
Article 12 Privacy
High Advocacy Framing Practice
Editorial
+0.70
SETL
+0.59

CORE ARTICLE. Entire argument centers on privacy erosion through normalized surveillance. Uses Daniel Pauly's 'shifting baseline syndrome' from fisheries to explain how each generation accepts lower privacy expectations as normal. References U.S. Supreme Court standard that privacy rights depend on 'reasonable expectation of privacy.' Advocates for ecosystem-wide regulatory approach to restore privacy protections.

+0.50
Preamble Preamble
Medium Advocacy Framing
Editorial
+0.50
SETL
ND

Article emphasizes privacy as a fundamental right aligned with preamble's emphasis on human dignity. Authors argue for 'stepping back' to examine healthy technological ecosystem and call for systemic protection of privacy as heritage for next generation.

+0.50
Article 21 Political Participation
Medium Advocacy Practice
Editorial
+0.50
SETL
ND

Central to authors' solution. Advocates for 'scientifically informed and democratic regulatory process' to govern privacy protection. Frames privacy as requiring democratic participation and oversight, analogous to fisheries management integration.

+0.40
Article 28 Social & International Order
Medium Advocacy Practice
Editorial
+0.40
SETL
ND

Authors advocate for ecosystem-wide approach to privacy governance, treating digital infrastructure as shared resource requiring coordinated order. Proposes regulatory frameworks analogous to international fisheries management.

+0.30
Article 3 Life, Liberty, Security
Low Framing
Editorial
+0.30
SETL
ND

Implicitly addresses security through discussion of hacker monitoring and threat detection. Privacy erosion implicitly framed as reducing digital security.

+0.30
Article 7 Equality Before Law
Low Advocacy Practice
Editorial
+0.30
SETL
ND

Authors advocate for 'scientifically informed and democratic regulatory process' to codify privacy protections, implicitly supporting equal protection through rule of law.

+0.30
Article 19 Freedom of Expression
Low Advocacy Framing
Editorial
+0.30
SETL
ND

Article exemplifies freedom of expression by critically analyzing corporate surveillance. Implicitly frames privacy as necessary condition for free expression by highlighting how ubiquitous surveillance can suppress speech.

+0.30
Article 29 Duties to Community
Low Framing Practice
Editorial
+0.30
SETL
ND

Implicitly addresses collective duties and social responsibilities. Frames scientists, regulators, and companies as having obligations to maintain healthy digital ecosystem and protect privacy.

+0.20
Article 18 Freedom of Thought
Low Framing
Editorial
+0.20
SETL
ND

Implicitly addresses freedom of thought through discussion of hidden data collection. Surveillance framed as occurring 'behind the scenes' without user awareness, suggesting threat to autonomous thought.

ND
Article 1 Freedom, Equality, Brotherhood

Article does not address equal rights or dignity of persons.

ND
Article 2 Non-Discrimination

Article does not address non-discrimination.

ND
Article 4 No Slavery

Article does not address slavery or servitude.

ND
Article 5 No Torture

Article does not address torture or cruel treatment.

ND
Article 6 Legal Personhood

Article does not address recognition as person before law.

ND
Article 8 Right to Remedy

Article does not directly address effective remedy, though regulatory call implies institutional remedies.

ND
Article 9 No Arbitrary Detention

Article does not address arbitrary arrest or detention.

ND
Article 10 Fair Hearing

Article does not address fair trial or due process.

ND
Article 11 Presumption of Innocence

Article does not address presumption of innocence.

ND
Article 13 Freedom of Movement

Article does not address freedom of movement.

ND
Article 14 Asylum

Article does not address right to asylum.

ND
Article 15 Nationality

Article does not address nationality.

ND
Article 16 Marriage & Family

Article does not address family rights.

ND
Article 17 Property

Article does not address property rights.

ND
Article 20 Assembly & Association

Article does not address freedom of assembly or association.

ND
Article 22 Social Security

Article does not address social security.

ND
Article 23 Work & Equal Pay

Article does not address right to work or fair labor.

ND
Article 24 Rest & Leisure

Article does not address rest or leisure.

ND
Article 25 Standard of Living

Article does not address adequate standard of living.

ND
Article 26 Education

Article contains educative content but does not address education as a right.

ND
Article 27 Cultural Participation

Article does not address cultural life or intellectual property.

ND
Article 30 No Destruction of Rights

Article does not address abuse of rights.

Structural Channel
What the site does
+0.20
Article 12 Privacy
High Advocacy Framing Practice
Structural
+0.20
Context Modifier
ND
SETL
+0.59

Site implements baseline privacy infrastructure (policy, cookie preferences, ad privacy options in footer) indicating commitment to privacy transparency. However, page code reveals Sentry error tracking, indicating data collection infrastructure. Creates SETL tension: editorial advocates privacy protection while structural implements standard surveillance/monitoring.

ND
Preamble Preamble
Medium Advocacy Framing

Not applicable to preamble.

ND
Article 1 Freedom, Equality, Brotherhood

Not applicable.

ND
Article 2 Non-Discrimination

Not applicable.

ND
Article 3 Life, Liberty, Security
Low Framing

Not applicable.

ND
Article 4 No Slavery

Not applicable.

ND
Article 5 No Torture

Not applicable.

ND
Article 6 Legal Personhood

Not applicable.

ND
Article 7 Equality Before Law
Low Advocacy Practice

Not applicable.

ND
Article 8 Right to Remedy

Not applicable.

ND
Article 9 No Arbitrary Detention

Not applicable.

ND
Article 10 Fair Hearing

Not applicable.

ND
Article 11 Presumption of Innocence

Not applicable.

ND
Article 13 Freedom of Movement

Not applicable.

ND
Article 14 Asylum

Not applicable.

ND
Article 15 Nationality

Not applicable.

ND
Article 16 Marriage & Family

Not applicable.

ND
Article 17 Property

Not applicable.

ND
Article 18 Freedom of Thought
Low Framing

Not applicable.

ND
Article 19 Freedom of Expression
Low Advocacy Framing

Not applicable.

ND
Article 20 Assembly & Association

Not applicable.

ND
Article 21 Political Participation
Medium Advocacy Practice

Not applicable.

ND
Article 22 Social Security

Not applicable.

ND
Article 23 Work & Equal Pay

Not applicable.

ND
Article 24 Rest & Leisure

Not applicable.

ND
Article 25 Standard of Living

Not applicable.

ND
Article 26 Education

Not applicable.

ND
Article 27 Cultural Participation

Not applicable.

ND
Article 28 Social & International Order
Medium Advocacy Practice

Not applicable.

ND
Article 29 Duties to Community
Low Framing Practice

Not applicable.

ND
Article 30 No Destruction of Rights

Not applicable.

Supplementary Signals
Epistemic Quality
0.81 medium claims
Sources
0.8
Evidence
0.8
Uncertainty
0.7
Purpose
0.9
Propaganda Flags
0 techniques detected
Solution Orientation
0.73 solution oriented
Reader Agency
0.6
Emotional Tone
measured
Valence
-0.2
Arousal
0.4
Dominance
0.7
Stakeholder Voice
0.55 5 perspectives
Speaks: security_professionalsinstitution
About: governmentcorporationindividualsscientists
Temporal Framing
mixed long term
Geographic Scope
global
United States, Canada, Atlantic Ocean
Complexity
moderate medium jargon general
Transparency
0.50
✓ Author ✗ Conflicts ✗ Funding
Audit Trail 17 entries
2026-02-28 09:56 model_divergence Cross-model spread 0.28 exceeds threshold (4 models) - -
2026-02-28 09:56 eval_success Lite evaluated: Moderate positive (0.40) - -
2026-02-28 09:56 eval Evaluated by llama-4-scout-wai: +0.40 (Moderate positive) -0.16
2026-02-28 09:56 rater_validation_warn Lite validation warnings for model llama-4-scout-wai: 0W 1R - -
2026-02-28 09:50 eval_success Light evaluated: Moderate positive (0.56) - -
2026-02-28 09:50 rater_validation_warn Light validation warnings for model llama-4-scout-wai: 0W 1R - -
2026-02-28 09:50 model_divergence Cross-model spread 0.28 exceeds threshold (4 models) - -
2026-02-28 09:50 eval Evaluated by llama-4-scout-wai: +0.56 (Moderate positive)
2026-02-28 09:44 eval_success Light evaluated: Strong positive (0.60) - -
2026-02-28 09:44 rater_validation_warn Light validation warnings for model llama-3.3-70b-wai: 0W 1R - -
2026-02-28 09:44 model_divergence Cross-model spread 0.28 exceeds threshold (3 models) - -
2026-02-28 09:44 eval Evaluated by llama-3.3-70b-wai: +0.60 (Strong positive)
2026-02-28 09:24 model_divergence Cross-model spread 0.28 exceeds threshold (2 models) - -
2026-02-28 09:24 eval Evaluated by claude-haiku-4-5-20251001: +0.40 (Moderate positive) +0.11
2026-02-28 09:22 model_divergence Cross-model spread 0.39 exceeds threshold (2 models) - -
2026-02-28 09:22 eval Evaluated by claude-haiku-4-5-20251001: +0.29 (Mild positive)
2026-02-28 01:55 eval Evaluated by claude-haiku-4-5: +0.68 (Strong positive)