IEEE Spectrum publishes a 2024 opinion piece by security experts Bruce Schneier and Barath Raghavan that advocates for privacy rights protection through ecosystem-level regulatory frameworks. The authors argue that internet surveillance has become normalized through 'shifting baseline syndrome'—each generation accepts degraded privacy as normal—and call for democratic, scientifically-informed governance to preserve privacy as foundational. However, the site itself implements Sentry tracking, requires account login, uses cookies and ad targeting, and paywalls the privacy content, creating a stark contradiction between message and practice.
I agree. Philosophically, politically and financially (in terms of groups I donate to).
But it's too late. The frame shift is too great for most people. Getting people to care about this let alone vote in a manner that would cause actual change is so far out I have a better chance of winning the lottery.
I don't know what to do other than to support and educate where I can.
Companies like Google have had access to our full email, search, location, photo roll, video viewing, docs, etc history for 10+ years. I don't think also having our LLM prompts fundamentally changes this picture..
I guess this is what the shifting baseline argument refers to..
Having said that, I think it's rational for almost all people to not care to give up privacy for all these (addictively) amazing tools. Most people don't do anything worth privacy protecting. Having worked in data/engineering at bigtech, it's not like there's a human on the other side reviewing what each user is doing. For almost all people the data will just be used for boring purposes to build models for better marketing/ads/recommendations. A lot of the models aren't even personalised, the user is just represented by some not-human-readable feature vector that again nobody looks at.
Hell, I have multiple Google Home devices that are always on and listening, and the thing's internal model is so basic and not-personalized that after multiple years it still has trouble parsing me when I say "Play jazz" and "Stop", even though these are the 2 commands I exclusuvely use. Sometimes it starts playing acid rock, and when I say "Stop" it starts reading me stock quotes.
Snowden has shown us that Microsoft and every other big tech will happily give NSA the keys. Given that OpenAI is fully beholden to Microsoft both in terms of ownership and compute, I have to assume NSA gets whatever they want from OpenAI, directly or through Microsoft.
Now, apps and services are integrating AI like crazy, penetrating just about every type of information, and much of these integrations are sending data to OpenAI, voluntarily. People are sending their private thoughts to OpenAI as they brainstorm, as they write, before fully fledged ideas are even formed. NSA must be enjoying this transparency now.
> one that respects people’s privacy rights while also allowing companies to recoup costs for services they provide
Big tech is a bit like tobacco companies at some point - we’ve convinced people and ourselves that certain things are good, even healthy while they’re actually the opposite.
We, the people who build and maintain these services, are in a position to offer pushback to ensure the humans and their privacy is higher than any other concerns.
i thought they were getting flack for building a local activity log for a local assistant?
the linked sources don't really talk about any kind of user activity logging.
i suppose the point stands though: modern tech is kinda like 70s and 80s wingnut nightmares come true. the tv actually does watch you now, and when you read something in the standard way, detailed information about your demographics and behavior are shared and then stored within milliseconds across tens of parties.
how many entities with an ein get notified when you read that ieee spectrum article? too many!
People are individual, some really don't mind offering their privacy for lower costs (their data is mined and doesn't belong to them) or additional features.
That doesn't mean it should be the norm though. If enough people actually care about their privacy there is the possibility for cheap, privacy invasive and expensive, privacy respecting products to live in the market.
Shameless plug: I launched Cognos[0] a few days ago that encrypts your AI Prompts and Outputs. No risk of leaks/hacks/being used for training.
If you like generative AI but are worried about putting all your data into ChatGPT and don't know how/want to run inference infrastructure yourself it could be something for you.
Trying to understand consumer privacy behaviors outside the prevalent social contract that the vast majority of people operate under is bound to missinterpret what is happening and why.
We live in a regulated "supermarket" economy. What surfaces on a screen is entirely analogous to what surfaces on a shelf: People check the price and make their choices based on taste, budget etc. They are not idiots, they operate under a simplifying assumption that makes life in a complex world possible.
The implicit assumption central to this way of organising the economy is that anything legally on sale is "safe". That it has been checked and approved by experts that know what they are doing and have the consumer interest as top priority.
People will not rush back home to their chemistry labs to check what is in their purchased food, whether it corresponds to the label (assuming that such a label even exists) and what might be the short or long term health effects. They dont have the knowledge, resources and time to do that for all the stuff they get exposed to.
What has drifted in the digital economy is not consumer standards, it is regulatory standards. Surfacing digital products with questionable short and long term implications for individuals and society has become a lucrative business, has captured its regulatory environment and will keep exploiting opportunities and blind spots until there is pushback.
Ultimately regulators only derive legitimacy from serving their constituencies, but that feedback loop can be very slow and it gets tangled with myriad other unrelated political issues.
It would have been fair if the author would have mentioned that Microsoft is very open about this. I went to a Microsoft training, where the instructor also made clear that the service is being monitored. Suspicious messages get flagged and are reviewed under a 4-eyes principle. [1]
At least in the EU (I'm told) they are required by law keep logs to ensure that their AI services are not being used to do bad.
> we need to step back and look at what a healthy technological ecosystem would look like: one that respects people’s privacy rights while also allowing companies to recoup costs for services they provide.
User-community owned services/infrastructure could reasonably be the way towards services that run well, recoup costs and prevent abuses (both of the service and of users). Are there any examples of this kind of thing for AI yet?
A lot of this is due to the failure of people's privacy instincts. People do have a strong instinctive desire for privacy, but only against other identifiable people. When private data are revealed to people they know or can be connected to their home address, then they are very upset. But they don't react when 'faceless' organizations collect their data, and use it in ways that are opaque, even when this is strongly to their disadvantage (such as HR departments sharing salary data).
Anti-privacy interests are completely aware of this and avoid triggering the viceral response. For example, the spy agencies always give the example of "no-one is listening to your calls" as if the risk of surveillance was the guys transcribing from reel-to-reel tape recordings, not the dossier available to the powerful on each citizen. But the guy listening to our words on the tape is the one we have a viceral response to. Similarly facebook advertising terms (and presumably that of other companies) are careful to ban advertisers from showing the targetting and giving the impression that there is a person at the other end who knows all about you.
I think the upshot of this is that we need to personalise it for the wider public, not just talk about abstracts like privacy. Bring out the fact that your next employer has that data when negotiating your salary. That companies spend thousands of hours figuring out how to manipulate you based on your data. That at the press of a button, the security services could assemble a dossier on you, and make it available to anyone powerful that you have annoyed.
So it is a Key-logger rebranded as "AI", and I'm sure Microsoft protected themselves in the EULA.
Did this violate the legal definition of an "expectation of privacy"?
In most WIPO countries, you can't legally record other peoples conversations unless both parties give you prior consent (hence the quality assurance disclaimer on phone services.) Single-party-consent does not mean what most people think it means, and has launched many lawsuits.
I usually recommend this list after a windows 11 "offline" install:
They sure don't make it easy to un-dork your PC for sure, but it works if you need to run the OS for that one legacy application that people refused to port to Mac/Linux.
Good luck, and make sure to sue them in a jury trial... They won't change their behavior unless there are fiscal disincentives. =)
Good thing that we saved all those fish since learning about relative baselines…
A very nothing article. It amazes me people think some private company owes them anything like privacy or free speech. That’s not how that relationship works. It’s an exchange relationship. Transactional. Not an ethical relationship. The only motivation for ethics here is PR. That’s the best you can do.
Strongly agree with this. Most folks assume that big tech will keep their data safe because well they're big corporations with 'rules' and therefore they must have people working 24/7 keeping customer data safe right? right??. This is just like how we assume data centers and servers keeping all are data are safe and secure, when in fact tons of data have been accidentally deleted by companies all the time. We just don't hear about it as often.
A reasonable first step would be to disallow targeted advertising (because it is the reason so much data is being collected).
I am not a big fan of advertising because its purpose is to manipulate thoughts. But I also like the idea of having a less formal form of paying for services. So, having ad-driven businesses should be okay as long as the thought manipulation machines have limits.
However, modern ad targeting has become so sophisticated and powerful that it creates problems on a global level (remember Cambridge Analytica). All the big ad companies have to play the targeting game because the others will eat their lunch if they don't.
If we want to protect users from ad companies' total exploitation, we need regulation. And if the rules change for all the players equally, it should stay a fair game.
But the Impossible Burger did more than talking ever did
Government tried antitrust for decades w telcos
But the costs of long distance calls dropped to practically zero when open VoIP protocols came out
The Web disrupted AOL, MSN, CompuServe, TV, Newspapers, Magazines etc.
Wikipedia disrupted Britannica and Encarta
The answer: open source must get good enough that people can switch from big tech corporation-controlled solutions and then they will be in control of their own software !! :)
I've seen first hand just how little the average person cares about privacy. The type of people who believe they have nothing to hide. My friends and family don't understand why I don't post everything to social media. No I'm not going to install this app and scan all my receipts. BuT wHy?!?!?!
I think many people here on hackernews live in a bubble and just don't understand what is an average human being. They surround themselves with like-minded tech savvy individuals and fail to comprehend how there isn't stronger support for privacy.
is it such a problem if people dont care? its not like they arent aware.
i mean its not too complicated to get to a desired privacy standard for the most part so it seems like the privacy conscious in general get angry for other people, who dont even care.
its because of those people that we can even use these products for free. i mean they arent making ad revenue off of us....
> it's too late. The frame shift is too great for most people
The capacity for the current crop of privacy advocates to make a cogent case it lacking. Peoples’ capacity to recognise and rearrange themselves in defence of a novel threat is not. If anything, our present malaise is one of allergic reactions to phantom threats.
> I don't think also having our LLM prompts fundamentally changes this picture
Is that what is being discussed? The biggest issue with Microsoft's new AI announcement was that their system was going to take screenshots of your computer every second and process them with AI. That means they could have way, way more data about you than LLM prompts.
> Most people don't do anything worth privacy protecting.
That may be true, but most people still benefit indirectly from the actions of the few who do have something to hide (e.g. protestors, journalists, whistleblowers).
If we want the masses to continue to benefit from the actions of those few, then we need to find a middle ground. Someplace where the masses are private enough that a truly private individual can hide among them without sticking out like a sore thumb. You don't need to hide, you just need to be able to hide.
>We, the people who build and maintain these services, are in a position to...
Speak for yourself. Most people in tech do not work for Google or Microsoft. Working in tech alone doesn't give you some kind of power to offer pushback in these tech companies, just like being a random worker in the agriculture sector doesn't give you power to offer pushback against the tobacco companies.
unfortunately most engineers I've seen (i come from an engineering background) care mostly about technical stuff and lack the EQ to see the bigger picture.
> Most people don't do anything worth privacy protecting.
That rationale sounds great (albeit dismissive/invalidating) until something you've done (and have provided ample digital evidence of) becomes illegal or is otherwise used against you.
Oh actually, what's your email password? I mean, since you're not doing anything worth keeping private, right?
You think there isn't a human reviewing the data of what each user is doing, but there absolutely could be, and there's no reason there can't be, like when Tesla employees were viewing and exfiltrating footage/imagery from customers' vehicles. Not just one or two people but apparently disparate _groups_ of employees. https://www.reuters.com/technology/tesla-workers-shared-sens...
The reason why people care about privacy is not necessarily because giving up privacy has some directly observable negative effect. But, simply, living without freedom sucks.
I don't want you to know my personal information not because you could/would do something nefarious with it. I don't want you to have it simply because it's none of your business.
Not sure that regulatory capture explains the poor quality of digital
goods and how people value them. I think it's that cargo cults (what
we have) are antithetical to true technological societies (which is
what we say we want)
A fault I see in Bruce Schneier's article, and the general hypothesis
of "frame/baseline shifting" - it's not that people have been
conditioned or forgotten the value of privacy, but that we're looking
at the world through ever smaller lenses. The story in the article is
that science is guilty of that too. The decline has been around
education and perspective in general, not just attitudes to a small
issue like "privacy". We thought the internet would widen our
scope. It narrowed it.
In the UK we have a great travel and culture show by Romesh
Ranganathan. (I highly recommend it, so get on your VPN to watch BBC
or find it on the torrents). It will cheer you up [0]. He's a funny
guy. But also the cultural vista is breath-taking. Looking at life in
central Africa it's great to be reminded of the diversity of
humankind. Watch for the little things - like a whole bus queue of
people, none of whom are on mobile phones.
What the Internet promised - the great "conversation of mankind" -
never emerged. Instead we got cat memes and social control media that
forced people into ever smaller parochial silos. HN is no different.
Here it's cool to have a bleak outlook on humanity and technology.
"Oh it's too late... we're doomed... oh woe is me!" C'mon
hackers... what happened to the joy of shaping the world? :)
Bruce Schneier appeals to a macro systems theory metaphor, and
mentions Daniel Pauly [1]. But he neglects some of the more profound
lessons that Forrester, Meadows and actually Norbert Weiner gave us
about feedback and the empty dream of cybernetic governance. Nothing
as big as humanity will fit in bottle that small unless you're willing
to destroy it in the process. And what you're destroying is the very
innovative base that gave you the technology in the first place. This
is why they should teach history, geography and other cultures in
schools.
Normalization of abusive behavior is not OK! If one company abuses you for 10+ years, then it is not OK for other companies to abuse you!
If big data were not lucrative, they could not sell your data. If your data was not valuable Facebook and Google would immediately remove it from their servers without hesitation.
Mantra.
- I don't care about privacy
- I don't care big tech has access to all my data
- I don't care that Google has access all my politicians data
- I don't care I am building a worse future for society
- I don't care that I am being recorded while having sex in Tesla
- I don't care I am building surveillance state
- I don't care that my data are being sold to China, to India, to wherever highest bidder lives
- I don't care how my data are being used. I don't care if my data are being used to train military robot dogs that will be used for wars
- I don't care that I will not receive insurance because my medical data are sold wherever
"The implicit assumption central to this way of organising the economy is that anything legally on sale is "safe". That it has been checked and approved by experts that know what they are doing and have the consumer interest as top priority.
People will not rush back home to their chemistry labs to check what is in their purchased food, whether it corresponds to the label (assuming that such a label even exists) and what might be the short or long term health effects. They dont have the knowledge, resources and time to do that for all the stuff they get exposed to."
What you describe is a feature of a high-trust society, where you don't have to double-check every single transaction or interaction you enter into, but can take most statements on trust. This allows people to get on with the fundamental task at hand, rather than dealing with the overhead of checking their food in the chemistry lab, or whatever the equivalent is for the specific transaction.
I have read suggestions that this was a major contributor to the growth of the Western economies, relative to other low-trust societies. If this was the case, we are in for a bumpy ride, as we seem to be rapidly changing from a high-trust to a low-trust society.
> The implicit assumption central to this way of organising the economy is that anything legally on sale is "safe". That it has been checked and approved by experts
This applies to certain producs only and even in the EU where these laws are more strict some products are regularly withdrawn from sale (these are mostly quality issues, not intentional actions), even as far as drugs are concerned. It's one of the reasons I tend to buy fresh products and from bio shops mostly - to increase my chances.
Schneier used to talk about "the Exxon Valdez of privacy", the idea that there would be a single giant spill that had significantly bad effects enough that it would force change.
That has basically not happened. It sometimes seems that the situation has got worse in terms of public debate, due to the usual bad-faith actors. For example, the TikTok discussion is not framed around privacy in general but focuses on "China bad". With the implication that an algorithmic megacorp controlling political sentiment through feeds is completely fine so long as it's Americans doing it. And the voting security discussion: there were questions about voting machines long before 2020, but partisan attacks focused on discrediting valid results.
You chose an analogy close to my heart! I have managed to convince myself that most "food" in the supermarket is inedible poison!
It's exhausting and a real nuisance to my quality of life but I equally refuse to knowingly consume excess additives unless completely in a pinch.
Needless to say I'm also very suspicious of online businesses. Although I'm actually getting a bit fatigued/defeatist by privacy issues. We're all so overwhelmingly in this ship that I don't know what I really stand to gain by constantly hamstringing myself digitally...
If we wake up in the worst case scenario, I'm sure I have enough of a footprint I wouldn't be able to meaningfully hide much from a determined bad actor...
I had a look at your offering, and unfortunately I’m not convinced. Obviously you need to send plain text to the actual AI model in use, which is confirmed by
> Sending a message involves transmitting the message to the server in plaintext over TLS encrypted connections. Your plaintext message is encrypted with the conversation public key and saved before being forward to your AI model of choice, again in plaintext over an TLS encrypted connection. When the AI model has generated a response, this is returned to our server. This response is also encrypted with the conversation public key and saved before being forwarded back to you.
The model operator still gets everything, just not trivially traced back to the user and not collected in one place; in addition you get everything and there’s no way to verify you’re not inspecting and/or storing clear text user info, or hacked to do so. It’s basically a pinky promise from an unknown entity (no offense but you can see that from the user point of view) whose main value proposition is that pinky promise.
Not really. There's Apple (less spying, more lock-in) and there's Linux (build-your-own operating system construction kit for DIY-loving folks. The prefabs aren't good enough that you can ignore what's underneath.).
> ...as if the risk of surveillance was the guys transcribing from reel-to-reel tape recordings
At a minimum, the guy transcribing the tapes is an accomplice to the surveillance effort.
As a society, we desperately need privacy regulations to catch up to the digital age. Legislation will take years (decades?) to catch up, though. In the meantime, as tech industry professionals, we have a responsibility to not be the guy transcribing the tape.
Right now, there's a lot of incentive to be that guy, and good data ethics regulation will remove those incentives from our industry. Until those incentives go away, it's up to us to make good moral choices, even if that means turning down a fat paycheck. How do we, as an industry, self-regulate while legislation catches up?
>We, the people who build and maintain these services, are in a position to offer pushback to ensure the humans and their privacy is higher than any other concerns.
So, a Hollywood-style strike with the aim to do so?
Did you sue them or have a criminal case brought against them? I think corporate espionage is illegal in many countries, probably in yours too.
Editorial Channel
What the content says
+0.70
Article 12Privacy
High Advocacy Framing Practice
Editorial
+0.70
SETL
+0.59
CORE ARTICLE. Entire argument centers on privacy erosion through normalized surveillance. Uses Daniel Pauly's 'shifting baseline syndrome' from fisheries to explain how each generation accepts lower privacy expectations as normal. References U.S. Supreme Court standard that privacy rights depend on 'reasonable expectation of privacy.' Advocates for ecosystem-wide regulatory approach to restore privacy protections.
FW Ratio: 71%
Observable Facts
Article states: 'Each generation of researchers came of age in a new ecological and technological environment, inadvertently masking an exponential decline' and applies to privacy expectations.
Authors write: 'Internet surveillance, and the resultant loss of privacy, is following the same trajectory' as fisheries collapse.
Article references: 'The U.S. Supreme Court has long held that our right to privacy depends on whether we have a reasonable expectation of privacy.'
Page footer displays links to 'IEEE Privacy Policy' and 'Cookie Preferences.'
Fisheries analogy demonstrates authors view privacy erosion as systematic, cumulative problem requiring structural intervention.
Simultaneous presence of privacy disclosures and tracking infrastructure indicates publisher acknowledges privacy principles while participating in standard surveillance practices.
+0.50
PreamblePreamble
Medium Advocacy Framing
Editorial
+0.50
SETL
ND
Article emphasizes privacy as a fundamental right aligned with preamble's emphasis on human dignity. Authors argue for 'stepping back' to examine healthy technological ecosystem and call for systemic protection of privacy as heritage for next generation.
FW Ratio: 60%
Observable Facts
Article concludes: 'A scientifically informed and democratic regulatory process is required to preserve a heritage—whether it be the ocean or the Internet—for the next generation.'
Authors frame privacy erosion as systematic problem requiring ecosystem-wide solutions.
Article emphasizes that privacy expectations are shaped by technological baselines rather than fixed principles.
Inferences
The framing of privacy as a 'heritage' to preserve suggests authors view privacy protection as foundational human concern aligned with UDHR preamble.
Call for systemic regulatory action implies belief that privacy protection requires structural changes beyond individual or market-based approaches.
+0.50
Article 21Political Participation
Medium Advocacy Practice
Editorial
+0.50
SETL
ND
Central to authors' solution. Advocates for 'scientifically informed and democratic regulatory process' to govern privacy protection. Frames privacy as requiring democratic participation and oversight, analogous to fisheries management integration.
FW Ratio: 67%
Observable Facts
Article concludes: 'A scientifically informed and democratic regulatory process is required to preserve a heritage—whether it be the ocean or the Internet—for the next generation.'
Authors describe fisheries model: 'They then turn these scientifically derived sustainable-catch figures into limits to be codified by regulators.'
Inferences
Emphasis on democratic processes implies privacy protection decisions should involve stakeholder participation and deliberation, not solely technical expertise or corporate self-regulation.
+0.40
Article 28Social & International Order
Medium Advocacy Practice
Editorial
+0.40
SETL
ND
Authors advocate for ecosystem-wide approach to privacy governance, treating digital infrastructure as shared resource requiring coordinated order. Proposes regulatory frameworks analogous to international fisheries management.
FW Ratio: 67%
Observable Facts
Article states: 'Instead of comparing to a shifting baseline, we need to step back and look at what a healthy technological ecosystem would look like: one that respects people's privacy rights.'
Authors draw parallel between fisheries management frameworks and needed privacy governance structures.
Inferences
Ecosystem framework implies recognition that privacy protection requires coordination across technological, regulatory, and social systems at global scale.
+0.30
Article 3Life, Liberty, Security
Low Framing
Editorial
+0.30
SETL
ND
Implicitly addresses security through discussion of hacker monitoring and threat detection. Privacy erosion implicitly framed as reducing digital security.
FW Ratio: 67%
Observable Facts
Article states: 'Microsoft recently caught state-backed hackers using its generative AI tools to help with their attacks' and discusses corporate monitoring necessity.
Authors note data collection happens 'behind the scenes' without explicit user awareness, suggesting threat to security.
Inferences
Discussion of surveillance as security measure suggests authors recognize tension between privacy protection and security monitoring.
+0.30
Article 7Equality Before Law
Low Advocacy Practice
Editorial
+0.30
SETL
ND
Authors advocate for 'scientifically informed and democratic regulatory process' to codify privacy protections, implicitly supporting equal protection through rule of law.
FW Ratio: 67%
Observable Facts
Article states: 'A scientifically informed and democratic regulatory process is required to preserve a heritage—whether it be the ocean or the Internet—for the next generation.'
Authors reference fisheries management where 'scientists derived sustainable-catch figures into limits to be codified by regulators.'
Inferences
Call for regulatory codification of privacy protections implies institutional mechanisms to ensure equal protection.
+0.30
Article 19Freedom of Expression
Low Advocacy Framing
Editorial
+0.30
SETL
ND
Article exemplifies freedom of expression by critically analyzing corporate surveillance. Implicitly frames privacy as necessary condition for free expression by highlighting how ubiquitous surveillance can suppress speech.
FW Ratio: 67%
Observable Facts
Article published as labeled 'Opinion' piece on major publication without apparent censorship.
Authors directly critique Microsoft's surveillance practices and normalization of data collection.
Inferences
Ability to publish corporate critique demonstrates freedom of expression, while content suggests surveillance threatens such freedom.
+0.30
Article 29Duties to Community
Low Framing Practice
Editorial
+0.30
SETL
ND
Implicitly addresses collective duties and social responsibilities. Frames scientists, regulators, and companies as having obligations to maintain healthy digital ecosystem and protect privacy.
FW Ratio: 67%
Observable Facts
Article describes fisheries scientists' responsibility to establish sustainable limits via regulatory codification.
Authors propose parallel responsibilities in tech sector to maintain ecosystem health and resist normalization of privacy erosion.
Inferences
Parallel between fisheries stewardship and digital ecosystem stewardship suggests privacy protection is collective duty incumbent on institutions.
+0.20
Article 18Freedom of Thought
Low Framing
Editorial
+0.20
SETL
ND
Implicitly addresses freedom of thought through discussion of hidden data collection. Surveillance framed as occurring 'behind the scenes' without user awareness, suggesting threat to autonomous thought.
FW Ratio: 50%
Observable Facts
Article notes: 'Most apps and services are designed to be always-online, feeding usage information back to the company' and 'behind the scenes there's a complex cloud-based system keeping track of that input.'
Inferences
Emphasis on hidden monitoring suggests authors view transparency in data collection as necessary for autonomous decision-making.
ND
Article 1Freedom, Equality, Brotherhood
Article does not address equal rights or dignity of persons.
ND
Article 2Non-Discrimination
Article does not address non-discrimination.
ND
Article 4No Slavery
Article does not address slavery or servitude.
ND
Article 5No Torture
Article does not address torture or cruel treatment.
ND
Article 6Legal Personhood
Article does not address recognition as person before law.
ND
Article 8Right to Remedy
Article does not directly address effective remedy, though regulatory call implies institutional remedies.
ND
Article 9No Arbitrary Detention
Article does not address arbitrary arrest or detention.
ND
Article 10Fair Hearing
Article does not address fair trial or due process.
ND
Article 11Presumption of Innocence
Article does not address presumption of innocence.
ND
Article 13Freedom of Movement
Article does not address freedom of movement.
ND
Article 14Asylum
Article does not address right to asylum.
ND
Article 15Nationality
Article does not address nationality.
ND
Article 16Marriage & Family
Article does not address family rights.
ND
Article 17Property
Article does not address property rights.
ND
Article 20Assembly & Association
Article does not address freedom of assembly or association.
ND
Article 22Social Security
Article does not address social security.
ND
Article 23Work & Equal Pay
Article does not address right to work or fair labor.
ND
Article 24Rest & Leisure
Article does not address rest or leisure.
ND
Article 25Standard of Living
Article does not address adequate standard of living.
ND
Article 26Education
Article contains educative content but does not address education as a right.
ND
Article 27Cultural Participation
Article does not address cultural life or intellectual property.
ND
Article 30No Destruction of Rights
Article does not address abuse of rights.
Structural Channel
What the site does
+0.20
Article 12Privacy
High Advocacy Framing Practice
Structural
+0.20
Context Modifier
ND
SETL
+0.59
Site implements baseline privacy infrastructure (policy, cookie preferences, ad privacy options in footer) indicating commitment to privacy transparency. However, page code reveals Sentry error tracking, indicating data collection infrastructure. Creates SETL tension: editorial advocates privacy protection while structural implements standard surveillance/monitoring.
ND
PreamblePreamble
Medium Advocacy Framing
Not applicable to preamble.
ND
Article 1Freedom, Equality, Brotherhood
Not applicable.
ND
Article 2Non-Discrimination
Not applicable.
ND
Article 3Life, Liberty, Security
Low Framing
Not applicable.
ND
Article 4No Slavery
Not applicable.
ND
Article 5No Torture
Not applicable.
ND
Article 6Legal Personhood
Not applicable.
ND
Article 7Equality Before Law
Low Advocacy Practice
Not applicable.
ND
Article 8Right to Remedy
Not applicable.
ND
Article 9No Arbitrary Detention
Not applicable.
ND
Article 10Fair Hearing
Not applicable.
ND
Article 11Presumption of Innocence
Not applicable.
ND
Article 13Freedom of Movement
Not applicable.
ND
Article 14Asylum
Not applicable.
ND
Article 15Nationality
Not applicable.
ND
Article 16Marriage & Family
Not applicable.
ND
Article 17Property
Not applicable.
ND
Article 18Freedom of Thought
Low Framing
Not applicable.
ND
Article 19Freedom of Expression
Low Advocacy Framing
Not applicable.
ND
Article 20Assembly & Association
Not applicable.
ND
Article 21Political Participation
Medium Advocacy Practice
Not applicable.
ND
Article 22Social Security
Not applicable.
ND
Article 23Work & Equal Pay
Not applicable.
ND
Article 24Rest & Leisure
Not applicable.
ND
Article 25Standard of Living
Not applicable.
ND
Article 26Education
Not applicable.
ND
Article 27Cultural Participation
Not applicable.
ND
Article 28Social & International Order
Medium Advocacy Practice
Not applicable.
ND
Article 29Duties to Community
Low Framing Practice
Not applicable.
ND
Article 30No Destruction of Rights
Not applicable.
Supplementary Signals
How this content communicates, beyond directional lean. Learn more
build 73de264+3rh4 · deployed 2026-02-28 13:33 UTC · evaluated 2026-02-28 13:37:02 UTC
Support HN HRCB
Each evaluation uses real API credits. HN HRCB runs on donations — no ads, no paywalls.
If you find it useful, please consider helping keep it running.