No accessible privacy policy on-domain; insufficient evidence for modifier.
Terms of Service
—
No accessible terms of service on-domain; insufficient evidence for modifier.
Accessibility
—
No accessibility statement visible; insufficient evidence for modifier.
Mission
+0.15
Article 3 Article 8 Article 19
Personal blog of security researcher promoting responsible disclosure and transparency in security practices. Modest positive signal for data protection advocacy.
Editorial Code
+0.10
Article 19
Author demonstrates commitment to free speech and transparency in security reporting; advocates against censorship through NDAs.
Ownership
—
Individual ownership; no concerning corporate structure. Neutral.
Access Model
—
Public access; no paywall or restrictive model. Neutral signal.
Ad/Tracking
—
No advertising or tracking pixels observable on-domain content.
> Instead, I offered to sign a modified declaration confirming data deletion. I had no interest in retaining anyone’s personal data, but I was not going to agree to silence about the disclosure process itself.
Why sign anything at all? The company was obviously not interested in cooperation, but in domination.
I think the problem is the process. Each country should have a reporting authority and it should be the one to deal with security issues.
So you never report to actual organization but to the security organization, like you did. And they would be more equiped to deal with this, maybe also validate how serious this issue is. Assign a reward as well.
So you are researcher, you report your thing and can't be sued or bullied by organization that is offending in the first place.
> every account was provisioned with a static default password
Hehehe. I failed countless job interviews for mistakes much less serious than that. Yet someone gets the job while making worse mistakes, and there are plenty of such systems on production handling real people's data.
When you are acting in good faith and the person/organization on the other end isn't, you aren't having a productive discussion or negotiation, just wasting your own time.
The only sensible approach here would have been to cease all correspondence after their very first email/threat. The nation of Malta would survive just fine without you looking out for them and their online security.
1) If you make legal disclosure too hard, the only way you will find out is via criminals.
2) If other industries worked like this, you could sue an architect who discovered a flaw in a skyscraper. The difference is that knowledge of a bad foundation doesn’t inherently make a building more likely to collapse, while knowledge of a cyber vulnerability is an inherent risk.
3) Random audits by passers-by is way too haphazard. If a website can require my real PII, I should be able to require that PII is secure. I’m not sure what the full list of industries would be, but insurance companies should be categorically required to have an cyber audit, and laws those same laws should protect white hats from lawyers and allow class actions from all users. That would change the incentives so that the most basic vulnerabilities are gone, and software engineers become more economical than lawyers.
If this was in Costa Rica the appropiate way was to contact PRODHAB about the leak of personal information and Costa Rica CSIRT ( csirt@micitt.go.cr ).
Here all databases with personal information must be registered there and data must be secure.
I use a different email address for every service. About 15 years ago, I began getting spam at my diversalertnetwork email address. I emailed DAN to tell them they'd been breached. They responded with an email telling me how to change my password.
I guess I should feel lucky they didn't try to have me criminally prosecuted.
> The security research community has been dealing with this pattern for decades: find a vulnerability, report it responsibly, get threatened with legal action. It's so common it has a name - the chilling effect.
Governments and companies talk a big game about how important cybersecurity is. I'd like to see some legislation to prevent companies and governments [1] behaving with unwarranted hostility to security researchers who are helping them.
AFAIK, what this dude did - running a script which tries every password and actually accessing personal data of other people – is illegal in Germany. The reasoning is, just because a door of a car which is not yours is open you have no right to sit inside and start the motor. Even if you just want to honk the horn to inform the guy that he has left the door open.
I suspect that the direction of these situations often depends on how your initial email is routed internally in these organizations. If they go to a lawyer first, you will get someone who tries to fix things with the application of the law. If it goes to an engineer first, you will get someone who tries to fix it with an application of engineering. If it were me, I would have avoided involving third party regulators in the initial contact at least.
Last year I found a vulnerability in a large annual event's ticket system, allowing me to download tickets from other users.
I had bought a ticket, which arrived as a link by email. The URL was something like example.com/tickets/[string]
The string was just the order number in base 64. The order number was, of course, sequential.
I emailed the organizer and the company that built the order system. They immediately fixed it... Just kidding. It's still wide open and I didn't hear anything from them.
I'm waiting for this year's edition. Maybe they'll have fixed it.
Incrementing user IDs and a default password for everyone — so the real vulnerability was assuming the company had any security to disclose to in the first place.
At this point 'responsible disclosure' just means 'giving a company a head start on hiring a lawyer before you go public.'
Hey TFA, other people have gone to prison for finding monotonic user/account IDs and _testing_ their hunch to see if it's true. See, doing that puts you at great risk of violating the CFAA. Basically, the moment you knew they were allocating account IDs monotonically and with a default password was the moment you had a vulnerability that you could report without fear of prosecution, but the moment you tested that vulnerability is the moment you may have broken the law.
Writing about it is essentially confessing. You need a lawyer, and a good one. And you need to read about these things.
Companies in Malta have to report these things to the police. Some university of malta student found a vulnerability in some software and they got instantly referred to the police rather than being tracked when they reported the issue.
Companies are doing their best to not reward people who diligently inform them about vulnerabilities.
I truly don’t understand why you decided to take the stance of setting them deadlines and disclosing the vulnerability if they miss them. I understand you had good intentions, but I also can see how this can look like unnecessary escalation and even like blackmail to someone outside the industry, like an insurance manager or a lawyer.
I agree that disclosing a vulnerability in a major web browser or in a protocol makes sense because it’s in the interests of the humanity to fix it asap. But a random insurance firm? Dude, you’re talking to them as if they were Google.
If you really care about them and wish them good (which I believe you do!) you should’ve just left out the deadlines and disclosure part and I don’t think cc’ing the national agency was that necessary given the scale of the problem. Maybe should’ve just given them a call and have had a friendly chat over the phone. You would’ve helped them and stayed friends.
I disclosed a vulnerability much like this one. .gov website. Incrementing IDs. No password to crack, just a url parameter with a Boolean value. Pretty much
Not a security researcher, but I once found an open Redis port without auth on a large portal. Redis was used to cache all views, so one could technically modify any post and add malicious links, etc. I found the portal admin's email, emailed them directly, and got a response within an hour: "Thanks, I closed the port." I didn't need a bounty or anything, so sometimes it may be easier and safer to just skip all those management layers and communicate with an actual fellow engineer directly
Vulnerability Researcher here… Unless your target has a security bounty process or reward; leave them alone. You don’t pentest a company without a contract that specified what you can and can’t test. Although I would personally appreciate and thank a well meaning security researchers efforts most companies don’t. I have reported 0days for companies that HAVE bounties and they still tried to put me in hot water over disclosure.. Not worth the risk these days.
Score Breakdown
+0.70
PreamblePreamble
High A: Advocates for transparent security disclosure as human rights issue F: Frames responsible disclosure as protection of vulnerable populations (minors) C: Documents institutional failure to protect personal data and freedom of speech
Editorial
+0.65
Structural
+0.35
SETL
+0.44
Combined
ND
Context Modifier
ND
Content strongly advocates for human dignity, transparency, and protection of vulnerable data subjects. Explicitly challenges institutional power asymmetry and legal intimidation tactics.
+0.70
Article 1Freedom, Equality, Brotherhood
High A: Asserts equal rights and dignity of all humans regardless of institutional status F: Treats security researcher and affected minors as moral equals to corporate entity
Editorial
+0.70
Structural
+0.40
SETL
+0.46
Combined
ND
Context Modifier
ND
Article champions equal treatment and rejects hierarchical power dynamics where corporations silence researchers. Emphasizes dignity of minors whose data was exposed.
+0.58
Article 2Non-Discrimination
High A: Critiques discrimination in legal treatment—researcher threatened while corporation violates data protection F: Implies selective enforcement favoring institutional interests over vulnerable populations
Editorial
+0.55
Structural
+0.35
SETL
+0.33
Combined
ND
Context Modifier
ND
Content documents discriminatory application of law: corporation exposing minors' data escapes accountability while researcher faces criminal threats for disclosure.
+0.78
Article 3Life, Liberty, Security
High A: Right to life/security threatened by data exposure of minors F: Frames institutional negligence as existential breach of duty to protect
Editorial
+0.75
Structural
+0.45
SETL
+0.47
Combined
ND
Context Modifier
ND
Directly addresses security and protection of vulnerable individuals (underage students). Criticizes failure of data controller to implement protective measures as per GDPR Article 5(1)(f).
+0.16
Article 4No Slavery
Medium F: Legal threat from corporation represents attempt at enslavement through coercive NDA
Editorial
+0.20
Structural
+0.10
SETL
+0.14
Combined
ND
Context Modifier
ND
Tangential reference to slavery/servitude through framing of coercive NDAs as silencing mechanism; weak and inferential connection.
+0.10
Article 5No Torture
Low
Editorial
+0.10
Structural
+0.10
SETL
0.00
Combined
ND
Context Modifier
ND
No observable torture or degrading treatment documented. Not directly addressed.
+0.13
Article 6Legal Personhood
Low F: Legal threats may constitute denial of legal personhood/standing as legitimate actor
Editorial
+0.15
Structural
+0.10
SETL
+0.09
Combined
ND
Context Modifier
ND
Weak connection; content does not substantively address legal personhood, though it documents marginalization of researcher's voice.
+0.72
Article 7Equality Before Law
High A: Researcher and minors equally entitled to legal protection F: Documents unequal application of law to researcher vs. corporation P: Blog platform enables equal voice in legal dispute
Editorial
+0.70
Structural
+0.50
SETL
+0.37
Combined
ND
Context Modifier
ND
Central theme: equal protection under law violated when security researcher threatened while corporation escapes accountability for GDPR violations. Author explicitly cites GDPR Articles 33-34 on notification obligations.
+0.87
Article 8Right to Remedy
High A: Documents privacy violation of minors through data exposure F: Frames data breach as fundamental attack on family integrity and privacy P: Advocates for corrective disclosure to affected families
Editorial
+0.80
Structural
+0.60
SETL
+0.40
Combined
ND
Context Modifier
ND
Strong advocacy for privacy rights of minors and families. Criticizes corporation's failure to notify affected users, particularly parents of underage students, as required by GDPR.
+0.10
Article 9No Arbitrary Detention
Low
Editorial
+0.10
Structural
+0.10
SETL
0.00
Combined
ND
Context Modifier
ND
No observable content on freedom of movement or nationality. Not addressed.
+0.58
Article 10Fair Hearing
High A: Researcher denied fair hearing; threatened with legal action rather than engagement F: Institutional response prioritizes reputation over due process
Editorial
+0.55
Structural
+0.35
SETL
+0.33
Combined
ND
Context Modifier
ND
Content documents denial of fair and impartial hearing: researcher's legitimate disclosure met with legal threats instead of technical dialogue. Corporation attempts to suppress researcher's narrative.
+0.72
Article 11Presumption of Innocence
High A: Researcher presumed innocent in security disclosure; corporation presumes guilt and threatens criminal prosecution F: Legal intimidation treated as presumption of guilt
Editorial
+0.70
Structural
+0.50
SETL
+0.37
Combined
ND
Context Modifier
ND
Researcher's responsible disclosure framed as criminal act; burden reversed where researcher must defend security research while corporation escapes liability for data breach. Critiques presumption of malice.
+0.75
Article 12Privacy
High A: Researcher's reputation attacked through legal threat without due process F: Corporate power used to chill researcher's speech and reputation P: Blog publication serves as reputation defense
Editorial
+0.75
Structural
+0.50
SETL
+0.43
Combined
ND
Context Modifier
ND
Central theme: researcher's honor and reputation attacked through threat of criminal liability. Author defends reputation through transparent public disclosure of institutional failure.
+0.27
Article 13Freedom of Movement
Low F: Researcher unable to freely move between disclosure channels without legal consequence
Editorial
+0.30
Structural
+0.20
SETL
+0.17
Combined
ND
Context Modifier
ND
Weak connection. Article 13 addresses freedom of movement; content suggests chilling effect on geographic/jurisdictional movement but not primary focus.
+0.33
Article 14Asylum
Low A: Researcher implicitly entitled to asylum from legal persecution in other jurisdictions F: Malta's extraterritorial criminal law framing implies denial of refuge
Editorial
+0.40
Structural
+0.20
SETL
+0.28
Combined
ND
Context Modifier
ND
Tangential: article cites Maltese law's extraterritorial reach; weak connection to asylum/refuge. Not primary concern.
+0.16
Article 15Nationality
Low
Editorial
+0.20
Structural
+0.10
SETL
+0.14
Combined
ND
Context Modifier
ND
No observable content on nationality or right to change nationality. Not addressed.
+0.47
Article 16Marriage & Family
Medium F: Family privacy of minors violated through data exposure F: NDA demand infringes on researcher's family/personal privacy by silencing them
Editorial
+0.50
Structural
+0.30
SETL
+0.32
Combined
ND
Context Modifier
ND
Secondary theme: family privacy of students compromised by data breach. Corporate censorship attempt infringes on researcher's personal freedom.
+0.87
Article 17Property
High A: Researcher's right to property (professional reputation, right to disclose findings) attacked P: Blog publication defends property right to reputation and narrative control
Editorial
+0.80
Structural
+0.60
SETL
+0.40
Combined
ND
Context Modifier
ND
Strong signal: author defends right to property in reputation and intellectual output (security research) against arbitrary interference. Legal threats framed as arbitrary attack on researcher's property.
+0.75
Article 18Freedom of Thought
High A: Researcher's freedom of thought/conscience attacked through legal intimidation F: NDA demand frames as denial of conscience and ethical autonomy
Editorial
+0.75
Structural
+0.50
SETL
+0.43
Combined
ND
Context Modifier
ND
Author emphasizes ethical obligation to disclose vulnerability and refusal to compromise conscience for legal threats. Frames NDA as attack on moral autonomy.
+1.00
Article 19Freedom of Expression
High A: Central advocacy for freedom of opinion and expression in security disclosure F: NDA and legal threats framed as censorship and silencing P: Blog publication is direct exercise of Article 19 rights C: Extensive coverage of institutional suppression of security research
Editorial
+0.95
Structural
+0.75
SETL
+0.44
Combined
ND
Context Modifier
ND
Strongest positive signal. Entire article pivots on researcher's right to seek, receive, and impart information about security vulnerabilities despite legal threats. Author explicitly defends right to publish post-remediation analysis. Blog is public exercise of free expression.
+0.62
Article 20Assembly & Association
Medium A: Researcher advocating for freedom of peaceful assembly in security community F: Institutional response aims to prevent researcher participation in conferences/public forums
Editorial
+0.60
Structural
+0.40
SETL
+0.35
Combined
ND
Context Modifier
ND
Author references attempt to prevent presentation at conferences and public forums. Frames institutional suppression as attack on freedom of assembly in professional community.
+0.58
Article 21Political Participation
Medium A: Researcher's right to participate in governance of data security standards F: Legal threats represent exclusion from legitimate policy participation
Editorial
+0.55
Structural
+0.35
SETL
+0.33
Combined
ND
Context Modifier
ND
Moderate signal: author advocates for participation in security disclosure governance and criticizes institutional resistance to transparent policy engagement. Frames legal threats as exclusionary.
+0.16
Article 22Social Security
Low
Editorial
+0.20
Structural
+0.10
SETL
+0.14
Combined
ND
Context Modifier
ND
No observable content on right to social security, welfare, or social insurance systems. Not addressed.
+0.67
Article 23Work & Equal Pay
High A: Right to work as security researcher threatened by legal intimidation F: Legal threats frame security research as illegitimate work P: Author continues work despite institutional suppression
Editorial
+0.65
Structural
+0.45
SETL
+0.36
Combined
ND
Context Modifier
ND
Content documents threat to researcher's right to work in security profession through legal threats and career intimidation. Author defends right to continue security research practice.
+0.32
Article 24Rest & Leisure
Low F: Data subjects (minors and students) denied rest/leisure as right through data breach
Editorial
+0.40
Structural
+0.20
SETL
+0.28
Combined
ND
Context Modifier
ND
Weak connection. No substantive discussion of rest, leisure, or reasonable working hours. Inferred only through frame of security/wellbeing disruption.
+0.77
Article 25Standard of Living
High A: Right to adequate standard of living threatened by data breach exposure F: Security failures impact wellbeing of vulnerable populations (minors, students) P: Institutional failure to implement adequate security measures
Editorial
+0.70
Structural
+0.50
SETL
+0.37
Combined
ND
Context Modifier
ND
Moderate-strong signal: data exposure threatens security and wellbeing of vulnerable individuals. Author advocates for institutional obligation to maintain standard of protection for personal data.
+0.62
Article 26Education
Medium A: Education of minors compromised by data exposure through insurer portal F: Institutional failure to protect student data as breach of educational trust
Editorial
+0.60
Structural
+0.40
SETL
+0.35
Combined
ND
Context Modifier
ND
Secondary theme: students/minors whose education-related data was exposed. Author frames data protection as prerequisite for safe educational environment.
+0.47
Article 27Cultural Participation
Medium A: Right to participate in cultural life of security community threatened F: Institutional attempt to exclude researcher from professional discourse and conferences
Editorial
+0.50
Structural
+0.30
SETL
+0.32
Combined
ND
Context Modifier
ND
Moderate signal: author references institutional attempt to prevent presentation at conferences and participation in security community discourse. Frames as exclusion from cultural participation.
+0.83
Article 28Social & International Order
High A: Social and international order for protection of human rights threatened by legal intimidation F: Institutional response to disclosure violates spirit of NCVDP and GDPR P: Author advocates for reformed vulnerability disclosure policy
Editorial
+0.75
Structural
+0.55
SETL
+0.39
Combined
ND
Context Modifier
ND
Strong signal: author explicitly references Malta's National Coordinated Vulnerability Disclosure Policy (NCVDP) and GDPR as evidence that institutional response violated established human rights frameworks. Advocates for social order supporting disclosure.
+0.67
Article 29Duties to Community
High A: Researcher subject to duties to community through responsible disclosure F: Institutional failure represents breach of corporate duty to community P: Author frames security research as social duty, not criminal act
Editorial
+0.65
Structural
+0.45
SETL
+0.36
Combined
ND
Context Modifier
ND
Author emphasizes duties of researchers (responsible disclosure, 30-day embargo) while criticizing corporation's failure to discharge duties to data subjects. Frames security research as community obligation.
+0.72
Article 30No Destruction of Rights
High A: Legal framework (GDPR, NCVDP, Malta Criminal Code) misused to suppress legitimate disclosure F: Institutional framing of security research as criminal act violates spirit of human rights law P: Author defends right to interpret rights and freedoms for security disclosure
Editorial
+0.70
Structural
+0.50
SETL
+0.37
Combined
ND
Context Modifier
ND
Author documents misuse of Article 337E (Malta Criminal Code) to threaten security researcher while data controller escapes accountability under GDPR. Frames as violation of intent of human rights law.