No privacy policy or statement observable on-domain from provided content.
Terms of Service
—
No terms of service observable on-domain from provided content.
Accessibility
+0.05
Article 2 Article 19
Site uses semantic HTML structure, heading hierarchy (h1-h6), and responsive design. Dark mode support observable. Minor positive for accessibility affordances.
Mission
—
No explicit mission or values statement observable on-domain.
Editorial Code
—
No editorial code of conduct or ethics statement observable on-domain.
Ownership
—
Personal blog by @harudagondi; no corporate entity or ownership restrictions observable.
Access Model
+0.08
Article 19 Article 26
Content appears freely accessible; no paywall or access restrictions observable. Supports open information access.
Ad/Tracking
—
No advertising or tracking signals observable in provided content.
You can go even further with this in other languages, with things like dependent typing - which can assert (among other interesting properties) that, for example, something like
get_elem_at_index(array, index)
cannot ever have index outside the bounds of the array, but checked statically at compilation time - and this is the key, without knowing a priori what the length of array is.
"In Idris, a length-indexed vector is Vect n a (length n is in the type), and a valid index into length n is Fin n ('a natural number strictly less than n')."
Similar tricks work with division that might result in inf/-inf, to prevent them from typechecking, and more subtle implications in e.g. higher order types and functions
Dividing a float by zero is usually perfectly valid. It has predictable outputs, and for some algorithms like collision detection this property is used to remove branches.
This reminds me a bit of a recent publication by Stroustrup about using concepts... in C++ to validate integer conversions automatically where necessary.
{
Number<unsigned int> ii = 0;
Number<char> cc = '0';
ii = 2; // OK
ii = -2; // throws
cc = i; // OK if i is within cc’s range
cc = -17; // OK if char is signed; otherwise throws
cc = 1234; // throws if a char is 8 bits
}
Note that the division-by-zero example used in this article is not the best example to demonstrate "Parse, Don't Validate," because it relies on encapsulation. The principle of "Parse, Don't Validate" is best embodied by functions that transform untrusted data into some data type which is correct by construction.
Alexis King, the author of the original "Parse, Don't Validate" article, also published a follow-up, "Names are not type safety" [0] clarifying that the "newtype" pattern (such as hiding a nonzero integer in a wrapper type) provide weaker guarantees than correctness by construction. Her original "Parse, Don't Validate" article also includes the following caveat:
> Use abstract datatypes to make validators “look like” parsers. Sometimes, making an illegal state truly unrepresentable is just plain impractical given the tools Haskell provides, such as ensuring an integer is in a particular range. In that case, use an abstract newtype with a smart constructor to “fake” a parser from a validator.
So, an abstract data type that protects its inner data is really a "validator" that tries to resemble a "parser" in cases where the type system itself cannot encode the invariant.
The article's second example, the non-empty vec, is a better example, because it encodes within the type system the invariant that one element must exist. The crux of Alexis King's article is that programs should be structured so that functions return data types designed to be correct by construction, akin to a parser transforming less-structured data into more-structured data.
The examples in question propagate complexity throughout related code. I think this is a case I see frequently in Rust of using too many abstractions, and its associated complexities.
I would just (as a default; the situation varies)... validate prior to the division and handle as appropriate.
The analogous situation I encounter frequently is indexing, e.g. checking if the index is out of bounds. Similar idea; check; print or display an error, then fail that computation without crashing the program. Usually an indication of some bug, which can be tracked down. Or, if it's an array frequently indexed, use a (Canonical for Rust's core) `get` method on the whatever struct owns the array. It returns an Option.
I do think either the article's approach, or validating is better than runtime crashes! There are many patterns in programming. Using Types in this way is something I see a lot of in OSS rust, but it is not my cup of tea. Not heinous in this case, but I think not worth it.
This is the key to this article's philosophy, near the bottom:
> I love creating more types. Five million types for everyone please.
The `try_roots` example here is actually a _counterexample_ to the author's main argument. They explicitly ignore the "negative discriminant" case. What happens if we consider it?
If we take their "parse" approach, then the types of the arguments a, b, and c have to somehow encode the constraint `b^2 - 4ac >= 0`. This would be a total mess--I can't think of any clean way to do this in Rust. It makes _much_ more sense to simply return an Option and do the validation within the function.
In general, I think validation is often the best way to solve the problem. The only counterexample, which the author fixates on in the post, is when one particular value is constrained in a clean, statically verifiable way. Most of the time, validation is used to check (possibly complex) interactions between multiple values, and "parsing" isn't at all convenient.
This exact philosophy is why I started treating UI design systems like compilers.
Instead of validating visual outputs after the fact (like linting CSS or manual design reviews), you parse the constraints upfront. If a layout component is strictly typed to only accept discrete grid multiples, an arbitrary 13px margin becomes a compile error, not a subjective design debate. It forces determinism.
Every time you introduce a type for a "value invariant" you lose compatibility and force others to make cumbersome type conversions.
To me, invalid values are best expressed with optional error returns along with the value that are part of the function signature. Types are best used to only encode information about the hierarchy of structures composed of primitive types. They help define and navigate the representation of composite things as opposed to just having dynamic nested maps of arbitrary strings.
C# gets close to this with records + pattern matching, F# discriminated unions are even better for this with algebraic data types built right in. A Result<'T,'Error> makes invalid states unrepresentable without any ceremony. C# records/matching works for now, but native DUs will make it even nicer.
Parsing over validation, and newtypes for everything, fall over when you don’t know the full range of possibilities that can occur in the wild.
It is a handy way to prevent divide by zero as in the article, or to have fun with lambda calculus by asking the type system if 3 + 4 == 8. You can reason about the full range of inputs. Same for file format parsing - making as many failure modes as possible fail as early as possible!
But be VERY wary of using them to represent business logic or state machines that allow only the transitions you believe can exist at this point in time. You just don’t know what wild things people will want to do in business logic, and if your software can’t handle those scenarios, people will just work around it and your stuff no longer matches reality.
Score Breakdown
ND
PreamblePreamble
Content is technical software engineering instruction. No observable engagement with human rights principles, dignity, freedom, or justice concepts in preamble context.
ND
Article 1Freedom, Equality, Brotherhood
No observable content addressing equal rights, dignity, or reason/conscience.
+0.07
Article 2Non-Discrimination
Low Practice
Editorial
ND
Structural
+0.05
SETL
ND
Combined
ND
Context Modifier
ND
Accessibility features (semantic HTML, heading structure, responsive design, dark mode) provide structural affordances for non-discriminatory access to information. Low evidence strength as accessibility features are incidental to primary purpose.
ND
Article 3Life, Liberty, Security
No observable content addressing security of person or physical integrity.
ND
Article 4No Slavery
No observable content addressing slavery or servitude.
ND
Article 5No Torture
No observable content addressing torture or cruel punishment.
ND
Article 6Legal Personhood
No observable content addressing right to life or personhood.
ND
Article 7Equality Before Law
No observable content addressing legal equality or discrimination.
ND
Article 8Right to Remedy
No observable content addressing effective remedies or legal recourse.
ND
Article 9No Arbitrary Detention
No observable content addressing arbitrary arrest or detention.
ND
Article 10Fair Hearing
No observable content addressing fair trial or impartial hearing.
ND
Article 11Presumption of Innocence
No observable content addressing criminal procedure or presumption of innocence.
ND
Article 12Privacy
No observable content addressing privacy or family protection.
ND
Article 13Freedom of Movement
No observable content addressing freedom of movement.
ND
Article 14Asylum
No observable content addressing asylum or refugee protection.
ND
Article 15Nationality
No observable content addressing nationality or citizenship.
ND
Article 16Marriage & Family
No observable content addressing marriage or family rights.
ND
Article 17Property
No observable content addressing property rights.
ND
Article 18Freedom of Thought
No observable content addressing freedom of thought, conscience, or religion.
+0.22
Article 19Freedom of Expression
Medium Advocacy Practice
Editorial
+0.15
Structural
+0.12
SETL
+0.07
Combined
ND
Context Modifier
ND
Article explicitly discusses sharing knowledge/information (technical education via blog post). Free access to content without paywall supports freedom of opinion and information. Author advocates for making Haskell concepts accessible to Rust community (inclusive information sharing). Structural: freely accessible, no registration barriers, open technical information dissemination.
ND
Article 20Assembly & Association
No observable content addressing freedom of assembly or association.
ND
Article 21Political Participation
No observable content addressing participation in government or public affairs.
ND
Article 22Social Security
No observable content addressing social security or welfare.
ND
Article 23Work & Equal Pay
No observable content addressing work or employment rights.
ND
Article 24Rest & Leisure
No observable content addressing rest or leisure.
ND
Article 25Standard of Living
No observable content addressing health, food, or housing.
+0.19
Article 26Education
Medium Advocacy Practice
Editorial
+0.12
Structural
+0.10
SETL
+0.05
Combined
ND
Context Modifier
ND
Content provides technical education on programming language design principles. Author explicitly frames content as educational resource for Rust community (technical development of human capacity). Free access supports right to education. No observable discriminatory gatekeeping. Medium evidence: education content is secondary to technical focus.
ND
Article 27Cultural Participation
No observable content addressing cultural participation or intellectual property ethics.
ND
Article 28Social & International Order
No observable content addressing social and international order.
ND
Article 29Duties to Community
No observable content addressing community duties or limitations on rights.
ND
Article 30No Destruction of Rights
No observable content addressing prevention of rights destruction.