Hacker News new | past | comments | ask | show | jobs | submit login

> - convert x and y from decimal to binary

What?! What exactly is at the memory address that you think it's converting?




I'm talking specifically about this formula[1] rather than what's happening at the hardware level!

1 - https://d262ilb51hltx0.cloudfront.net/max/800/1*Y5WC--jvMMdU...


But that's only happening in your head. It's an implementation detail of your shell/ide that you are reading numbers in base10.


From the post: "Now, if you find that function dreadful — I am with you. When I first wrote it down, I found it both complicated and unhelpful; after formulating it, my mind was no closer to understanding the kind of pattern the AND function followed, if any."

That's what I'm railing against. If your objective is to understand how bitwise-AND works, trying to decipher it directly in base-10 is folly. I'm not a fan of the graphical approach taken by the post either, and prefer the algebraic way out, which is why I proposed an algebraic base-10 (any-base, really) version.


But your algebraic arithmetic with max-wise and min-wise characters is not equivalent at all! In fact it's a terrible analogy.

Bit-wise arithmetic is great because it's operations are compatible with propositional calculus,- but I cannot see how you can do anything productive with min/max-wise defined AND/OR for base10. For example how to do you determine 324 XOR 310 in your notation?


In this decimal digit context it would be appropriate to define

  NOT d = 9 - d
Similar to

  NOT b = 1 - b 
for binary digits. And in fuzzy logic.

Taking the logic identity

  a XOR b = ( a AND NOT b ) OR ( NOT a AND b )
it would be

  MAX( MIN( a, NOT( b ) ), MIN( NOT( a ), b ) )




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: