Hacker News new | past | comments | ask | show | jobs | submit | catblast's comments login

Not sure I buy that explanation. Looking through this repo, the kernel, drivers, and many core libraries seem to be written from scratch with no dependencies. There are some deps for reading images, and other things outside of kernel/driver function, but this seems mostly written from ground up (A rust based OS could bind to a C/C++ png or sound file library in the same way).

Writing an OS is a fiddly task to say the least. It's one thing to be motivated to create an OS regardless of language (such as this), vs let's create one to showcase a language. It's harder to recruit people for the latter frankly, and having written OS and kernel code, it's fiddly work that more people talk about then actually enjoy doing it over the long haul.

Also in defense of Redox, I think it has far different goals, rather than getting something up and running with interactivity quickly. SkiftOS is much more "toy"/fun hacking vibe.


> I did fairly deep research on this beforehand, looking into actual pharmacological aspect of interactions with things like dopamine, etc

While you can read about mechanisms and in vitro this and that all day, this is not the same as quality applied clinical research, which frankly just doesn't exist in any meaningful quantity. I am a practitioner, and I am very sympathetic to the movement of alternative treatments counter to established standards of care, however at the end of the day it doesn't change the fact that the controlled research is non-existent or poor, and things like the MAPS manual are essentially woo.

And specifically the problems I have is that ironically, these alternative treatments which are supposedly trying to overcome boundaries of established care are extremely proscriptive themselves.. they're limited, just differently, and the justifications have very little evidence based backing.


"very little evidence based backing"

Many studies have been done, not just in vitro but in real people with just about the most severe mental issues there are: severe depression and PTSD, and the results have been positive, and these studies have been published in peer reviewed journals and been well received by the scientific and medical community.

So can you describe what evidence you're looking for and what would satisfy you?

Also, you say you're a practitioner. I wonder what you are a practitioner of exactly. If it's some sort of traditional therapy you should be aware that for PTSD and severe depression the evidence is that traditional therapy is very ineffective. So what would you suggest for those suffering from these conditions if not psychedelic-assisted therapy, which from my reading is actually far more effective for these conditions than traditional therapy?


Woo? Are you familiar with their clinical research/results and still saying it's woo?


That is not how safety margins work. Safety margins are meant to give a buffer for unforeseen circumstances, they are not a ticket to just cheat. With this logic... why stop at a 25A breaker (30 in the US)? Why not just plug a 20A device into a 100A breaker, or no breaker at all?

No safety margin can account for purposeful circumvention, which is what connecting a 15/20A outlet to a 30A circuit is.


You see no difference between 20 vs. 25 amps and 20 vs. 100?

I didn't say any difference was unacceptable, but a significant difference for 20/25 should not be accepted.

If the danger gets gradually worse for every 5 amps on the fuse, that's fine. Then the excess danger at 25 or 30 amps is only a tiny fraction of the excess danger at 100 amps. Good work.

If the danger has a sudden sharp increase at a certain amperage, then that amperage threshold needs to be further away than a mere 20/25 difference. Or even 20/30.


Anybody long enough to remember Unix from the beginning or even just the last 25 years... which is a tiny percentage of this site... should know that a unifying Unix or Unix "tradition" as noted in a follow-up comment is a pretty much a myth. The tradition is whatever system you grew up on and tribal biases you subscribe to and the only true Unix traditions are mostly trivialities like core shell syntax and a handful of commands, and a woefully underpowered API for modern purposes. And long option names are definitely not part of any tradition.

Myths like "everything is a file" or file descriptor is complete bollocks, mostly retconned recently with Linuxisms. Other than pipes, IPC on Unix systems did not involve files or file descriptors. The socket api dates to the early 80s and even it couldn't follow along with its weird ioctls. Why are things put in /usr/local anyway? Why is /usr even a thing? There's a history there, but these days I don't seem much of anything go into /usr/local on most Linux distributions.

It's also ironic to drag OS X into a discussion of Unix, because if there was one system to break with Unix tradition (for the best in some ways) -- no X11, launchd, a multifork FS, weird semantics to implement time machine, a completely non-POSIX low-level API, etc, that would be it.

All this shit has been reinvented multiple times, the user-mode API on Linux has had more churn than Windows -- which never subscribed to a tradition. There's no issue of lack of familiarity here, the original Unix system meant to run on a PDP-11 minicomputer only meets modern needs in an idealized fantasy-land. Meanwhile, worse is better has been chugging along for 50 years while people try to meet their needs.


> more churn than Windows -- which never subscribed to a tradition.

My understanding is that Windows has always had a very strong tradition of backwards compatibility. Even to the point of making prior bugs that vendors rely on still function the same way for them (i.e. detect if it's e.g. Photoshop requesting buggy API, serve them the buggy code path and everyone else the fixed one).

That's just as much a tradition as "we should implement this with file semantics because that's traditionally how our OS has exposed functionality".


> no X11

XQuartz if you want it

> completely non-POSIX low-level API

macOS has a POSIX layer.


> XQuartz if you want it

There are X server implementations for Windows, Android, AmigaOS, Windows CE!!, etc... I don't think this is relevant.

> macOS has a POSIX layer. So do many systems, again including Windows in varying forms through the years. I think the salient issue is that BSD UNIX and "tradition" are conflicting. The point of the original CMU Mach project was to replace the BSD monolith kernel.


> you're doing it wrong.

What else do you suggest that is liquid and essentially risk free? There is a purpose for this type of product. If not a high yield savings account, what is it?


They have had these for years in OR suites to dispense surgical scrub soap. They use an air bulb and tubing rather than mechanical link since that is much more durable and cheaper to replace. These obviously present an accessibility issue in public places that don't generally apply to ORs. I think the hand operated or IR is the good enough solution in most cases though.


> In reality you'd be one of thousands of people holding HTZ calls or TSLA puts in your Robinhood account. You could make a huge payday and be indistinguishable from the crowd.

This is utterly clueless.


I take it you haven't spent much time on wallstreetbets.


For one, having known many posters on wsb personally, the number of people actually trading large positions vs the number participating in that forum is not the same.

Finally, people (apparently even the "informed" crowd here) greatly overestimate the difficulty of identifying individuals or actions in massive systems that are effectively recorded and completely surveilled, and they underestimate the resources of the feds. It's usually not that hard to whittle down to a handful of actions, and even if there are hundreds it only takes a tiny bit of taxpayer money to comb through it by hand.


why?


In the context of language design, the arguments for 0-based indexing rise above bikeshedding. https://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/E...

Sure it may not be the most critical point about language design, but it's not just bikeshedding either. For one thing, it is a language semantics issue rather than plainly syntax issue.

I still like Lua, and maybe you don't even agree with Dijkstra's argument, which is also possible to make, but doesn't mean its bikeshedding - which seems to be a totally overused term to basically mean any argument I don't want to have.


Roberto Ierusalimschy makes an interesting point on this in an interview[1] last year.

> When we started Lua, the world was different, not everything was C-like. Java and JavaScript did not exist, Python was in an infancy and had a lower than 1.0 version. So there was not this thing when all the languages are supposed to be C-like. C was just one of many syntaxes around.

> And the arrays were exactly the same. It’s very funny that most people don’t realize that. There are good things about zero-based arrays as well as one-based arrays.

> The fact is that most popular languages today are zero-based because of C. They were kind of inspired by C. And the funny thing is that C doesn’t have indexing. So you can’t say that C indexes arrays from zero, because there is no indexing operation. C has pointer arithmetic, so zero in C is not an index, it’s an offset. And as an offset, it must be a zero — not because it has better mathematical properties or because it’s more natural, whatever.

> And all those languages that copied C, they do have indexes and don’t have pointer arithmetic. Java, JavaScript, etc., etc. — none of them have pointer arithmetic. So they just copied the zero, but it’s a completely different operation. They put zero for no reason at all — it’s like a cargo cult.

[1] - https://habr.com/en/company/mailru/blog/459466/


I'm confused. Lisp does 0-based indexing. Lisp predates C and C-like languages by decades.


They don't say that C was the first. The claim is that C's influence was responsible for the proliferation.

(Bit of a tangent: Lisp predates C by at most 2 years if you count McCarthy's original 1960 paper, but afaict their respective implementations both got their first public distribution in 1962. Of course, Lisp was set to gain momentum from that time until the AI winter, while C was practically confined to UNIX until the 80s...)


Historically, everyone who studied computing seriously learned assembly language, and it was much more widely used.

In early computing, assembly language saw a lot of use. The navigation program that sent the Apollo mission to the Moon was written in assembly language. During the micro-computer boom, which echoed the history of big iron boom, a lot of applications and systems were written in assembly language again. Most commercial video games for 8 bit micros were written in assembly language. The famous WordPerfect word processor for the IBM PC was assembly language. The VMS operating system: assembly language.

C became easily popular because it gave a nice notation, and a sprinkling of type, to assembly language memory manipulation concepts that most professional programmers already knew how to use.


C wasn't from 1972? Basically a decade later?


Oof yes, that's correct. The manual is dated 12 June 1972, which seems to be the release date.


Lisp isn't the reason modern-day languages are all offset-indexed though; that's Cs fault


Are you bikeshedding bikeshedding??

Just kidding, I actually agree with your point that the term is just demonstrate that you don’t want to have a conversation by diminishing the value of anybody having it.


Fwiw Dijkstra practically invented bikeshedding


bikeshedding considered harmful


Frankly, garbage. Serious bakers are capable of a lot more consistency than you realize.

> It turns out that not all eggs are the same.

You can weigh eggs, or more often it's actually sufficient to balance the liquid to offset the variation in eggs -- a lot of baking especially isn't just about taste - but about consistency and proper proportions to get repeatable texture and density. It is also generally speaking quite sufficient to deal with food liquids in volume as well since room temperature differences are controlled close enough such that it doesn't matter.

The difference in weight of a cup of water between 20 C and 25 C is negligible.

1 cup of flour on the other hand can vary in actual material by over 20% because of numerous variables from clumping to type of flour.

> I don’t know how many times I’ve seen someone in a video boast about measuring the flour by weight for their bread, only to add a completely unmeasured amount when flouring the working surface or their hands.

Basically bullshit again because: In most cases, the working surface flour won't amount to even 1% of the final product, however as stated volume vs. weight can make double-digit differences.

Your whole "10 gram" of sugar claim is a straw-man.

Baking and pastries tends to require a lot more precision then basic cooking to get repeatable edible results. It's the one reason why pre-made cake mixes and Bisquik are so popular, even with professional chefs.


There's a relatively old FOSS "band-in-a-box" program: https://www.mellowood.ca/mma/


Neat example of this (someone posted this as a response on Twitter):

https://www.youtube.com/watch?v=bvMW71BJofc


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: