Hacker News new | past | comments | ask | show | jobs | submit | markjgx's comments login

> It's been flawless, more battery and memory efficient than Chrome

Is that actually true?


It was true for me, specifically for my workflow, the websites I use and how I leave some specific tabs in the background.

I’ve used Chrome for many years before Firefox and it was always prioritizing JS responsiveness even when the app was in the background and not needed, so it consumed CPU cycles and battery power needlessly. I see now that Chrome enables a Low Power mode by default on battery and it’s unusable as scrolling gets janky. I don’t know if the overall experience has gotten better in the last year on Chrome.

Not sure what’s different about memory though, but Chrome always appeared like a memory hog when I tested both browser side by side on the same set of websites and same few extensions. Could be that it just caches more and that’s benefitting responsiveness


True for me too, much more memory efficient, especially with content heavy websites.


"Surfer: The World's First Digital Footprint Exporter" is dubious—it's clearly not the first. Kicking off with such a bold claim while only supporting seven major platforms? A scraper like this is only valuable if it has hundreds of integrations; the more niche, the better. The idea is great, but this needs a lot more time in the oven.

I would prefer a cli tool with partial gather support. Something that I could easily setup to run on a cheap instance somewhere and have it scrape all my data continuously at set intervals, and then give me the data in the most readable format possible through an easy access path. I've been thinking of making something like that, but with https://github.com/microsoft/graphrag at the center of it. A continuously rebuilt GraphRAG of all your data.


Take a look at https://github.com/karlicoss/HPI

It builds an entire ecosystem around your data where it is programmatic rather than just dumping text files. The point of HPI is to build your own stuff onto it and it all integrates seamlessly together into one Python package.

The next stop after Karlicoss is https://github.com/seanbreckenridge/HPI_API which creates a REST API on top of your HPI without any additional configuration.

If you want to get more fancy / antithetical to HPI, you can use https://github.com/hpi/authenticated_hpi_api or https://github.com/hpi/hpi-graph so you can theoretically expose it to the web (I am squatting the HPI org, I am not the creator of HPI). I made the authentication method JWTs so you can create JWTs where it will give access to only certain services' data. (Beware, hpi-graph is very out of date and I haven't touched it lately but my HPI stuff has been chugging away downloading data).

Some of the /hpi stuff I made is a bit mish-mash because it was rip-and-replace from a project I was making so you'll see references to "Archivist" or things that aren't local-first and depend on Vercel applications.


The amount of built-in platforms isn't necessarily the problem. The best systems are those that establish a plugin ecosystem.


While I agree that it's not the first, I think it's unfair to say that it's not valuable without hundreds of integrations.


Yeah it was honestly more of a marketing statement lol, but removing it for sure. Adding daily/interval exporting is one of our top priorities right now and after that and making the scraping more reliable, we'll add something similar to GraphRAG. Curious to hear what other integrations you would want built into this system.


Some players will also be convinced that you're cheating if you play the game really really well, and they will get upset. CS:GO (now CS2) has a very fascinating way of determining if someone is cheating. A ML based heuristic that is constantly being retrained, that can accurately judge whether someone is actually cheating or not based off their Overwatch (not the game) replay system. https://www.youtube.com/watch?v=kTiP0zKF9bc


> Managing build configurations...

In terms of package management, you can apply rules to what crates you want to include; including specific platform constraints.

  [target.'cfg(target_os = "linux")'.dependencies]
  nix = "0.5"
On the code side it's pretty much the same as C++. You have a module that defines an interface and per-platform implementations that are included depending on a "configuration conditional check" #[cfg(target_os = "linux")] macro.

https://github.com/tokio-rs/mio/blob/c6b5f13adf67483d927b176...


Glad this was flagged. A lot of people have a misconception of managed languages being slow compared to your regular ol' binary program, these days that couldn't be further from the truth. In traditional high performance C/C++ development you have to manually split your code into hot and cold paths, static analysis optimization can only go so far.

Do you want to inline this function in your loop? Yes and no, i.e. you might be taking up some valuable registers in your loop, increasing register pressure. Time to pull out the profiler and experiment, wasting your precious time.

Managed languages have the advantage of knowing the landscape of your program exactly, as __that__ additional level of managed overhead can help the VM automatically split your code into hot and cold paths, having access to the runtime heuristics of your program allows it to re-JIT your hot paths, inline certain functions on the fly, etc.


hckrnews.com + DarkReader. DarkReader is such a good extension, couldn't live without it!


"and it looks like they're finally improved on what was a terrible, clunky, extremely dated UI." doesn't hold true, it's mostly the same UI with a dark reskin. Certain editor hot paths have been reworked and that's about it. The editor is almost exclusively written in Slate, Epic's in-house Window framework/general GUI module. Slate's got a pretty interesting nested MACRO system. Most definitely not data-driven and pretty much as hard-coded as it gets, redesigning the editor for real would be difficult to say the least. In reality most experienced developers don't want a new design, they are happy with the workflow they have and I have to agree with them. I'm generally happy with the "if it ain't broke don't fix it" reskin decision.


Hey there, this looks great. I was wondering why Deepspeech 0.6? Why not the latest version DeepSpeech 0.9?


I need to cycle back and update voice2json. Rhasspy (the full voice assistant) supports DeepSpeech 0.9.3.


Awesome, thanks.


What does HN think of Model M replicas? Had a few mechanical keyboards over my life but my Model M replica feels the best.


It's unfeasible for most people though because of the incredible noise it generates (Yeah, I get it, the sound's awesome and it's a feature, but you're not going to use it at work or at home with other people)


I started out by doing a raytracer, graphics programming is really rewarding. https://raytracing.github.io/books/RayTracingInOneWeekend.ht...


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: