This is really cool. And as OP pointed out, I really like the pipeline integration. Like when linting catches function-level complexity, but in a cross functional way. I prefer to think of programs in layers where the top layers can import lower layers, but never the other way (and also very cautious on horizontal imports). Something like this would help track that. Unfortunately, I'd really need to support Go. I find it interesting the the code is written in Go, but doesn't support Go. But I will watch this project.
From the visualization perspective, it reminds me a lot of Gource. Gource is a cool visualization showing contributions to a repo. You see individual contributors buzzing around updating files on per-commit and per-merge.
The visualization is actually inspired by Gource, but taken to the 3D space, it's a really cool project.
Golang is very challenging to implement, because dependencies between files inside a package are not explicitly declared, you can just use any function from any file without importing it as long as they both belong into the same package, so supporting Golang would probably require spawning an LSP and resolving symbols.
The reason for implementing dep-tree in Go was because things were going to get algorithmic af, and better to choose a language as simple as possible, knowing that it also needed to be performant.
If Go treats all files inside a package the same, maybe you should use packages as the "unit" in Go instead of files? That would probably still be useful, at least for bigger projects...
Yeah, that's an option, it's not a perfect fit with the philosophy of the project, but definitely possible. But ideally it would just work between files in a package.
Another 3D code base tool is Primitive they were pushing almost into IDE territory but not sure anything got beyond beta but maybe with Apple Vision Pro they might take another swing.....
Mostly finished. However, these software take advantage of the accessibility features of each OS to emulate mouse and keyboard input. Clipboard access is also required. So as each OS changes the requirements to access those features, someone has to keep updating the software for that.
Synergy doesn't work on Wayland, so I can't use it on Fedora anymore (unless I switch it back to X).
There's always new feature requests. Drag and Drop files is a common one. I personally think that's scope creep, but I can see the appeal. Synergy and barrier already establish an encrypted connection between machines, so copying a file seems a good fit. At very least a "Synergy send to ${computer}" share/send to option would make sense.
Here's Synergy's roadmap, and since Synergy and Barrier are the commercial/open-source fork of the same ancestor, Barrier probably has received similiar requests over time.
I liken this more to how vscode operates. I use that to develop remotely inside a vdi (or inside a local vm over ssh). It will install an instance of itself on the remote host and then it's like I'm operating "locally" out of that host. For work, I use one vm for all my development. For personal, I have a desktop computer I dedicate to development that I often access remotely via laptop/vscode.
But I wouldn't use vscode to connect to all the random hosts I need to throughout, especially if it's going to drop software.
Where I might consider a tool like is is if I connected to something along the lines of a jumpbox. Some primary host from which I launch most of my other work out of. Once it opens on that jumpbox, I could then ssh out from there. That would possibly make the session sidebar useless - unless I had a "jumpbox" per client/region/project.
Long before mobile smart phones, truckers were using apps like copilot gps on their offline laptop for just this. I used to have it on a laptop for install work I did. In fact, I continued using it in areas with no cell coverage. Another app I used was Microsoft Streets and Trips. In 2006, they added a GPS version that did turn by turn nav.
That said, the point still stands. I did some searching, and Streets and Trips are discontinued. Copilot was bought by Trimble and AFAIK, you can only get the mobile app versions (they still have offline support). I think their truck fleet software will still run on laptops, but are sold as part of a bulk purchase.
I did find MapFactor, which I've never used, as being something that will work on Windows and Linux (via wine) and it provides real-time GPS navigation.
There are still more than a few great turn by turn navigation libraries that run on ruggedized windows tablets. There were two that escape my mind.
Quite often they work completely offline, and also cover more information related to topography and driving routes that may not be traditional roads on existing maps. For example, the path to a ___location in a farmer's field, etc.
Same here. I started with k8slens, but moved to k9s when k8slens started requiring the login. I didn't know about Openlens, which seems to be a fork of the open source portion of k8slens. But I've gotten pretty good at k9s now, so I'll probably stick with it.
In either case, both tools have made me less proficient with kubectl. Now I mostly use kubectl to apply yaml manifests. Editing a manifest in k9s is not as straightforward as I would like, and it's nicer to have a local copy of the file.
Outdated PHP is a big thing. So many times I've pulled a php file from an old server and it fails to run on the new because some methods were deprecated twenty-dickety-two. Which isn't a criticism of the language, just how things go. You either have to rewrite it to modern or run an unsupported version of PHP.
When I was young and optimistic, I was doing a bunch of side IT jobs. At one, a local insurance company needed to replace their Windows NT server. Instead of going with Windows 2000, I talked them into me setting up a RedHat Linux server running Samba. I had a few hiccups as the workstations weren't actually connected to a ___domain originally, but I eventually got them all going with AD login, roaming profiles, tape backups, etc. The big selling point was the open source nature, free updates forever.
Less than six months later, Redhat announced that the were going to discontinue RedHat Linux and start releasing their new RedHat Enteprise Linux. This left me rather angry and embarrassed. It was definitely a turning point in my understanding of FOSS. It also made me a lifelong Debian/Debian derivative user.
I've supported RHEL professionally, even getting RHCEs in it. RedHat has also contributed a lot to the open source community. But I've certainly never forgot that first pivot, so I'm not surprised with their recent decisions regarding RHEL's source code.
I understand embarrassment, but anger might be misplaced. Red Hat was a newly public company trying to turn a profit. It identified its market and Red Hat Linux wasn't serving it, and the model they were pursuing with Red Hat Linux wasn't working.
But I am a solid supporter of the "pay for RHEL or use Debian" philosophy. If you need promises about the future, pay for RHEL or use a project that doesn't have commercial motives. Debian is great and I wish more companies would standardize on it and support it.
I'm not such a fan of the middle road of hoping that vendors will continue supplying things of value for free. It's especially ironic that an insurance vendor got burned by placing a bet on a free operating system with no assurances whatsoever. The RHEL subscription is insurance.
Both Canonical and Ubuntu have their own share of problems, and they've been struggling to monetize Ubuntu for the past few years. Once you get burned by one commercial vendor (not once actually… it's the third time afaik), it's probably wise to use this opportunity for migrating to a more stable distribution where commercial interest doesn't have much influence, and which has never intentionally burned its users (since Debian developers are users too and are also doing it for themselves).
Heh. Saw the edit, but responding anyway. Canonical has its own issues. I can overlook some, others (like forcing Snap on users) have put me off a lot.
Ubuntu did a lot to popularize Linux and make the Linux desktop experience more usable. It struggled for a long time to figure out how to monetize that, though. They seem to be profitable now[1] claiming a growth in revenue to $205.4m and operating profit of $44m with a headcount of 858 (up from 705 the prior year).
The "Ubuntu Pro" move this year (which also raised many hackles, briefly) will probably pad the coffers a bit more.
> The big selling point was the open source nature, free updates forever. Less than six months later, Redhat announced that the were going to discontinue RedHat Linux and start releasing their new RedHat Enteprise Linux.
I wish there were legal consequences when companies lied about future prices of things or durations of support.
Notes and presentations from various HEPiX conferences around 2003/2004 will reveal the reaction to academic licence fees for RHEL, and the birth of Scientific Linux as an EL rebuild.
Remember these were/are publicly funded projects with budget time-lines.
From the visualization perspective, it reminds me a lot of Gource. Gource is a cool visualization showing contributions to a repo. You see individual contributors buzzing around updating files on per-commit and per-merge.
https://github.com/acaudwell/Gource