Even if this was technically possible with a standard subscription, I'm curious how you'd think trying to sell access would go, how seriously you think people whom you approach would take you, etc.
I share your frustration so I love that it is so fast/lightweight, cool project. I personally don't think I would have much use for a built-in terminal. I would just like for it to let you browse through whatever other images in the same folder from the image I opened with and be able to navigate it with the keyboard.
Images should also resize whenever the window is resized. Those two changes alone would make this very usable.
This looks nice and lightweight, it would be sweet to have the possibility to import an OPML file which is what most of the podcast services let you export.
EDIT: Noticed a little bug on Windows, you can't add a podcast while it's playing, at least not by pasting the address, it immediately wipes out the field. Had to pause playback in order to add another.
I have personally got a small rock to the head and almost died.
It is about the accuracy and how fast can the other person react or protect themselves. I would say that guns like these exceed the throwing of rock with high numbers. You can also do it many times in a short duration.
Fireguns are effective for that particular reason. They are accurate, and there is no way to protect against, as they are very fast to use. And you are not limited to single throw.
He was over estimated the threat (Ed: or suggesting throwing it is more dangerous than firing it), the issue is capacitors suck.
A 95 MPH fastball which can seriously hurt has about 150J worth of energy. Meanwhile a professionally made coil gun using a lot more capacitors hit 10J and this isn’t nearly as powerful. https://en.wikipedia.org/wiki/GR-1_%22Anvil%22
This was long coming and announced (Steam had a big hard to miss warning whenever opened on Win 7, for pretty much the entire year), but yet another reminder that when it comes to digital libraries (of games, apps, music, movies, books, etc), your "ownership" of your titles is dependent on countless variables.
Even though GOG's own client (GOG Galaxy) has been requiring Windows 10 for even longer, you could always just download the games from their website and manually install them no problem.
Even if that were possible, the GPL license makes no sense for artwork or music, and is quite possibly legally unenforceable in such a context. You need a combination of GPL + Commons; and how many games are licensed that way?
But even then, GPL + Commons does not give you trademark rights, only the ability to reuse the assets under a different name. So unless you have GPL + Commons + Trademark, do you really have ownership?
But hold on, in Japan and in the US, game mechanics can be patented. So who cares if you have the code, assets, and trademark, if you don't have patent rights? I suppose you need GPL + Commons + Trademarks + Patent Assignment; or maybe you swap out GPL with Apache2.
Now hopefully whoever made the artwork doesn't sue for unpaid royalties. You're relying on the declared licenses, but it's still possible that whoever made the game, lied in one way or another. It's also possible there are applicable patents owned by other companies, which weren't disclosed.
The point is: Even with GPL code, it's still a long way from being "your game." I didn't even mention the middleware like Havok Physics or Unity Engine; which would render your GPL game code pretty useless without a proprietary attachment, if using the GPL license at all is even legal with such a combination.
It's easy enough to maintain ABI compatibility layers for games to run them indefinitely, even cross platform (eg Wine, emulators, etc etc etc)
I don't demand GPL rights over the movies I watch, or the books I read, so I'm not sure why I'd require that for games. Of course source-available would be better than not but it's not a hill I'm willing to die on. I'd rather play some good games.
gpl is still subject to bit rot and likely the reason there aren’t more gpl games
there’s misaligned incentives between the people writing the game and the people wielding c compilers as political weapons.
i swear every time i can’t compile code it is not clear which aspect of c failed— the dynamic linker working around gpl limitations as technical debt, the kernel itself using a more advanced c with backwards breaking changes, the code being written for the wrong architecture triple, or the code was written for a different c compiler altogether
i don’t actually try and fix it because bit rot only gets worse if i notice the problem.
at least for linux, it might make sense to containerize the environment and store it. That way next time if it doesn't build on current, one can use the prev that did work.
i use linux on a daily basis and i will say microsoft’s solution to containerization was the .exe
linux has appimage, but containerization also falls prey to gpl.
exe, appimage, containerization are ideologically opposed to the gpl—- errr the other way around. gpl essentially requires compiling from source, which is great, but the intention behind that is to disrupt software supply chain distribution and most people want to be able to upgrade their computer, which is architecturally challenging with gpl, which is where nix, guix and the like can politely solve the compilation and distribution problem
but the core problem is my mom wanted a picture of me to know i’m alive, which is now entirely irrelevant to the topic at hand.
Isn't that the kind of description of the first iteration of many innovative products? The iPad was mocked just as much if not more, it was a big iphone without the phone, with no possibility of multitasking and a plethora of other limitations and yet now tablets are ubiquitous.
It's not hard to see how this product could continue to be streamlined and made more accessible in the future.
Apple showing up to the party is usually a pretty good indicator of a technology having crossed a maturity threshold: smartphones, tablets, smart watches, wireless earbuds, TV streaming devices, ARM laptops, etc.
Even their “misses” have just been devices that were too niche or bad value propositions for the average consumer, rather than being technically immature (thinking of HomePod here). It’s rare for Apple to launch a device that’s just far too early to be useful even to its target audience.
It's not really the first iteration though, the modern VR era started about 8 years ago with the first consumer Oculus Rift and in that time it's been iterated on numerous times by numerous players and none of them have stuck.
"The iPad was mocked" is irrelevant. Many or most products are mocked by some people, even iPhone. Regardless of mocking, iPad was an immediate success. Vision Pro is not. I fully admit that the price of Vision Pro is the biggest problem. But you can't pretend that the first iteration of Vision Pro is just like the first iteration of iPad.
My point is that they are two very different products with substantially different target audiences.
Now, sure, you can say the Vision Pro was not as big a success as the iPad even if you account for that difference in markets, scale, price ranges, etc. But that doesn't mean it's a total failure either, or that there is no future for the product.
Most people who have a Vision Pro, seem to like it. It's unsurprising that it's not flying off the shelves because at the moment it's little else than an expensive toy, and once the novelty wears off it's not like there is that much to do with it at the moment, it's also seemingly uncomfortable to wear for prolonged periods of time. But like I said, it's not hard to see how it could be getting better with future iterations.
So even if there is no perfect correlation between the shortcomings of the first iPad and the larger shortcomings of the first Vision Pro, there is a correlation.
putting aside my bitter cynicism of "Apple hype culture": I do think VR just needs to wait for the tech to evolve into the level of ease of "put on snow goggles" before we get wide adoption. But I also am in the camp where I don't see this being a market with desperate demand. The iPad is a great example because in many ways it's the same: some people read religiously on it, other people are artists and they catered to that market. Then others just use it as a "cheap" computer to put in front of a kid.
These are diverse markets, but far from the general market. I think VR/AR will end up the same.
> These are diverse markets, but far from the general market.
What do you mean by the general market?
iPad has more unit sales than Mac. It's a massive market. The last time Apple reported unit sales, back in 2018, iPad was selling over 43 million units per year.
tablets are everywhere, can be shared, and do not make people look ridiculous. hell, even my cats have apps made just for them. haven't seen any viral videos of pets wearing a headset.
not being able to see how this is different is very disingenuous. when the ipad was released, nobody had a device like that. apple's headset was not the first. even those the came before did not gain a lot of traction. so apple is not blazing new trails here that people just don't understand yet. this is an accepted as niche product line for certain personalities.
They are now, and that's exactly the point. And people don't look ridiculous now because they became adopted, but even the first versions of mobile phones made people look ridiculous.
Have some perspective, try to think beyond a lapse of more than two years back and forwards.
My 70+ year old aunt and 75year old mom (at the time) were using the very first (heavy and clunky) iPad. It was a device that immediately appealed to certain people for who full on computing was too much, when all they wanted to do was reading newspapers, websites, watching photos.
I cannot see this with a VR headset. It's a very geeky limited market, no matter the price point. But ESPECIALLY at Apple's price point.
So, what exactly is the point of this line of argument? That some niche forms of touchless interfacing existed already? And thus the interfacing of the Vision Pro is not innovative?
Reminds me of the Magic Leap. Or even the Kinect. That use case is even more niche than VR, but setup some tracking gloves and you can perform gesture based actions on your PC (don't really NEED the gloves, but it improves precision without needing a special spatial comera).
The UX itself is an iPhone-vs-Blackberry style leap compared to every other AR or VR device out there. It's just a fundamentally better paradigm for basic tasks and for mixing a headset (or future iGlasses) with non-VR activities.
Pretty interesting. If you add support for Pro Tools and Logic Pro, you'll have the attention of industry pros. Others important DAWs to consider supporting are Studio One, Cubase, Nuendo, Adobe Audition (this one should be easy, their session files are basically XML).
For as much as it does work, this works because it translates project formats between VST hosts, where you might run the same plugin that will be able to read its own data payload in the output project.
But neither ProTools nor Logic supports those VST plugins, and the embedded data for AAX and AU plugins are each encoded differently. There's no guarantee -- and generally slim likelihood -- that you could coax an AU or AAX version of some plugin to read data written by its VST sibling, and likewise any other combination. Even if you could pull it off in some cases, you can't get it consistent enough for professional use.
This sure to be a super useful project for some people, but it can only reach so far.
I don't see why it should be untenable. You can have equivalent parameters on identical plugins regardless of the plugin protocol is coded in, and the same plugin in AU could produce a signal that nulls out to the exact same processing applied on the same plugin in AAX or VST versions. So there has to be a way to translate between the two. And it's a very worthwhile problem to solve, perhaps the most important one when it comes to converting sessions between DAWs.
Plugins encode their state as a combination of automation parameter values managed by the DAW and opaque binary blobs for everything else.
In the general case of professional-grade use, you can't assume that two versions of the same plugin will handle any of those in compatible ways. In some cases they might, but there's nothing that guarantees it and many known cases where it doesn't hold.
And speaking to your broader point of "there has to be a way to translate between the two" -- this is likely true in a formal sense, but not a practical one. If it was 2030 and your job was to revive some dead AAX plugin and open its projects in some new one that you were writing for AUv4, you could do a whole bunch of bespoke debugging and forensics to make it happen. But there's no solution for the general case, as applies to a converter like this.
Also, there is at least one synth (Serum) where the developer declined to document the file format of the presets:
"I reached out to Steve through Xfer's forum and he was prompt and helpful. Unfortunately, the .fxp file format is completely dependent on the source code, and he can't release a spec for it without making the code open source. Which probably isn't happening any time soon."
VST plugin state is a binary blob owned by the plugin (the host's job is merely to read/write from/to disk when required). This is because the plugin is free to have hidden state that the host does not know about.
AAX and VST and AU [ and other ] plugin formats do not require any kind of interoperability of the way their plugin state is serialized and deserialized. Pulling the VST binary blob from a session in one DAW and giving it to the "same plugin" in AAX format running in ProTools is not even a thing.
Not only that, but even identifying "the same plugin" is far from trivial because different plugin formats use entirely different models for identification. The fact that the session in DAW A uses a VST format plugin identified as "XXXX-YYYY-ZZZZ" gives you no clue how to identify the equivalent plugin in LV2 or AAX or AU formats.,
Ah, I see, so it's a matter of not being able to get the data out of the plugin in the first place, in a way that's usable outside the DAW.
So, even if it was somehow possible say, to reverse engineer one specific plugin across AAX, VST and AU, you'd still be no closer from solving true cross-compatibility from AAX, VST and AU in general.
The assumption here is that we care about plugin state. I have audio projects with minimal plugins and the first job is converting the track structures (including cropped/transformed wavs), plugins are replaceable in many cases but it perhaps depends on the genre and composition style you are targeting.
reply