'Awful' seems awfully subjective there. Can Skype even reasonably be grouped with Teamspeak? If all you want to do is quickly chat with a group, is Discord not actually far more annoying, all else equal?
Windows isn't beating out Linux in the desktop space because Linux is awful. Network effects are much more significant.
Not that I'd call myself a buff, but I don't think my knowledge of larger numbers makes 10^24 a small number in any absolute sense. Cryptographic keyspaces are deliberately designed to be unfathomly large, whereas the number of stars in the universe is simply an observational fact. The number of stars really is huge in a human sense, as much as that's worth anything. There are more stars than there are grains of sand, etc.
The fact that we exist to discuss these odds means that whatever the probability distribution, at least one instance of life has occurred. Not only that, but life arose and eventually led to intelligence at our level - something that appears to be rare even on our own planet, but achieved relatively quickly all things concerned (only a few hundred million years).
While the anthropic principle guarantees that we observe intelligence as we're defining it (since we include ourselves), I agree that doesn’t mean intelligence is inevitable or common. A more likely modelling in my opinion is that worlds of microbial life are abundant, worlds with complex multicellular life is rarer, and intelligent civilizations are rarer still. Given the distribution of intelligence levels on Earth, it seems unlikely that we simply passed every constraint while no other planet gets close. Also, if we observed a planet with humans as they were 100,000 years ago, would we even consider them intelligent life? Probably just as intelligent as modern-day humans if raised the same, but literally nowhere near our technological level.
When scientists evaluate whether soil can support certaing thing, they don’t treat each factor (like pH, moisture, nutrients, microbial conditions) as independent hurdles that must be overcome one by one. Instead, they see that multiple factors interact in complex ways. A deficiency in one area (e.g., nutrient content) can be mitigated by another factor (e.g., microbial activity enhancing nutrient cycling). If you extend this to conditions in which life might it arise, it suggests to me that planetary habitability may be more like a network of contributing conditions rather than a checklist -- actually much more difficult to caclulate?
Also, mostly as an aside, we also have the advantage of knowing that life and then intelligence arose relatively quickly once conditions stabilized - only a few hundred million years. n=1 but I think this is a promising indication on where any variables might lie.
I think you're broadly correct, although it's perhaps less applicable to games designed to hook you indefinitely.
I often think about film reviewers, and how the sheer volume of film they've watched means that their experiences are likely further removed from an average person's potential experience, than basically anyone else.
Much like how if you're an average person who doesn't really go to magic shows, the opinion of another random person on a magic show is probably going to be more appropriate for you than that of Penn and Teller, who've seen it all before.
Growing up, a close relation of mine was an economist, and certainly not of the Austrian school. As a teenager, I was basically ganged up on by a teacher and some kids when inflation was brought up in class. They seemingly had no concept of monetary inflation, and I was forced to swallow that it referred solely to prices going up. I obviously questioned him on this incident, and he outlined that the "prices are going up" phenomenon is price/consumer inflation, and that increases to the money supply are monetary inflation.
Historically, monetary inflation and consumer inflation coincided (Supply of X goes up -> X is devalued -> consumables are now charged at higher X), and so distinguishing between the two wasn't particularly pertinent.
The Roman Empire's observations that debasement of their coins resulted in the increase in prices, meant that the original conception of inflation really was as a monetary phenomenon, not just that prices are going up.
It's really only a relatively recent phenomenon, from the early 20th century, that you had dual definitions trying to occupy the same word, although the concept that price inflation could deviate from monetary inflation probably was starting to be understood with the establishment of price indices in the 19th century.
Keynes arguing that prices could rise independent of the monetary supply post-Great Depression increased the focus on consumer inflation. It was around the 1970s where inflation more commonly came to consumer inflation in academia. 'Stagflation' of the 1970s is probably the tipping point in usage.
To conclude: it's not really wrong to use inflation to refer to monetary inflation, as it's the original usage, but considering consumer inflation as 'inflation' is definitely more in fashion (especially in the US).
While it may be true that the Austrian school uses it like that, it's certainly not the case that they're the only ones. In fact, I suspect if you speak to anyone economically trained over a certain age, there would be a high chance of them defaulting to this.
Growing up, a close relation of mine was an economist, and certainly not of the Austrian school. As a teenager, I was basically ganged up on by a teacher and some kids when inflation was brought up in class. They seemingly had no concept of monetary inflation, and I was forced to swallow that it referred solely to prices going up. I obviously questioned him on this incident, and he outlined that the "prices are going up" phenomenon is price/consumer inflation, and that increases to the money supply are monetary inflation.
Historically, monetary inflation and consumer inflation coincided (Supply of X goes up -> X is devalued -> consumables are now charged at higher X), and so distinguishing between the two wasn't particularly pertinent.
The Roman Empire's observations that debasement of their coins resulted in the increase in prices, meant that the original conception of inflation really was as a monetary phenomenon, not just that prices are going up.
It's really only a relatively recent phenomenon, from the early 20th century, that you had dual definitions trying to occupy the same word, although the concept that price inflation could deviate from monetary inflation probably was starting to be understood with the establishment of price indices in the 19th century.
Keynes arguing that prices could rise independent of the monetary supply post-Great Depression increased the focus on consumer inflation. It was around the 1970s where inflation more commonly came to consumer inflation in academia. 'Stagflation' of the 1970s is probably the tipping point in usage.
To conclude: it's not really wrong to use inflation to refer to monetary inflation, as it's the original usage, but considering consumer inflation as 'inflation' is definitely more in fashion (especially in the US).
Conversely, I feel like Lost is massively in the opposite category. They did get their massive up-front commitment for 6 seasons while in season 2. The character and thematic arcs are probably some of the best I've seen in TV, while handling a massive ensemble cast.
The Wire was definitely a masterclass in this, as each season has dangling threads that seem so obviously picked up and driven in subsequent seasons, but nothing that's left dangling to an extent you're disappointed.
That said, it's also aided by primarily being a story about characters rather than plot.
I always thought Lost got a bad rap for that. It was actually a story about characters, but its fandom thought it was very much about plot. Hence the hate for the finale (which I thought was wonderful)
Lost SHOULD have been the opposite, but it instead just piled on new mysteries that were never answered except ‘Dallas’ Who Shot JR style- i.e. it’s just a dream (or in this case purgatory.)
I'm pretty sure it was Bobby Ewing who turned out not to be dead at the end of the season, whereas JR was really shot (just not killed). However my memories of 1980s soaps are somewhat hazy, like most people here I was more interested in things like Z80/6502 assembly code than big-haired women catfighting in ludicrous shoulder pads [0].
However, genuine thanks for the Lost finale spoiler, I never wanted to watch beyond season 2 anyway as it already became rather tedious.
Referring to the 'Who Shot JR' part was a mixup on my part- was pretty young at the time and not actively watching weekly so I forgot which was which...
Most companies certainly won't be using "commercial grade internet" in the way that term is usually used. That would usually be reserved for large enterprises, which really only covers a small part of the workforce in practice.
Many businesses don't bother even subscribing to a business package, because something like a static IP is unnecessary for them.
Further, the point regarding VPNs still stands -- think of the chaos it would cause for many people working from home (on residential connections). And that's just one example.
I don't find it plausible for an ISP to block this.
Actually, there is "commercial grade internet" at least in my country. The main difference is that it is several times more expensive, and in the office buildings the owner doesn't allow ISPs with cheaper "residential" plans.
Business, yes, that was the word I was looking for, thanks! So the ISP could just limit the residential packages, limit the business packages to actual businesses, and that's all.
I think it's clear that solar panels, while working today, clearly haven't been able to solve today's problems, or else this discussion wouldn't be happening. But we should keep investing in them, one way or another.
Similarly, we should keep investing in the prospect of commercially viable fusion reactors. The harnessing of fusion reactors would be instantly revolutionary as opposed to the incremental progress solar promises. Therein lies the difference. Once is not necessarily better than the other.
I would say it’s clear that solar panels are absolutely working extremely well today, at least as long as you don’t live too close to the poles.
Renewables all together is growing faster than nuclear ever did. And solar is now a huge part of that.
We have models where solar or solar+wind is providing all the power to everything from small remote weather stations through houses to large islands. Some small countries and regions are getting close too.
It’s clear that we have all the technologies we need to do 100% renewables. There’s studies that indicate that the long term costs of this is lower than the traditional fossil and nuclear energy infrastructure. We just need to build the factories to continue scaling up. And of course the transition is more expensive than it’ll be when we just maintain and expand on the system.
I'm fine with gas plants available on standby. If they run 1% of the time, then they are no longer a significant contributor to climate change.
Even if they have to run 10% of the time, we've still taken an enormous cut out of greenhouse gases. We would turn our attention to many other sources of greenhouse gas (agriculture, concrete, transportation, etc.)
Batteries have huge potential, simply due to the fact that they're so broadly defined - must store energy, output it as electricity on demand, and be cheap. There's a high chance that we find some way to make grid-scale batteries extremely cheaply, in the future.
In the mean time, getting to 90% will basically stop climate change in its tracks, giving us time to research dirt-cheap batteries.
It doesn't even have to always be electricity on demand, sometimes we also need heat. I wonder if heat storage will be a thing we'll have in the households (or maybe it's enough to have it in district heating facilities?).
Less then 99% means not having a working fridge for 3-12 hours or more in 30c heat so there went all your perishable foods. It means no lights in the house will work. No cooling or heating of any kind. No computers. No phone. None of your other random applicances will work either. None of the stuff you use to navigate a city like street lights will be working. Of course it can be mitigate with a generator or an expensive battery bank with solar panels provided you don't have a large enough load. Of course solar panels only work during the day so if the outage lasted into the night then you better hope to have a large enough bank to power all your essential equipment.
Suffice to say, less then 99% available is pretty terrible. You should come down and talk to a South African.
The historical trend line https://ourworldindata.org/battery-price-decline, that those involved in this business seem optimistic that this will continue into the future, and that I am not aware of any technical barriers preventing that (even with no technological breakthroughs it seems likely that we would continue to see declining costs for some time even just due to economies of scale).
If we had one hour per day without power that would be about 95% availability. Most of us wouldn't even notice that if it happened in the middle of the night.
If we had 100% availability with an 18-day stretch without power that would be about 95% availability, but it would be hugely disruptive.
Why? I don’t agree that wouldn’t be at all acceptable.
And even > 90% would be very expensive to achieve in winter in much of Europe (of course there are alternatives to solar so it’s not such a huge issue)
Go to a country with just that and witness how stupidly wasteful it is to have an energy grid with regular outages. Everyone who can afford it has an expensive backup generator, batteries, etc.
For industry, it's a disaster.
I live off-grid, with solar and LifePo4, but I'm not naive enough to think that would scale to an economy any time soon. And for the record, no below 99% availability should be seen as unacceptable.
The implication is that it's a voluntary political decision to forgo more reliable sources of electricity. There's basically zero chance of that happening in any political entity, hence that expectation is optimistic.
Of course, it frequently happens involuntarily, and just saying "get used to it" is pessimistic, as you say.
Both reflect the same thing: it's politically untenable to voluntarily accept poor reliability of electricity supply.
Solar power can 100% solve our energy needs today. It’s cost effective at the unit level. It works at scale. It decentralizes nicely.
Did I mention that it works?
Every home could have rooftop solar for less than it costs to produce centralized power plants. (I have rooftop solar and it cost significantly less than a new car now my power costs won’t go up for 20 years at which point the panels might need a refresh but that part of the system is the cheapest part)
We could easily flip from subsidizing fossil fuels to subsidizing rooftop solar today and realize significant gains (higher roi by shifting the investment). If you spent one years investment in fusion and fossil fuel subsidies on deploying rooftop solar and grid scale batteries you’d change the energy story permanently.
Energy would suddenly be plentiful. Fossil fuels would permanently shift out of relevance.
Fission reactors would look like quaint and staggeringly expensive tools of a bygone age. And fusion which DOES NOT WORK. Would look even more like a silly dream.(We are no closer to fusion than we were 30 years ago.)
Why the fuck are we still talking about fusion when we have something that works?
I agree with everything you say about rooftop solar. If you have a suitably unshaved roof, and you are in a reasonably sunny climate, the current economic math works. And as energy costs go up, and capital costs come down it works better all the time.
That said, we still need a grid to distribute electricity to places that consume more energy than roof space. Think apartment blocks, factories etc. And yes, a huge chunk of that load can still be supplied by grid-scale solar and wind etc.
Even with large-scale storage (another fruitful place to spent investment money) there's going to need to be peak-generation.
However you look at it, I don't think fusion will be the answer. Since fusion was first proposed and the landscape of requirements have shifted. By the time it's practical, it'll be solving a problem we dont have.
The science may lead to a working reactor. But no one will build it at scale because it simply won't solve the problem well have then.
Grid scale storage has lagged behind solar and wind generation, but it's starting to catch up. By the time we have a viable commercial fusion power plant all of the grid storage issues will have been long since solved.
If you ever get involved in hiring people, you realise that there are people slipping through the cracks. I've come across freelancers or underemployed people making a fraction of what their skills should be earning them, and it seems totally arbitrary that they're in such a position. It's lucky - for both sides - when this match happens.
An efficient system would make this integration easy, but I agree, for a lot of people to get anywhere, they have to "insert themselves".
Simultaneously, it's actually quite easy to be oblivious to this -- if you go to school, get good grades, get a graduate job, and do well enough at your job (but remain a normal employee), you'll never have the opportunity to have this realisation.
Windows isn't beating out Linux in the desktop space because Linux is awful. Network effects are much more significant.