Hacker News new | past | comments | ask | show | jobs | submit login

Those are great points! Another related law is from queuing theory: waiting time goes to infinity when utilization approaches 100%. You need your processes/machines/engineers to have some slack otherwise some tasks will wait forever.



I’m remembering reading once that cities are incredibly efficient in how they use resources (compared to the suburbs and rural areas, I guess), and, in light of your comment about waiting time, I’m realizing why now why they’re so unpleasant: constant resource contention.


Amusingly this is something that I see as being a huge divide in rural and urban politics.

Yes, it’s inefficient. Yes, some people want that!


Right. Living is not an optimization problem.


Unless not until the oil and other essential stuff run out.


Our problem is not that we are running out of stuff, but that we’re drowning on it.


Sorry to put it so bluntly, but you're basically saying:

"I don't care it the climate's fucked, I want to live away from civilization and drive 100 miles a day everywhere"

Of course we shouldn't hyper-optimize everything, but sooner people realize our environment depends on not everyone getting exactly what they want whenever they want the better. Living in a (walkable) city is just one such concession towards the environment we ought to make, even if we don't "want" to.


Or we could just compete with each other for resources as we have since forever. I’d rather do that than have no choice but to live in Kowloon.

Just whack an externality tax on fossil fuels and things like cutting down wilderness, job done.


Or we can stop acting like there’s only two options: living in wide-open fields with a clear horizon or the fucking walled city of Kowloon.

Also, you have the mindset of a typical anti-social coastal elite who thinks “oh no big deal we can just raise the cost of living for all the poor rural types by sticking on a tax because I want to go LARP as a Victorian manor lord. And people don’t bend to my every whim immediately or live exactly like me so I want to be in total control of the 50 miles around me.”


Sure.

All I'm saying is that the efficiency arguments are silly unless you are comparing like for like. If we're suggesting that people simply do less because it's more efficient, well, no-one is going to do that without an incentive.

Everyone having 50 sqmi obviously isn't realistic (there actually is not enough space on the globe), but equally, if the idea is that everyone _has_ to live in a metropolitan apartment because each person has to use (1/7billion) of the resources, you're going to see an uprising, that just won't fly with people.

The best outcome is probably to convince as many people as possible to live in a shoebox so that the rest of us can still have a decent life. It seems to be working!


We're already competing under capitalism, and clearly the end-state isn't who'll get the most of what we already have, but who'll get the most of what's yet to get exploited. This competition doesn't have any upper bounds.


That’s not remotely what I’m saying. I live in a city and don’t drive most days because I can walk and take public transit and there’s never any parking. What I’m saying is that in the bigger picture, approaching life as a set of problems to be optimized is the wrong way to approach life.


If you think cities don't fuck the climate just as much as suburbs do I have a well you can carry water 40 flights of stairs from.


cities do because they exist in a system that generate carbon, but they are vastly more resource & carbon efficient than suburbs per person

https://usa.streetsblog.org/2015/03/05/sprawl-costs-the-publ...

https://news.berkeley.edu/2014/01/06/suburban-sprawl-cancels...


Maybe, but the resources it takes to live are an optimization problem.


what it means to not optimise though is that some people end up better off and many others are worse off.


And what it means to optimise is also that some people end up better off and many others are worse off.


Yes, the point is to find a balance so that the first number is maximised.


On the other hand, in cities people are queueing up and talking at the bakery counter. While people in the suburbs are listening to the radio while driving to the bakery. I guess you choose to live where you feel most comfortable.


FWIW, my experience is that people are friendlier and more likely to make conversation outside of urban areas.


Maybe if you are a white male. Ever drive through a rural area a month out from this election? It’s a scary place, especially right now with the rhetoric being used just on the signs people are putting in their yards.


In fact that is also my experience but while urban people driving in the car they typically aren't talking to strangers


The efficiency results in abundance not possible in less dense areas, you are waiting for things that are simply not available elsewhere.


Sort of. Compare doing laundry at the laundromat to doing laundry in your basement.


They meant things like bars, restaurants, sports stadiums, concerts, plays. Things that require sufficient density to make economic sense.


LA has multiple of all of those and nearly entirely suburbs


Right, but if I had 1 hour in NYC versus 1 hour in LA how many clubs could I theoretically go to? Probably a dozen in NYC, provided I leave immediately. Probably about .5 in LA.

So while what you're saying is true, it doesn't disprove anything. LA is much less dense and therefore has much less "stuff" available for its inhabitants. But it's still more than a rural area.


This might surprise the nyc residents on hn but you can find bars next to bars next to bars next to bars in LA too.


It allows for a greater variety of things (museums, concerts, etc.), but to get that you have to deal with higher contention and, thus, costs across all things (whether in terms of time spent waiting or money spent outbidding others), including, crucially, the things you consume the most of (roads, housing, etc.). So maybe a good way to think about it is: if you have a lifestyle that requires a modest amount of most resources, then the variety provided by density may be worth the increased resource contention, but if you have a lifestyle that requires a lot of certain resources (like space for kids), then the tradeoff may no longer make sense.


Yep, I used to work in a factory. Target utilization at planning time was 80%. If you over-predict your utilization, you waste money. If you under-predict, a giant queue of “not important” stuff starts to develop


This reminds me of something my mother told me she aimed for when she ran her catering businesses: she always wanted 1 serving of pie leftover at the end of every day.

If she had 0, she ran the risk of turning customers away and losing money. Any more than 1 is excess waste. Having just 1 meant she’d served every possible customer and only “wasted” 1 slice.


And then you can eat the pie as a reward.


Customers don't want to buy the last one.


For some scenarios that's fine, and you can slash the queue whenever necessary.

Eg at Google (this was ten years ago or so), we could always spend leftover networking capacity on syncing a tiny bit faster and more often between our data centres. And that would improve users' experience slightly, but it also not something that builds up a backlog.

At a factory, you could always have some idle workers swipe the floor a bit more often. (Just a silly example, but there are probably some tasks like that?)


Unlike merchantmen, naval vessels were crewed at a level allowing for substantial attrition (bad attrition would be casualties; good attrition would be prize crews); I believe they traditionally (pace Churchill) had many, many activities which were incidental to force projection (eg polishing the brightwork) but could be used to occupy all hands.


Yes. And, well, you can also always train more. Especially in the age of sail.


You can add a measure of robustness to your optimization criteria. You can explicitly optimise for having enough slack in your utilisation to handle these unforeseen circumstances.

For example, you can assign priorities to the loads on your systems, so that you can shed lower priority loads to create some slack for emergencies, without having to run your system idle under during lulls.

I get what the article is trying to say, but they shouldn't write off optimisation as easily as that.


The problem is that people who agree to a task being low priority still expect it to be done in nine months and all of a sudden they become high priority if that doesn’t happen.

So you’re fixing the micro economics of the queue but not the macro. Queues still suck when they fill up, even if they fill with last minute jobs.


This totally depends on the system in question and what the agreements with your users are.

Eg if you are running video conferencing software, and all of a sudden you are having bandwidth problems, you typically first want to drop some finer details in the video, and then you want to drop the audio feed.

In any case, if you dropped something, you leave it dropped, instead of picking it back up again a few seconds later. People don't care about past frames.

(However, queuing instead of outright dropping can still makes sense in this scenario, for any information that's younger than what human reaction times can perceive.)

Similarly in your scenario, you'd want to explicitly communicate to people what the expectations are. Perhaps you give out deep discounts for tasks that can be dropped (that's what eg some electriticy providers do), or you can give people 'insurance' where they get some monetary compensation if their task gets dropped. (You'd want to be careful how you design such a scheme, to avoid perverse incentives. But it's all doable.)

> So you’re fixing the micro economics of the queue but not the macro. Queues still suck when they fill up, even if they fill with last minute jobs.

I don't know, I had pretty positive experiences so far when eg I got bumped off a flight due to overbooking. The airline offered decent compensation.

Overbooking and bumping people off _improves_ the macro situation: despite the occasional compensation you have to pay, when unexpectedly everyone who booked actually showed up, overbooking still makes the airline extra money, and via competition this is transformed into lower ticket prices. Many people love lower airfares, and have shown a strong revealed preference of putting up with a lot of stuff eg RyanAir pulls as long as they get cheap tickets.


A task "shed" is one delivered with infinite latency. If that's fine for you then the theorem doesn't hurt you, do what's best for your ___domain. It's just something to be aware of.


I feel that a 100% efficient system is not resilient. Even minor disruptions in subsystems lead to major breakdowns.

There’s no room to absorb shocks. We saw a drastic version of this during COVID-19 induced supply chain collapse. Car manufacturers had built near 100% just in time manufacturing that they couldn’t absorb chip shortages and it took them years to get back up.

It also leaves no room for experimentation. Whatever experiment can only happen outside a system not from within it.


This is coincides with my headcannon cause of the business cycle.

1. Firms compete

2. Firms either increase their efficiency or die

3. Efficient firms are more susceptible to shocks

4. Firm shutdown and closures are themselves shocks

5. Eventually the system reaches a critical point where the aggregate susceptibility is higher than the aggregate of shocks that will be generated by shutdowns and closures

6. Any external shock will cause a cascade

There's essentially a "commons" where firms trade susceptibility for efficiency. Or in other words, susceptibility is pooled while the rewards for efficiency are separate.


It sounds similar to how animal/plant species often work.

A species will specialise for a niche, and outcompete a generalist. But when conditions change, the generalist can adapt and the specialist suffers.


*head canon

Something you personally (in your head) believe to be a general law, or rule, or truth (canon). It's roughly synonymous with "mental model".

A cannon is a weapon.


But in practice we see that:

1. Firms compete

2. Some firms get ahead

3. Accrued advantages to being ahead amplify

4. A small number of firms dominate

5. New competition is bought or crushed

6. Dominate firms become less efficient in competition-free environment


They aren't mutually exclusive. And, not xor.


Good analysis, but one also needs to look at the definition of `efficiency`, what is your definition of efficiency in this context.


The ability to do more with fewer resources. Profit is a great starting point when answering, "What is efficiency to a firm?"


If only that weren't called a "cycle" as if it had a predictable periodicity.


It has inevitability, but you're right, not predictable periodicity. Is predictable periodicity a necessary part of a cycle? I feel like the rise and fall of nations is a cycle, but not necessarily one of predictable periodicity.


If not predictability, then regularity, and I believe that's a fundamental misunderstanding -- the system is chaotic.


There is a fundamental tension between efficiency and resilience, you are completely correct. And yea, it’s a systems problem, not limited to tech.

There is an odd corollary, which is that capitalistic systems which reward efficiency gains and put downward pressure to incentivize efficiency, deal with the resilience problem by creating entirely new subsystems rather than having more robust subsystems, which is fundamentally inefficient.


This is exactly the subthread of this conversation I’m interested in.

Is what you’re saying that capitalism breaks down resilience problems into efficiency problems?

I think that’s an extremely motivating line of thinking, but I’ll have to do some head scratching to figure out exactly what to make of it. On one hand, I think capitalism is really good at resilience problems (efficient markets breed resilience, there’s always an incentive to solve a market inefficiency), on the other (or perhaps in light of that) I’m not so sure those two concepts are so dialectically opposed


To understand the effects, we first have to take a step back and recognize that efficiency and resiliency problems are both subsets of optimization problems. Efficiency is concerned with maximizing the ratio of inputs to outputs, and resiliency is concerned with minimizing risk.

The fundamental tension arises because risk mitigation increases input costs. Over a given time horizon, there is an optimal amount of risk mitigation that will result in maximum aggregate profit (output minus input, not necessarily monetary). The longer the time horizon, the more additional risk mitigation is required, to prevent things like ruin risk.

But here’s the rub: competition reduces the time horizon to “very very short” because it drives down the output value. So in a highly competitive market, we see companies ignore resiliency (they cannot afford to invest in it) and instead they get lucky until they don’t (another force at work here is lack of skin in the game). The market deals with this by replacing them with another firm that has not yet been subject to the ruinous risks of the previous firm. This cycle repeats again and again.

Most resilient firms have some amount of monopolistic stickiness that allows them to invest more in resiliency, but it is also easy to look at those firms and see they are highly inefficient.

The point is that the cycle of firms has a cost, and it is not a trivial one: capital gets reallocated, businesses as legal entities are created, sold, and destroyed, contracts have to be figured out again, supply chains are disrupted, etc. Often, the most efficient outcome for the system is if the firms had been more resilient.

So there is an inefficient Nash equilibrium present in those sort of competitive markets.


That’s a good clarification about firms vs. the broader system. I think that’s a pretty good breakdown, overall, and fits well with the general notion that capitalism is resilient, not efficient, by offloading efficiency onto smaller entities which are efficient, not resilient. You could compare to command economies where a firm failure is basically a failure of the state, and can destabilize the future of the entire system.


There was a sci-fi series I read (I want to say by Alastair Reynolds) which talked about planet-bound civilisations having an innate boom-and-bust cycle where the civilisation would inevitably get more and more efficient at utilising resources, while thereby becoming more fragile and susceptible to system shocks. It would then collapse and eventually the survivors would rebuild.


I mean, car companies also just straight out cancelled their chip orders because they initially thought people would stop buying cars during COVID.


That tracks. I worked at a lot of places/teams where anything but a P0 was something that would never be done.


Solution: everything is a P0!


Then you just get Little's law, which is not usually what people want. Preemption is usually considered pretty important... Much like preemptory tasks.


No what you get is alcoholism. It was sarcasm.


Porque no los dos? The purpose of a beverage is what it does.


Interesting. My gut reaction is that this is true in reverse: infinite wait time leads to 100% utilization. However, I feel like you can also have 100% utilization with any queue length if input=output. Is that theory just a result of a first order approximation or am I missing something?


I think it comes from tasks not taking an equal amount of time, coming in at random, and not having similar priorities.


That's right, this is true no matter the queue length. If input=output on average, there is no limit on how long your queue will grow, and therefore no limit on how long queued task will wait.

I don't know what you mean by reverse.


The average queue length is still infinity. Whatever the queue length happens to be at the start, it will stay there, and it could be any positive number up to infinity.

Besides, angels can't really balance on pinheads.


For some it may go without saying, but for the uninitiated, y’all should be reading https://en.wikipedia.org/wiki/The_Goal_(novel)


Slack __or__ lower priority tasks.


Tasks that never get done, yes. In other words, tasks that wait forever.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: