You get what you measure, and you should expect people to game your metric.
Once upon a time only the brightest (and / or richest) went to college. So a college degree becomes a proxy for clever.
Now since college graduates get the good jobs, the way to give everyone a good job is to give everyone a degree.
And since most people are only interested in the job, not the learning that underpins the degree, well, you get a bunch of students that care only for the pass mark and the certificate at the end.
When people are only there to play the game, then you can't expect them to learn.
However, while 90% will miss the opportunity right there in front of them, 10% will grab it and suck the marrow. If you are in college I recommend you take advantage of the chance to interact with the knowledge on offer. College may be offered to all, but only a lucky few see the gold on offer, and really learn.
That's the thing about the game. It's not just about the final score. There's so much more on offer.
> However, while 90% will miss the opportunity right there in front of them, 10% will grab it and suck the marrow.
Learning is not just a function of aptitude and/or effort. Interest is a huge factor as well, and even for a single person, what they find interesting changes over time.
I don't think it's really possible to have a large cohort of people pass thru a liberal arts education, with everyone learning the same stuff at the same time, and have a majority of them "suck the marrow" out of the opportunity.
I did a comp science degree, so I can't speak for the liberal arts. However I imagine the same experience could apply.
For us the curriculum was the start of the learning, not the end. We'd get a weekly assignment that could be done in an afternoon. Most of the class did the assignments, and that was enough.
There was a small group of us that lived (pretty much) in the lab. We'd take the assignment and run with it, for days, nights, spare periods, whatever. That 10 line assignment? We turned it into 1000 lines every week.
For example the class on sorting might specify a specific algorithm. We'd do all of them. Compete against each other to make the fastest one. Compare one dataset to another. Investigate data distributions. You know, suck the marrow.
(Our professors would also swing by the lab from time to time to see how things were going, drop the odd hint, or prod the bear in a direction and so on. And this is all still undergrad.
I can imagine a History major doing the same. Researching beyond the curriculum. Going down rabbit holes.
My point is though is that you're right. You need to be interested. You need to have this compulsion. You can't tell a person "go, learn". All you can do is offer the environment, sit back, and see who grabs the opportunity.
I get that you cant imagine this playing out. To those interested only in the degree, it's unimaginable. And no, as long as burning-desire is not on the entry requirements, it most certainly will not be the majority.
In truth the lab resources eoild never have coped if the majority did what we did.
> I did a comp science degree, so I can't speak for the liberal arts.
By 'liberal arts' I meant the common 4 year, non-vocational education. My major was CS too, but well over half of the time was spent on other subjects.
> I get that you cant imagine this playing out. To those interested only in the degree, it's unimaginable
I can easily imagine what you describe playing out. I just wouldn't call it 'sucking the marrow' (unless you were equally avid in all your classes, which time likely would not permit).
But as you allude to in your last point, the system isn't really designed for that. It's nice when it does effectively support the few who have developed the interest, and have extra time to devote to it, as it did for you.
I'd rather see systems that were designed for it though.
That's entirely the point. If you see the degree only as a stepping stone to the company job, then that's all you see and that's all you get.
If you accept that the degree/job relationship is the start, not end, of the reason for being there, then you see other things too.
There are opportunities around the student which are for them, not for their degree, not for their job. There are things you can learn, and never be graded. There are toys to play with you'll never see again. There are whole departments of living experts happy to answer questions.
For example, (this is pre google) I wrote a program and so needed to understand international copyright. I could have gone to the library and read about it. Instead I went to the law faculty, knocked on the door, and found their professor who specialized in intellectual property.
Since the program I wrote was in the medical space, I went to the medical campus, to the medical research library, and found tomes that listed researchers who might benefit. I basically learned about marketing.
If all you care about is the company job, then all you'll see is the degree.
right and getting a family is also just a box to check and eating food is a box to check and brushing my teeth is just a box to check and on it goes for every single thing in life. If we all just checked boxes then we'd not be human anymore.
I'm not downvoting you, because this is a common economic misconception, and I'm sure your opinion is shared by many.
Money is never wasted.
While the long explanation is some what technical and boring, the short version is this;
"Money is neither created nor destroyed, it simply moves from one hand to another".
Put another way, the National Gallery had 20 mil to spend. So they spent it. That 20 mil us now in the economy, and will travel further. The family that sold the painting might need a new roof, or a tractor, or whatever. They in turn spend the money and it flows.
An economy is just the flow of money. An economy stalls when the money stops flowing and is hoarded.
Fundamentally you want rich people to spend their money. On "what" is mostly irrelevant.
Here's another simplistic example. The US produces a surplus of wheat. USAid buys a lot of that wheat (using tax money) which is thus a round-about subsidization of wheat farmers. This is prudent because local food security, ie having farmers at all, is a good thing.
Now USAid have a pile of wheat, so they donate it to countries that can't afford it. This buys US prestige, both with those countries and their neighbors.
Now USAid stops. The govt "saves money". Farmers loose their subsidy. Long-term US citizens lose their food security.
Money itself has no value. Spending that money has value. Because only by spending it can you realize that value.
Reminds me of a visit of a garden restaurant in Munich, back in the days. My friend ordered a soup, but got a beer. Pointing out the mistake to the waiter, he was told that it doesn't matter, because the price is the same.
Mind sending me all your retirement savings? I'll even give you a pretty sweet drawing for it. It wont be money wasted[1]; I'll be sure to use it well.
[1] not sure how much you will be able to sell said sweet drawing for nor when, but by definition, it will be worth it.
Alas, I am not a gallery, and thus I don't make an income displaying drawings. I encourage you to target your sales at those best placed to profit from your product.
So from my perspective, I can getter better value moving my cash to dome other suppliers.
But even if I did buy your sweet drawing, the money itself us not wasted (I personally just control less of it.) The same money would now be controlled by you, and I'm sure you'll spend it, thus benefiting others.
The money itself cannot be wasted, it merely moves from one set of hands to another.
My personal control of money can indeed be wasted, since I can transfer it to another for insignificant value. But that's simply my control, not the money itself.
I already spent my retirement savings. I used them to buy index funds and some shares. I think the companies will use my money better than you would, at least in the sense they will give me a return on my investment.
It will be dispersed to more thsn a single family. That's the point.
In this case specifically it's unlikely the family sold an asset simply to buy another asset. They've had it a few hundred years, and the gallery has had their eye on it for decades. It's likely they sold it cause they needed the cash, for a new roof or whatever.
If they spend it, then those people providing the goods and services will prosper. If they invest it in a business, then that business has capital to grow, and all those employees will benefit.
From the article: “ The Virgin and Child with Saints Louis and Margaret and Two Angels was bought for just over $20m (around £16m at the time), funded by the American Friends of the National Gallery London.”
> "Money is neither created nor destroyed, it simply moves from one hand to another".
With regards to how money is created, you may want to read on credit and how banks create money virtually out of nothing, or how the state has a monopoly on printing money (turning "not money" -- paper and ink -- into "money").
The destroying part is much simpler: you can perform an experiment of burning a banknote yourself.
But this money won’t probably be spent further in any ”productive” way, they will be locked in some financial tools that will only help extracting funds from the real sector, which is what one probably really cares about when they say that “money should work”. It’s not similar to a government investment in building a bridge which, while it’s also spending state money, creates ripples of economic activity involving thousands of people and dozens of industries.
That is a very broad generalization. Even if it was 'put into some fund', that equates to a capital investment which can be used to deliver value elsewhere.
Money is complicated - the only way in which I would see it get truly wasted is if you took it out as cash and burnt it. Even then you'll be (marginally) raising the value of all other money left in the system.
There was a line from a movie: "You had all that money in the stock market. What happened?" "Oh, the money's still there; it just belongs to someone else now."
When the last tree is cut, the last fish is caught, and the last river is polluted; when to breathe the air is sickening, you will realize, too late, that wealth is not in bank accounts and that you can't eat money. -- Alanis Obomsawin
Taking your suggestion in good faith, I'm intrigued by your concept of democraticly controlled, worker owned. Please explore this further.
I guess I'm wondering primarily what "democratically controlled" even means. Like everyone votes on every decision? Or we elect people to make decisions? Or we vote on "big decisions"? (Who defines "big"?
Most companies are democratic. In the sense that the shareholders appoint the decision makers. Shareholders -> Board -> management.
Your point about "worker owned" simply means the workers own the shares, and hence "democratic" would seem to be redundant. Unless you are suggesting that the democratic function is exercised in another way?
Now clearly Mozilla is a mix of non profit and for profit. A non profit doesn't really have shares (there's usually some other approach to appointing decision makers.)
So, I think you are suggesting that the voting rights move from "shareholders" to employees.
Naturally this opens the door to 51% attacks, or more specifically incentivises workers to coalesce into groups with mutual-support voting.
Given a reasonably high turnover in workers, we should therefore expect decision making to be mostly short-term not long term? (Simplistically, most people will vote to further their short term returns, ignoring long term goals because in the long run they're not here.)
In other words the company starts to behave a lot like a govt does. Regular elections promote short-term goals and results (don't start a project that will complete after you've left) at the expense of things like maintainence etc.
It also values political skills over say engineering skills. Being a good speaker counts for more than being competent.
Do you believe this structure will make a better browser? When funding runs low, will they make better decisions on which staff to cut?
> MONDRAGON is the outcome of a cooperative business project launched in 1956. Its mission is encapsulated in its Corporate Values: intercooperation, grassroots management, corporate social responsibility, innovation, democratic organisation, education and social transformation, among others.
> Organisationally, MONDRAGON is divided into four areas: Finance, Industry, Retail and Knowledge. It currently consists of 81 separate, self-governing cooperatives, around 70,000 people and 12 R&D centres, occupying first place in the Basque business ranking and tenth in Spain.
Or Scop-TI in France, a large worker cooperative in the IT and engineering sector.
As far as I recall, the Netscape browser was free. There may have been a paid one (for enterprise), but I'm pretty sure we had a free one.
They did charge OS makers to bundle it (via support contracts) but the biggest market there (Windows) wrote their own. By IE5 Netscape was basically gone, IE6 had no competition (and hence no development) until Firefox came along.
> Cutting funding essentially returns us to the IE6 monoculture with no progress.
1. It doesn't return us to monoculture - Monoculture of ie6 gave us multiple browsers, which recently all merged into Chrome. We already have a monoculture which will now lose funding.
2. We're not losing any of that progress. Actual documented standards exist now, all players implement the same basics, and you can create most websites without browser specific quirks. That's not going away.
3. We've had so much progress that Electron is its own massive OS now. We could do with a bit less progress and a bit more "how do we make this mess maintainable".
Is the abuse you experienced a function of software development? Or perhaps some companies are shitty and some are not. Or perhaps it is NY (big city) that is the root cause.
Change is good, but you need to be sure you're changing the right thing.
Let's say 1 pay x for a product. Gross markup is say 100%. Do I sell it for 2x. Let's say there's a tariff cost of y. That means the cost price is x + y. I mark that up to 2x + 2y. It's easy to up the price by 2y and disclose the tariff as "z%".
But this of course presumes all your expenses remain flat. And they likely don't. As your expenses go up (2nd order effects) that 100% markup starts to not be enough. So the markup goes up a bit.
Plus since things are going up anyway, and since there's uncertainty (which has a cost) we need to bump the price up even more (because hey, free market.)
And when the tariffs go away, we can remove the primary cost, but all the secondary hikes remain. Because that's all just extra profit, and, like, free market right?
This round of inflation is going to make covid look mild. (And as I point out to my Republican friends, just remember, you voted for this.)
The way out of this is to devalue the dollar. That would erode the real value of the outstanding debt (which is delimited in dollars.) Alas the US has worked very hard to make the dollar the world currency, so devaluing it is complex.
The US consumer (voter) is of course the big loser. At least this generation is. Folk born around 2030 may be the big winners.
I'm rather aware of the concept of markup. Marking up itself isn't the problem, it's completely understandable -why- that must exist in most cases. But either way, companies don't like to disclose their landed costs for obvious reason - people will think they're being ripped off.
Tariffs are in the news and the percentages are known. If I'm selling a wallet made in China, in the US for $80, and list a tariff line item of $2 - people will calculate and easily know that I imported said wallet from China for <$1 and start to question why I'm charging so much.
> Tariffs are in the news and the percentages are known. If I'm selling a wallet made in China, in the US for $80, and list a tariff line item of $2 - people will calculate and easily know that I imported said wallet from China for <$1 and start to question why I'm charging so much.
If they're clever enough to do that math, they're clever enough to infer the result from the quality and fact that it's made in China. The ways of obscuring that would be to have paid more for a higher quality item made in China (make a convincingly costly product), or make it difficult to evaluate the quality in the first place.
But I know you're talking hypotheticals and all. It is maybe worth wondering whether in aggregate it'll become more transparent that the U.S economy is based on adding a negligible amount of value to anything from top to bottom.
American Manufacturing never left. Total goods manufactured in the US peaked in 2018 and 2019. It dropped during covid but has returned to those levels now.
Of course manufacturing jobs left. Replaced by automation. A much smaller number of people are making things. Americans have moved on to Services jobs (many of which are poorly paid) and Knowledge Worker jobs (many of which are highly paid.)
Even industries that are traditionally thought of as solid blue collar (Boeing, Ford etc) are producing more, but with way fewer people.
Fundamentally of course, automation is cheaper, and more consistent than human labour.
Naturally the US does not make everything. Nowhere does. Some industries resist automation. Construction and some agriculture crops spring to mind. The high cost of US labor makes these attractive to foreign labor. Mexico for example produces 80% of produce that is cultivated or picked by hand.
Incidentally foreign labor doesn't have to be executed in foreign lands - the primary industries for undocumented (and hence cheap) labor in the US are agriculture, construction, child care and so on. Things that cannot be automated.
(On the agriculture front, the major outputs are crops that can be automated, thing wheat, corn, chickens, pigs etc. The major imports are things that are more labor intensive to harvest, like vegetables and flowers. )
So no, factory jobs are not coming back. Because they were replaced with robots, not foreigners. You may see local production increase though as more robots come online.
> ...replaced by automation. A much smaller number of people are making things.
Some manufacturing was replaced by automation, but most of it was not. The jobs still exist, just not in the US. Worldwide, a much larger number of people are making things.
In China, manufacturing jobs account for 29% of total employment, according to UN data reported by Our World in Data.
Nonetheless, the UN still reports that many countries have more manufacturing jobs (relative to total employment) than the US. China, if 29% is correct, has the most, but almost all European nations also have more than the US.
I used to work in a factory (i was an engineer working upstairs, never on the floor). Employment peaked in the 1950s at just over 2000 humans - today there are just over 200 to make essentially the same output. The lazer cutter replaced 70 humans running saws with 3 to run the machine. The paint system is entirely automated with only off hours maintenance done by humans. And so on.
that is how the us makes more than ever with much small % I in labor.
The technical term for where you've arrived at is "mid life crisis" - albeit a bit earlier than most.
Your search for success is over. So the next step becomes the search for significance.
First though, as others have said, your "what to do next" is colored by the natural grieving process you need to go through. Accept that your emotions will likely be all over the place for a while. This isn't a time for big decisions, but rather smaller, quieter, times for you to find a new simple routine.
Once your life has returned to (emotional) normal, then you can start the next chapter. Being significant means making a difference, on the people around you, on your environment, and so on.
Like with most things, there's no "quick path". You need to try this, try that, and see what sticks. Local charities are always a good place to start (they can use the help) but they may just be a stepping stone till you find your place. Interacting with youth in some way (school, church, clubs etc) can also be very inspirational (for you and for them.) Kids like to hear stories of success (they're still on that path) and hearing it from someone young - closer to their own age - really sticks.
Most of all, don't panic if you wake up tomorrow with nothing to do. You've got a long road ahead. You don't need to rush. Take some time to just settle down. Maybe go on a trip to somewhere peaceful.
Charities- yes, and: consider the concept of mutual aid as well. In-the-mix helping others, and being helped reciprocally (but not transactionally) is potentially more meaningful than helping through charitable giving, though given our dominant paradigm of capitalism, charity is certainly helpful too.
As an employer we're not just looking for someone to fill a role now, we're considering the person's potential to grow and move onto more challenging roles.
As an employee money is clearly a driving factor, but other things also impact quality of life. Working conditions, work-life-balance, annual leave, scope for advancement, company structures etc all make a difference.
Yes money matters, but if you choose only based on that you end up in places where people (co workers and managers) only care about money. YMMV.
The average tenure of a software developer is 3 years. This is mostly because of salary compression where internal raises never keep up with the market and new hires come in making more than “loyal” employees. Companies don’t invest in their employees.
I’ve seen this personally in companies that had 20 people all the way to the US’s second largest employer. Not to mention, it’s much easier to get a promotion by changing employers than going through the internal promo process.
Unless you are working for a non profit, everyone mostly cares about money. Any halfway competent interviewee knows how to fake “passion about the mission”
I’m 50 now and I do value things more than money about my company. An equivalent position to what I have now pays 30%-40% more at AwS (been there done that) or GCP (I’ve turned down recruiters and managers). But I value autonomy, my real usable unlimited PTO, having influence on the company direction, work life balance etc.
I can make those choices. When you are young and just starting out you can’t and honestly you shouldn’t.
Once upon a time only the brightest (and / or richest) went to college. So a college degree becomes a proxy for clever.
Now since college graduates get the good jobs, the way to give everyone a good job is to give everyone a degree.
And since most people are only interested in the job, not the learning that underpins the degree, well, you get a bunch of students that care only for the pass mark and the certificate at the end.
When people are only there to play the game, then you can't expect them to learn.
However, while 90% will miss the opportunity right there in front of them, 10% will grab it and suck the marrow. If you are in college I recommend you take advantage of the chance to interact with the knowledge on offer. College may be offered to all, but only a lucky few see the gold on offer, and really learn.
That's the thing about the game. It's not just about the final score. There's so much more on offer.
reply