Hacker News new | past | comments | ask | show | jobs | submit login

This is an emotional topic, the discussion of which is nearly always evidence-free. How could one study it objectively?



I tested it back when I ran a dev team.

We were faced by constant pressure to develop customer stuff faster, and the team were doing regular 60+ hour weeks. I put in some new rules on hours, including:

- no dev worked more than 40 hours on scheduled projects. Overtime was for disaster recovery or slippage repair to meet a deadline

- if the salesperson agreed to a new deadline, then we went over their other tasks and worked out what needed to shift to make room. If they insisted on overtime, then we insisted on lieu time to pay it back, from their scheduled project time.

- if someone came in to fix a disaster, we gave them time of in lieu. If we really couldn't do the time off, we paid them overtime for the time.

It took a few months for this to settle in, and I got to test the productivity changes (by measuring project estimates vs actuals). Productivity improved as the hours worked decreased.

The productivity gains came mostly from more realistic project estimates, but a significant part from increased dev productivity... they just wrote better code that needed less correction, faster.

I would post the data, but it's commercially sensitive and about ten years old now.


I'd like to second this - I ran the same stats once, while contracting for a customer with excessive demands.

I found that a week of working long hours was about my max, and after that I started making mistakes to the point that the extra hours actually became counter productive.


You can debate until you're blue in the face the claims about poisonous workplace culture, broken management, unreasonable / unsustainable expectations, or other intangible things unmentioned in this piece (like people working extra because they're angling for a promotion, or trying to learn a new job skill that will be valuable in the future) -- but the math doesn't lie.

If you are salaried and you work overtime, you are working for free. If you're on an hourly wage and you don't get any sort of overtime pay, you're working for free.


Yeah, and if you're in a startup and you have a significant equity position then you are not working for free, you are directly monetarily motivated.

Also, people act to maximize their utility, not necessarily compensation. The act of working hard and hitting deadlines is a form of compensation in itself.

Finally, lots of people work a lazy 8 hours (with lots of time spent on the phone, Facebook, personal email) and need the extra hours to accomplish 8 hours worth of real work.


Ah! The equity position as fake compensation. Unfortunately for the worker beeze, said "equity" usually turns out to be worthless in a exit, except in unicorn situations (Facebook, etc). The founders, early investors and insider shareholders are all first in line, so mostly likely, you end up with shit, or maybe some options that might be worth something in a few years, if the company hasn't tanked by then.

No thanks, unless I can take your company stock and flip it the same day, I'll take cash.


That's not always true.


Your last example is kind of irrelevant, because if you're honest with yourself, you're not counting the not-work hours.


That's a fair point with your qualifier. I think a lot of people are not honest with themselves about this, though.


Pay is only one reason for work.

If pay is one's only reason, long hours for no extra pay are certainly objectionable. But that's a circular argument.

I like working long hours when it's work that I want to be doing. If people are working at something they don't want to do, maybe that is the real problem. For menial jobs that no one enjoys, there may be no easy way to fix the situation, but in an industry like tech there is little excuse for accepting it.


Well, sure -- I acknowledged more intangible benefits in passing -- but the poster wanted an objective way to measure it, so I gave them one. shrug


You gave a way of measuring something else.


Most large companies will grade you based on your work adjusting bonus and the amount of stock you recive. That way there is at least some monetary incetive for it as well.


Working extra hours with that in mind would only be wise if you could accurately predict the reward. It would be a rude surprise if you worked extra hard all year only to hear "Sorry, no performance bonuses this year, our CEOs decided they weren't earning enough."


>How could one study it objectively?

Here's a fix, and it comes from within: Give up the ideal of objectivity. Objectivity doesn't exist.

I see this fallacy all the time from hackers. We are not merely rational creatures we are also feeling creatures. Embrace it.


I feel that you're wrong about that.


If you think their is such a thing as objective truth in this world you must be some sort of postmodern miracle.


I think you missed his sarcasm. He feels that objective truth exists. :)

BTW, I have found Yudkowsky's "The Simple Truth" an effective argument against those claim that objective truth doesn't exist. If you walk in the way of a speeding train, you will die irrespective of your beliefs. Beliefs don't alter the reality. http://yudkowsky.net/rational/the-simple-truth/


Hmmm... yes let's disregard the last century of philosophical thought and the postmodernism movement as a whole because the dude who write's Harry Potter fan fic wrote a cute story.

How would you convince a Solipsist that there is an objective truth?


> How would you convince a Solipsist that there is an objective truth?

I can't. But let me quote Bertrand Russell from "An Outline of Philosophy":

Solipsism (the theory that I alone exist) is a view which is hard to refute but still harder to believe. Solipsism is not really believed even by those who think they are convinced of its truth.

A solipsist would have no objection to jumping from a cliff, would he? But he won't actually do it.


Why would a solipsist have no objection to jumping off a cliff?

Just because you believe reality doesn't exist outside your own mind doesn't mean you can't die.


That's not the point. If you really are a solipsist, why would jumping off a cliff have more probability of leading to death than not jumping (because the cliff doesn't really exist)?


Ummm... what?

All solipsism says is that you can't be sure reality exists outside of your own mind. You can question whether or not you would be dead outside of your own experience, but you'd still be dead in your internal world.


The only reason to believe that jumping off a cliff leads to death is because that's what we observe happening to other people and we assume we are like other people. But the solipsist has no reason to believe this as the other people don't exist outside of his own mind therefore he is not like them.


"But the solipsist has no reason to believe this as the other people don't exist outside of his own mind therefore he is not like them."

Except the solipsist is also unsure he exists outside of his own mind too, so why would he believe he is the sole exception to everything he's observed in the only reality he can be sure of?


He already believes he's the sole exception to everything he's observed by definition of being a solipsist.


No? How does him being unsure that reality exists outside his own mind mean he believes he's the sole exception?

I mean if you're unsure there is a reality outside of your own mind, wouldn't that make you more risk adverse? Your death could potentially mean that everyone you've ever known/loved would cease to exist, even if they are a product of your unconscious mind...


> No? How does him being unsure that reality exists outside his own mind mean he believes he's the sole exception?

A solipsist isn't "unsure", he definitely believes reality doesn't exist outside of his own mind. That alone sets him apart from everyone and everything else in the universe, because a solipsist believes he is the only one who actually exists.


That's a pretty unique definition of solipsism. As far as I'm aware a solipsist believes he can't know if reality exists outside his own mind.

But we've strayed pretty far from my original point, which was that you can't disregard centuries of philosophical thought with a story that doesn't address any of the arguments of the ontological frameworks it's trying to refute. I'm not personally a solipsist, but solipsism is by definition not falsifiable, so I'm a little confused about where this conversation is supposed to go...


Considering that everyone is a solipsist these days, that's an interesting problem but it's not a philosophical problem. If someone else is a solipsist I know from experience that they're wrong, and I'm not a solipsist.


> How would you convince a Solipsist that there is an objective truth?

"I refute you!"

Joking aside, I feel Solipsism is immature. Do you know which demographic also comprises Solipsists? Two year olds. "Mommy doesn't exist when I close my eyes." This gives Solipsists an excuse to stop thinking critically about things like ethics, et al. So in this sense, Solipsism's just a cop-out which justifies laziness. When someone says "objective reality doesn't exist", what I really hear is "everybody's equal; you all get a trophy; we all have a right to our opinions". You might recognize this as the Red Herring Fallacy. I think Solipsism is a subtle version of this same fallacy.

Paul Graham says something similar about the subjectivity of aesthetics * : "Your mother at this point is not trying to teach you important truths about aesthetics. She's trying to get the two of you to stop bickering." I think this applies to Moral Relativism, Solipsism, and generally most philosophies which deny the existence of objectivity.

* http://www.paulgraham.com/taste.html

Post Modernism is often said to have been a reaction to Modernism. But I think it's even more important to realize that Post Modernism is a reaction to World War I and II. After WWII, I imagine people realized "Science and objectivity gave us cars and electricity, but it also gave us mustard gas and nuclear weapons. Maybe this whole Modernism gig isn't so great after all..."

According to Literary Post Modernism, there's lots of conflicting narratives rather than a single objective perspective. During the chaos of World Wars I and II, I imagine war-stories naturally contradicted one another. I like the conflicting POV aspect because it can encourage the reader to question the author's reliability, like in Edgar Allen Poe's The Tell-Tale Heart. Unfortunately, Post Modernism can also have the opposite effect: encouraging readers to quit thinking too hard and to accept the text as it is since "it's all equally valid". This negative aspect fits really well with the self-esteem movement, which (I'll say again) I think is a cop out for thinking critically.

You may notice I'm not refuting Solipsism per se, but ulterior motives for believing Solipsism is valid. This is because going down the road of "formal proofs" probably won't yield anything convincing. We'd simply talk past each other, and in circles, and bicker over definitions. But at the end of the day, it's pretty difficult to disprove abstractions and ideologies sin finding an inherent contradiction. And I'll even admit, maybe Solipsism actually is true. Who knows? I won't claim outright that Solipsism is false. But in my own experience, I find it very unlikely that Solipsism is true * . And I do want to acknowledge a possible bias for believing "Solipsism is true" due to it's convenience.

* http://yudkowsky.net/rational/bayes

p.s. I do find the Harry Potter fan-fic kinda weird...


I wasn't necessarily saying that solipsism is a good philosophical framework. I personally think it's rather limiting. My point was simply that you can't refute centuries of philosophical thought with a story, you actually have to put in the effort to critically examine existing theory and and address the actual arguments of the ontological frameworks you are trying to refute.

"Unfortunately, Post Modernism can also have the opposite effect: encouraging readers to quit thinking too hard and to accept the text as it is since 'it's all equally valid'"

Isn't deconstructionism sort of a core component of Postmodernism? Doesn't the deconstructionist view say that you shouldn't accept the text as it is?


Thats a good argument and may be generalized by quoting George Edward Box "Remember that all models are wrong; the practical question is how wrong do they have to be to not be useful."


I occasionally reread Isaac Asimov's The Relativity of Wrong. While Yudkowsky's The Simple Truth is more rigorous, I find the brevity of Asimov's essay more refreshing. Another perennial favorite of mine is George Orwell's Politics and the English Language.

http://chem.tufts.edu/answersinscience/relativityofwrong.htm

https://www.mtholyoke.edu/acad/intrel/orwell46.htm


Great links, thanks.


There is mathematics and physics, at least. A lot of arguing can be, and has been, had here but it's largely arguing about definitions; you cannot deny that they allow you to obtain the "truth" in the sense that that truth can be used to predict and affect what happens in the real world as far as your senses can best tell you. There's no arguing away fire or public key encryption.


Yes all of those things exist and can make predictions about "reality," but that doesn't mean reality exists outside of my own experience.

http://en.wikipedia.org/wiki/Phenomenology_(philosophy)


>that doesn't mean reality exists outside of my own experience.

I am aware of phenomenology and I've read some Merleau-Ponty. To my taste, however, the situation where an objective/"objective" scientific truth makes consistent predictions about my sensory experiences that suggest a physical outside world and another where that truth makes consistent predictions about the outside world itself, which in turn causes the sensory experiences, don't seem different in a meaningful way. Why make that distinction?


I mean yeah if you don't care about understanding the nature of existence, the distinction doesn't matter. If all you're concerned with is making useful predictions, there is no point in quibbling about the subjectivity of experience.

My point is simply that just because the scientific method leads to useful predictions that doesn't make it objective truth nor does it invalidate the importance of subjective experience.


>if you don't care about understanding the nature of existence, the distinction doesn't matter

Okay, I think I understand your position now. (Though I'm not sure if the lack of belief in objective truth is as common as you claim.) Sorry be so persistent but I have to ask: suppose you do care about the nature of existence; how do you tell if it's one or the other or which one is more likely?


I'm not sure how to answer that. What is the purpose of determining which one is more likely? I don't think that's a question that's possible to answer nor one that reveals anything about the nature of existence.


Is that statement objective or subjective?


There's a good deal of middle ground between the self-evident

>We are not merely rational creatures we are also feeling creatures

and a thought-terminating cliche like

>Objectivity doesn't exist.

The latter doesn't seem like a helpful response to someone who is apparently asking for scientific evidence for long working hours and/or overtime doing harm to you. If anything, knowing what science says on the matter is emotionally important as it could either (if the possibility of permanent harm is limited) provide some relief to or (otherwise) justify the indignation of a person currently doing overtime and unhappy about it (and hopefully lead to action on their part). Both scenarios seem desirable to me.

Edit: Could you explain the downvotes? In case it sounded dismissive I changed "thought-stopper" to the more formal "thought-terminating cliche" above and added the ellipses for clarity.


You pissed off those who are highly invested in either side without provoking any support from the middle; your rewording is no less offensive, especially with the pointer to the pre-edit version, and your usage of parentheses necessitates an above average working memory or rereading.


Thanks for your insight. I'm genuinely surprised if I did come off as being against both sides. I thought my comment was obviously pro-rationalist (my reasoning being that a deliberately rational person would recognize the importance of one's emotional state and use rational thought to improve it). I may have inadvertently stumbled upon a trolling strategy: in what appears to be a binary choice don't be recognizably for either.

I was also apparently too sleepy to call parentheses by their name.


I'm in favor of what comes from within, but it's important not just to repeat what one already believes.


This is a billion or trillion dollar question, and there are many academics and industry analysts studying this and publishing results. It's just not in an area that the current HN audience is particularly competent in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: