Part of the zen of Python is "Flat is better than nested". Until recently, I always thought of this at a code block/function level. Recently I realized that this is also important when modularizing and adding layers of abstraction. The more layers of abstraction (from foo.bar.baz import Baz) you add, the more code you write and have to "keep in your mind". This presentation by Jack Diederich at PyCon 2012 really opened my eyes to the potential problems of adding layers of abstraction via modules and classes: http://pyvideo.org/video/880/stop-writing-classes
Interesting you say that; I would argue (and have) the exact opposite. Good abstractions, by their very nature, reduce the amount of moving parts you have to keep in your head at any one time. When the number of "moving parts" created by your abstractions outnumbers the actual bits its abstracting, then you know your design took the wrong turn somewhere.
> When the number of "moving parts" created by your abstractions outnumbers the actual bits its abstracting, then you know your design took the wrong turn somewhere.
This is an interesting point which I agree with and is usually where I end up. In the past I've started from the opposite end, creating multiple modules, class hierarchies, etc. This future proofing has made things harder for me to follow/understand when reading at a later date. It has become tiresome and now I consciously start at the other end, making the simplest thing that could work. Layers of abstraction then come naturally as needed.
This is the best approach , but it is best combined with fairly comprehensive re factoring otherwise you end up in a situation I have been in a few times where you have new code that runs with a nice abstraction and old code that works with less abstraction.
At that point you can introduce bugs because somebody modifies the old code in a way which would not be allowed under the new abstraction and then the new code ends up reading data produced by the old code leading to a cascade of failure.
I am genuinely surprised at how much others spend on software. I started totaling up his list (excluding monthly services like dropbox) and stopped when I reached $500, which was right around Cornerstone.
You've already sunk money into a Computer. If you pay a little more per year on software, how much better is your computer? It's a tradeoff which depends upon how much money you have, and how much utility you will derive.
For example, unlike Windows, there is no free and convenient svn utility on the Mac. If I were using svn on the Mac frequently, Cornerstone (which is quite expensive by my standards) would be a necessity.
Twenty years ago, computers cost $5000 and $500 worth of software was no big deal. Well today, that $500 dollars will buy you just a much a productivity boost, so why is it no longer worth it? Don't compare the $500 to the price of your computer; compare it to your already-spent salary plus overhead.
In my view, most of us who love free software have a knee-jerk reaction to paying money for software. Try spending a little more per year on software and observe whether or not it improves you life. It did for me.
> For example, unlike Windows, there is no free and convenient svn utility on the Mac.
If you aren't command-line-phobic, svn has been included by default on OSX for a while.
However, I do agree about paying for software that improves your productivity... if you amortize it over even a year (think monthly cost vs. productivity/entertainment value), it's likely cheaper than a caffeine habit... and I usually use software for years.
Productivity is much better with a decent GUI like TortoiseSVN. Want to see the diffs for multiple files you have changed? Just double click each. No mucking about with command line windows and cutting and pasting filenames or what have you.
On Windows there is TortoiseSVN which makes svn more efficient. On *nix, I used the XEmacs module. Of course I know how to use the command line also and do use it for one-file commits on occasion.
If you are using SVN routinely and do not use a GUI, you are wasting your time.
I have never understood this phrase, and I see it a lot from Windows users. When I use the command line (and I spend ~12 hrs/day on it), there is no mucking. There's no frustration with it.
I'm honestly asking: what do you mean by mucking about?
Over time it isn't so bad. I'll buy a new $30-50 application maybe twice a year (and maybe a $20 upgrade another twice a year). Now, I've got a nice collection, and it never seemed like a lot of money.
> Money moving from one place to another does not always help the economy. When money flows from the corporate coffers directly into the private accounts of its executives, generally little of that money actually flows into the greater economy.
Can you give an example of how this would hurt the economy?
EDIT - those that down voted, mind giving a reason?
Second, if we assume that a dollar in one place does not always provide the same value to the economy as a dollar in another place (which everyone seems to agree on; they just don't agree on which places provide the most value), then we must conclude that over the long term sending as much money as possible to a sub-optimal place is worse for the economy than sending it somewhere more optimal.
I think it's pretty obvious that taking money away from the rich giving it to the poor, maybe even in the form of food stamps, has a very direct and immediate effect on GDP. So it is clearly optimal in the short run in terms of GDP and also for helping those who really need it in times like these.
But in the longer run, some of the money needs to go into investment or society will stagnate. Some of that investment can be done by governments, but central planning isn't very good at exploring new ideas. Transparent, democratic governments can be pretty efficient in doing things that are already well known and institutionalized. For instance, European health care systems are hugely more efficient than the US one.
On the other hand, the Googles and Apples of this world are difficult to imagine as creatures of some government.
I don't know what optimal is, but the evidence is that giving more to the wealthy doesn't help much. My family is decidedly upper-upper-middle class, and I won't pretend that cutting my taxes will stimulate the economy. Extra money is going into loan repayment or savings. If my taxes went away entirely, I'd spend more, but not enough to offset the loss of revenue to the government.
I fully agree that there must be private investments, and indeed there must be wealthy people. Whenever I hear someone say "No one needs more than X dollars", I immediately know that I'm talking to an extremely naive person. However, I do think that the pendulum has swung too far in favor of the wealthy, not just in terms of taxes.
Edit: Just to be clear, dumping money into the pockets of average Joes isn't necessarily going to do anything to stimulate the economy either. The tax rebates showed that.
I didn't downvote your question. I don't believe HN lets you downvote someone who replies to you, and I don't have the karma to downvote anyway.
I don't think you would accept a concrete answer. Do you believe that every dollar in every place (federal tax revenue, corp coffers, billionaire's money market account, average joe's mortgage, food stamp, etc.) has the same value to the economy? If not, then you must agree that some of these are better places to send a dollar than others. Whether choosing a suboptimal flow "hurts" the economy is semantics.
If you think that putting another dollar in a wealthy man's account is the best use of the dollar, then I would say that history disagrees with you, as there's little evidence that "trickle down" economics work. Something like 80% of economists say it doesn't work.
Haskell is compiled. As I understand it, much of the point of Maybe is that it provides a formal language for expressing and manipulating unhandled edge cases. Then the compiler can spot those unhandled edge cases at compile time and make sure you handle them at one level of scope or another.
In Ruby the concept seems less useful because there is no compiler.
That's not the point, the point is that Maybe is the right way to handle what nil (or None or null) fails to handle properly. Scala gets it right too, IIRC, with Nullable.
1. A "null" instance of one type should not be conflated with a "null" instance of a separate type.
2. By type-wrapping in a Maybe, you declare where you need to be able to handle nulls and where you are free to ignore them (but can never pass them in). You confine the null to specific regions of your code.
3. You force your code's clients to think about the null case wherever you make it visible.
Maybe is really just a special case of Either, where it optimizes for the situation where the second return type is "failure." The type of Either is necessary because proper type theory requires a container type to wrap two types. This is immediately obvious if you try to write the type signature for any function which has a return arity of one.
Maybe is still useful outside of compiled language because it is also a way of composing and combining operations that might fail. The catch is that a great deal of its usefulness (in Haskell &al) comes from static typing, because any function which could return nil would return Maybe<SomeType> instead of SomeType, so you are forced to check against Nothing in order to have a valid program, thereby eliminating any possible NullPointerExceptions from your program. (The monadic syntax Haskell/Scala/&c provide makes this less tedious than it sounds.)
However, you could still make use of the composition operations in dynamic, untyped languages, and simply use them as a toolkit for chaining together operations which could possibly return nil. You don't actually have any static guarantees like you have in a typed case, but it's possible to imagine use-cases where the combining operations are useful enough to reimplement in a dynamically typed language. I suspect that a dynamically typed language that somehow lets you use some kind of monadic notation would benefit from this (e.g. perhaps through the use of Lisp macros), but without extra language support, it probably would just be tedious and verbose (e.g. in Ruby.) I'd be happy to be proved wrong, though.
It has nothing to do with compilation. It has to do with static typing -- that, before ever evaluating something, you typecheck it. It is perfectly feasible to interpret a statically typed language -- SML/NJ for example, includes an interpreter for SML. However, statically typed languages are generally easier to fully compile than dynamic languages; SML/NJ also includes a compiler for SML.
Exactly. You can pass Maybe objects in Ruby as a convention, but the extra layers of indirection probably aren't buying you much. There is no way to enforce the contract.
Vala does something like that. If you have a method that returns type X, it always has to return an X - not null. The type of "an X or null" is indicated as X?. It makes you realize, "hey, this could be null, am I handling those?"
There is a slight advantage to Haskell/Scala/Caml's Maybe versus the nullable type system used in Vala and Groovy and so forth. If you have a hash table whose values are (e.g.) Integers, looking up a key would return a Maybe<Integer> (because there's no guarantee the key exists.) If you wanted to have null values in the hash table, then the values would be Maybe<Integer> and looking up a key would return a Maybe<Maybe<Integer>>, which could be Nothing (because the key was not present in the table) or Some<Nothing> (because the key was present and a null value was stored in the table) or Some<Some<x>> where x is an Integer. As far as I know, nullable type systems don't allow Integer?? as a type. Still—it is a huge step above the nothing you're offered in other languages.
I agree that the change will cause a lot of pain, however early adopters tend to have a high tolerance for pain. If you don't want to deal with the pain or are running a server (your EC2 mention sounds like a server), you probably want to avoid cutting edge distros like Fedora.
Check out Mondrian: http://mondrian.pentaho.com/ I used it a couple years back at a startup. It's written in Java and IIRC works (only?) with MySQL. Mondrian took a bit of work to get setup, mostly due to my lack of OLAP knowledge at the time, but once setup, it was pretty nice and fast. I think it uses materialized views for the cube data.
I'm actually working on a new project where an OLAP will be nice. Thanks for posting Brewery, I'll definitely give it a spin. OLAP's don't get much love, but are very useful for certain types of problems.
In all fairness, the guy was being a dick (no pun intended) to Zed. However, Zed should have kept the insults to the troll and left Github, Powerset, Engine Yard, and the Ruby community out.
The value proposition of the Ruby community has more to do with their mindshare than their technical merit.
Compare and contrast Scientology with Psychiatry. Both communities make similar value propositions: improve mental health.
Scientology is optimized to gain and keep converts and to spread like a viral meme.
Psychiatry is optimized to achieve good treatment outcomes.
Scientology makes more money despite being a less effective form of therapy.
Virality itself is adaptive, so the Ruby community thrives and survives despite being a mediocre technology. Everything about ruby is optimized for gaining converts, attention and cohesion. Hard technical merit is less important in this case than community cohesion, growth, and publicity. Flamewars bring the ruby community publicity and this leads to growth which leads to the survival and replication of ruby.
Erlang and C++ survive on hard technical merits. They take a different evolutionary strategy that requires less propaganda / groupthink.
Just look at the life cycle of communities as if they were a species and it all makes sense.
Scientology makes more money [than the field of psychiatry].
This seems really questionable to me. If we assume 13 psychiatrists per 100K people in 2005 ( http://answers.google.com/answers/threadview?id=523453 ), and just count Europe and the US, that's ~80K psychiatrists, and if they average 120K USD per year, that's nearly 10 billion USD. Scientology had a worldwide income of less than 400 million in 1993 ( http://webcache.googleusercontent.com/search?q=cache:jD-Xo-q... ), so it would have had to grow by 20 times to rival psychiatry circa 2005.
I think psychiatry (leaving aside everything but actual practicing psychiatrists) probably dwarfs Scientology in total income.
I don't know about the Erlang trolls, but the C++ trolls are out writing shitty internal corporate apps for Windows, leaving no source and forcing a company to use the buggy app in perpetuity.
I've been trolled by people in lots of different "communities." PHP, Postgres, Java, Perl and Lua just to name a few. IMHO if a software community doesn't have jerks and trolls, it's because nobody is using the software. Ruby has its share but I don't find it any worse than the others.
I guess he felt that others besides the troll were having a laugh at his expense, (i.e., the HN Tips guys comments, other penis-oriented repos connected to employees of aforementioned companies) and were, if only indirectly, in on the joke.
Yep, that's what people seem to gloss over when they think I just overreacted. I knew for a fact that several Ruby people considered this hilarious, and that one or more of them worked at github. In that situation, it's either I leave silently (which everyone thinks I should have done), or bring the issue up and make sure everyone knows what's going on. I prefer the latter because it at least lets others come behind me and avoid the problem.
Thankfully, they've fixed the problem now and I don't have to worry about it anymore.
You and others which may have been trolled before and did not "make a fuss". I see many here with a bully mentality, saying that one should just keep their head down and endure the trolls crap and they will go away. Except they won't.
That's a pretty lucid and well written account. I would like to know more about the connection between HackerNewsTips and Github though. Github employees have denied it here on HN.
We haven't shouted this from the rooftops primarily because we're tired of this being the story when it should have focused on the bullying and our addressing of the problem.
Looks interesting, however I really dislike iTunes' "everything under then sun" motto. If you are looking for something much more lightweight for playing music, check out mpd http://mpd.wikia.com/wiki/Music_Player_Daemon_Wiki . There is a really nice Mac mpd client called Theremin: http://theremin.sigterm.eu/