For people who like a bit more clarity, underscore also has a _.defaults syntax:
function foo(opts){
var defaults = {bar: "baz", foo: "biff"};
_.defaults(opts, defaults);
// do things with opts here...
}
The difference between _.defaults and _.extend is significant; _.extend will overwrite all properties in the first object with those defined in the second (and third, fourth, etc.), while _.defaults will only overwrite properties that are null or undefined.
The advantage is, using _.defaults, that you aren't passing a defaults object around within the body of your method and using it as if it were the opts, or reassigning to yet another var (see `finalParams` defined in the last example). This makes the method a bit clearer.
As with _.extend, _.defaults takes 1..n arguments and will fold 2..n into 1. If you fear callers will pass in an object they rely on for function options (doubtful, but why not) you can just write:
var actual_opts = _.defaults({}, opts, {
// Your defaults here
});
where ``opts`` is the options object passed in.
This also fixes the issue of a caller passing in no options (or "null" or "undefined") without having to add an `||`
zimbatm notes an issue with altering the object you're provided, but there's actually a bigger one with this method: if the function is called with no options or `null` or `undefined` it will blow up with a rather unhelpful message.
Unless you have a truly huge set of options, I'd suggest using
var final_opts = _.defaults({}, opts, defaults);
instead: `opts` being `null` or `undefined` will be equivalent to an empty object.
This is a good point, and unfortunately it means that most of the advantage of using _.defaults() goes by the wayside. It just becomes a matter of personal preference. Thanks for the tip.
> This is a good point, and unfortunately it means that most of the advantage of using _.defaults() goes by the wayside.
How and why? I don't consider mutating the original object to be an advantage of using _.defaults, unless the object is completely under your control and you're essentially calling `_.defaults(this, defaults)`.
and if you expect to accept multiple types of falsy values for any one argument (which requires a lot more kludge), you should probably change your API :)
You are correct, except this will not work in the significant case where any falsy value is passed ("", 0, null, NaN... I exclude undefined because that is the intended behavior) or with booleans.
A slightly more verbose, yet hardy syntax would go like this:
The assignment is hardly a performance concern in modern javascript. I imagine if we set up a jsperf test case we would find the difference negligible and the ops/s in the 10s of millions range. This is not the kind of thing that slows down JS apps.
I could see, using the if statement form, that you might have a programmer decide to compact it and drop the brackets - and then you might fall in the trap where somebody thinks they're in an if block when they're really not. Happens pretty often.
In addition, I don't know if you really would want to check for null. I usually just check for undefined, which would be the absence of any provided value (as opposed to a provided value that just happens to be null). Of course, it depends on the app.
In addition, I really don't like using short-circuited boolean ops for assignment. The style is grating to me - it's not immediately obvious what it's really doing. To each his own.
EDIT: Created a simple jsperf test. Difference is about 6.8%, but we're talking hundreds of millions of ops/s. Not the kind of stuff that matters outside of raw benches.
I'm not talking about single occurence of the "bad" pattern. If you multiple it to the occurence in a whole code base... well you get the point. That's why underscore and coffee-script treat default assignement that way.
If you want to allow undefined to be passed as a valid non-default value for a trailing parameter, you can also use arguments.length to determine how many arguments were explicitly specified.
var foo = function(a, b) {
if (arguments.length < 2) {
b = 'some default value';
}
// ...
};
I thought this was common knowledge? If you've read any JS library code you'd have seen this before. I'm not sure what the 'best' library to study is these days- I remember there was HN discussion about it and someone mentioned underscore.js since it was simple utilities.
It is certainly common knowledge in some groups, but not in others. There are plenty of people who are primarily designers who know just a little JS, as well as developers in other languages who have learned just enough JS to get by.
The advantage is, using _.defaults, that you aren't passing a defaults object around within the body of your method and using it as if it were the opts, or reassigning to yet another var (see `finalParams` defined in the last example). This makes the method a bit clearer.