Hacker News new | past | comments | ask | show | jobs | submit login
Default Parameters in JavaScript (designpepper.com)
26 points by joshuacc on March 19, 2013 | hide | past | favorite | 26 comments



For people who like a bit more clarity, underscore also has a _.defaults syntax:

  function foo(opts){
    var defaults = {bar: "baz", foo: "biff"};
    _.defaults(opts, defaults);
    // do things with opts here...
  }
The difference between _.defaults and _.extend is significant; _.extend will overwrite all properties in the first object with those defined in the second (and third, fourth, etc.), while _.defaults will only overwrite properties that are null or undefined.

The advantage is, using _.defaults, that you aren't passing a defaults object around within the body of your method and using it as if it were the opts, or reassigning to yet another var (see `finalParams` defined in the last example). This makes the method a bit clearer.


One problem though it that it's modifying the opts parameter so it might create unwanted side-effects.

    var myopts = {foo: 1}
    foo(myopts)
    console.log(myopts) // => {foo: 1, bar: "baz"}


As with _.extend, _.defaults takes 1..n arguments and will fold 2..n into 1. If you fear callers will pass in an object they rely on for function options (doubtful, but why not) you can just write:

    var actual_opts = _.defaults({}, opts, {
        // Your defaults here
    });
where ``opts`` is the options object passed in.

This also fixes the issue of a caller passing in no options (or "null" or "undefined") without having to add an `||`


zimbatm notes an issue with altering the object you're provided, but there's actually a bigger one with this method: if the function is called with no options or `null` or `undefined` it will blow up with a rather unhelpful message.

Unless you have a truly huge set of options, I'd suggest using

    var final_opts = _.defaults({}, opts, defaults);
instead: `opts` being `null` or `undefined` will be equivalent to an empty object.


This is a good point, and unfortunately it means that most of the advantage of using _.defaults() goes by the wayside. It just becomes a matter of personal preference. Thanks for the tip.


> This is a good point, and unfortunately it means that most of the advantage of using _.defaults() goes by the wayside.

How and why? I don't consider mutating the original object to be an advantage of using _.defaults, unless the object is completely under your control and you're essentially calling `_.defaults(this, defaults)`.


Thanks for pointing out the _.defaults method. If I have time, I'll update the article to include that point.


for many cases the following will work well enough:

    function(arg1, arg2) {
      arg1 = arg1 || "foo";
      arg2 = arg2 || "bar";
    }
it will fail of course if you expect to accept falsy args tho, then you need a bit moar kludge...

    arg1 = arg1 !== 0     ? 1234  : arg1;
    arg1 = arg1 !== false ? "foo" : arg1;
    arg1 = arg1 !== null  ? "foo" : arg1;
    arg1 = arg1 !== ""    ? "foo" : arg1;
and if you expect to accept multiple types of falsy values for any one argument (which requires a lot more kludge), you should probably change your API :)


You are correct, except this will not work in the significant case where any falsy value is passed ("", 0, null, NaN... I exclude undefined because that is the intended behavior) or with booleans.

A slightly more verbose, yet hardy syntax would go like this:

  function foo(arg1, arg2){
    arg1 = arg1 === undefined ? "foo" : arg1;
    arg2 = arg2 === undefined ? "bar" : arg2;
  }
This would catch the intended behavior, which is to fill a parameter when it is not supplied at all.


already was updating :)


If you don't mind, I need to correct the form a little. If this becomes a pattern in a big application you are losing performance for nothing.

In fact you're just compacting an if:

    if (arg1 == null) { // check for both `undefined` and `null`
      arg1 = 'default';
    }
    // compacted to:
    arg1 = arg1 == null ? 'default' : arg1;
The difference is that you're always doing an assignement, even if unnecessary.

    (arg1 == null) && (arg1 = 'default');


The assignment is hardly a performance concern in modern javascript. I imagine if we set up a jsperf test case we would find the difference negligible and the ops/s in the 10s of millions range. This is not the kind of thing that slows down JS apps.

I could see, using the if statement form, that you might have a programmer decide to compact it and drop the brackets - and then you might fall in the trap where somebody thinks they're in an if block when they're really not. Happens pretty often.

In addition, I don't know if you really would want to check for null. I usually just check for undefined, which would be the absence of any provided value (as opposed to a provided value that just happens to be null). Of course, it depends on the app.

In addition, I really don't like using short-circuited boolean ops for assignment. The style is grating to me - it's not immediately obvious what it's really doing. To each his own.

EDIT: Created a simple jsperf test. Difference is about 6.8%, but we're talking hundreds of millions of ops/s. Not the kind of stuff that matters outside of raw benches.

http://jsperf.com/ternary-vs-if-for-default-args


I'm not talking about single occurence of the "bad" pattern. If you multiple it to the occurence in a whole code base... well you get the point. That's why underscore and coffee-script treat default assignement that way.


Interestingly, in JSPerf, sometimes the ternary pattern is actually faster than the if statement. Must be something V8 is doing.


you may have to check for null if you want to use a default arg2 while providing an explicit arg3. i guess you can always do:

    myfn('a',undefined,'b');


Yeah. You see that sometimes in modern JS, e.g. when pretty-printing JSON:

  JSON.parse(object, undefined, 2);


If you want to allow undefined to be passed as a valid non-default value for a trailing parameter, you can also use arguments.length to determine how many arguments were explicitly specified.

    var foo = function(a, b) {
        if (arguments.length < 2) {
            b = 'some default value';
        }
        // ...
    };


I wish jQuery's setter methods worked this way. It's too easy to accidentally turn a setter into a getter.


It should be noted that ES6 implements default parameters, so they are coming to javascript soon: http://wiki.ecmascript.org/doku.php?id=harmony:parameter_def...


but not reliably to a browser for some quite considerable time.


TL;DR version:

    function(kwargs) {
      kwargs = _.extend({param1: true, param2: 42}, kwargs)
      // ...
    }


    var args = _.defaults({}, kwargs, {param1: true, param2: 42}).


I thought this was common knowledge? If you've read any JS library code you'd have seen this before. I'm not sure what the 'best' library to study is these days- I remember there was HN discussion about it and someone mentioned underscore.js since it was simple utilities.


It is certainly common knowledge in some groups, but not in others. There are plenty of people who are primarily designers who know just a little JS, as well as developers in other languages who have learned just enough JS to get by.


Library agnostic version if anyone cares for it: https://gist.github.com/jonjaques/3036701


Or you could just use CoffeeScript.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: