While I agree, and I fully expected it to go to March 1st instead, the core of the problem is that dates don't obey basic arithmetic identities. Treating Feb 29 as a zero value is an elegant solution in some ways.
But say 2012-02-29 + 1 year went to 2013-03-01. Then what's 2012-03-01 - 1 year? Does it go to March 1st every time, sometimes ignoring that there's an extra day in between, or does it go back 365 days (March 2nd)?
I suspect the only "solution" is to decide that calendar dates are for human consumption only. If you're doing any calculations, you do them on timestamps, where you can declare a 'year' to be one of 365.256363004, 365.24219, or 365.259636 days (sidereal, tropical, and anomalistic years)[1]. Given the importance of calendars and seasons, you'd probably want to use the middle one, as they all essentially share that length of time as a definition of a year. That way you can just screw the whole leap-year concept entirely.
Of course, then you're left with year-long agreements that expire at odd hours of the night.
Different approaches, different bugs. The seconds from UTC approach results in 1 month from Aug 31st being Sep 1st, and 1 month from Jan 31 being Mar 3rd.
Well, a "month", unqualified, is not really a unit of time. You have to know not only which month you're talking about, but in the case of February, what year it's in, to know exactly how long it is. The same goes for years, if you want them to consist of an integral number of days.
So I think as a matter of API design, Ruby has made a wrong choice by making these operations look like arithmetic on known values. It's too tempting to think that they'll obey the arithmetic identities, when they don't. If you want to have an API function "same date N months later", I have no problem with that at all; then it's much less tempting to think that it's just doing addition.