Hacker News new | past | comments | ask | show | jobs | submit login
Understanding the corporate impact (of upgrading Firefox) (kaply.com)
68 points by Isofarro on June 23, 2011 | hide | past | favorite | 93 comments



Merits of the argument aside, I was surprised to see the tone that Asa Dotzler took on that thread. To me, it sounded like "you enterprises are dumb and we don't care about you."

If I were "Community Coordinator for Firefox marketing projects," I think I would have tried to be more tactful.

I'm not trying to throw darts at him, but it may be instructional for those of us who are geeks to consider how our words can come across online.


I didn't say anything that could be remotely construed as calling enterprises dumb. I said that they have not, and in my opinion, should not, be Firefox's target.


Well, you believe what you believe, and nobody can fault you for your beliefs, in and of themselves. I happen to disagree with you, but that's not my issue with your comment... my issue is that - regardless of what you intended - your tone came off as very arrogant, dismissive and mean-spirited. And, like it or not, there is a PR aspect to consider anytime someone - who is as visible and well known as you are - says something that sparks controversy.

I think this whole thing has done some damage to the Firefox brand, and that makes me sad, as I've been a Firefox (and before that, Mozilla... going back as far as Mozilla M2, IIRC) supporter for a long time.


Actually it's unclear by reading this blog post alone what are you trying to say.

It becomes clear if people read the end of your previous post[1] where you say:

"For corporate deployments, there has to be a stable branch. I bet someone is probably going to make a nice business out of creating and maintaining a stable branch…"

1. http://mike.kaply.com/2011/06/21/firefox-rapid-release-proce...


Now what you actually said is: ‘if you want to provide standarised and stable non-Windows browsing environment, screw you, sucker’.

Yes, yes, I heard that ‘someone go and make business of stabilising the crap out of firefox’ song. Well, there's one problem – there's no such product yet.


Staying at FF3.6 wouldn't make them the most out-of-date by a longshot. A majority of our site's corporate users are on IE6!

But it seems to me that part of this problem is that they're trying to accomplish two things with the browser, that for their needs might be better separated into two.

For internal corporate apps, there's probably little reason to upgrade. In particular, the security risk posed by accessing internal systems on a backlevel browser is pretty minimal. They could retain FF3.6 for use in this context.

For general web browsing (research, etc.), you really want to be as up-to-date as possible, but there shouldn't be any repercussions to doing so. If you're accessing the internal apps on your old FF3.6, then you can have a separate "external browser" that's kept current.

So my suggestion is to stick where you are for the internal apps, and separately install Chrome or IE or something for external access. Set up configurations that restrict each browser to only be able to access its intended sphere.


I can't imagine trying to get everyone in a large corporation to use the right browser for the right thing. Most normal people barely understand that IE is not the internet, let alone understanding the difference between websites, web apps and internal and external servers.

In a very small company, maybe, but not in corpo-land.


There would certainly be a training issue, but I think people would get it if their system were configured so that each browser is only able to access its intended targets. Something as simple as giving the internal-only one a different, non-standard icon might help.

Using policies in IE6, it should be easy enough to disable access to internal sites. And in FF3.6 (for the internal apps) it's probably trickier (and I don't know how to deploy the policy), but one might point it to a non-existent proxy, and tell it not to use that proxy for internal sites.


Wait... what IS the difference between a webapp and a website?


Buzzwords, Ajax, HTML 5


Sounds like there's a business idea here.


The reason a lot of people are still stuck on IE6 is that they wrote apps that depended on a lot of non-standard IE6-specific behavior. Apps coded to today's web standards should age much more gracefully and place fewer constraints on browser upgrades going forward.


What actually are those apps that truly require a browser? What's the benefit of binding a program to a web browser? There has to be one, otherwise it wouldn't be done so often, but I have a hard time seeing it. What are some concrete examples?


Just a list of things I came up with thinking for a few moments. These are all my thoughts and I don't have much to back up these positions, except for life experiences (and I'm pretty young):

* Small download,

* Controllable data,

* Instant, transparent upgrade,

* easier to get good/better demographics on usage,

* can more easily charge for usage,

* easier to get money via advertisements,

* works on Windows/Linux/Mac, because it's browser dependent,

* fewer steps to usability:

  - goto website, signup vs 

  - goto website, install this program, go to program, start program. It's confusing and scary for a non-power-user.
* People feel safer going to a website than installing this rogue application on their computer (though not so with a handheld device, it would seem).

[another I thought of while reading what I wrote after fixing some display-errors]

* A/B testing easier to evaluate


If the reason behind the upgrade is to take advantage of a new and more performant JavaScript interpreter, you might want that used for internal apps first, if anything. It depends on whether the sticking point is security or backward compatibility.


So here's my take on the matter. This is a Damned if you do, damned if you don't problem.

a) Firefox must release frequently to keep up with competition. A firefox by any other version is just the same set of changes, just labeled 4.1 vs 5. I don't know why they changed their numbering system (to keep up with chrome i guess) but it makes absolutely zero difference.

b) Any corporation cares about stability vs progress. They want to test now, guarantee that it will work for say 10 years, and just stick with it.

And now of course

c) The web needs to progress. (ie6). IE6 is still being used. This has been beaten long and hard for years. The problem is not IE6, the problem is corporations. IE6 will die, it'll die soon (I hope) as I still have to support that heap of crap. HOWEVER tomorrow firefox 3.6 will be it. FF 3.6 will have some crap and work like shit 10 yrs from now and some corporation will want to use it and nothing else just like IE7 and IE8 and actually anything.

--

Anyone who embraces chrome is embracing a mindset, it will always be up-to-date and our it must support the latest version, we'll force a restart of chrome at some point to update that version. Our sites MUST WORK WITH LATEST CHROME.

Here we have examples of companies complaining, and I guarantee if Mozilla would state that 3.6 will be forever supported (7 yrs?) like IE6 was they would get the same shit MS got for IE6, sure not today, but in a few yrs "omg fucking ff 3.6 can't do shit like rounded corners right, it needs to die" followed by the FF3.6 eulogy website etc. The only way to win this is for companies to realize that the following will be true on the web:

a) security vulnerabilities in browsers will be discovered/patched, and quickly. You must upgrade frequently to keep up.

b) The web is moving forward, even if you dislike that. Due to this rendering engines must be updated constantly. Sorry. Adopt best practices and release frequently. Its an unfortunate cold hard truth and its costly but its the nature of dealing with the web.

c) Mozilla can't support an old browser forever. Pay them and they will. However they are not Microsoft, and even MS can't do this effectively, so if you want something for "free" (IE included here) be prepared for paying for having to move in the direction they take. Otherwise contribute cash to them to support what you want. I'm sure Mozilla will happily create a special FF3.5 branch back-porting fixes for years as long as you pay them to.


Let's play a little game here.

Assume a corporation is creating over 100 custom applications to support businesses over dozens of product divisions, and any time supports over 1,000 internal applications, dating back to COBOL.

The COBOL applications keep working. The ColdFusion apps keep working. The VB6 apps are supported on an XP development box, but they still work on Windows 7. People are planning the replacement applications right now.

And then there's Mozilla. A feature introduced in 6 gets deprecated in 11, or changed just enough. Suddenly, 50 applications have an immediate need to change before there are any security bugs found in 10. After all, 10 is EOL and security bugs wont' be updated.

The idea of a long term release (supported X years) is a good one. Enforced bit rot and continual QA on legacy apps sucks, but having to do it every few months is untenable.

On the other hand, about 50 people see a market opportunity right about now. :)


On the other hand, about 50 people see a market opportunity right about now. :)

Well, enterprisefox.com and enterprisebrowser.com are both already registered, so basically, yeah. :-)


I would argue that the Chrome mindset may actually be a more sustainable easier practice. Instead of a large "certification" process you just fix things as they break on an ongoing basis. Small changes over time seem easier to absorb than a Huge change once every few years.


Not to mention they would only have one version of chrome (the latest) to support and not multiple versions of Firefox that are still installed on various desktops around the office.


Let me put my flame-suit on ...

To further distill your points, there is fundamentally two problems:

1. Maintaining backwards compatibility while providing support for new features;

2. Providing security updates without breaking compatibility.

I need my flame-suit because, if Mozilla can't solve those two problems, then they seem to be ignorant of some very basic strategies in software development.

In web-based application development lately, you see a lot of "MVC" talk. MVC is far from a new concept, but more people are starting to see the wisdom of keeping your business logic separate from your UI logic.

There's no reason why a web browser shouldn't be constructed the same way. Internally, it should be separated into distinct parts -- the network interface, the rendering engine, the user interface, etc. -- with those parts communicating only through a largely static API. Each of those parts should then endeavor to separate the security-related logic from the rest of the application logic.

In other words, there is absolutely no reason why it should be impossible to apply and maintain security fixes to multiple versions of the code base, simultaneously.

Once upon a time, programmers saw the wisdom of maintaining separate versions of their applications, and they helped their system administration brethren by signaling the impact of changes in the version numbering:

major.minor.revision

Sysadmins running a specific major.minor of an application could always feel comfortable blindly updating .revisions, which typically consisted of things like security updates. If a .revision ever broke anything at all, the developers were generally considered to have screwed up.

Sysadmins could more leisurely apply major.minor updates, after double-checking that the .minor change didn't break any of the stuff that their network relied on.

And, meanwhile, sysadmins could even more leisurely take their time in developing comprehensive testing periods and update roadmaps for major releases, so that nobody in the company would get caught with their pants down.

So, not only is Mozilla committing an egregious error in not being able to apply just security updates without changing any other application functionality, but they're committing a whole other sin of long-standing software best practices by abandoning the most sensible version numbering system that was ever developed, and developed for damn good reason.

I think it's clear that Mozilla simply doesn't give a shit about their users at this point. They are either accidentally or intentionally ignorant of why these practices were developed and what problems they were intended to solved, not only in corporate markets but in home users as well. Mozilla is also digging the graves of system administrators everywhere who pushed for deployment of Mozilla applications in corporate environments and are now going to be faced with nothing less than a huge fucking maintenance nightmare -- and I promise you that nearly every corporate IT department has an administrative overlord who is much less interested in why a particular software product is technically better than they are in what the cost will be or how many people are going to complain.

I'm beyond furious, personally. Way, way beyond furious. I've been making similar comments in every single Mozilla-related thread since this was announced. My company was still switching our clients -- clients who work with us because they trust our judgement -- to Firefox as of last week. This new release strategy has just fucked us in the ass. As of now, we've completely halted Firefox conversions, and I have to figure out what to do next.

As far as website compatibility goes, would it really be so hard to support a simple comment tag at the top of the page that listed the browser major versions that the site had been built for, and then -- because the application was designed well so as to be capable of such magic as this -- use the appropriate rendering engine, dynamically loaded?


Your proposed approach of always shipping all the rendering engines you have ever had is exactly what IE is doing.

This has two drawbacks:

1) It makes your app huge (because you have to support many wildly different modes, some of which have different architecture requirement). Clearly less of a problem if your app comes preinstalled with the OS anyway.

2) If everyone were to adopt it, it significantly raises the bar for a new entrant into the browser market: instead of just implementing the specs, it would have to reverse-engineer a bunch of legacy modes. This is actively bad for the web, and a major reason why Gecko, WebKit, and Presto are NOT taking this approach.

For the rest, I understand the situation you're in, and it's unfortunate. But your claims about how browsers _should_ be architected assume that things like the web's security model or the general feature set browsers need to support are somewhat static. They're not, and changes to them can require significant architecture changes. Basic things like the in-memory representation of the DOM are being actively changed in WebKit and Gecko every so often, as new requirements come up; I can't speak to Presto and Trident because I can't see the code. Which means that porting security fixes becomed pretty difficult, because your APIs in the old release are different, your code flow in the old release is different, and most importantly the invariants the old release maintains are different, because the specs that determine those invariants have changed in the interim.


There is no way to separate out security from the other software layers. In a web applications and user agents everything is involved with security. I agree it would be great if they provided security fixes for obsolete releases but let's not pretend it would be cheap or easy. Someone would have to pay for that, the open source community is unlikely to deal with that level of configuration management drudgery for free.


> There is no way to separate out security from the other software layers.

Are you sure that's not the underlying cause of the problem?

Security is a lot like scalability and fault tolerance: you have to plan your software architecture to support it systematically. You can't just bolt it on later.


The corporate mindset is the problem. They've got to get away from the hubris of thinking their software testing process is so special or better than the community's.

Although it's a pretty cushy assignment. I'm sure plenty of bigcorp PMs would rather do that than to manage something with more complicated deliverables.


I agree. The assumption here is that the corporate IT mindset is the constant factor, and Mozilla, Chrome, or whatever, have to adapt to it.

I don't agree with that assumption. Corporate IT has to get better and faster, or they will get marginalized by cloud hosted solutions that can move faster and deliver superior experiences at better cost.

If your internal software is so crappy you can't upgrade your browser, maybe you should try and fix your internal software.


> Corporate IT has to get better and faster, or they will get marginalized by cloud hosted solutions that can move faster and deliver superior experiences at better cost.

That's easy to say when you're not responsible for a system where an hour of downtime comes with a seven-figure number in brackets on this quarter's review.

Corporate IT guys get a lot of grief about imposing picky rules and making life difficult, but the bottom line is that they are the guys who are going to take the flak if stuff breaks. Most of these policies are not there because someone at the head office wants to throw their weight around, they are there because a lot of people on the corporate network really could cause serious damage without even knowing how or why if left to do whatever they wanted.

If you want to blame someone for this cultural problem, blame the big software and infrastructure providers, who haven't yet collectively invented a robust global IT architecture where risk can reliably be localised to the user at fault. Or step up and do something about it and get very rich, because it's a real problem facing literally billions of staff every day.

Alternatively, you could take the view that cloud computing is the way forward, releasing a new browser every three months will make these sorts of problems go away, and everyone who develops in-house tools and operates a conservative IT policy is a moron with no idea how to run a successful business. Good luck with that. :-)


> That's easy to say when you're not responsible for a system where an hour of downtime comes with a seven-figure number in brackets on this quarter's review.

In particular if you have such a critical system, you should have a plan on how to upgrade, test, and deploy it.

Releasing a browser every three months (or constantly, such as Chrome) does make a lot of problems go away. And having reasonably-well written web apps does help as well - just because you're developing in-house tools doesn't mean they have to suck and only work on IE6.00.14.13 patch level 3 with the right fonts and ActiveX version installed.

Corporate IT can learn a lot from how application development works on the web, the web is the more robust IT architecture you're asking for.


> In particular if you have such a critical system, you should have a plan on how to upgrade, test, and deploy it.

Indeed, but wouldn't such a plan start with building on a firm foundation that isn't going to get EOL'd and no longer support security updates after three months?

The reality is that building on FF or Chrome now means trusting potentially critical business functionality to an outside group you can't control with a history of pushing breaking changes.

(If anyone is about to pipe up with how they only release useful changes and the community testing is sufficient to prevent regressions, please keep in mind that both Firefox and Chrome have each outright broken both cosmetic rendering details and basic functionality recently, in minor/point releases that you wouldn't normally expect to change user-observable behaviour at all, plus of course there are many widely and less widely reported compatibility issues.)

> And having reasonably-well written web apps does help as well

Don't tell that to anyone who uses Java applets and has spent significant time over the past few months working around the repeated screw-ups made by major browsers as they implemented new internal details.

(If anyone is about to pipe up with how Java applets are yesterday's technology and browser vendors don't need to care about them any more, congratulations, you are a walking example of my point.)

> Corporate IT can learn a lot from how application development works on the web,

Or web application developers could learn a lot from corporate IT, depending on your point of view. Personally, I can't remember the last time I saw a business critical corporate IT system completely fail or a major deployment of old-fashioned desktop software block a whole company, while I see the trendy style of rapid-development cause major errors with alarming frequency. That applies to everything from Reddit falling over every ten minutes to Google Docs' seeming inability to display even basic documents and spreadsheets properly on all major browsers at any given time.


> Indeed, but wouldn't such a plan start with building on a firm foundation that isn't going to get EOL'd and no longer support security updates after three months?

Why? Why does it matter if 4 is EOL when 5 is available? Instead of obsessing about having security updates, why not just expect updates. If you're willing and able to test and deploy a new browser every 3 months, what does it matter if they call it 4.2 or 5.0? Test and deploy the new browser and stop worrying so much about the version number.

> If anyone is about to pipe up with how they only release useful changes and the community testing is sufficient to prevent regressions, please keep in mind that both Firefox and Chrome have each outright broken both cosmetic rendering details and basic functionality recently, in minor/point releases that you wouldn't normally expect to change user-observable behaviour at all, plus of course there are many widely and less widely reported compatibility issues.

So don't deploy on day 0. Breaking changes can be, and sometimes are, introduced in minor version increments. If you need to test, you need to do so whether you're going from 5 to 6 or 5.1 to 5.2.


> If you're willing and able to test and deploy a new browser every 3 months

I'm not, and I don't think a lot of other people are either. The version number is irrelevant. The frequency of releases is the problem here.

> So don't deploy on day 0. Breaking changes can be, and sometimes are, introduced in minor version increments.

OK, fine, but now everything is effectively a minor version increment in Firefox and Chrome, and anyone who doesn't update within 90 days is apparently going to lose all security updates. That is not a viable combination for any users who value a stable platform they can build on more than cutting edge toys.


> I'm not, and I don't think a lot of other people are either. The version number is irrelevant. The frequency of releases is the problem here.

A lot of people very clearly are willing to update every three months. A lot of users update Firefox regularly. And for people using Chrome, they get updates even more frequently. When the upgrading is painless, no one minds doing it.

As for corporations, if they aren't willing to roll out a new browser every three months, how often are they willing to do it? And how are they dealing with browser exploits in between these long cycles?

> OK, fine, but now everything is effectively a minor version increment in Firefox and Chrome, and anyone who doesn't update within 90 days is apparently going to lose all security updates.

So how is this different? If you are not willing to update, you're not getting security updates anyway.

> That is not a viable combination for any users who value a stable platform they can build on more than cutting edge toys.

This is either hyperbolic or delusional. You tell me which. I'm using the latest version of Chrome. I also have Firefox 5 installed. They both work on every site I've visited lately. The only exceptions are some internal corp sites that only work with IE, and those clearly didn't work with FF 3.6 either (also I think those were all recently upgraded or retired so they're no longer an issue, but I'm not certain).

If you want to build stable apps, then use the stable pieces of HTML/CSS/JS/whatever. No one says you have to use the latest feature that Chrome/Firefox/IE/Safari added. You can choose to build on a stable platform without running a 2-year-old browser.


> A lot of people very clearly are willing to update every three months.

...and a lot of people very clearly are not. So now what? We say, "this doesn't work for lots of people", you counter-point that it does work for a lot of people ... both statements are true and nothing has been accomplished.

> And for people using Chrome, they get updates even more frequently.

And for people using Internet Explorer, they get updates much less frequently. Guess which browser still has the lion's share of the market? (Hint: http://getclicky.com/marketshare/global/web-browsers/)

> When the upgrading is painless, no one minds doing it.

Right, and that's the rub! Upgrading is not painless! I think you, and I, and Silhouette are in agreement here: if upgrading were painless, no one would mind doing it. The problems seems to be that for you, "painless" means, "I have to download and install it", and for us, "painless" means, "we have to answer support calls about what happened to the bookmarks menu and why X page is no longer working even though it was two weeks ago and by the way the back button looks different and I don't think I like this new version..."

> If you are not willing to update, you're not getting security updates anyway.

Nobody's objecting to security updates!

This very statement is so indicative of what the problem is here: that security updates are being conflated with application updates. Security updates are fine! Corporate IT will almost always roll out a revision change, no problem!

Microsoft's Update Tuesdays? Usually OK!

Service Packs? Let's wait!

Now Mozilla's got a huge troll face on and is saying, effectively, "Hehehe, here, have an update ... it might be a security update, it might be a service pack! Enjoy!"

> This is either hyperbolic or delusional. You tell me which.

See, now here's where I want to make this personal now.

Don't pull that shit. Just because you don't understand someone else's problem, doesn't mean their problem is insignificant. OK?


> ...and a lot of people very clearly are not. So now what? We say, "this doesn't work for lots of people", you counter-point that it does work for a lot of people ... both statements are true and nothing has been accomplished.

Now nothing. I never said there weren't a lot of people on the other side. I was responding to Silhouette's comment: "I'm not, and I don't think a lot of other people are either." He's not willing, and doesn't think many others are. That's incorrect, because indeed many others are. "I don't think a lot are" is not the same as "I think a lot are not". (Compare: "I don't think a lot of people are fans of Justin Bieber." vs "I think a lot of people are not fans of Justin Bieber." These are very different statements.)

> And for people using Internet Explorer, they get updates much less frequently. Guess which browser still has the lion's share of the market?

And? Are you advocating that Firefox should follow IEs lead? It looks like IE's lead is dropping almost as quickly as Chrome's share is rising.

> Right, and that's the rub! Upgrading is not painless!

It would be a hell of a lot less painful if it didn't involve a massive rollout of new software every 18 months. Chrome's always-updating model has proven to be painless for a lot of people.

> for us, "painless" means, "we have to answer support calls about what happened to the bookmarks menu and why X page is no longer working even though it was two weeks ago and by the way the back button looks different and I don't think I like this new version..."

So use IE and be done with it, or maintain FF 3.6 indefinitely yourself. If you want your browser to remain unchanged for very long periods of time, and Mozilla won't help you with the goal, I don't see what your other options are. I really don't see how Mozilla has an obligation here.

Also, if you upgraded frequently, the changes that arrived wouldn't be quite so large. When you follow the "big bang" software rollout technique, it's a lot of changes dumped on the user at once. If you roll out smaller changes, users have less to adapt to any any given time. Maybe just send an email telling them where the bookmarks moved to.

> Nobody's objecting to security updates!

My point was that you have to go through testing for security updates as well. I concede that you're less likely to have users calling to ask about why the bookmarks moved after a security update, though.

> Now Mozilla's got a huge troll face on and is saying, effectively, "Hehehe, here, have an update ... it might be a security update, it might be a service pack! Enjoy!"

That's hardly fair. What Mozilla is saying is more along the lines of "Here's the latest and greatest." And "It's too much effort to maintain a bunch of old branches indefinitely, so we're not doing that anymore."

> See, now here's where I want to make this personal now.

You want to make it personal because you took offense to something I said to someone else? How thin-skinned are you?

> Don't pull that shit. Just because you don't understand someone else's problem, doesn't mean their problem is insignificant. OK?

I did not say his problems were insignificant, and you need to stop getting your knickers in such a twist. I was responding to a specific statement: "OK, fine, but now everything is effectively a minor version increment in Firefox and Chrome, and anyone who doesn't update within 90 days is apparently going to lose all security updates. That is not a viable combination for any users who value a stable platform they can build on more than cutting edge toys." This is indeed hyperbolic (or possibly delusional). Minor versions with new functionality very clearly is a viable combination, because it's been working for Chrome for some time now.

The idea that you need everyone on FF4 for a year so that you can have a "stable platform" for development is ridiculous. If the move to FF5 is going to break everything, then you're already screwed, and your best bet isn't to dump a year into development for FF4, but to find a way to write apps that won't break every time the browser is upgraded. You cannot stay on FF4 forever (I hope), so at best you can postpone the problem and probably make it much harder to resolve in a year.


If you are using Java applets, then yes, migrate away. That's a technology that's effectively deprecated since at least 5 years. And yes, my point is: move faster.

Regarding stability: yeah sure. Amazon never works, Google.com is always down, ebay doesn't render in half of the browsers. On top of that, all the owners of the millions of websites struggle every day to keep up with all the browsers. Clearly, the web is nonviable and should be shut down.


> If you are using Java applets, then yes, migrate away. That's a technology that's effectively deprecated since at least 5 years.

That's simply not true, though it is very revealing that you think it is.

In fact, every browser has made major developments in Java applet support far more recently than that, Java itself has evolved significantly in that time, and of course there are now several rather advanced programming languages that compile down to Java bytecode and the applet mechanism lets you use those languages in developing web-based tools.

Whether or not you personally know anything about it, Java applets are widely used in several significant industries, often for configuration of networked hardware or UIs for in-house tools. The global investment in this technology is probably rather large, and you are basically asking that everyone who has spent time and money on the technology should throw it away because your pet browser can't manage a point release without breaking it? Well, sorry, but other people's browsers can, and you just look like your quality control is broken with that argument.

> And yes, my point is: move faster.

Why? Quite a few of these tools just work, and need little if any maintenance, because the systems behind them also just work and are still in use. You are arguing that people should completely rewrite systems that are working fine and have no current issues, so your browser can advance into untold new territory without bothering about backward compatibility. Your reality check is about to bounce.

> Regarding stability: yeah sure. Amazon never works, Google.com is always down, ebay doesn't render in half of the browsers.

You do realise that both Amazon and Google have had major outages recently, taking down huge numbers of sites that rely on their service infrastructure, right?

In fact, contrary to all the advocacy about how cloud-based computing is less risky than doing things in-house because the Internet giants have redundant this and scalable that and on-call the other guy, the record so far is pretty poor.

As far as their front-line services, of course the real giants manage to keep something up pretty much all of the time, but even then Amazon has suffered from several obvious bugs in recent months, which must be hitting them in the budget for both the PR and customer support departments.

Google aren't much better. Their main site works fine, but for example one client of mine uses Google Docs and I'm not sure we have ever managed to get a meeting together when one of the team's systems didn't have basic access or drawing bugs while trying to use it. They fix one thing and break another. So do Facebook. Don't even get me started on smaller sites like Reddit falling over every ten minutes (remind me again what their infrastructure is and how often they've attributed total system failure to that outsourced infrastructure).

So yes, I stand by my comments on stability, and I think you have a very rose-tinted view of the reliabilty of big name sites such as those you mentioned. The facts just don't support your argument.


Certainly corporate IT guys sometimes take the flak when stuff breaks, but the really good* corporate IT guys manage to turn major breakage into a justification for a bigger budget.

"...blame the big software and infrastructure providers, who haven't yet collectively invented a robust global IT architecture where risk can reliably be localised to the user at fault."

So many of the problems of building a robust global IT architecture have already been solved in the Internet itself. When Big Enterprise resists using the Internet as it's designed, as a loosely-coupled system, with rules (RFCs, etc.) then it ends up with the sorts of problems in browser testing like the OP is talking about.

"Or step up and do something about it and get very rich..."

That's the best thing in your comment. I would have upvoted you just for that. If my previous thesis has any value, that a big problem with corporate IT is attitude, how do we fix it? Send everyone to reeducation camps in the gulag? I can't think of how to profit from that, though it might be fun to watch :)

*Depending on the interpretation of "good."


I get the feeling you're mocking me. :-) But I'll play along...

> So many of the problems of building a robust global IT architecture have already been solved in the Internet itself.

Ironically, I think the Internet is a fine example of what I mean. Before the Internet, no-one could DDoS a whole business out of existence, because business models weren't vulnerable to that sort of instant mass attack. Before the Internet, you didn't get rapidly evolving viruses that could sneak into your corporate networks because one user clicked the wrong Facebook link during their lunch break and cause you weeks of grief to remove because they could propagate quickly throughout a whole section of your LAN. Before the Internet, one executive doing something dumb didn't result in every confidential report to which that executive had access being uploaded to someone who would extort money from you to keep it quiet. Before the Internet, your database front-ends were in-house, and one error in a front-end didn't leave you vulnerable to having all your customers' credit card details stolen.

Does this mean the Internet is a bad thing? Of course not, it's overwhelmingly positive on balance. But it presents many fine examples of what can happen if the damage causable by a single rogue element isn't effectively contained.

> If my previous thesis has any value, that a big problem with corporate IT is attitude, how do we fix it?

Exactly. You are assuming that the problem here is the attitude of corporate IT groups, and not their choice of tools. I am suggesting that Firefox and Chrome are making themselves into unsuitable tools and the responsible thing for corporate IT to do is to not use them, presumably going back to using a stable IE version and Microsoft's well-developed management tools instead.


I didn't mean to mock you and I really did like your comment about how do we profit from this. That really is the important question for HNers. That and "how do we fix it."


> Corporate IT has to get better and faster, or they will get marginalized by cloud hosted solutions that can move faster and deliver superior experiences at better cost.

Who says the cloud-hosted solution isn't more expensive? Who says "experience" is the goal? A company cares about making money, and an internal browser-based app is simply a way to make money more efficiently.

Often, the "corporate mindset" means cheap. If you don't need to worry about upgrading your browser, you don't need to blow money on developers just to keep up with cutting edge browser design and you can focus on developing technology that actually makes you more money.

For example, if you had a system that involved scanning physical documents into an image repository, then having individual workers review and annotate every single document (using a browser-based app) and then manually make corresponding changes to the customer database. You are paid by an SLA-based contract based on the rate and number of images you are able to process in a billing period.

Where would you rather invest your money? Keeping the crusty annotation browser-app up to date with the latest shiny new browser changes, or implementing OCR to increase the speed and efficiency you can process documents?


Hubris has nothing to do with it. It's just the reality in companies with existing revenue streams.

If you have hundreds of agents using a web-based front-end to process loan applications, failure of the front-end app due to an unexpected change in the browser means an instant, measurable, and substantial loss in productivity and revenue.


The corporate mindset is the problem. They've got to get away from the hubris of thinking their software testing process is so special or better than the community's.

I don't entirely disagree with your sentiment, but there IS a difference that makes "their testing process" special... it's relative to their environment. Simply put, if there is a critical business app that XCorp is running, and it's closed-source FooApplication from MegaVendorCorp, then "the community" is unlikely to have tested their new browser against FooApplication. So XCorp would be foolish to roll out a browser, without additional internal testing, that verify that it works with FooApplication.

And that's only the most trivial example... when you start talking about the interactions between different apps and different systems (all the varied SSO systems in existence, for example), you can easily have a case where an organization's internal environment will result in failures that would never be caught be the external testing of the browser vendor (and this true whether it's a F/OSS browswer, or something from Microsoft, Apple, Opera, etc.)

Interal IT testing and certification isn't just a way for some bureaucrats to justify their positions... it really does serve an important purpose.

That said, none of this is meant to argue against the notion that corporate IT departments need to improve their processes to become more agile, and more able to adapt quickly.


To counter a point that has been made several times here: it's not just corporations which operate a "conservative" IT policy: many SMEs do so as well. Innovations are great at home but at work, they confuse users, break things which have worked forever, and represent an unknown quantity in support terms. I've already decided not to put FF4 in front of my users because of the interface changes (I know I can restore the address bar, etc, but I don't want to spend my time doing that for each user).

The Firefox team are solving problems most users don't know they have, i.e. I rarely hear complaints that "the address bar is confusing", or "the status-bar is in the way", but frequently "I went to Google Image search and it's trying to install something on my computer". Time spent solving the second problem is far more valuable to me than time spent on the first (non-) problem.


So? Let's all get stuck with FF 3.6 like IE6?

Please no. Mozilla shouldn't slow down the release cycle just because big corporations can't keep up with it.

Alternatively, what would help those big customers to keep up would be the opposite of what they require. Instead of having infrequent "big" releases every few months, having continuous updates ala Chrome where each increment in almost compatible with the next one could help to mitigate the problem. There will be no big breakages and everyone that wants will be able to keep up when time comes.


> Mozilla shouldn't slow down the release cycle just because big corporations can't keep up with it.

No-one can keep up with it. Not big corporations. Not people who make web sites and web apps. Not even people who make extensions for the browser itself.

The only people who can keep up are users who ticked the "update me whenever you feel like it" box. They only benefit in practice if the people providing the content can take advantage of any new functionality. Otherwise, it's just a no-win/lose proposition, a significant risk of breaking stuff that used to work for no real benefit.


Its a vast improvement for web developers who have to develop for 1 version of chrome versus 4 versions of ie, Its a vast improvement for web developers who get to use new functionality to deliver better experiences to their users, Its a vast improvement for users who get a faster, smoother, more featureful web experience.

The only people it sucks for is large corporations who for some reason have applications that are tied to 10 year old technology and for extension developers, its a shame but certainly not worth holding back the web for


> Its a vast improvement for web developers who have to develop for 1 version of chrome versus 4 versions of ie, Its a vast improvement for web developers who get to use new functionality to deliver better experiences to their users, Its a vast improvement for users who get a faster, smoother, more featureful web experience.

It would be a vast improvement if any of those things were actually true, but unfortunately none of them is.

Web developers aren't developing for one version of Chrome or Firefox now, they're developing for one version every three months. That is far worse than supporting, say, IE8 and IE9. (Please can we drop the IE6 nonsense now? The remaining market share is negligible and the arguments are academic outside of niche markets.)

Web developers can't use the new features if they can't keep up with them, and in any case they are typically not portable, robust or future-proof enough for production work when first released.

That means users don't get a more featureful experience. If anything, they get less, because trendy web sites use trendy CSS3 features and the like to save time, and those features don't work for a very substantial proportion of the web-surfing public where the older but tried-and-tested techniques did.


> Please can we drop the IE6 nonsense now? The remaining market share is negligible and the arguments are academic outside of niche markets

When last I checked in with some former co-workers, JPMChase still used IE6. Its a relevant problem.


The list of fatal regressions both chrome and firefox have introduced is pretty close to 0, developing a new version of a web app every 3 months for both browsers is beyond obtuse to the point of outright lying.

Web developers arent as utterly hopefully as you seem to think, my gmail supports drag and drop attachments, my irc client supports websockets when available, my task tracker works offline (and on a mobile), my maps support geolocation.

And I personally dont support have to support ie6, but a lot of people still have to. I havent heard of anyone that has to support Chrome v1.


> The list of fatal regressions both chrome and firefox have introduced is pretty close to 0

Well, if they've only introduced a few here and there, I suppose that's OK then. </sarcasm>

> developing a new version of a web app every 3 months for both browsers is beyond obtuse to the point of outright lying.

It's a good thing that's not what I said then, I suppose.

> And I personally dont support have to support ie6, but a lot of people still have to.

A few people still have to, I'm sure, but the market share has dropped like a stone over the past year or two. For most projects that aren't in-house jobs in the organisations that haven't managed to leave it behind yet, it is no longer relevant.


It works great right up until Chrome has a bug that prevents anyone from using your internal app.


The problem I see with this complaint is what "End of Life" means in corporate IT. What it boils down to in practical terms is that the vendor no longer is available for support calls.

We continue to run many EOL apps because of these kinds of issues -- we know the apps are stable for our usage of them, and it is a low risk that we would have needed vendor support anyway. We make an explicit decision to accept that risk

So this is technical impact, but really it is a risk management issue.


The FF4 EOL announcement has a bit more impact than that. It has been announced that FF4 will not received security updates. This means that if you deploy FF4 in your enterprise, you'll not only be committing to software without support, you'll have internet-facing software that contains published vulnerabilities (as they appear). That's just unacceptable.

Browser vendors would do well to adopt a release strategy that is similar to Ubuntu's model. You have a churn of releases with a periodic LTS release that will receive security patches for a longer term.

The key here is understanding that there is some middle ground. The "release often, period" method of software development is attractive to a lot of us individually, but it's nearly impossible to plan for when you manage hundreds of PCs per tech support staff member. Providing a consistent LTS release schedule would at least give corporations the ability to plan.


Corporations don't want LTS. They want RidiculouslyLTS. Ubuntu's LTS releases are supported for 3 years. IE 6 is now almost 10 years old. This is what corporations want. They want to install software and forget about it for a decade. This is not a realistic model for a browser. Even 3 years is not very realistic. 3 years means running Firefox 3.0 now, in June 2011. Sure, some people do it, but it's not a great idea for many reasons.

When you're looking at Microsoft Word, expecting long-term support makes sense. It's shrink-wrapped software and upgrading is a big ordeal. You have to upgrade everyone at once, or someone's going to (repeatedly and consistently) send out docx files to coworkers who cannot open them. It can take a lot of time and money to upgrade everyone. Browsers are (or should be) different. The web is continually evolving. New exploits pop up fairly often. Regular upgrades are necessity for security and functionality, so upgrading a browser should be a minor, easy thing, and it should happen frequently. The problem is not the lack of support from Mozilla, but the expectation from corporations that browsers should be treated like Word processors.


"Corporations don't want LTS. They want RidiculouslyLTS."

Sure they do, but they also want 100% tax exemption, employees that will work 80 hours for 40 hours pay, and $0 fire and theft insurance. Companies want lots of stuff they don't get :)

My point is that it's not black and white unless Mozilla chooses to make it that way. Right now, it's white. For Mozilla to say: "Infinite incremental release is the way forward" is to say, "We don't care about your ability to plan releases in any way shape or form." That's pretty black & white. Any commitment to a release schedule would, at least, give them the ability to plan.

I agree that web browsers are not synonymous with word processing software, but the expectation that large corporations will simultaneously adopt web based technologies while throwing out decades of change-management process is just naivety at it's finest.

Large organizations benefit from economies of scale. This means reduced costs. This means one IT manager per five hundred PCs. This means you can't roll out software willy-nilly and expect your organization to stand. Any in-roads Mozilla gained with corporations will be quickly squandered if they stick to this strategy.

Again, I agree that the old pattern is not what we need, but this is upheaval, and that's not what works at corporations.


What's interesting is that in the past the approach corporations have taken to browsers is that they wait for the browser to be released, then test it in their environment to see whether it broke anything.

This is as opposed to testing the beta to see whether it's breaking anything and if so report a bug to the browser vendor so the final release _won't_ break anything.

Just making that one change to their rollout process would significantly reduce the breakage potential of updates....


Exactly, companies want a lot of stuff, but that doesn't mean they should get it.

> My point is that it's not black and white unless Mozilla chooses to make it that way. Right now, it's white. For Mozilla to say: "Infinite incremental release is the way forward" is to say, "We don't care about your ability to plan releases in any way shape or form." That's pretty black & white. Any commitment to a release schedule would, at least, give them the ability to plan.

Asking for a release timeline and asking for 3 years of support on a dead branch are entirely separate things. It's not unreasonable to ask the Mozilla keep users informed of the release schedule (don't they already do this?), but asking them to support an old version for 3 years seems silly. It takes a lot of manpower and there's no value in it for probably 95% of their customers.

> I agree that web browsers are not synonymous with word processing software, but the expectation that large corporations will simultaneously adopt web based technologies while throwing out decades of change-management process is just naivety at it's finest.

I would say it's practical, not naive. If you want a browser that allows you to stagnate while enjoying long-term support, then either start paying someone for support, or start using IE. You agree that web browsers and word processors are different, so why should Mozilla bend over backwards to allow corporations to try to treat them the same?

> Large organizations benefit from economies of scale. This means reduced costs. This means one IT manager per five hundred PCs. This means you can't roll out software willy-nilly and expect your organization to stand. Any in-roads Mozilla gained with corporations will be quickly squandered if they stick to this strategy.

Sure, so don't roll out willy-nilly. Test and then roll out, just as they do now. But instead of rolling out 4.1, roll out 5.0. I don't understand what the big issue is. I'm sure it's a bit more work if there's a breaking change, but frankly you're going to have to deal with those breaking changes eventually unless you're going to stick with the same browser for years (let's call this the IE6 model). You just deal with them incrementally rather than in one massive painful push. Frankly the incremental approach sounds less painful for everyone involved.

We're talking about corporations demanding that a free product meet their antiquated needs. There's no grounds for the demand. It reflects their broken rollout policies, not a deficiency on the part of Mozilla. Maybe if it takes you several months to approve a browser rollout, that's your real problem. And let's be honest, a lot of this is just baseless fear. We're talking about a change in the way version numbers are incremented. People are worrying because it's called 5.0 instead of 4.1. The number attached doesn't really matter.

> Again, I agree that the old pattern is not what we need, but this is upheaval, and that's not what works at corporations.

What else would work? For Mozilla to dedicate a lot of extra resources to maintaining dead branches for years? Not only does this not help move corporations forward, it doesn't help Mozilla, and it doesn't help the web.

If the current change management models are not working, then they need to be fixed. Test and rollout faster and more frequently. Roll out in stages, rather than with the "big bang" model. Have a roll-back plan and system in place. Really, the way people talk, it's as if every week the browser is going to break half the web. I hardly think that's the case, and I frankly think Mozilla's testing is probably more thorough than most any corporate IT department.


I don't entirely disagree with most of what you said, but I think you underestimate the challenges. I own a small start-up, so I know how nice it is to operate at a small scale. I let my employees chose their own browser, but I require that they keep it up to date. I can do that because I don't have to worry about supporting 500 people.

I've also consulted for very large corporations. The same disruptive ideology that applies to start-ups falls flat with large corporations. The biggest mistake is to take a unilateral position with them. They will fall back to IE. They will stagnate on old versions if we don't work with them. Is that what you want?

Maybe your product can avoid the corporate space altogether, but there are a lot of us who rely on them. I don't want to re-live IE6. I want browser makers to acknowledge that changing the corporate world isn't the same as creating Google or Facebook. It's far less glamorous and usually means doing shit in a way that you don't like for longer than you like. Change is coming, but it's coming slowly.

I hate line-item rebuttals, because they just turn in to pissing matches, but I did want to offer some clarification on these two points.

> Asking for a release timeline and asking for 3 years of support on a dead branch are entirely separate things. It's not unreasonable to ask the Mozilla keep users informed of the release schedule (don't they already do this?), but asking them to support an old version for 3 years seems silly. It takes a lot of manpower and there's no value in it for probably 95% of their customers.

Let me clarify: I don't think 3 years is a viable LTS schedule for browsers. I don't know what the time frame is, but I'm sure it's not 3 years, and I'm sure it's not "We release continuously." I should have been more clear about what "similar" means when referencing Ubuntu. By similar, I mean that they should continue their march forward, but that there should be an occasional LTS release that is maintained for a longer, more planned interval.

> I would say it's practical, not naive. If you want a browser that allows you to stagnate while enjoying long-term support, then either start paying someone for support, or start using IE. You agree that web browsers and word processors are different, so why should Mozilla bend over backwards to allow corporations to try to treat them the same?

I'm saying it's naive (showing a lack of experience, wisdom, or judgment) to expect that this new development cycle won't have ramifications. Mozilla is free to do what they wish with their product. What I'm claiming is that it will come at the cost of market share in the corporate world. If they don't care about the market share, then go ahead and walk on it.


> I've also consulted for very large corporations. The same disruptive ideology that applies to start-ups falls flat with large corporations. The biggest mistake is to take a unilateral position with them. They will fall back to IE. They will stagnate on old versions if we don't work with them. Is that what you want?

What I want is for corp IT departments to not pretend as if this is somehow an insurmountable change. Yeah, corporations are slow and risk-averse. How risky is it to upgrade the browser once per quarter? Are they not testing and upgrading now for the sake of security fixes?

> Let me clarify: I don't think 3 years is a viable LTS schedule for browsers. I don't know what the time frame is, but I'm sure it's not 3 years, and I'm sure it's not "We release continuously." I should have been more clear about what "similar" means when referencing Ubuntu. By similar, I mean that they should continue their march forward, but that there should be an occasional LTS release that is maintained for a longer, more planned interval.

Okay, I can understand that. I can see the value in dropping a version every 9 months with a guarantee of 1 year of security fixes. Or maybe drops every 6 months that get 9 months of support.

I think if Mozilla is committed to backwards-compatibility moving forward, though, this shouldn't be necessary. If they can't avoid breaking changes for some reason, then it is an issue. (One exception would be breaking changes in beta/bleeding-edge features. If you build your product on unstable CSS or whatever, that's your burden. And if you built your product on features that were only around for 3 months, then you should be able to fix your product quickly.)

> I'm saying it's naive (showing a lack of experience, wisdom, or judgment) to expect that this new development cycle won't have ramifications. Mozilla is free to do what they wish with their product. What I'm claiming is that it will come at the cost of market share in the corporate world. If they don't care about the market share, then go ahead and walk on it.

Thanks for the definition of naive. :) I never said it wouldn't have ramifications. I actually said the opposite. Corporate IT shops need to find a way to move more quickly. They need to change. Running everything on a multi-year upgrade cycle is ridiculous, and borderline incompetent (though the incompetence might be coming from higher up than the IT guys).


> Corporate IT shops need to find a way to move more quickly. They need to change.

That is easy to say, and difficult to do.

The simplest way I can think of to explain why is that every single move, or change, comes with a nonzero cost-per-employee. While it may be possible to reduce that cost, nobody has yet invented a way to eliminate it.

When you have 5 employees, a small change that costs X-per-employee is not a big deal.

When you have a few thousand...


> That is easy to say, and difficult to do.

No doubt. That doesn't mean it shouldn't happen.


> IE 6 is now almost 10 years old. This is what corporations want.

Straw man. The only people I see bringing IE6 up in this discussion are those trying to create a false dichotomy between the now-trimonthly release schedules of Firefox and Chrome and something bad.

> They want to install software and forget about it for a decade. This is not a realistic model for a browser.

Well, if you want browsers to be taken seriously as a platform for running software applications and not just viewing static sites with occasional interactive features, then browser had better become that long-lived.

Do you have any idea how much investment Microsoft makes in keeping each new version of Windows backward compatible with software and drivers from the last version (or, more likely, the last several versions)? There is no way they would be the dominant force on the desktop today if they had gone around releasing point updates to Windows every few months and arbitrarily breaking other people's software that was built on their platform.

Even your three year figure is telling. If Windows 7 hadn't been able to run existing applications written in the XP era, Microsoft would probably be toast today, having blown two big OS releases in a row and lost the confidence of the developer market. Three years is within the typical lifespan of a single PC in an office environment, particularly these days when IT departments are seriously questioning the value they get from spending time and money on the upgrade roundabout and paying for all these subscription plans. Everything else that came with the PC still runs three years later and still lets people get their jobs done. Why should browsers be anything special?

Your final paragraph is just outright denial. Microsoft can and do patch security and compatibility bugs in Word all the time. They just manage to do it without changing the UI, introducing new file formats while obsoleting ones that were fine just six months ago, and breaking a couple of key features that they don't care about because only 5% of users need them.


> Straw man. The only people I see bringing IE6 up in this discussion are those trying to create a false dichotomy between the now-trimonthly release schedules of Firefox and Chrome and something bad.

It's not a strawman. Why do you think IE6 is still around? It's not for the grandmothers who don't know how to run Windows Update. It's for corporations who demand that Microsoft support an ancient product because they're too lazy/cheap/incompetent to upgrade their internal apps to work on modern browsers.

Corporations want indefinite support on every piece of software they use (and why not, I guess). The question becomes how long does it make sense for vendors to support a product. When you're talking about a product that needs to be upgraded frequently to continue to do its job and to maintain security, I'd say the answer is "not very long".

> Well, if you want browsers to be taken seriously as a platform for running software applications and not just viewing static sites with occasional interactive features, then browser had better become that long-lived.

You're the one creating a false dichotomy here, pretending that the only options are to build for a stagnant browser or to not build at all. The browser is not the platform, and that's what you don't get. The web is the platform. If you built a web app for Firefox 4, it's going to work in Firefox 5. The platform hasn't changed. It's gained some new features perhaps, but the original functionality is still there.

> Do you have any idea how much investment Microsoft makes in keeping each new version of Windows backward compatible with software and drivers from the last version (or, more likely, the last several versions)? There is no way they would be the dominant force on the desktop today if they had gone around releasing point updates to Windows every few months and arbitrarily breaking other people's software that was built on their platform.

An interesting point of comparison would be to note that Firefox 5 is backwards-compatible with Firefox 4. Unless you choose to use bleeding-edge beta functionality, your sites will still work just fine. Mozilla (and Google, and Apple, and Microsoft) work really hard to maintain backwards-compatibility in their browsers. You're pretending that upgrading to FF5 means that everything written for older browsers will stop working. We both know that's not true.

> Even your three year figure is telling. If Windows 7 hadn't been able to run existing applications written in the XP era, Microsoft would probably be toast today, having blown two big OS releases in a row and lost the confidence of the developer market. Three years is within the typical lifespan of a single PC in an office environment, particularly these days when IT departments are seriously questioning the value they get from spending time and money on the upgrade roundabout and paying for all these subscription plans. Everything else that came with the PC still runs three years later and still lets people get their jobs done. Why should browsers be anything special?

They aren't special. Firefox 5 will run all the same sites as Firefox 3.0, and it will run a lot more as well.

> Your final paragraph is just outright denial. Microsoft can and do patch security and compatibility bugs in Word all the time.

Of course they patch bugs. I never said they didn't. That doesn't mean the Word Processor model makes sense for browsers. I'm using Microsoft Office 2008. Should I also be using Firefox 3.0? I haven't upgraded Office because it costs money. I have upgraded Firefox because it's free and easy. Browsers are no longer shrink-wrap software, and treating them as if they are is ridiculous.

> They just manage to do it without changing the UI, introducing new file formats while obsoleting ones that were fine just six months ago, and breaking a couple of key features that they don't care about because only 5% of users need them.

Can you point out what FF5 has done that's so terrible and backwards-incompatible?

Edit: I guess there are some add-on incompatibilities, and that is something that they need to address moving forward. Frequent drops are fine, but should maintain compatibility.


> Why do you think IE6 is still around?

I really don't, at least not on the scale it was a couple of years ago. I expect there are a handful of organisations who are still stuck in IE6 land, but it's no longer the limiting factor it used to be.

> When you're talking about a product that needs to be upgraded frequently to continue to do its job and to maintain security, I'd say the answer is "not very long".

I guess I lack sympathy because, ironically, the vendors are still using old and underpowered tools and casual processes for their development, which is why we need so many security patches. Usually, those tools start with a C and end with the word "compiler". Often, the development processes are trendy, "Agile" things that emphasize prototyping and pushing rapid updates over clear specs and systematic designs, and that treat techniques like unit testing as some sort of divine correctness guarantee instead of a back-up to good basic development practices.

Well, you reap what you sow. If you don't invest in tools and processes that can build robust software that requires little maintenance, I'm going to bitch if you don't invest in supporting your existing software either.

We've had programming languages much better suited to application development than C and its derivatives for many years now, and there is really no excuse for still writing things like networking software using a model that has a terrible risk/benefit ratio for that kind of project.

It's also a common fallacy that writing code significantly less buggy than what we put up with today needs heavyweight processes that cost more. There are many people in many industries who do it every day and have the metrics to prove it.

> Unless you choose to use bleeding-edge beta functionality, your sites will still work just fine.

Sure, but unless people are going to use that bleeding edge technology, why push out new versions of the browser with the corresponding functionality changes? It would be far safer to push only such security patches and bug fixes in existing functionality as are necessary, and make functional changes at a pace that allows for the whole industry to develop sustainably and actually take advantage of the improvements.

> Can you point out what FF5 has done that's so terrible and backwards-incompatible?

Well, so far it seems to have broken quite a few plug-ins and it looks like some of the typography engine bugs are back again, but I've only been using it for two days so I haven't yet tried it with most of the projects I'm involved with.

In any case, shipping one version without breaking anything is nothing to be proud of. You have to ship every version without breaking anything, and neither Firefox nor Chrome has a good track record in that department.


I really wish you had your email address in your profile.


Sorry, I like my pseudonymity shield in this sort of discussion. It makes conflicts of interest far less likely, given that I have a number of professional interests and possibly not everyone I work with would like my honest views on this subject. (I don't think that matters as long as the work I do for them is my best effort at what they're paying me for, but they might disagree.)

Should I ask why you wanted my e-mail address...?


Because I'm seriously considering devoting some of my meager resources to forking and long-term maintaining the Firefox 3.6 and 4-5 versions, and wanted to get some non-public feedback on that from someone with the same problems I have.


Well, the nice thing about a pseudonym is I can tell you the same things here that I'd say privately. :-)

FYI in case you didn't know, it looks like they are planning to maintain the 3.x branch for a while, but 5 is being treated as the next step from 4 so they won't be maintaining a 4.x series independently.

Also FWIW, some of the specific problems encountered by projects I've worked on actually crept into the later 3.x series, notably the ones involving text rendering problems and the ones involving Java applets.

So I guess that's a vote from me for "probably more valuable uses for that much of your time", but I suppose YMMV depending on what specific issues you've encountered.


> I really don't, at least not on the scale it was a couple of years ago. I expect there are a handful of organisations who are still stuck in IE6 land, but it's no longer the limiting factor it used to be.

And we can all be thankful for that.

> I guess I lack sympathy because, ironically, the vendors are still using old and underpowered tools and casual processes for their development, which is why we need so many security patches. Usually, those tools start with a C and end with the word "compiler". Often, the development processes are trendy, "Agile" things that emphasize prototyping and pushing rapid updates over clear specs and systematic designs, and that treat techniques like unit testing as some sort of divine correctness guarantee instead of a back-up to good basic development practices.

I suppose you could use HotJava or Lobo if you want a browser not written in C(++). Really, if you think that Firefox is poorly written and managed, why not move to something else? I feel like software companies are always under fire for using outdated techniques and poor quality control and whatever else, and yet these complaints seem so often to come from people who haven't produced anything on the scale of the software they criticize.

> Well, you reap what you sow. If you don't invest in tools and processes that can build robust software that requires little maintenance, I'm going to bitch if you don't invest in supporting your existing software either.

What projects have you worked on that had 5MM lines of code? Considering how complex a modern browser is, I'm not sure the bug rate is especially high. How many people use Firefox every day and rarely encounter bugs? Seems pretty robust to me.

> We've had programming languages much better suited to application development than C and its derivatives for many years now, and there is really no excuse for still writing things like networking software using a model that has a terrible risk/benefit ratio for that kind of project.

I'm sure you'll be looking into Lobo, then.

> It's also a common fallacy that writing code significantly less buggy than what we put up with today needs heavyweight processes that cost more. There are many people in many industries who do it every day and have the metrics to prove it.

I'm not really sure why you're going down this path, since I don't think I ever said anything about this, but I'll bite. What industries are producing significantly less-buggy code at the same (or similar) cost?

> Sure, but unless people are going to use that bleeding edge technology, why push out new versions of the browser with the corresponding functionality changes? It would be far safer to push only such security patches and bug fixes in existing functionality as are necessary, and make functional changes at a pace that allows for the whole industry to develop sustainably and actually take advantage of the improvements.

Safer? Sure. More expensive? That, too. Maintaining multiple branches is not free. Mozilla has been playing that game for a long time, and I think they're sick of it. I doubt their decision was made indiscriminately.

> Well, so far it seems to have broken quite a few plug-ins and it looks like some of the typography engine bugs are back again, but I've only been using it for two days so I haven't yet tried it with most of the projects I'm involved with. > In any case, shipping one version without breaking anything is nothing to be proud of. You have to ship every version without breaking anything, and neither Firefox nor Chrome has a good track record in that department.

If they can't ship without introducing functional bugs, then I agree they have issues they need to work out. The plugin thing isn't really acceptable, either. If you're going to have other people build on your product, you absolutely need to support them.


> I feel like software companies are always under fire for using outdated techniques and poor quality control and whatever else, and yet these complaints seem so often to come from people who haven't produced anything on the scale of the software they criticize. [...] What projects have you worked on that had 5MM lines of code?

I think you know that I'm not going to answer that when I'm posting under a pseudonym, but I've worked on some reasonably large-scale projects. I doubt anything I've written ultimately has as many end users as Firefox, but I'm definitely in both the millions-of-lines club and the millions-of-users club, probably multiple times by now, and I've worked on projects where faults can cause Very Bad Things to happen.

> I'm not really sure why you're going down this path, since I don't think I ever said anything about this, but I'll bite. What industries are producing significantly less-buggy code at the same (or similar) cost?

My point is that if you're going to write software with a huge cross-section for malware to attack, such as a web browser, and you choose to write it using decades-old technology, where it is difficult to write it securely, then it's hard to take seriously any complaints about how hard it is to maintain that code and keep up with security patches.

As for other industries, obvious ones would be defence, aerospace, finance, and infrastructure management. When was the last time you saw a plane fall out of the sky because of software failure, or your electricity supply cut off because the management software for the national grid crashed? (There are also oddities like TeX, but then that was written by Donald Knuth, so it's hardly representative.)

Software developers in these industries typically work to different standards than those developing consumer products, and yes, the up-front costs are typically significantly higher. The thing is, by the time you account for the time it takes to fix bugs post-release compared to fixing it at an earlier stage in development, the difference in the total effort isn't such a high factor. And at that stage, you still haven't counted the benefit to society (and to your PR) of not having perhaps millions of users disrupted by each serious bug.

The bottom line is that as an industry, we are mostly stuck in the past, but we are there only because of inertia and non-technical factors, not because of engineering merit. As I noted before, this is a rather ironic argument in a discussion about trying to move people onto newer and better browser technologies, but quite apt: it shows how little progress we make if all the better technologies live in their own little worlds.

> If you're going to have other people build on your product, you absolutely need to support them.

Exactly. I think that's the key point that I and a few other people in this thread have been trying to make.


> I think you know that I'm not going to answer that when I'm posting under a pseudonym, but I've worked on some reasonably large-scale projects. I doubt anything I've written ultimately has as many end users as Firefox, but I'm definitely in both the millions-of-lines club and the millions-of-users club, probably multiple times by now, and I've worked on projects where faults can cause Very Bad Things to happen.

Fair enough, but I stand by what I said. Most of the people who criticize large software projects have never worked on a similar scale. Nor have most of them worked on projects with similar performance needs. It's all fine and dandy to criticize the choice to use C++ when you've never had to write a large performant program (not that this necessarily applies to you).

> My point is that if you're going to write software with a huge cross-section for malware to attack, such as a web browser, and you choose to write it using decades-old technology, where it is difficult to write it securely, then it's hard to take seriously any complaints about how hard it is to maintain that code and keep up with security patches.

So what's the alternative? To write it in Java, so it's measurably slower and feels even slower than it measures? To write it in C# so it only runs on Windows reliably? To write it in Smalltalk (yeah, right)? To write it in Haskell so that only a handfull of people have the skills to do the job? Oh, and let's not forget all of those have major pieces written in C that would still be attack vectors.

> As for other industries, obvious ones would be defence, aerospace, finance, and infrastructure management. When was the last time you saw a plane fall out of the sky because of software failure, or your electricity supply cut off because the management software for the national grid crashed? (There are also oddities like TeX, but then that was written by Donald Knuth, so it's hardly representative.)

Oh, come on. I've worked in the Defense industry. The processes are heavier and I dispute that the quality of the resultant code is better. It's also a ton of C. If most of it were attached to the Internet, people would find holes all over it. Defense software most certainly has bugs, sometimes serious ones. The aerospace industry has similar issues. Tons of C and C++. If you're talking about NASA, let's not forget that Spirit landed on Mars and went into fault mode due to software problems. Finance? Seriously? How about the fact that a few months ago the DJIA lost a trillion dollars in value for half an hour due to a software malfunction?

On the other hand, the space shuttle software was a massive success, with relatively few bugs. It was also written in an extremely heavy-weight process that cost many times more per line than typical software development. And that's okay, because it's the space shuttle, but it's not realistic to expect this of normal software, and it's incorrect to claim that it doesn't cost more. It also has far fewer lines of code than most commercial aircraft systems, because NASA knows that simpler is generally safer. Fewer features mean less complexity, but people don't want a simple browser, because that means NCSA Mosaic.

A major difference between most of the things you mentioned and Firefox is that the things you listed are not constantly exposed to the Internet. The bugs in Firefox that matter are security bugs that don't affect Airbus 380s because A380s aren't browsing random Internet sites. The machines that control the electric grid don't have wide open ports that random attackers can reach. The attack surface of Firefox is huge, and that's why more security bugs are discovered there than, say, Visa's payment centers. High-reliability systems are also systems. When Firefox fails, you see it and curse at Mozilla. When Visa's software fails, you probably never see it, because you get rerouted to a different machine. When an Airbus has a machine failure, the backup kicks in, or worst case the pilots go manual. That's redundancy that Firefox cannot reasonably provide. (Also, when Visa's software messes up, you just reswipe the card and never realize what actually happened.)

> Software developers in these industries typically work to different standards than those developing consumer products, and yes, the up-front costs are typically significantly higher. The thing is, by the time you account for the time it takes to fix bugs post-release compared to fixing it at an earlier stage in development, the difference in the total effort isn't such a high factor. And at that stage, you still haven't counted the benefit to society (and to your PR) of not having perhaps millions of users disrupted by each serious bug.

I'm going to disagree here. For one, I think you're just handwaving when you claim that the cost to fix bugs balances out the cost of a much heavier process. I don't think that's true, and I think the value proposition for most consumer software makes this clearly a bad idea. For another, you're comparing software that does one relatively unchanging thing against software that is constantly growing. Yeah, the electric grid is pretty reliable. It's also done the same job for decades. Firefox hasn't even existed for a decade (unless you count Netscape Navigator, which is iffy because so much was rewritten). Spending twice as long to build something to get a little more reliability can be a pretty good tradeoff when you'll use it nearly unchanged for 20 years. It's a pretty bad idea when you're going to be actively developing the next version as soon as the current one drops.

> The bottom line is that as an industry, we are mostly stuck in the past, but we are there only because of inertia and non-technical factors, not because of engineering merit. As I noted before, this is a rather ironic argument in a discussion about trying to move people onto newer and better browser technologies, but quite apt: it shows how little progress we make if all the better technologies live in their own little worlds.

I don't think we're stuck in the past. The industries you mentioned use the same languages and often older ones. I think our engineering and processes are much better than the were a decade or two ago.

> Exactly. I think that's the key point that I and a few other people in this thread have been trying to make.

And I don't dispute that at all. My point is that frequent drops and reliable support are not mutually exclusive. Whether Firefox will deliver both is a good question, but it certainly could at least in theory.


FWIW, I've spent a substantial chunk of my career working with C++, including on heavily mathematical code where C++'s performance did make it a suitable tool for the job. That taught me that advocates of other languages sometimes exaggerate their performance claims.

It also taught me that most software that "needs to be written in C++ for performance reasons" really doesn't. Most of the old arguments against VM-based languages, so-called scripting languages, and functional languages have been out of date for years now as technologies like JIT compilation have matured. As the coming generations of processors introduce more parallel processing power but don't speed up sequential execution much more, languages that provide a natural model for expressing parallel algorithms or for auto-parallelisation behind the scenes will become even more advantageous.

You mentioned some of the aerospace projects but you automatically said that it wasn't realistic to expect similar quality from "normal software". That is the mindset that I think needs to change in our industry. For example, every study I've ever read about Cleanroom Software Engineering has shown that its up-front development costs (both time and money) are typically within a factor of 2 or 3 of a "normal" software development process (often much closer) and while it doesn't completely eliminate bugs you do consistently see bug rates at least an order of magnitude lower for Cleanroom jobs. Over the lifetime of most projects, the total development+bug fixing time is certainly in the same ballpark at that point. There is no evidence that such code is any less maintainable in the face of changing requirements than code developed using any other process (on the contrary, the systematic and carefully documented structure gives you a great start) and the morale of the development team is usually noticeably higher (because they are spending most of their time building interesting stuff instead of fixing bugs that should never have been there).

Elsewhere, there is telecoms control software in the world, written in Erlang, that has been operating continuously for years with a few seconds of downtime in total since it went live. That's some absurd number of 9s of reliability, because the software architecture is fundamentally designed to be fault tolerant.

I guess I'm just trying to say that we shouldn't assume today's routine commercial practice is the most efficient, reliable way of doing things. We know, beyond any reasonable doubt, that it isn't. As an industry, we allow ourselves to be held back by non-technical issues like the availability of ready-trained staff, because development groups are too tight to provide training to improve their people's skills, and by preconceptions that say languages or development processes or software design principles that aren't today's mainstream must be too hard for everyday development tasks outside of niches where quality really matters.

I'd have a lot more sympathy for development groups that struggle to maintain shipping code in the face of evolving security threats and the like if those groups didn't shoot themselves in the foot, stick a noose around their necks and then tie their hands behind their back before they kicked the stool away.


> It also taught me that most software that "needs to be written in C++ for performance reasons" really doesn't.

Most software doesn't need to be written in C++ for the simple reason that most software doesn't actually need to be highly performant. Beyond that, many applications can be written sufficiently performant in other languages. However, I'm not convinced that for software such as a browser, where performance is key to the success of emerging web technologies, anything but C/C++ is really going to do the job. There's a lot of evidence that various languages are almost as fast as C or C++, but rarely as fast. That "almost" can add up in complex programs, especially when you're talking about programs that have to run other programs (i.e. Javascript). If you disagree, what language do you believe would be appropriate for building a cross-platform web browser?

> You mentioned some of the aerospace projects but you automatically said that it wasn't realistic to expect similar quality from "normal software". That is the mindset that I think needs to change in our industry. For example, every study I've ever read about Cleanroom Software Engineering has shown that its up-front development costs (both time and money) are typically within a factor of 2 or 3 of a "normal" software development process (often much closer) and while it doesn't completely eliminate bugs you do consistently see bug rates at least an order of magnitude lower for Cleanroom jobs. Over the lifetime of most projects, the total development+bug fixing time is certainly in the same ballpark at that point. There is no evidence that such code is any less maintainable in the face of changing requirements than code developed using any other process (on the contrary, the systematic and carefully documented structure gives you a great start) and the morale of the development team is usually noticeably higher (because they are spending most of their time building interesting stuff instead of fixing bugs that should never have been there).

That's not really what I said. Expecting commercial products to use the same heavyweight process that aerospace uses is not realistic. It is not reasonable to expect Mozilla to spend 6 to 9 months to implement the same features that Microsoft or Google develops in 3. This is a strategy for loosing the entire market. If the quality bar needs to be raised, it must be done more efficiently than by adopting "defense-grade" process.

Sure, when a code failure causes a missile to detonate inside a fighter jet, you can afford the additional cost (and why not, Uncle Sam is footing that bill). But when a code failure results in a browser restart, you can't justify the extra development effort. For much lower effort you can build a system that says "oops", saves state, and restarts right back to where the user was. Indeed I think all the major browsers do that now, though I'm not sure, because I can't actually recall the last time I actually had a browser crash on the Desktop.

I also disagree with your assertion that developers are happier on teams that practice heavyweight development processes. I've never heard anything but the opposite. Spending hour upon hour writing and refining specs is hell. It is not coding, and most coders don't like doing it to excess. Experienced coders should also know that a lot of it is wasted time, because inevitably half of the assumptions made turned out wrong. Some planning is a good thing. Heavyweight processes are something else entirely. Extra specs and documentation might help reduce bugs, but probably not nearly so much as more/better testing.

> Elsewhere, there is telecoms control software in the world, written in Erlang, that has been operating continuously for years with a few seconds of downtime in total since it went live. That's some absurd number of 9s of reliability, because the software architecture is fundamentally designed to be fault tolerant.

This is a false comparison. I could point out that Firefox has been running (some version, somewhere) continuously for almost a decade, and that would almost be more reasonable. At least then we'd be comparing a bunch of machines to a bunch of machines. Telecom software is not magically bug-free. Quite the opposite, languages like Erlang are designed with failure in mind. Machines fail. Networking devices fail. Antennas fail. Software fails. Rather than asserting that the solution is to write better software, Erlang is an acknowledgement that the solution is a better system. The software isn't better in the sense of having fewer bugs or using a DoD development process. It simply expects failure. When Erlang encounters an error, it can retry the operation, restart the process, or move to another machine. Firefox will retry, it will restart if necessary, and it is adding things like process separation for crash-prone plugins. These things are a net gain for users, but the reality is that faults still happen, and the gains in quality here do not come from fewer bugs but from better response to those bugs.

> I guess I'm just trying to say that we shouldn't assume today's routine commercial practice is the most efficient, reliable way of doing things. We know, beyond any reasonable doubt, that it isn't. As an industry, we allow ourselves to be held back by non-technical issues like the availability of ready-trained staff, because development groups are too tight to provide training to improve their people's skills, and by preconceptions that say languages or development processes or software design principles that aren't today's mainstream must be too hard for everyday development tasks outside of niches where quality really matters.

I've never said that today's practices are the most efficient, but going backwards in time to adopt DoD-style development is a move in the wrong direction. Spending 3 times as long on a given development effort may yield some small quality gains, but the overall effect is a loss of value. A product that has 10% fewer meaningful bugs but has 66% fewer necessary features is a failure. The move forwards is to build in more fault tolerance. When you run Firefox, it uses some fault-tolerant techniques already, and will hopefully add more. When you execute a search on Google, it uses fault-tolerant techniques. These are the same techniques that other high-reliability industries use. Stuff still breaks, but they recover. Our industry is not trailing the state of the art here. We're doing the same things. The state of the art can always improve, but that doesn't mean that we are doing a bad job now.

The security of Microsoft's Windows line went way up in the wake of all the XP exploits. It was certainly not a move towards heavy-weight development process that made these gains. In fact everything I've heard has indicated that Microsoft has moved the other direction toward "agile" development. What changed was the focus. When security is top priority, it unsurprisingly gets better. If you want fewer bugs, hire more great testers and have them work closely with developers. Give your developers security training. Hire security experts. Use the best tools you can get. And adopt a culture that places bug-fixing first on the list. But don't saddle your developers with an antiquated development system, and especially don't try to say that this system is somehow leading-edge when the industry has already tried it (numerous times).

> I'd have a lot more sympathy for development groups that struggle to maintain shipping code in the face of evolving security threats and the like if those groups didn't shoot themselves in the foot, stick a noose around their necks and then tie their hands behind their back before they kicked the stool away.

Meh. This is just being hateful.


Our posts are getting very long going point-by-point, so I won't address everything you've said individually now. However...

On the efficiency issue, IMHO you're still too focussed on the up-front cost, when it is the long term efficiency that really counts. It doesn't matter if it takes 6 months to develop something properly instead of 3 months to hack it, if the hackers then spend another 6 months patching bugs before anyone can use it.

Even if that weren't true, do we really believe that a tri-monthly release cycle is necessary to compete in the browser market today? It takes longer than that for new technologies to be used in production projects, even by the most die-hard of bleeding edge early adopters. Wouldn't you rather code to a well-defined spec and know that it really was going to work in everyone's browser, even if it took another three months for the your favourite new feature to be well-specified so you could use it? We've waited years for things like HTML5 and CSS3, so I think we could wait another few weeks to get them right!

Regarding the morale of developers, again, I think you're confusing heavyweight processes generally with my example of Cleanroom in particular. There are plenty of studies that show developers do enjoy working within that process; try a Google Scholar search for Cleanroom Software Engineering and read a few of the papers. Likewise, you also seem to be assuming that Cleanroom is heavy in the sense of being all about writing specs and management overhead, which suggests to me that you've never actually tried it to see what it feels like in practice.

This is exactly the sort of preconception and "I just don't believe it" reaction I think we need to overcome with evidence if our industry is going to improve its performance. Bizarrely, we seem to be spending far more time today worrying about issues like TDD and pair programming, which have much less proven benefit and get highly variable feedback from those who have used them.

I can't help but observe that those ideas are very accessible to a typical journeyman programmer versed in OO languages today, while shifting to something like a Cleanroom process or an Erlang style software architecture or a more declarative programming style as functional programming languages promote requires understanding concepts that for most people are radical and unfamiliar. We fear what we do not understand, and that is holding us back.

A final example on this:

> Extra specs and documentation might help reduce bugs, but probably not nearly so much as more/better testing.

On the contrary, working to good specs with a robust peer review culture is a highly effective quality strategy with an excellent RoI, and more than stands up to any test-based approach I've seen. The evidence for this is overwhelming, if you choose to seek it out. Again, I refer you to Google Scholar, or a good bookshop. But again, you have to be willing to look at a culture shift and trying something with a completely different philosophy to what you do today. You typically won't find this stuff in Cleany McCoder's Internet Echo Blog.

> Spending 3 times as long on a given development effort may yield some small quality gains, but the overall effect is a loss of value. A product that has 10% fewer meaningful bugs but has 66% fewer necessary features is a failure.

Perhaps, but what if it's more like a 10% overhead to better than halve the number of bugs (as in one small academic comparison I just quickly looked up on Google Scholar)? What if a factor of 2-3x in the up-front development costs really cuts your bug rate from 20 bugs/KLOC not to 18 bugs/KLOC but right down to about 1 bug/KLOC (not unusual for Cleanroom with established teams in industrial practice), with all the resulting savings in maintenance costs later as well as the user-visible improvement in quality?

I'm picking Cleanroom in particular here just because I happen to know a little about it, but my point isn't that we should all use Cleanroom. My real point is that there are different processes and design techniques and languages and tools out there, some of them very different to typical industrial practice today, and some of them objectively performing much better.

If any software group (I'm not just digging at Firefox, it's an industry-wide problem) wants to complain about maintenance costs and how it's impossible to do much better on quality without silly overheads, I think they should take a look outside the mainstream with an open mind before they make too many bold claims. In the meantime, claims about how it's too hard to both release quickly and maintain basic security patching on other branches (and here I am digging at Firefox's new release strategy) all sound a bit hollow.


For the process efficiency, I simply disagree with your assessment. I'm not aware of any unbiased works that indicate that using a heavyweight process results in even slightly higher quality without significantly slowing time to market. In fact it seems that there are almost no unbiased works in these areas. It's difficult to evaluate processes in general, because you're actually comparing teams. Most of the results I've seen have come from someone pushing their process, whether it's agile, or heavyweight, or whatever. There is a lot of anecdotal evidence that indicates faster iteration yields higher quality and more features, though. It's telling that so many of the industry leaders have moved that direction.

Yes, I think browsers need frequent releases in order to be competitive. I don't know if 3 months is the magic number, but it beats the pants off 1 year. Early standards proposals are refined inside the browsers that ship them first. This is one of the reasons HTML has been so successful, and it's what HTML5 has tried to continue: standardize what works. The opposite, where a committee pushes out a standard and everyone then tries to lamely implement it, is a broken process. Early shipping is a prerequisite for well-formed standards. Shipping the first beta standards also gives a browser a competitive edge, because they can then argue in favor of their particular flavor of implementation becoming the standard, and they already have that implemented. Late shipping loses developer mindshare. "Oh, I guess I have to install Chrome to try the latest CSS widgetmagicwaffles."

I'll look into the cleanroom process you're describing, but so far you've made it sound unattractive. It seems like you've been describing a very heavyweight development process. As for TDD and pair programming, I haven't said anything about those. Pair programming is completely orthogonal to process weight, and I think TDD is overhyped. I feel its primary benefit is that it gets a bunch of unit tests written that help avoid regressions.

The problem with working with good specs is that they take a long time to get done, and they inevitably have mistakes that have to be corrected during dev, often major mistakes. While you're building this massive spec, all your engineers are sitting idle, or going brain dead doing spec reviews. I'm in favor of high-level specs, and the larger the project, the more specific the spec, but I don't for a moment believe that a complete spec is a worthwhile thing. Let's be honest, this is the waterfall model: gather requirements -> build insanely large spec -> implement to spec -> verify implementation -> maintain

I'll look into cleanroom, because it's not something I'm very familiar with, but I have my doubts. The "complete spec" model is extremely expensive. It makes sense for avionics. It makes a lot less sense for desktop software.


Well, you're obviously not going to take my word for it, nor would I want you to, so please do go and read up on some of those other ideas. You're quite right that it takes more up-front work in a process like Cleanroom, but some of that can be repaid in practice in quicker testing cycles before shipping and in keeping delays to fix bugs fewer and shorter. But as I said before, please don't get too attached to Cleanroom or any particular figures I've mentioned. They're just examples, to try to demonstrate that some popular assumptions don't always stack up when faced with the facts.


If you don't learn from history, what are you planning on learning from?

Debian solved this sort of problem by forking Firefox to produce Iceweasel, which is a stable FF version plus security patches.

There's nothing to prevent someone from doing the same for a Windows build of Firefox. There might even be a plausible business plan in providing support and updates. The releases would be free, but you can charge a fee for integrating and writing new patches, plus .msi installer scripts, plus answering questions. Cygnus was doing this for a bunch of GNU projects up until their merger with Red Hat.


"Education programs, documentation updates, communications all are planned."

Couldn't you just fire all employees who need education programs and documentation updates for a browser update?

I know it sounds harsh and will get me downvotes, but think about it?

Maybe there are legit employees who are unable to grasp a new browser. But maybe for them a full blown browser is the wrong solution anyway. Perhaps somebody should step in and create dummy browsers that only have a couple of buttons for the limited set of activities those users are capable of performing. Seriously: the point of apps with nice UI is to make documentation superfluous. So instead of creating documentation nobody wants to read, create apps that make the docs superfluous.

It could be some kind of "browser generator".


One indeed might wonder at the people who need education programs for a browser.

seriously, the only major ui innovation since netscape 2.x? 3.x? was the merging of url bar and search bar.


You're thinking about it wrong. There are support documents with screenshots, self-help videos and things like that. All these things have to be redone when there are major visual changes to the browser.


What makes the web great is its constant change and innovation. How is it that a 2 person startup can deal with it's users changing browsers whenever they want, but a giant corporation can't? When your browser is beholden to a corporate dev cycle what you get is Internet Explorer.

What exactly would be wrong with a test system that included support for a variety of browsers? Perhaps they could automate this test suite. Any process that made your dev system more agile would be a step forward in this turbulent economy. The current rate of change in the world is going to be increasing exponentially for the foreseeable future. Those well adapted to this environment will prosper those who aren't will perish.

Furthermore, "faced with deciding which is more important: security updates or the critical production web application needed to manufacture your product is not a happy place to be." if you're running the same browser to browse the web and do process control you're asking for XSS/CSRF attacks against your process control system, setup some special link on the desktop that launches a custom browser, then you can keep that browser stable and have an up to date browser for the rest of the internet. Personally, if I was charged with securing a process control system I would have it entirely disconnected from the rest of the corporate network. Only systems that need to access the process control should be connected to that network. We live in the world of stuxnet.


> What makes the web great is its constant change and innovation. How is it that a 2 person startup can deal with it's users changing browsers whenever they want, but a giant corporation can't?

That two-person startup consists of two hackers who eat, sleep, and fucking breathe Web technologies. They have bookmarklets for six different sets of Web documentation, and they can quote chapter and verse from any of them. They work 18-hour days because they love their job.

The giant corporation's line-of-business app team consists of an IT guy, an accountant who owns a couple of dogeared books on JavaScript, and maybe a couple of contractors. They work 8-hour days, only half of which is actually devoted to the application they're working on, and they don't care about standards as long as they can finish writing that Web app and get back to their actual job.

I'm certainly exaggerating a bit, but the reason is because that giant corporation's business is almost certainly not writing Web apps for a living.


It appears to me that corporate wants to repeat mistakes of past. Is it really Microsofts fault that IE6 won't die?

Why is IE8 still around? Microsoft, of all companies, has tried to support legacy software for as long as possible, and often longer than possible, by reversing EOL'd announcements. The fact you can still run most 10+ year old software on a modern OS from Microsoft illustrates this.

Look where Microsoft is today, I certainly don't wonder why that is.

Corporations define policies and procedures for software releases internally. They find what is stable, slap a stamp of infinity approval on it, and forget about it. If this is to continue, there will be an IE6 problem again in a few years, only this time it will be FireFox that is being blamed.

Browsers need to be updated often, this is only going to become more important as the web advances. If this breaks your stuff, if your intranet falls apart as a result, more than likely, this is the fault of your intranet, not that of the vendor software. If it is the vendor software, in a few days, a patch will auto update.

As a corporation, you can chose to fix your intranet, put in the time and money to repair your mistakes, or not. Doing so will benefit you moving forward, and eventually, your intranet will be modern enough that these issues will be no more than a small inconvenience. It is not the problem of a browser vendor to stagnate their software because you want to allow your own software to stagnate.

The source is available, you have options. You can fork your own, keep the software where you want it, and patch it internally. If you don't like that option, contribute back to the current version, and help move things along. If you don't like that, write a check to someone to do this for you. I would bet that Firefox will be more than happy to backport for the right price.

You don't get to have your cake and eat it too, especially at the expense of the rest of the internet using world.

All the suggestions in the comments of the linked article lean on FireFox developing different versions and augmented policies to somehow continue to support an old version to aid in keeping crappy intranet's working. Why should FireFox spend a moments thought on making a Corporate Legacy Infinity Support Humping Along version just to appease the corporations inability to progress.

Sure, it is a lot of work to roll out 500,000 updates to FireFox. Move to Chrome and that won't be an issue, you are always up to date. Or, maybe there is an auto-update mechanism to FireFox for that scale. But doing that, removes the need for a large staff to take months to years to plan a browser update. That sentence is so insane, having a team to plan and document the upgrade of a browser!

People will find small portions of their jobs are irrelevant. Progress is like that. One day, many of our jobs will be replaced by robots. There must have been resistance to the copy machine, and all those secretaries no longer had to hand type a letter from the boss. Carbon copy paper may have had equal resistance. Progress happens, sometimes at the expense of jobs. Stay on top of your industry, and change with it, and your job should be secure. Don't get comfortable with the fact that your job is hanging on by a thin thread of aged policy in a corporate environment that changes slowly. That will be forced to change, as we are seeing now.

Corporate IT workers are demanding to turn FireFox into the IE6 problem. This has come at the expense of the entire internet using public in the past. FireFox is not willing to allow that to happen.

Version numbers are going to stop mattering. When you check the version number in the future, it will say "current" or "needs updating". Embrace this, and your intranet will be modern and more bulletproof. Don't, and I hope you are confident there are other aspects of your business that do not rely on technology so you will still be relevant in some other field. Someone will always need to put more paper in the copy machine, maybe you can create a procedure for making sure that all the copy machines waste a page per print. Your job is safe for at least your lifetime.


>"The fact you can still run most 10+ year old software on a modern OS from Microsoft illustrates this. Look where Microsoft is today."

38th largest company in the US in terms of revenue.

$18 billion in profits puts them 4th behind Exon-Mobile, Chevron, and ATT.

[http://money.cnn.com/magazines/fortune/fortune500/2011/full_...]


And vast majority of corporate desktops.


> Why is IE8 still around?

On XP, you can't upgrade to IE9. The question that naturally follows is: Why is XP still around?


> Is it really Microsofts fault that IE6 won't die?

IE6 stayed around for so long because people built in-house apps using its non-standard technologies and found themselves locked in when the standards and everyone else moved on.

So rather like almost every major functional change in Firefox or Chrome within the last year, then.

(That's not fair? Really? Take a look at your CSS and see how many -moz- or -webkit- entries it contains. And if it doesn't contain any, did you need a bleeding edge browser anyway?)


Not seeing some rounded corners or box shadows is different than not being able to log in because the site used activex.


And if rounded corners and box shadows was all you lost, that would be a valid argument. But almost every new feature in recent Firefox and Chrome releases -- all these vague, unspecified things that the advocates in forum discussions this week keep telling us will drive the web forward -- is currently as non-standard and browser-specific as ActiveX ever was. You can no more build a future-proof system on the new technologies around HTML5 and CSS3 today than you could build one around ActiveX back in the IE6 days, and we all know how that ended.


Yup, you summed up everything I was contemplating saying. Well done.


I used to do IT in the Navy and participated in upgrade processes like this, and this is exactly why I didn't go work at a government software contractor like most of my shipmates did after we got out.


I think this is a challenge especially for open source projects. I'd rather write documentation then have to maintain an old, deprecated codebase.


Stuff changes. Corporates need to get used to that.

Add to that the motivation for buying software is usually either a bonus for completing a project or cash backhand from the vendor rather than on the merits of a tool.

Also, I know a certain large company that used paper envelopes for asset tracking. They bought in a fantastic electronic system to handle it and all the staff sabotaged it because: A) it made them accountable ... B) it made people visible of what they were doing.

TBH I think most corps are deranged as are any groups of humans with N > 5.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: