Wednesday, June 28, 2006

Backwards compatibility

One of the biggest lessons I've learned from the ECMA process is how critical backwards compatibility is. When you have millions of users, the enormous weight of existing software holds you back from doing just about anything that breaks compatibility. An email from Brendan on the ES4 public mailing list (you have to subscribe to the mailing list to view the original email) crystallizes the difference:
...we are not breaking backward compatibility except in selected and few ways that actually improve the bugscape.
In other words, if you expect that a break in compatibility will cause fewer bugs than it fixes, then it's acceptable.

2 comments:

Anonymous said...

Well, backwards compatibility is very critical indeed. Once built, we often can't and won't be able to switch a complete infrastructure (as in technological, usability-wise, as well as policy-wise). Banks are still using their old Fortran-code that wasn't even suppose to make it to the year 2K. All modern operation systems are still using the C++ calling convention. We have the knowledge, we have the technology, yet we can't just upgrade everything at once.

So in a way, Backwards compatibility is also the worst enemy. It slows down progress until it bogs down completely. However good your idea is, if it's revolutionary its not going to succeed. Better make sure its has a good and solid foundation before it ever gets popular and people start to depend on it.

Any new technology starts as a toy. At some point, we'll become dependent. From this point on backwards compatibility will block any substiancial growth.

So perhaps, computer scientists are looking in the wrong direction. The current know-how and technology could offer much more. Yet in the everyday practicality we see little of all this potential. How did the world of theory and practise ever get so seperated? Perhaps the theorist (like yourself) should focus on a different problem.

I would formulate this problem as:
How do have not-so-great-solution-after-all x evolve to a-much-better-solution y?.'

Esspecially in programming languages you can often see this happening. Rather than switching to a pure OO system, most people switched from C to C++. Rather than switching from C++ to Haskell, most are switching from C++ to C# + Generics + LINQ.

The ideas will be incorporated, but the revolution will never take place. Backwards compatibility on all the different meta-levels rules with a iron fist.

Anonymous said...

Dave,

I'm surprised the committee is willing to break things, even with the reward of fixing something else.

Recently, the Ocaml team silently changed the behavior of a library function to conform to the documentation. The function had been there for ten years, just as it was. Then they decided to change it, on the rationale that it should conform to the documentation. The scale was something like they had changed a fold-left to a fold-right.

This broke a lot of programs needlessly (including ours) and we wasted time trying to figure out why one guy's build was behaving differently, before realizing it was the standard library. Worse yet, now we have to do a version check to see which version of the function is running (and Ocaml doesn't provide a reasonable facility for this, so we do string comparisons at runtime).

When the Ocaml developers realized that the function & documentation had been out of sync, they should have immediately deduced that either a) everyone was depending on the behavior and not the documentation, or b) no one had ever used the function. Either way, they should have added a new function, with a new name, to do the other kind of fold, and fixed the documentation of the existing function.

Anyway, I hope the ECMA team is not going to do that: break a lot of programs and then point to some areas where the language is cleaner. In some sense, language standards don't have bugs; they JUST ARE!