DLL Pickle

DLL Pickle (31-July-1998)

The DLL model is broken. It always has been.

For those of you not deep into the Microsoft model of software computing, here’s a quick primer on the idea of DLLs: a DLL (so called because that is usually its three-letter file name extension) is a Dynamic Link Library, a collection of programming routines compiled together into one file. DLLs typically live in the Windows system directory, and are used by many programs.

The concept behind the DLL is attractive. Why not take a routine needed by many tasks and write it only once, install it in a place where it is accessible by all, and then all programs can share it. The idea is good. Sharing common routines allows them to be written and debugged once. This enhances code reuse, lowering the cost of writing later software that use those routines, lowering the hard disks usage requirements, and simplifying the coding and testing of new software. In addition, if these DLLs are written so that they can be called from many applications, but be loaded into memory only once, significant savings in memory usage can be achieved. In theory, anyway.

This model would be great if these DLLs could be written once, written right, and supported everything needed of them from the beginning. But the reality of software development is that these routines may not be coded to meet all needs the first time out. Plain old bugs that cause the routine not to work correctly will be missed in the first release, showing themselves only when tested on new hardware platforms, or perhaps only under unusual circumstances.

So the routines need to be rewritten and revised. Microsoft anticipated this problem, and introduced support for a version number branded into each DLL, so that later versions of the DLL could be shipped with later software and replace the earlier, buggy, versions. But there are two bad assumptions in this scheme.

The first bad assumption is that any programmer could write routines that were “new and improved” and at the same time, fully backward compatible. Software developers, living in the real world where real software actually has to be delivered, did their best to work with the original DLLs, occasionally having to write work-arounds or depending on undocumented behaviors to get their software to work. When a DLL was revised, “fixed” in some people’s opinions, those behaviors changed and software broke. Who was at fault? The developer of the software or the author of the DLL? Certainly a point of debate.

The second bad assumption was that the version numbers would be controlled in such a way that a later version was always an improved copy of an earlier version, Mistakes happen, and DLLs were mistakenly misnumbered leading to a “rogue” DLL with an incorrect number taking over a system. If the number were greater than it should be, newer software would be unable to replace the supposed “later” DLL, and conflicts would occur when this older DLL failed to perform as anticipated.

The problem is most apparent in the “shared” or “common” DLLs that provide services used by a number of applications. Common DLLs, often prefaced by :”COM” such as COMCTL32.DLL and COMDLG.DLL, provide controls and dialogs used by many applications. A change in the error messages returned by these routines or the format of the information available from these DLLs crash systems left and right.

The Quality Assurance effort at Microsoft to avoid these problems is insufficient to the task. If a common control needs a change in order to work with a new version of Visual Fred, the DLL is tested with Visual Fred, but not necessarily tested with the hundreds of other products which may also be using this DLL. As Microsoft is in the business of selling development tools, it needs to test not only its own tools, but the software that can be generated with their tools, to ensure that changes to the DLL do not cause these products to fail as well. Any product which could use a DLL, not matter how obscure, needs to run a comprehensive test on that DLL, checking all of the functions, inputs and outputs, to ensure full compatibility. If the DLL cannot survive these tests, it should not be released upon an unsuspecting public.

An alternative to this difficult regime is one used on other operating systems. When a new common module is installed, the older modules are archived and tracked by the system, rather than being overwritten and obliterated. Each software module calling a routine in a DLL specifies not only the DLL name, but the version number with which it was tested. In this way, an application which depends on a particular behavior of a specific version of a DLL will get the behavior it desires. Obviously, this is a major change to the way DLLs are managed and would require significant changes at the operating system level to the way DLLs are used, but it does offer us the hope that our software will run – today, tomorrow, and in the future.

If you examine the typical Microsoft installation routine, you will see that it installs not only the application it is supposed to, but also the latest DLLs with which this software was developed. While fine in theory, the effect can be devastating. Software installed on the machine years ago may suddenly cease working, or worse, crash the system with a loss of data.

No other vendor is in the position of Microsoft to ship entire updates to the operating system with each new game, utility, development package they ship. Keeping up in a game with such lop-sided rules is difficult at best.

This is a situation that cannot be allowed to go on.

Powered by WordPress. Designed by Woo Themes

This work by Ted Roche is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 United States.