Slashdot stories can be listened to in audio form via an RSS feed, as read by our own robotic overlord.

 



Forgot your password?
typodupeerror
Microsoft Programming

"Side By Side Assemblies" Bring DLL Hell 2.0 433

Posted by kdawson
from the long-nightmare-is-not-over dept.
neutrino38 writes "This is an alert for all developers using Microsoft Visual Studio 2005. At the beginning of January, Microsoft issued a security fix for Visual Studio 2005 forcing the use of new dynamic libraries (DLLs) by all applications compiled with this IDE. Basically, applications compiled with Visual Studio 2005 will not work anymore on an ordinary (non-dev) PC unless the newer DLLs are installed. And we found out that this is true on fully updated PCs. I just posted some details and some suggested fixes." Read below for some more background on Microsoft's so-called "side by side assemblies."


For those unfamiliar with the Microsoft world, native microsoft applications written in C++ rely on dynamic libraries. Two of them are infamous: MSVCRT.DLL and MFCxx.dll. Because of software evolution and security fixes, multiple versions of these DLLs were often present in the system, causing application instability. Where Linux implemented a simple suffix notation on the dynamic libraries, Microsoft created a new beast in 2001: the Side By Side assembly. These are basically DLLs with a companion XML file that identifies them. The XML file contains a digital signature and when the system binds these DLLs dynamically to an application, it checks that the signature of the DLL matches the DLL itself. When everythings runs well, this is pretty transparent. But when issues arise, it becomes excruciatingly difficult to troubleshoot and fix. DLL hell is not over.
This discussion has been archived. No new comments can be posted.

"Side By Side Assemblies" Bring DLL Hell 2.0

Comments Filter:
  • by bit trollent (824666) on Sunday October 04, 2009 @03:55PM (#29638011) Homepage

    Instead of referencing the .dll in \Windows\System\ why don't you reference a copy of the .dll in \Program Files\Your Stupid App\ ?

    Seems like a simple fix to me, though I'll admit most of my .dll experience comes from the asp.net/c# world.

  • .NET internals (Score:3, Interesting)

    by NoYob (1630681) on Sunday October 04, 2009 @03:59PM (#29638027)
    I don't know. Let's say you're developing in C# or VB and you make calls to a library that is really a bunch of C++ classes and methods with a C# or VB wrappers. Then, I'd assume, you would indeed have this problem.

    Someone with knowledge of .NET internals care to comment?

  • by Anonymous Coward on Sunday October 04, 2009 @04:03PM (#29638067)

    But I always get a chuckle out of the clueless non-coders of slashdot tackling such discussions.

    Lol @ all the calls for statically linked libraries (yay its 1982 and I only run one app) or for each app to ship with its own version of the .dll's.

    Stories like this really do major harm to slashdots geek-cred. You people really expose yourself as poseurs.

  • DLL hell never left (Score:5, Interesting)

    by eddy (18759) on Sunday October 04, 2009 @04:05PM (#29638093) Homepage Journal

    I upgraded my Fallout 3 installation yesterday. After patching, the game wouldn't run, returning some fairly obtuse message about import ordinals [google.com]. So I googled the message, and found out it's because the game now links against a newer version of "Microsoft(R) Games for Windows(TM) Live(TM)" whatever. Note that this wasn't some new patch, it's months old and yet this problem, which must realistically be hitting quite a few users, persists. This isn't something you get via Windows Update either, this is just some obscure 'distributable runtime' crap you should know you need?

    So let me repeat that: Super mainstream game on a super mainstream platform (Vista x64), no add-ons, I patch to the latest version and it won't start, nothing is mentioned at the developer's site.

    Now I recognize good old Bethesda again. Here's how they'd be able to repro: Fully updated Vista machine, install game from DVD, apply patch, notice it won't fucking run.

    I don't normally give much for the 'PC-gaming sucks' choir, but c'mon..

  • by Anonymous Coward on Sunday October 04, 2009 @04:26PM (#29638271)

    .NET 3.5 uses the 2.0 CLR. If you don't use LINQ or WPF or a few other features, then your code will run on 2.0. Also, Visual Studio can target whatever version you want.

  • Re:Also... (Score:3, Interesting)

    by Anonymous Coward on Sunday October 04, 2009 @04:27PM (#29638279)

    .NET's strong requirements mean it no longer matters if "foo.dll version 1.5" breaks backwards compatibility with "foo.dll version 1.4", because the developer has the option to say at compile time "ONLY use foo.dll version 1.4". Consider this from the perspective of the end user, the application developer, and the dll vendor.

    The dll vendor is happy, because they can release new versions into the wild on their own schedule. Win.

    The application developer may or may not have been aware that the vendor released a new version of the dll. Fortunately the new DLL can't break the old code, so all regression testing that passed on the dev machine will pass on every end user machine, provided the end user didn't delete the original dll. Typically the the application developer has a license to redistribute the vendor's dll with the application; in that case running the setup application will restore perfect functionality. Win.

    The end user wants things to just work. Some "advanced" end users may know that the dll vendor released a new version that fixes some frustrating bug in the application, so they pressure the developer to release a new version of the application. Meanwhile, nothing broke. Win.

    Win. Win. Win.

    The only downside is that "advanced" users can't fuck with the application and try to make it use the wrong DLL. I see that as an added bonus.

  • by MioTheGreat (926975) on Sunday October 04, 2009 @04:30PM (#29638309)
    No. Winsxs is not 6gb. That's just explorer being stupid when it tells you how big it is. There are a lot of hard links in there.
  • by Entrope (68843) on Sunday October 04, 2009 @05:16PM (#29638635) Homepage

    This can bite you in a lot of conditions. One of the canonical examples is memory allocation. For example, foo.dll allocates memory and passes the pointer to bar.exe. To operate safely, bar.exe has to pass the pointer back to foo.dll so it can be freed. Otherwise, foo.dll might be using -- say -- malloc() and free() from one version of the C runtime library, and bar.exe might be using malloc() and free() from a different version. Because the different DLLs will end up allocating from different arenas, you'll corrupt both if you malloc() using one and free() using the other.

    There's a reasonable argument that passing memory ownership without providing allocation functions is a bad way to design libraries. Unfortunately, some interface standards specify bindings that forbid providing that kind of deallocation function in the DLL. (I'm looking at you, CCSDS SLE! I still haven't forgiven you for inflicting this form of DLL hell upon me so many years ago.)

  • Re:Also... (Score:4, Interesting)

    by Entrope (68843) on Sunday October 04, 2009 @05:25PM (#29638691) Homepage

    The real problem is C version Y not being backward compatible to C version X, leading to this idiocy of piling more and more complexity on top of a totally rotten mechanism.

    It might surprise you, but Microsoft isn't actually to blame here. Rather, the legions of incompetent programmers that wrote DLLs such as C are to blame. We'd call them idiots, but Microsoft calls them paying customers. Thus prompting them to design SxS and incorporate it in WinXP.

    Also, SxS is what made the restyling of the UI (through common controls version 6) technically possible.

    Microsoft takes backwards compatibility very seriously.

    They don't take it seriously. They don't even try for DLL compatibility within a product release. Can you really say with a straight face that you can new using MSVCR70.DLL and delete using MSVCR40.DLL, and get away with it? What about MSVCR80D.DLL and MSVCR80.DLL?

    I'd bet dollars to donuts (adjusted for inflation) that the majority of problems caused by that incompatibility are due to Microsoft's C and C++ runtime libraries and their tendency to allocate memory from different arenas. My coworkers and I mostly program for Linux, but those few who have spent much time coding on Windows see that variant cause problems more often than practically anything else. How often you see third-party libraries cause DLL hell (that is, libraries not written by the application author) where those libraries are written by someone other than Microsoft?

  • A question .... (Score:3, Interesting)

    by taniwha (70410) on Sunday October 04, 2009 @05:38PM (#29638763) Homepage Journal
    not having programmed for Windows for many years now what happens when these different versions of library C use different data structures or global variables?
  • Re:Also... (Score:4, Interesting)

    by jc42 (318812) on Sunday October 04, 2009 @05:50PM (#29638819) Homepage Journal

    Suppose you're building an application using two DLLs, let's call them A and B. Both depend on a third DLL named C. Now, suppose A uses C version X, and B uses C version Y. You're screwed, right? Not with SxS, since that allows multiple versions of C to be loaded. That's the real added value of SxS.

    Huh? Microsoft's compilers and linkers have allowed this for ages. The difference is that it's now a feature, not a bug.

    There was a study of MS's binary apps years ago (and I thought it was reported here, but I don't seem to find it in the archive), which analyzed the contents of the binary files. It was found that there were typically many copies of various library routines, sometimes 10 or more copies. They even had duplicated local data blocks. This was presented as a bug at the time, and an explanation of why MS's binaries required so much memory.

    But now this practice is called "side-by-side", and it's considered a feature. Which, of course, means that it never will be fixed, and users can look forward to even more bloated programs that contain copies of all the versions of a routine that have ever been released.

    It's sorta like extending the concept of a CMS into the binaries of apps. No longer will all those old versions be kept on the sidelines, off in obscure libraries. They will all be right there in the binary, ready to be loaded into memory if they are ever needed.

    Who was it that has congratulated the software industry for consistently undoing all the amazing hardware advances of the past decades by responding to bigger memories and faster processors by making the software use more memory and cpu cycles? SxS is a major step forward in this technique, and promises to lead to major increases in memory sales in the near future. And all it took was realizing that a practice that had been considered sloppy wastefulness of memory was actually a valuable new feature for backwards compatibility.

  • by edxwelch (600979) on Sunday October 04, 2009 @06:54PM (#29639237)

    I hate to break it to you, but there is quite a lot of this going on. For instance, if you were to look at the source code of every app that needs to decode png files (just as example), you would probably find only about 50% use the libraries that come with the OS, and the reasons why vary:
    * avoiding dll hell (as mentioned)
    * the app is cross platform
    * poor implementation of official dlls
    * politics (at one stage Microsoft made tied certain dlls to the installation of IE, even though they had nothing to do with browser functionality)

  • by JSBiff (87824) on Sunday October 04, 2009 @11:04PM (#29640725) Journal

    "The Windows API has 95% of what you would need the C runtime to do."

    Unless what you need the C runtime to do is to be cross-platform compatible. Then it has 0% of what I need the C runtime to do. The reason to have a standard c library, at all, was to make applications significantly more portable. That's why it's, I believe, part of the ANSI and ISO specifications of C, is it not? Sure, any vendor can create their own proprietary, non-portable, runtime library. I'm sure Microsoft would be delighted for you to use CreateFile, WriteFile, et al., because you've now tightly bound your C application to Windows. Of course, sometimes that's fine. Sometimes your app will be tightly bound by other things like DirectX, anyhow, so, might as well go native API's all the way.

    But, if I'm using the C runtime library, it's because I'm trying to write portable code.

    Wasn't one of the original goals of DLLs was to avoid lots of duplicate copies of the same runtime libraries littering up the hard drive, and RAM? It's my understanding that one of the 'advantages' of a DLL is that you could have 5, or 10, or 20 programs running, and they could all 'share' one copy of the DLL loaded in RAM? That if the DLL was already loaded in RAM by another program, it would also save a little bit of time that otherwise would have been required to load the DLL from disk, etc? I suppose, however, that since both hard drive space and RAM have gotten orders of magnitudes larger, faster, and cheaper than in the past, perhaps, for the sake of reliability, it does make sense to just go ahead and sacrifice some efficiency, as you suggest, by either having every program install their own directory-local copy, or even statically link.

  • Re:Also... (Score:3, Interesting)

    by DangerFace (1315417) on Monday October 05, 2009 @03:44AM (#29641991) Journal

    Office still boots twice as fast as OpenOffice on a typical computer ... in practice, it doesn't seem to be hurting them anyway.

    I have to disagree - OpenOffice on my Linux partition boots faster than Office on my Windows partition on the same computer. The problem is when MS gimp Windows to hell - they know the workarounds, so Office runs OK. OpenOffice just have to use trial and error and a whole lot of guesswork to rid themselves of bugs that only exist to give MS an advantage. I completely understand that the average Joe Sixpack doesn't care, but that's why MS is still managing to sell gimped OSs.

  • by Cassini2 (956052) on Monday October 05, 2009 @11:12AM (#29645567)

    That's not a Visual Studio C++ issue, it's the way Windows memory management works. No matter what IDE/compiler/CRT you use, memory allocated by one dll cannot be (reliably) freed by another. It has to be freed by the same dll that allocated it.

    If you use GlobalAlloc() to allocate the memory, then GlobalFree() always frees it. (I'm quoting from my nightmares here.)

    The problem happens in VC++. The new operator eventually calls malloc() which eventually calls GlobalAlloc(), through chains of function calls that are fairly non-obvious, unless you read the disassembly or the source. GlobalAlloc() is a based Windows function, so every DLL links to the same system DLLs. The new and malloc calls are in the Microsoft Visual C++ libraries. Those libraries are loaded on a per DLL/EXE basis. As such, different VC++ DLLs can link to different VC++ run-time libraries, containing identical (or nearly identical) new and malloc functions but with different data areas.

    Additionally, malloc() is optimized so it doesn't always call GlobalAlloc() whenever new memory is required. malloc() has it's own list of memory allocations, and that is where the problem is. A malloc() from one DLL with it's own data memory area knows nothing about another DLL's data memory area. As such, the free() call can't possibly succeed when the data was allocated in a different DLL.

    Unfortunately, the torture doesn't end there. There are only two ways around the problem. Firstly, you can never pass memory allocations across DLL boundaries. Unfortunately, for some applications this doesn't work, for example COM and ActiveX controls. Alternatively, you can create a new type of memory handler to handle inter-DLL memory allocations. Microsoft created the IMalloc API for this reason. However, it is impossible to make the IMalloc API work across all possible failure modes. Also, IMalloc is not used by default for either new, delete, malloc() or free(). As such, the IMalloc API does not completely solve the inter-DLL issues, and introduces new problems of its own.

    The IMalloc API is at the heart of COM, which is also at the heart of C#. Normally, C# might be a good language for soft real-time, long life systems. However, if any bug exists in any control using the IMalloc API, then all of the CLR can become unstable. As such, C# is the home of my biggest programming disaster ever. A program that is less reliable and runs thousands of times slower than the equivalent in C. In C/C++, some freedom exists to properly handle memory allocations, and data types are checked at compile time. In COM, it is almost impossible to understand all of the complexities of all of the memory allocations and data type conversions. As such, it is both easy to make mistakes with COM, and very difficult to work around them.

    ActiveX/COM/DLLs are the root source of many serious security and reliability issues inside Windows. Historically, reliability and security issues were at least traceable to an executable. Now, all bets are off. ActiveX is present in Windows Explorer, Office and Internet Explorer, making it very difficult to effectively lock down the system. As such, system requirements for "a solid application" on "Microsoft Windows" represent an oxymoron. The result of these contradictory requirements is the application separation. Big and small applications use web servers running embedded Operating Systems or Linux (printers, Google, Bing), and display the results on a web browser on Windows.

    In short, this bug is not a side effect of the Windows Memory Allocation API: GlobalAlloc() and GlobalFree(). It is a side-effect of DLLs under Microsoft Windows, and importantly the C Run-Time Library DLL implementation. As a result of the attempted workarounds, it has probably had as big of an effect on Windows as the old Intel 8088 segmented memory architecture.

I have never seen anything fill up a vacuum so fast and still suck. -- Rob Pike, on X.

Working...