Forgot your password?
typodupeerror
Microsoft Programming

"Side By Side Assemblies" Bring DLL Hell 2.0 433

Posted by kdawson
from the long-nightmare-is-not-over dept.
neutrino38 writes "This is an alert for all developers using Microsoft Visual Studio 2005. At the beginning of January, Microsoft issued a security fix for Visual Studio 2005 forcing the use of new dynamic libraries (DLLs) by all applications compiled with this IDE. Basically, applications compiled with Visual Studio 2005 will not work anymore on an ordinary (non-dev) PC unless the newer DLLs are installed. And we found out that this is true on fully updated PCs. I just posted some details and some suggested fixes." Read below for some more background on Microsoft's so-called "side by side assemblies."


For those unfamiliar with the Microsoft world, native microsoft applications written in C++ rely on dynamic libraries. Two of them are infamous: MSVCRT.DLL and MFCxx.dll. Because of software evolution and security fixes, multiple versions of these DLLs were often present in the system, causing application instability. Where Linux implemented a simple suffix notation on the dynamic libraries, Microsoft created a new beast in 2001: the Side By Side assembly. These are basically DLLs with a companion XML file that identifies them. The XML file contains a digital signature and when the system binds these DLLs dynamically to an application, it checks that the signature of the DLL matches the DLL itself. When everythings runs well, this is pretty transparent. But when issues arise, it becomes excruciatingly difficult to troubleshoot and fix. DLL hell is not over.
This discussion has been archived. No new comments can be posted.

"Side By Side Assemblies" Bring DLL Hell 2.0

Comments Filter:
  • by csueiras (1461139) on Sunday October 04, 2009 @04:48PM (#29637951)
    Microsoft FTL.
  • Speaking as a user (Score:5, Insightful)

    by Anonymous Coward on Sunday October 04, 2009 @04:52PM (#29637987)

    Speaking as a user, can we get statically linked libraries? I don't care if it's dependency hell or DLL hell. I want self-contained applications.

  • by Anonymous Coward on Sunday October 04, 2009 @04:55PM (#29638017)

    Guess what .NET VM is written in.

  • by igomaniac (409731) on Sunday October 04, 2009 @05:05PM (#29638085)

    Everybody who developes applications for the Windows platform should know that you need to include the merge module for the C/C++ runtime libraries in your installer. You've just been luck so far that other applications have installed the DLL's you needed for you. Try your app the way it is on a clean install of Windows XP without the service packs and see how well that goes :P

    In fact the SxS assembly system in windows is the only real way out of DLL hell, much better than the versioning scheme for shared libraries used in Linux. Get your facts straight before posting.

  • by Anonymous Coward on Sunday October 04, 2009 @05:07PM (#29638105)

    This has been heavily debated in comments in the Visual C++ blog:
    http://blogs.msdn.com/vcblog/archive/2009/08/05/active-template-library-atl-security-updates.aspx

    Unfortunately, the VC++ team doesn't seem to understand what's wrong with pushing out a critical update through automatic updates that silently updates dependency requirements. I've personally seen projects that were suddenly broken by this update and the ensuing confusion that resulted.

  • by Tony Hoyle (11698) <tmh@nodomain.org> on Sunday October 04, 2009 @05:13PM (#29638147) Homepage

    Indeed.. this is a non-story. If the submitter had distributed his app correctly it would have worked out of the box. Instead he decided to rely on a specific version of a DLL being installed on the target system, then blames Microsoft when it all goes to hell.

  • by petes_PoV (912422) on Sunday October 04, 2009 @05:16PM (#29638169)
    Now that memory is so cheap and disk space even cheaper, do we still need the small process sizes that dynamic linking brings?
    Would it be worth burning more RAM (although in an on-demand paged system, there's obviously no need to have your entire processl resident) to get rid of the problems associated with incompatible versions of libraries. Just go back to statically linking everything, so yo only ever need 1 binary - as all the routines it will ever call are already part of it.
  • by Tony Hoyle (11698) <tmh@nodomain.org> on Sunday October 04, 2009 @05:17PM (#29638181) Homepage

    No, because that's what sxs solves. You can have multiple versions of the MSVCxxx libraries installed and each app can find the correct one.

    It's similar to the library versioning that unix does, except instead of changing the filename each library has a manifest containing its version number and a checksum of the library, and the loader knows how to fing the right one.

    This is a complete non-story written by someone who doesn't test applications on a clean system prior to distribution, then wonders why it doesn't work.

  • Re:Also... (Score:5, Insightful)

    by mwvdlee (775178) on Sunday October 04, 2009 @05:18PM (#29638193) Homepage

    You do understand that DLL Hell exists exactly because version X and Y DO have huge differences?

  • by Tony Hoyle (11698) <tmh@nodomain.org> on Sunday October 04, 2009 @05:22PM (#29638239) Homepage

    If they were statically linked you'd have way more than 11gb of applications..

  • by QuoteMstr (55051) <dan.colascione@gmail.com> on Sunday October 04, 2009 @05:23PM (#29638243)

    But the other side is that the OS is massive.

    It'd be more massive if everything were statically linked.

    Remember, shared libraries didn't come first. The world began with static libraries, and shared libraries came later. There were good reasons for the switch, and those reasons apply even today.

  • by JohnFen (1641097) on Sunday October 04, 2009 @05:34PM (#29638329)

    In fact the SxS assembly system in windows is the only real way out of DLL hell, much better than the versioning scheme for shared libraries used in Linux.

    Better than the Linux scheme + proper shared library design? How? I've done extensive work with both, and the SxS scheme seems like like a gigantic, fairly ugly hack to me (albeit not as ugly as a lot of other hacks) and Linux's scheme, while not perfect, seems much more elegant and reliable.

    I'm not trolling or picking a fight, I really want to know.

  • by heffrey (229704) on Sunday October 04, 2009 @05:34PM (#29638331)

    Yeah SxS works a treat. No more dll hell. Great for servicing too. The problem here is moronic devs not shipping the libraries that they link against. MS would be castigated if they didn't fix security holes. Why oh why does kdawson think this is a return to dll hell? Does he actually know what SxS is? Does he even have experience of windows development?

  • by QuoteMstr (55051) <dan.colascione@gmail.com> on Sunday October 04, 2009 @05:35PM (#29638343)

    Honestly there are now very problems which can't be solved more quickly and far more effectively in managed code. The difference is even bigger when talking multithreaded code.

    Writing "managed" code has nothing to do with using sane concurrency abstractions. You can do one without the other. Hell, you can easily write buggy managed code that relies on raw semaphores and deadlocks often, and you can write elegant message-passing C++ code. The key is the abstraction, not the runtime mechanism.

  • by Anonymous Coward on Sunday October 04, 2009 @05:36PM (#29638345)

    So how is this better? How is it better that when a library has a vulnerability, one has to go over every single application in their system because they all decided to bundle their own, to be sure it won't end up affecting him/her? Not saying you're wrong, maybe I am missing something about what SxS means, but I'm not convinced it is a better scheme.

  • by igomaniac (409731) on Sunday October 04, 2009 @05:49PM (#29638421)

    Why do you think it's a hack? I mean, the manifest files used by the SxS assembly system are much more expressive than the three digits used by libtool versioning to tell which shared libraries can be loaded for a specific process. Also note that two DLLs loaded into a process can reference different major versions of the same third DLL without a name clash (leading to two versions of it being loaded), while that's AFAIK not possible with shared libraries.

    http://www.freesoftwaremagazine.com/books/agaal/building_shared_libraries_once_using_autotools

    The SxS system also has some additional security since it uses signatures for the DLLs when loading your process, so it's much harder for a hacker to replace the library you're using behind your back (by setting LD_LIBRARY_PATH for example).

  • by Cyberax (705495) on Sunday October 04, 2009 @05:54PM (#29638453)

    Have you tried to install DLLs without using MSI?

    It's not so easy with NSIS, for example. And don't get me started on shared DLL usage counters...

  • Then don't bother (Score:3, Insightful)

    by OrangeTide (124937) on Sunday October 04, 2009 @06:00PM (#29638501) Homepage Journal

    everyone having their own DLL would be the same as just statically linking everything. you'd have tons of code duplicated and loaded. have no easy way to patch common code system wide.

    People suffer DLL hell because it is better than not using DLLs at all.

  • by QuoteMstr (55051) <dan.colascione@gmail.com> on Sunday October 04, 2009 @06:01PM (#29638503)

    The SxS system also has some additional security since it uses signatures for the DLLs when loading your process, so it's much harder for a hacker to replace the library you're using behind your back (by setting LD_LIBRARY_PATH for example).

    Funny, it's only proprietary software authors that think this way. Over here in the free world, application flexibility is seen as a Good Thing. LD_* hacks might not be the most elegant way to implement certain functionality, but the approach certainly makes hard things possible.

    And again, the SxS signing approach doesn't actually add any real security. Someone wanting to modify an application will find a way to do it regardless of any special "don't modify me" bits the application might contain.

    (Go ahead and moderate this troll. That doesn't make it any less true.)

  • by igomaniac (409731) on Sunday October 04, 2009 @06:06PM (#29638547)

    And again, the SxS signing approach doesn't actually add any real security. Someone wanting to modify an application will find a way to do it regardless of any special "don't modify me" bits the application might contain.

    You think public key signatures of the executable and it's dependencies is not real security? ... Then what is?

  • by nstlgc (945418) on Sunday October 04, 2009 @06:09PM (#29638569)
    If your installer doesn't make sure that is the case, you're an idiot, cause that's what installers are for.
  • by jma05 (897351) on Sunday October 04, 2009 @06:24PM (#29638685)

    Dynamic linking is being used because static linking has been denied as a choice in most of the current dev tools that matter on Windows. In Delphi, I had a choice of static and dynamic linking. I always chose static linking. Most Delphi users did the same. I didn't have that choice with VB6, Java, and .NET.

    Static linking is not bad. When smart linking, only the used routines from the library are bundled along, not everything in the library. When I used dynamic linking, to simplify installation for the user, I had to also distribute the full DLLs along (they would not be installed if they already exist on the target machine), even when I used only a small portion of their functionality. Consequently, my installers were always SMALLER when using static linking.

    If you are developing in-house applications, this is less of a concern though since you can reasonably assume their presence on the target machines; and because you will be using the same dev tools consistently. Dynamic linking is only efficient when the involved DLLs are relatively unchanging and necessary by many apps. This also works well in Linux where a package manager with dependency tracking is an assumed part of the system. Dynamic linking has its advantages and disadvantages. But it is not a solution that uncritically deserves its current dominance.

  • by bonch (38532) on Sunday October 04, 2009 @07:31PM (#29639075)

    If we're having to store different versions of libraries, the whole purpose of dynamic linking has already been defeated.

  • by JohnFen (1641097) on Sunday October 04, 2009 @07:46PM (#29639183)

    It seems like a hack to me because it's overengineering a solution to problem that seems to be caused by poor design choices. The more elegant solution is to make good design choices. Admittedly, Linux's solution is of the same vein, but at least it's not overengineered, and so it doesn't actively encourage poor programming practices.

    As others have pointed out, your DLL example is itself an example of suboptimal engineering. Your situation should never arise in the first place, and is an example of underlying problems. To make it work without fixing the underlying problems is a hack.

    But I'm fascinated by this part:

    The SxS system also has some additional security since it uses signatures for the DLLs when loading your process, so it's much harder for a hacker to replace the library you're using behind your back (by setting LD_LIBRARY_PATH for example).

    Why do you think this is a good thing? In my opinion,the flexibility of being able to replace shared libraries "behind an application's back" is highly desirable, and is one of the benefits of using shared libraries.

    If you app is performing a function that really needs to be as locked-down and secure as possible, then you shouldn't be using shared libraries or DLLs at all. Fortunately, there are very, very few types of apps where such concerns are valid.

  • Re:Also... (Score:2, Insightful)

    by Blakey Rat (99501) on Sunday October 04, 2009 @08:14PM (#29639327)

    And yet Office still boots twice as fast as OpenOffice on a typical computer, IIS and MS SQL are completely neck-and-neck with their competitors, and Outlook completely trounces its closest competitor, performance-wise.

    I mean, I completely understand what you're saying: having multiple copies of the same function/code block in memory is inefficient. But in practice, it doesn't seem to be hurting them anyway.

  • Re:Also... (Score:5, Insightful)

    by Niten (201835) on Sunday October 04, 2009 @08:15PM (#29639335)

    MS shouldn't really be allowing such poor practices. Why should my memory be eaten up by loads of DLL files that are nearly identical. Let's face it, there isn't going to be huge differences between version X and Y.

    Versions X and Y of a DLL will be flat-out incompatible if that DLL is written in C++ and the author has changed the number of attributes in an interface class (unless he uses tricks such as pimpl), or if he's added or removed any virtual functions.

    And the fact that Microsoft is so good at preserving application backward compatibility, even in the face of "poor practices", is frankly one of the main reasons that Windows is the #1 business desktop operating system in the world.

  • by fluffy99 (870997) on Sunday October 04, 2009 @08:27PM (#29639425)

    Microsoft did this intentionally. They deprecated the vulnerable version of the dll. You "solution" to the problem of your customers still running the vulnerable version of the VC dlls should be to either force them to upgrade or install the new dlls for them. Instead you decide the security is a hassle and undo the fix on your developer machine, so you can ignore the larger issue that you are building insecure software and you customers are running insecure computers. Fix the problem, instead of whining about it and continuing to crank out crappy .net software. How hard would it be to have your software check for the problem dll versions, and direct the customer to download/install the new version? Cripes, games do it all the time when they check what version of direct x is installed.

  • by roemcke (612429) on Sunday October 04, 2009 @08:32PM (#29639473)

    You think public key signatures of the executable and it's dependencies is not real security? ... Then what is?

    Security never ever works if the attacker and the one being attacked is the same person

  • by Blakey Rat (99501) on Sunday October 04, 2009 @08:34PM (#29639487)

    A thing that a LOT of Linux programmers (and a lot of programmers in general) seem to miss is this simple fact, bolded for emphasis:

    Most programmers suck.

    The very fact that you're here reading Slashdot means it's likely that you do not suck. Most programmers do. For most programmers, "cargo cult" programming is a way of life. The majority of programmers do not, and never will, fully understand pointers to the level where they would be able to re-create the C++ STL by themselves. Relevant to this discussion: most programmers don't know how linking works, they just hit the "Play" button on the IDE and off it goes. Most programmers have zero knowledge of user permissions or fast user switching, and see nothing wrong with writing their application data in the Program Files folder.

    Most programmers never, ever read the API documentation. Not only do they have no problem using deprecated functions, but they don't even know what functions are deprecated.

    And when their programs break because of this? They blame Microsoft! It's Microsoft's fault, always Microsoft's fault!

    Now the open source community might be lucky enough that it has no bad programmers. (I doubt it, but let's play along.) Good for you. Microsoft, unfortunately, isn't that way: one of their biggest challenges is to keep terrible programmers from constantly breaking their own apps and/or Windows itself.

    What I'm getting at here is that Microsoft's goal is to make programming for Windows as easy and hands-off as possible. Any solution to this problem that requires the programmer to fix their application is far inferior than a solution that works "automatically."

    The programmer who posted this topic didn't read Microsoft's documentation, and screwed up his application's installer so that it links directly to a specific library version DLL instead of to the version-agnostic DLL. He's one of the bad programmers I've been talking about, but to be fair: considering he's actually posting here, it's probably one of the best of the bad. Hopefully he'll take something away from this and Slashdot won't spend the entire thread bashing Microsoft for no reason, but I doubt it.

  • by Anonymous Coward on Sunday October 04, 2009 @08:49PM (#29639575)

    They're virtually identical. Linux encodes the version number in the filename, which is bad practice for the same reason that file extensions defining file types are bad practice, but it's human-readable. Windows encodes the version number in human-readable XML, and goes one step further by giving it a checksum.

    The one other visible difference is if you try to gaze into the internal implementation details of Windows by navigating to %windir%\SxS and then take a naive view of the size of the directory, you might get your panties in a twist. Now, granted, the SxS structure there also essentially encodes version information in the directory structure, not so different from Linux, but abstracted away by the OS.

  • by Anonymous Coward on Sunday October 04, 2009 @10:02PM (#29640019)

    Can you Brainfuck faggots stop bringing it up any time that a programming-related topic is discussed here?

    Yes, we already know about Brainfuck. Most of us have known about it for many years now. We don't need cockfools like you bringing it up so bloody often.

  • by terjeber (856226) on Monday October 05, 2009 @02:20AM (#29641251)

    Static linking is not bad ... my installers were always SMALLER when using static

    How easy it is to show that you do not understand anything at all... who cares what size your installers are? If everybody followed the "I'll just statically link everything" the average Windows computer would need 32G of memory just to function (exaggerated to make a point).

    DLLs are good, but they have problems. Static linking is bad for anything slightly more advanced than a "Hello World" application.

  • Re:.NET internals (Score:4, Insightful)

    by shutdown -p now (807394) on Monday October 05, 2009 @03:06AM (#29641491) Journal

    Frankly speaking, any sane person compiling C++ code for Windows will just statically link the standard library, rendering this a non-issue.

  • by shutdown -p now (807394) on Monday October 05, 2009 @03:25AM (#29641591) Journal

    Most coders just write against API and should not care about binary compatibility.

    A well-designed stable API should also ensure binary compatibility. That's almost a given with C APIs, fairly trivial to get with Java or .NET, and needs special attention but definitely possible with C++.

  • by Twylite (234238) <twylite@crypt.coCHICAGO.za minus city> on Monday October 05, 2009 @03:51AM (#29641729) Homepage

    I really hate this /lib stuff. I remember the first time I make a C binary executable with GCC. It worked find on my Linux box, so it must work fine on any other Linux box I thought. Wrong! Turned out I needed to apt-get a whole bunch of libraries...

    Seriously, all you are saying is that you didn't understand that your compiler was linking to a bunch of libraries, some of which were distributed with the OS and others were your responsibility to distribute when you created the application's setup/install package.

  • Re:Source Engine (Score:3, Insightful)

    by Xest (935314) on Monday October 05, 2009 @05:04AM (#29642077)

    "I was really pissed when I discovered that Microsoft had discontinued all versions of Visual Studio Express under the most recent one (2008, I believe?). I had to go and get a copy of VSE2005 off of bittorent since you could no longer download it from the Microsoft web site."

    Then you got pissed and used BitTorrent for absolutely no reason whatsoever, because Microsoft are touting Visual Studio Express more than ever nowadays, it's become a core product for them.

    http://www.microsoft.com/express/ [microsoft.com]

    What gave you the impression they'd discontinued them? Even on the standard Visual Studio pages links to express are and always have been clearly visible.

  • by gbjbaanb (229885) on Monday October 05, 2009 @05:37AM (#29642197)

    "It works fine for me" said the MS developer.

    I put it down the the decline of Microsoft, I've been working as a MS dev for the past 15+ years, and since Bill left (a coincidence, I feel) the company has started a steady decline - wasting their money on frippery, attempting to get a new growth market, screwing with established systems in place of selling new stuff, and generally trying desperately to get your money off you. At least in the past, they were also focussed on making good technical stuff too.

  • by dr_dex (49357) on Monday October 05, 2009 @07:14AM (#29642637) Homepage

    Put it in stackoverflow.com as a question. And then just answer it yourself.

  • by radtea (464814) on Monday October 05, 2009 @08:58AM (#29643163)

    If everybody followed the "I'll just statically link everything" the average Windows computer would need 32G of memory just to function (exaggerated to make a point).

    Wanna provide some data for that claim? And any guesses as to the number of people who ship private versions of the DLLs they need to ensure their app behaves properly because it depends on bugs in that specific DLL version? In my experience that's a pretty common move for anything above a "Hello World" application.

    Also, a number of people the GP was responding to were making points about how long it would take to download stuff, so I hope you replied to them as well, pointing out what dummies they are.

  • Re:Also... (Score:2, Insightful)

    by Blakey Rat (99501) on Monday October 05, 2009 @09:58AM (#29643723)

    The problem is when MS gimp Windows to hell - they know the workarounds, so Office runs OK.

    Oh come on! You can't just say stuff like that without providing any evidence. Do you have any? At all?

    It seems to be that Windows performance, in general, is at worst on-par with the competition. Even Vista, if your computer meets the hardware specs. Now you're telling me that the only reason OpenOffice runs slow is because all of Windows is "gimped", and Office contains some kind of cheat code that "un-gimps" it? Seriously?

The test of intelligent tinkering is to save all the parts. -- Aldo Leopold

Working...