Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Microsoft Programming

"Side By Side Assemblies" Bring DLL Hell 2.0 433

neutrino38 writes "This is an alert for all developers using Microsoft Visual Studio 2005. At the beginning of January, Microsoft issued a security fix for Visual Studio 2005 forcing the use of new dynamic libraries (DLLs) by all applications compiled with this IDE. Basically, applications compiled with Visual Studio 2005 will not work anymore on an ordinary (non-dev) PC unless the newer DLLs are installed. And we found out that this is true on fully updated PCs. I just posted some details and some suggested fixes." Read below for some more background on Microsoft's so-called "side by side assemblies."


For those unfamiliar with the Microsoft world, native microsoft applications written in C++ rely on dynamic libraries. Two of them are infamous: MSVCRT.DLL and MFCxx.dll. Because of software evolution and security fixes, multiple versions of these DLLs were often present in the system, causing application instability. Where Linux implemented a simple suffix notation on the dynamic libraries, Microsoft created a new beast in 2001: the Side By Side assembly. These are basically DLLs with a companion XML file that identifies them. The XML file contains a digital signature and when the system binds these DLLs dynamically to an application, it checks that the signature of the DLL matches the DLL itself. When everythings runs well, this is pretty transparent. But when issues arise, it becomes excruciatingly difficult to troubleshoot and fix. DLL hell is not over.
This discussion has been archived. No new comments can be posted.

"Side By Side Assemblies" Bring DLL Hell 2.0

Comments Filter:
    • by Omni-Cognate ( 620505 ) on Monday October 05, 2009 @06:08AM (#29642611)

      Hands up anyone who knows what an "activation context" is! If you don't you have no idea what WinSxS does, how it does it or how to diagnose it when it goes wrong.

      In my opinion, WinSxS is a good mechanism, or at least as good as Microsoft could have made it while working within the constraints of history. However, WinSxS cannot be used in the real world without properly understanding it, and achieving that understanding is very painful indeed. The MSDN documentation is piecemeal, appears incomplete and inaccurate in a few places and lacks a proper overview. I think the only reason I properly twigged what activation contexts are about is that I had recently written a mechanism that operated on similar principles (a thread-local stack of context-related information).

      I wrote a Wiki page at work describing what (I think) WinSxS's motivation is, how it works and some of the problems it suffers from. I'd like to put it somewhere on the public internet - any suggestions? It should ideally be somewhere wiki-like where people can correct its inevitable inaccuracies without me having to maintain it, but I'm not sure it's appropriate for wikipedia.

  • This is C++-only, right? Cuz I develop .NET code all day in VS2005, and it works very happily on all sorts of messed up machine configurations.
    • .NET internals (Score:3, Interesting)

      by NoYob ( 1630681 )
      I don't know. Let's say you're developing in C# or VB and you make calls to a library that is really a bunch of C++ classes and methods with a C# or VB wrappers. Then, I'd assume, you would indeed have this problem.

      Someone with knowledge of .NET internals care to comment?

    • Which version of .net? Which build of that version? No good trying to run code written on .NET 3.5 on .NET 2 for example.

      Same with other elements like DirectX.

      If your installer doesn't make sure that the right version (or later, if applicable) is installed you'll get exactly the same problems.

    • by Entrope ( 68843 ) on Sunday October 04, 2009 @05:16PM (#29638635) Homepage

      This can bite you in a lot of conditions. One of the canonical examples is memory allocation. For example, foo.dll allocates memory and passes the pointer to bar.exe. To operate safely, bar.exe has to pass the pointer back to foo.dll so it can be freed. Otherwise, foo.dll might be using -- say -- malloc() and free() from one version of the C runtime library, and bar.exe might be using malloc() and free() from a different version. Because the different DLLs will end up allocating from different arenas, you'll corrupt both if you malloc() using one and free() using the other.

      There's a reasonable argument that passing memory ownership without providing allocation functions is a bad way to design libraries. Unfortunately, some interface standards specify bindings that forbid providing that kind of deallocation function in the DLL. (I'm looking at you, CCSDS SLE! I still haven't forgiven you for inflicting this form of DLL hell upon me so many years ago.)

    • Re: (Score:3, Informative)

      by gbjbaanb ( 229885 )

      This particular issue is C++ only, but that's simply because MS has screwed the way things used to work for unmanaged binaries.

      You may still get this issue next time MS issues a patched version of any of the .NET libraries, imagine that your code will only run on .NET 3.5 SP1 patch 1234, then you'll see the exact same problem. I guess we're all used to installing all manner of .NET framework libs through WU so no-one sees this issue so far, but its only a matter of time.

      Of course, if binaries used the previ

      • Re: (Score:3, Informative)

        Its as though any developer who doesn't work directly for Microsoft is an enemy of the state (Of Microsoft).
        • by gbjbaanb ( 229885 ) on Monday October 05, 2009 @04:37AM (#29642197)

          "It works fine for me" said the MS developer.

          I put it down the the decline of Microsoft, I've been working as a MS dev for the past 15+ years, and since Bill left (a coincidence, I feel) the company has started a steady decline - wasting their money on frippery, attempting to get a new growth market, screwing with established systems in place of selling new stuff, and generally trying desperately to get your money off you. At least in the past, they were also focussed on making good technical stuff too.

    • by Anonymous Coward on Sunday October 04, 2009 @06:31PM (#29639077)
      I'm not sure what the authors of these posts were trying to accomplish, but there's some serious misinformation here. I've written a bunch of C++ apps on Windows using Visual Studio 2005, none of which rely on MFCxx.dll - that's only required when your application uses the MFC libraries. Also, the need for the redistributables for the C++ runtime has been around for a lot longer than this patch released in January (plus, it is now October), not exactly "news". There's a reason the redistributables come with Visual Studio from the get go, they aren't there for show. New redistributables are required for each version of Visual Studio, and even for each service pack (the alternative is to statically compile the runtime into the application which many people do). The requirement isn't because of the implementation of side by side assemblies, it's because changes have occurred in the C++ runtime which are not binary compatible. If developers who use Visual Studio 2005 for commercial products still aren't aware of this and need this sensationalist story on Slashdot to warn them, they have bigger problems.

      But when issues arise, it becomes excruciatingly difficult to troubleshoot and fix.

      The version information is written in plain text in the manifest. The files have names based on the version information in the WinSxS folder. If you get the error and notice the files aren't there, its fairly trivial to troubleshoot and fix.

      I'm not a fan of side by side assemblies, I just hate to see issues like this blown out of proportion as it obscures some of the real issues that developers face when developing for Windows (such as just about every bug filed on Microsoft Connect being closed as "by design" instead of being worked on or at the very least closed as "can't fix, compatibility issues", for example).

      • by shutdown -p now ( 807394 ) on Monday October 05, 2009 @02:10AM (#29641505) Journal

        You are 100% correct. The "problem" with those guys is that they did not even understand that they have to redistribute CRT DLL with their binaries even before the update. They got lucky there because on most systems, it is already there (I believe Vista has it out of the box, and probably some MS software will install it on XP - maybe even IE7/8?), so they got away with not distributing it. Now the update brought in a newer version of runtime, which of course the clients don't have, and they suddenly realize that they didn't distribute their dependencies properly. What does "DLL Hell" or side by side assemblies even have to do with it?

        And, if they find it so hard to comprehend the idea that they have to give users the DLLs they link against, they could always just compile CRT statically and be done with it.

  • Also... (Score:5, Informative)

    by SigILL ( 6475 ) on Sunday October 04, 2009 @03:52PM (#29637981) Homepage

    What might be interesting to note here is that the summary isn't everything there is to side-by-side (SxS) assemblies.

    Suppose you're building an application using two DLLs, let's call them A and B. Both depend on a third DLL named C. Now, suppose A uses C version X, and B uses C version Y. You're screwed, right? Not with SxS, since that allows multiple versions of C to be loaded. That's the real added value of SxS.

    All this is in theory of course, which as we all know is in theory equal to practice, but in practice is not...

    • A question .... (Score:3, Interesting)

      by taniwha ( 70410 )
      not having programmed for Windows for many years now what happens when these different versions of library C use different data structures or global variables?
    • Re:Also... (Score:4, Interesting)

      by jc42 ( 318812 ) on Sunday October 04, 2009 @05:50PM (#29638819) Homepage Journal

      Suppose you're building an application using two DLLs, let's call them A and B. Both depend on a third DLL named C. Now, suppose A uses C version X, and B uses C version Y. You're screwed, right? Not with SxS, since that allows multiple versions of C to be loaded. That's the real added value of SxS.

      Huh? Microsoft's compilers and linkers have allowed this for ages. The difference is that it's now a feature, not a bug.

      There was a study of MS's binary apps years ago (and I thought it was reported here, but I don't seem to find it in the archive), which analyzed the contents of the binary files. It was found that there were typically many copies of various library routines, sometimes 10 or more copies. They even had duplicated local data blocks. This was presented as a bug at the time, and an explanation of why MS's binaries required so much memory.

      But now this practice is called "side-by-side", and it's considered a feature. Which, of course, means that it never will be fixed, and users can look forward to even more bloated programs that contain copies of all the versions of a routine that have ever been released.

      It's sorta like extending the concept of a CMS into the binaries of apps. No longer will all those old versions be kept on the sidelines, off in obscure libraries. They will all be right there in the binary, ready to be loaded into memory if they are ever needed.

      Who was it that has congratulated the software industry for consistently undoing all the amazing hardware advances of the past decades by responding to bigger memories and faster processors by making the software use more memory and cpu cycles? SxS is a major step forward in this technique, and promises to lead to major increases in memory sales in the near future. And all it took was realizing that a practice that had been considered sloppy wastefulness of memory was actually a valuable new feature for backwards compatibility.

      • Re: (Score:3, Informative)

        by drsmithy ( 35869 )

        Who was it that has congratulated the software industry for consistently undoing all the amazing hardware advances of the past decades by responding to bigger memories and faster processors by making the software use more memory and cpu cycles?

        That would be all the people for whom software is a tool, not intellectual masturbation. The people who consider "how soon can you deliver", "how much will it cost" and "how long can you maintain it" substantially more important than "will it run on my 10 year old

  • Speaking as a user (Score:5, Insightful)

    by Anonymous Coward on Sunday October 04, 2009 @03:52PM (#29637987)

    Speaking as a user, can we get statically linked libraries? I don't care if it's dependency hell or DLL hell. I want self-contained applications.

    • by Darkness404 ( 1287218 ) on Sunday October 04, 2009 @04:00PM (#29638041)
      All that is just great until you realize that each application may be -huge-. If someone can make a small script that converts OGG files into MP3 files using both the OGG encoder and MP3 encoder that is already on my system, it saves me from downloading ~20 MB of extra files.
      • Re: (Score:3, Informative)

        by WarwickRyan ( 780794 )

        But the other side is that the OS is massive.

        c:\windows alone is 11gb, with \winsxs being around 6gb.

        Googling shows that \winsxs is where all these dll versions are being stored.

        I haven't got close to 11gb of applications installed (and I've got VS, SQL server + office on here).

        • by Tony Hoyle ( 11698 ) <tmh@nodomain.org> on Sunday October 04, 2009 @04:22PM (#29638239) Homepage

          If they were statically linked you'd have way more than 11gb of applications..

        • by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Sunday October 04, 2009 @04:23PM (#29638243)

          But the other side is that the OS is massive.

          It'd be more massive if everything were statically linked.

          Remember, shared libraries didn't come first. The world began with static libraries, and shared libraries came later. There were good reasons for the switch, and those reasons apply even today.

          • Re: (Score:3, Funny)

            Any now you've wasted another 638 of my bytes storing the extra pixels in your bold text, you insensitive clod!
          • by jma05 ( 897351 ) on Sunday October 04, 2009 @05:24PM (#29638685)

            Dynamic linking is being used because static linking has been denied as a choice in most of the current dev tools that matter on Windows. In Delphi, I had a choice of static and dynamic linking. I always chose static linking. Most Delphi users did the same. I didn't have that choice with VB6, Java, and .NET.

            Static linking is not bad. When smart linking, only the used routines from the library are bundled along, not everything in the library. When I used dynamic linking, to simplify installation for the user, I had to also distribute the full DLLs along (they would not be installed if they already exist on the target machine), even when I used only a small portion of their functionality. Consequently, my installers were always SMALLER when using static linking.

            If you are developing in-house applications, this is less of a concern though since you can reasonably assume their presence on the target machines; and because you will be using the same dev tools consistently. Dynamic linking is only efficient when the involved DLLs are relatively unchanging and necessary by many apps. This also works well in Linux where a package manager with dependency tracking is an assumed part of the system. Dynamic linking has its advantages and disadvantages. But it is not a solution that uncritically deserves its current dominance.

            • Re: (Score:3, Insightful)

              by terjeber ( 856226 )

              Static linking is not bad ... my installers were always SMALLER when using static

              How easy it is to show that you do not understand anything at all... who cares what size your installers are? If everybody followed the "I'll just statically link everything" the average Windows computer would need 32G of memory just to function (exaggerated to make a point).

              DLLs are good, but they have problems. Static linking is bad for anything slightly more advanced than a "Hello World" application.

              • Re: (Score:3, Insightful)

                by radtea ( 464814 )

                If everybody followed the "I'll just statically link everything" the average Windows computer would need 32G of memory just to function (exaggerated to make a point).

                Wanna provide some data for that claim? And any guesses as to the number of people who ship private versions of the DLLs they need to ensure their app behaves properly because it depends on bugs in that specific DLL version? In my experience that's a pretty common move for anything above a "Hello World" application.

                Also, a number of people th

        • Re: (Score:3, Interesting)

          by MioTheGreat ( 926975 )
          No. Winsxs is not 6gb. That's just explorer being stupid when it tells you how big it is. There are a lot of hard links in there.
      • Re: (Score:3, Informative)

        Also keep in mind you're looking at a huge RAM footprint as well, since multiple copies of the same library will be loaded per-application, whereas sometimes you can make optimizations for libraries shared across applications with paging magic.

        That said, I thought that OS X apps were statically linked (except with OS libs), and thus tend to be large, but reduce this issue since so much functionality is based on system libs. I could be wrong; I don't really ever work with non-linux systems.

        • by onefriedrice ( 1171917 ) on Sunday October 04, 2009 @06:26PM (#29639043)

          That said, I thought that OS X apps were statically linked (except with OS libs), and thus tend to be large, but reduce this issue since so much functionality is based on system libs. I could be wrong.

          Yeah, you are wrong. Mac OS X apps are definitely dynamically linked (the equivalent of so on Mac OS X is dylib). The reason Mac OS X apps can be larger than executables on other platforms is because they often contain machine code for multiple platforms (ppc, x86, x86_64). That only translates to a large footprint in storage.

          Now you know.

    • by Timothy Brownawell ( 627747 ) <tbrownaw@prjek.net> on Sunday October 04, 2009 @04:03PM (#29638069) Homepage Journal
      That's great until some common library needs to be updated for a security hole, and you have download 20 updates from different vendors (assuming they even provide updates...) rather than 1 item in Windows Update.
    • by moon3 ( 1530265 )
      self-contained applications

      TFA problem is easily resolved by statically linking the stuff. And the thing you pointed up is the Achiles heel of pretty much any *NIX system where the app dependencies are often much more widespread and the app itself is not self-contained, but rather disseminated over the whole system.
    • Re: (Score:2, Informative)

      by seneces ( 839286 )
      Applications can statically link the CRT with /MT or /MTd instead of the (default) /MD and /MDd. It's pretty common, and i've found that the actual increase to binary size is very small. It often cuts down on distribution size anyway, since that allows /OPT:REF (eliminate unreferenced code) to take effect. It'd be nice if the CRT was available on all systems by default and we didn't have to worry about it, but failing that, static linking is a *necessity* for anything that doesn't use a full installer.
  • Instead of referencing the .dll in \Windows\System\ why don't you reference a copy of the .dll in \Program Files\Your Stupid App\ ?

    Seems like a simple fix to me, though I'll admit most of my .dll experience comes from the asp.net/c# world.

    • by Anonymous Coward on Sunday October 04, 2009 @04:01PM (#29638053)

      That defeats the whole purpose of a DLL anyway. The thought was that you wouldn't have to "reinvent the wheel" and you could reuse code. However, Microsoft's failure to document their operating system's API thoroughly in a public manner led to developers relying on undocumented features that were later changed. Then, those applications needed older versions of those libraries and would install them over the newer versions. This, of course, crashed the newer applications. Ugh.

      • Re: (Score:3, Interesting)

        by edxwelch ( 600979 )

        I hate to break it to you, but there is quite a lot of this going on. For instance, if you were to look at the source code of every app that needs to decode png files (just as example), you would probably find only about 50% use the libraries that come with the OS, and the reasons why vary:
        * avoiding dll hell (as mentioned)
        * the app is cross platform
        * poor implementation of official dlls
        * politics (at one stage Microsoft made tied certain dlls to the installation of IE, even though they had nothing to do wi

    • Then don't bother (Score:3, Insightful)

      by OrangeTide ( 124937 )

      everyone having their own DLL would be the same as just statically linking everything. you'd have tons of code duplicated and loaded. have no easy way to patch common code system wide.

      People suffer DLL hell because it is better than not using DLLs at all.

  • by igomaniac ( 409731 ) on Sunday October 04, 2009 @04:05PM (#29638085)

    Everybody who developes applications for the Windows platform should know that you need to include the merge module for the C/C++ runtime libraries in your installer. You've just been luck so far that other applications have installed the DLL's you needed for you. Try your app the way it is on a clean install of Windows XP without the service packs and see how well that goes :P

    In fact the SxS assembly system in windows is the only real way out of DLL hell, much better than the versioning scheme for shared libraries used in Linux. Get your facts straight before posting.

    • by Tony Hoyle ( 11698 ) <tmh@nodomain.org> on Sunday October 04, 2009 @04:13PM (#29638147) Homepage

      Indeed.. this is a non-story. If the submitter had distributed his app correctly it would have worked out of the box. Instead he decided to rely on a specific version of a DLL being installed on the target system, then blames Microsoft when it all goes to hell.

    • by Gerald ( 9696 )

      How does that work for PortableApps or U3 packages?

    • by JohnFen ( 1641097 ) on Sunday October 04, 2009 @04:34PM (#29638329)

      In fact the SxS assembly system in windows is the only real way out of DLL hell, much better than the versioning scheme for shared libraries used in Linux.

      Better than the Linux scheme + proper shared library design? How? I've done extensive work with both, and the SxS scheme seems like like a gigantic, fairly ugly hack to me (albeit not as ugly as a lot of other hacks) and Linux's scheme, while not perfect, seems much more elegant and reliable.

      I'm not trolling or picking a fight, I really want to know.

      • Re: (Score:3, Insightful)

        by igomaniac ( 409731 )

        Why do you think it's a hack? I mean, the manifest files used by the SxS assembly system are much more expressive than the three digits used by libtool versioning to tell which shared libraries can be loaded for a specific process. Also note that two DLLs loaded into a process can reference different major versions of the same third DLL without a name clash (leading to two versions of it being loaded), while that's AFAIK not possible with shared libraries.

        http://www.freesoftwaremagazine.com/books/agaal/buil

        • by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Sunday October 04, 2009 @05:01PM (#29638503)

          The SxS system also has some additional security since it uses signatures for the DLLs when loading your process, so it's much harder for a hacker to replace the library you're using behind your back (by setting LD_LIBRARY_PATH for example).

          Funny, it's only proprietary software authors that think this way. Over here in the free world, application flexibility is seen as a Good Thing. LD_* hacks might not be the most elegant way to implement certain functionality, but the approach certainly makes hard things possible.

          And again, the SxS signing approach doesn't actually add any real security. Someone wanting to modify an application will find a way to do it regardless of any special "don't modify me" bits the application might contain.

          (Go ahead and moderate this troll. That doesn't make it any less true.)

          • Re: (Score:3, Insightful)

            by igomaniac ( 409731 )

            And again, the SxS signing approach doesn't actually add any real security. Someone wanting to modify an application will find a way to do it regardless of any special "don't modify me" bits the application might contain.

            You think public key signatures of the executable and it's dependencies is not real security? ... Then what is?

      • by Blakey Rat ( 99501 ) on Sunday October 04, 2009 @07:34PM (#29639487)

        A thing that a LOT of Linux programmers (and a lot of programmers in general) seem to miss is this simple fact, bolded for emphasis:

        Most programmers suck.

        The very fact that you're here reading Slashdot means it's likely that you do not suck. Most programmers do. For most programmers, "cargo cult" programming is a way of life. The majority of programmers do not, and never will, fully understand pointers to the level where they would be able to re-create the C++ STL by themselves. Relevant to this discussion: most programmers don't know how linking works, they just hit the "Play" button on the IDE and off it goes. Most programmers have zero knowledge of user permissions or fast user switching, and see nothing wrong with writing their application data in the Program Files folder.

        Most programmers never, ever read the API documentation. Not only do they have no problem using deprecated functions, but they don't even know what functions are deprecated.

        And when their programs break because of this? They blame Microsoft! It's Microsoft's fault, always Microsoft's fault!

        Now the open source community might be lucky enough that it has no bad programmers. (I doubt it, but let's play along.) Good for you. Microsoft, unfortunately, isn't that way: one of their biggest challenges is to keep terrible programmers from constantly breaking their own apps and/or Windows itself.

        What I'm getting at here is that Microsoft's goal is to make programming for Windows as easy and hands-off as possible. Any solution to this problem that requires the programmer to fix their application is far inferior than a solution that works "automatically."

        The programmer who posted this topic didn't read Microsoft's documentation, and screwed up his application's installer so that it links directly to a specific library version DLL instead of to the version-agnostic DLL. He's one of the bad programmers I've been talking about, but to be fair: considering he's actually posting here, it's probably one of the best of the bad. Hopefully he'll take something away from this and Slashdot won't spend the entire thread bashing Microsoft for no reason, but I doubt it.

    • Re: (Score:3, Insightful)

      by Cyberax ( 705495 )

      Have you tried to install DLLs without using MSI?

      It's not so easy with NSIS, for example. And don't get me started on shared DLL usage counters...

  • DLL hell never left (Score:5, Interesting)

    by eddy ( 18759 ) on Sunday October 04, 2009 @04:05PM (#29638093) Homepage Journal

    I upgraded my Fallout 3 installation yesterday. After patching, the game wouldn't run, returning some fairly obtuse message about import ordinals [google.com]. So I googled the message, and found out it's because the game now links against a newer version of "Microsoft(R) Games for Windows(TM) Live(TM)" whatever. Note that this wasn't some new patch, it's months old and yet this problem, which must realistically be hitting quite a few users, persists. This isn't something you get via Windows Update either, this is just some obscure 'distributable runtime' crap you should know you need?

    So let me repeat that: Super mainstream game on a super mainstream platform (Vista x64), no add-ons, I patch to the latest version and it won't start, nothing is mentioned at the developer's site.

    Now I recognize good old Bethesda again. Here's how they'd be able to repro: Fully updated Vista machine, install game from DVD, apply patch, notice it won't fucking run.

    I don't normally give much for the 'PC-gaming sucks' choir, but c'mon..

    • Re: (Score:3, Informative)

      by causality ( 777677 )

      I upgraded my Fallout 3 installation yesterday. After patching, the game wouldn't run, returning some fairly obtuse message about import ordinals [google.com]. So I googled the message, and found out it's because the game now links against a newer version of "Microsoft(R) Games for Windows(TM) Live(TM)" whatever. Note that this wasn't some new patch, it's months old and yet this problem, which must realistically be hitting quite a few users, persists. This isn't something you get via Windows Update either, this is just some obscure 'distributable runtime' crap you should know you need?

      So let me repeat that: Super mainstream game on a super mainstream platform (Vista x64), no add-ons, I patch to the latest version and it won't start, nothing is mentioned at the developer's site.

      Now I recognize good old Bethesda again. Here's how they'd be able to repro: Fully updated Vista machine, install game from DVD, apply patch, notice it won't fucking run.

      I don't normally give much for the 'PC-gaming sucks' choir, but c'mon..

      I had the same problem. Only, I run Fallout3 in Linux via WINE and there is apparently no way whatsoever to get xlive.dll to work in WINE. In addition, you do need the latest Fallout3 patches in order to install the expansions. Personally, I found it unacceptable that I would not be able to use any of the expansions merely because someone decided to add functionality that I never asked for, do not need, and will never use.

      I found a solution. There is a patch for Fallout3 that removes all Live functio

  • by Anonymous Coward on Sunday October 04, 2009 @04:07PM (#29638105)

    This has been heavily debated in comments in the Visual C++ blog:
    http://blogs.msdn.com/vcblog/archive/2009/08/05/active-template-library-atl-security-updates.aspx

    Unfortunately, the VC++ team doesn't seem to understand what's wrong with pushing out a critical update through automatic updates that silently updates dependency requirements. I've personally seen projects that were suddenly broken by this update and the ensuing confusion that resulted.

    • Re: (Score:3, Informative)

      I agree that updates to IDE & toolchain that introduce new dependencies for binaries produced by that toolchain shouldn't be silent. On the other hand, installing the application on a clean system (for every OS you support, and counting each major Windows version separately) and checking if it runs is one of the most basic tests that every new build of the product should go through. It's trivial to automate, too (if the installer is written properly and allows for silent installation) so there's no excu

  • by petes_PoV ( 912422 ) on Sunday October 04, 2009 @04:16PM (#29638169)
    Now that memory is so cheap and disk space even cheaper, do we still need the small process sizes that dynamic linking brings?
    Would it be worth burning more RAM (although in an on-demand paged system, there's obviously no need to have your entire processl resident) to get rid of the problems associated with incompatible versions of libraries. Just go back to statically linking everything, so yo only ever need 1 binary - as all the routines it will ever call are already part of it.
    • problems associated with incompatible versions of libraries.

      We'll never get rid of the problem with incompatible libraries until processes stop communicating with each other. What if two processes communicate over a shared memory segment where one version is using a data structure with an extra struct field? What about window message differences, drag-and-drop, and file formats? Sure, static linking might paper over some problems, but it won't free programmers from having to think about backwards compatibil

    • by Jahava ( 946858 ) on Sunday October 04, 2009 @05:09PM (#29638571)
      It's not just a matter of disk / process size. DLLs (and dynamically-linked libraries, in general) are also an important buffer between API [wikipedia.org] and ABI [wikipedia.org] differences that exist between OSes (Windows 7, Vista, XP, etc.) and OS versions (SP1, SP2, and even minor updates...). Dynamically-linked Libraries act as a buffer between the API / ABI layer and the application layer, allowing the OS to fundamentally change how it works (e.g., add a field to the Create a Window system call structure), update the associated DLL(s) (e.g., rewrite the "CreateWindow" function in core.dll), and have all of the applications installed on the system still work perfectly without needing a single update.

      Additionally, DLLs can enhance security. If there's a bug in MSVCRT.DLL, Microsoft can patch it and release it as an update. In a static world, every application that had that buggy code compiled into it would have to be rebuilt and redistributed.

      In a statically-linked world, every application would need to be rebuilt (on Windows, redistributed) every time an API / ABI change was released, as well as every time a bug was fixed. Furthermore, download sites would either have to release binaries for every API / ABI combination (that's a ton of releases per version) and deal with users downloading the wrong one, or do the open-source model and release the source, forcing the users to build the application on their system and rebuild every API / ABI update. And somehow I don't think the latter solution would fly with the Windows community.

      Like other posters have said, Microsoft's solution is actually not a bad one. Allowing multiple DLLs to be loaded simultaneously is not a pretty solution, but it's not a pretty problem that they have to solve, either. Advance with backwards-compatibility in mind as much as it makes sense to, and use SxS DLLs when it doesn't.

  • I read the linked article [com.com], and don't see anything exciting. How is this any different from the shared libraries that have been used in the Unix world for 23 years [faqs.org]? "Private assemblies" can be achieved with rpath or a simple manipulation of LD_LOAD_PATH (see Mozilla). And what does the asymmetric cryptography buy you over simply using the appropriate soname?

  • by heffrey ( 229704 ) on Sunday October 04, 2009 @04:34PM (#29638331)

    Yeah SxS works a treat. No more dll hell. Great for servicing too. The problem here is moronic devs not shipping the libraries that they link against. MS would be castigated if they didn't fix security holes. Why oh why does kdawson think this is a return to dll hell? Does he actually know what SxS is? Does he even have experience of windows development?

    • Re: (Score:3, Funny)

      by Rogerborg ( 306625 )
      If everyone ships their own copies of shared libraries, then shared libraries won't be a problem any more? You've cleaved that Gordian knot in twain.
  • Additional reading (Score:5, Informative)

    by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Sunday October 04, 2009 @04:48PM (#29638413)

    Everyone (even Windows programmers) should read Ulrich Drepper's piece on how to write shared libraries [slashdot.org]. (Warning: PDF.) (Likewise, even free software developers should read Raymond Chen's blog [msdn.com].)

  • by fluffy99 ( 870997 ) on Sunday October 04, 2009 @07:27PM (#29639425)

    Microsoft did this intentionally. They deprecated the vulnerable version of the dll. You "solution" to the problem of your customers still running the vulnerable version of the VC dlls should be to either force them to upgrade or install the new dlls for them. Instead you decide the security is a hassle and undo the fix on your developer machine, so you can ignore the larger issue that you are building insecure software and you customers are running insecure computers. Fix the problem, instead of whining about it and continuing to crank out crappy .net software. How hard would it be to have your software check for the problem dll versions, and direct the customer to download/install the new version? Cripes, games do it all the time when they check what version of direct x is installed.

  • First off, why on earth is the developer still using Visual Studio 2005? We're on Visual Studio 2008, SP1. That, right there, raises a red flag. If someone compiled something with an ancient version of gcc and found out it didn't work, when distributed, on more up to date Linux distributions, wouldn't you think that the appropriate response would be for our man to get his tools straightened out first?

    I would think that if the author shipped his system with a copy of the runtimes and had them install in his application directory, he would have no problem at all. The Windows DLL load order is application directory first, then, some other stuff, so his application should always have the right libraries, if he shipped them. In fact, I even think there's some sort of a doohickey that you can do to have Windows look for COM components first in your own directory before it looks for them in common areas. There's no need to have "DLL hell" at all, unless the developer really asks for it.

    Frankly, I doubt DLLs of the relatively small size of the CRT should even be used any more across different applications from different vendors.

    1. First, you cannot possibly test your application with all the different versions and patch versions of DLLs that are out there, because patch releases now are way too fast. Reliability, right now, not performance, is the pre-eminent problem in the software community.

    2. The static linker is now actually capable of removing unused code, from an image, it could not do that before.

    3. DLLs have to be relocated when an application loads them into its own process space, so you take a performance hit there.

    4. The Windows API has 95% of what you would need the C runtime to do. This isn't like Linux, where, you would die without libc trying to make heads or tales of syscalls and what not. On Windows, I struggle to think of a CRT call that could not be done nearly as simply as in SDK directly. For files there's CreateFile, WriteFile, etc. All of the basic string functions exist within the SDK, and the stuff to display formatted strings in the SDK is better than what the CRT gives you anyway. It's a bit more involved, but, there's articles out there on how to not have a CRT at all. In fact, applications that use the ATL and WTL frameworks even support not having the CRT code, just so you can write really, really tiny applications and gasp, COM components.

    • by JSBiff ( 87824 ) on Sunday October 04, 2009 @11:04PM (#29640725) Journal

      "The Windows API has 95% of what you would need the C runtime to do."

      Unless what you need the C runtime to do is to be cross-platform compatible. Then it has 0% of what I need the C runtime to do. The reason to have a standard c library, at all, was to make applications significantly more portable. That's why it's, I believe, part of the ANSI and ISO specifications of C, is it not? Sure, any vendor can create their own proprietary, non-portable, runtime library. I'm sure Microsoft would be delighted for you to use CreateFile, WriteFile, et al., because you've now tightly bound your C application to Windows. Of course, sometimes that's fine. Sometimes your app will be tightly bound by other things like DirectX, anyhow, so, might as well go native API's all the way.

      But, if I'm using the C runtime library, it's because I'm trying to write portable code.

      Wasn't one of the original goals of DLLs was to avoid lots of duplicate copies of the same runtime libraries littering up the hard drive, and RAM? It's my understanding that one of the 'advantages' of a DLL is that you could have 5, or 10, or 20 programs running, and they could all 'share' one copy of the DLL loaded in RAM? That if the DLL was already loaded in RAM by another program, it would also save a little bit of time that otherwise would have been required to load the DLL from disk, etc? I suppose, however, that since both hard drive space and RAM have gotten orders of magnitudes larger, faster, and cheaper than in the past, perhaps, for the sake of reliability, it does make sense to just go ahead and sacrifice some efficiency, as you suggest, by either having every program install their own directory-local copy, or even statically link.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...