"Side By Side Assemblies" Bring DLL Hell 2.0 433
neutrino38 writes "This is an alert for all developers using Microsoft Visual Studio 2005. At the beginning of January, Microsoft issued a security fix for Visual Studio 2005 forcing the use of new dynamic libraries (DLLs) by all applications compiled with this IDE. Basically, applications compiled with Visual Studio 2005 will not work anymore on an ordinary (non-dev) PC unless the newer DLLs are installed. And we found out that this is true on fully updated PCs. I just posted some details and some suggested fixes." Read below for some more background on Microsoft's so-called "side by side assemblies."
For those unfamiliar with the Microsoft world, native microsoft applications written in C++ rely on dynamic libraries. Two of them are infamous: MSVCRT.DLL and MFCxx.dll. Because of software evolution and security fixes, multiple versions of these DLLs were often present in the system, causing application instability. Where Linux implemented a simple suffix notation on the dynamic libraries, Microsoft created a new beast in 2001: the Side By Side assembly. These are basically DLLs with a companion XML file that identifies them. The XML file contains a digital signature and when the system binds these DLLs dynamically to an application, it checks that the signature of the DLL matches the DLL itself. When everythings runs well, this is pretty transparent. But when issues arise, it becomes excruciatingly difficult to troubleshoot and fix. DLL hell is not over.
Oh the beauty of fixing what was already working (Score:2, Insightful)
The most important concept in WinSxS (Score:5, Informative)
Hands up anyone who knows what an "activation context" is! If you don't you have no idea what WinSxS does, how it does it or how to diagnose it when it goes wrong.
In my opinion, WinSxS is a good mechanism, or at least as good as Microsoft could have made it while working within the constraints of history. However, WinSxS cannot be used in the real world without properly understanding it, and achieving that understanding is very painful indeed. The MSDN documentation is piecemeal, appears incomplete and inaccurate in a few places and lacks a proper overview. I think the only reason I properly twigged what activation contexts are about is that I had recently written a mechanism that operated on similar principles (a thread-local stack of context-related information).
I wrote a Wiki page at work describing what (I think) WinSxS's motivation is, how it works and some of the problems it suffers from. I'd like to put it somewhere on the public internet - any suggestions? It should ideally be somewhere wiki-like where people can correct its inevitable inaccuracies without me having to maintain it, but I'm not sure it's appropriate for wikipedia.
And by all developers you mean (Score:2)
.NET internals (Score:3, Interesting)
Someone with knowledge of .NET internals care to comment?
Re:.NET internals (Score:4, Insightful)
Frankly speaking, any sane person compiling C++ code for Windows will just statically link the standard library, rendering this a non-issue.
Re:.NET internals (Score:4, Funny)
Re: (Score:2)
Which version of .net? Which build of that version? No good trying to run code written on .NET 3.5 on .NET 2 for example.
Same with other elements like DirectX.
If your installer doesn't make sure that the right version (or later, if applicable) is installed you'll get exactly the same problems.
Re:And by all developers you mean (Score:5, Interesting)
This can bite you in a lot of conditions. One of the canonical examples is memory allocation. For example, foo.dll allocates memory and passes the pointer to bar.exe. To operate safely, bar.exe has to pass the pointer back to foo.dll so it can be freed. Otherwise, foo.dll might be using -- say -- malloc() and free() from one version of the C runtime library, and bar.exe might be using malloc() and free() from a different version. Because the different DLLs will end up allocating from different arenas, you'll corrupt both if you malloc() using one and free() using the other.
There's a reasonable argument that passing memory ownership without providing allocation functions is a bad way to design libraries. Unfortunately, some interface standards specify bindings that forbid providing that kind of deallocation function in the DLL. (I'm looking at you, CCSDS SLE! I still haven't forgiven you for inflicting this form of DLL hell upon me so many years ago.)
Re: (Score:3, Informative)
That's not a Visual Studio C++ issue, it's the way Windows memory management works. No matter what IDE/compiler/CRT you use, memory allocated by one dll cannot be (reliably) freed by another. It has to be freed by the same dll that allocated it. I'm not 100% sure, but I think you might also run into the same issue if you unload the dll and reload it before freeing the memory.
Re:And by all developers you mean (Score:5, Interesting)
If you use GlobalAlloc() to allocate the memory, then GlobalFree() always frees it. (I'm quoting from my nightmares here.)
The problem happens in VC++. The new operator eventually calls malloc() which eventually calls GlobalAlloc(), through chains of function calls that are fairly non-obvious, unless you read the disassembly or the source. GlobalAlloc() is a based Windows function, so every DLL links to the same system DLLs. The new and malloc calls are in the Microsoft Visual C++ libraries. Those libraries are loaded on a per DLL/EXE basis. As such, different VC++ DLLs can link to different VC++ run-time libraries, containing identical (or nearly identical) new and malloc functions but with different data areas.
Additionally, malloc() is optimized so it doesn't always call GlobalAlloc() whenever new memory is required. malloc() has it's own list of memory allocations, and that is where the problem is. A malloc() from one DLL with it's own data memory area knows nothing about another DLL's data memory area. As such, the free() call can't possibly succeed when the data was allocated in a different DLL.
Unfortunately, the torture doesn't end there. There are only two ways around the problem. Firstly, you can never pass memory allocations across DLL boundaries. Unfortunately, for some applications this doesn't work, for example COM and ActiveX controls. Alternatively, you can create a new type of memory handler to handle inter-DLL memory allocations. Microsoft created the IMalloc API for this reason. However, it is impossible to make the IMalloc API work across all possible failure modes. Also, IMalloc is not used by default for either new, delete, malloc() or free(). As such, the IMalloc API does not completely solve the inter-DLL issues, and introduces new problems of its own.
The IMalloc API is at the heart of COM, which is also at the heart of C#. Normally, C# might be a good language for soft real-time, long life systems. However, if any bug exists in any control using the IMalloc API, then all of the CLR can become unstable. As such, C# is the home of my biggest programming disaster ever. A program that is less reliable and runs thousands of times slower than the equivalent in C. In C/C++, some freedom exists to properly handle memory allocations, and data types are checked at compile time. In COM, it is almost impossible to understand all of the complexities of all of the memory allocations and data type conversions. As such, it is both easy to make mistakes with COM, and very difficult to work around them.
ActiveX/COM/DLLs are the root source of many serious security and reliability issues inside Windows. Historically, reliability and security issues were at least traceable to an executable. Now, all bets are off. ActiveX is present in Windows Explorer, Office and Internet Explorer, making it very difficult to effectively lock down the system. As such, system requirements for "a solid application" on "Microsoft Windows" represent an oxymoron. The result of these contradictory requirements is the application separation. Big and small applications use web servers running embedded Operating Systems or Linux (printers, Google, Bing), and display the results on a web browser on Windows.
In short, this bug is not a side effect of the Windows Memory Allocation API: GlobalAlloc() and GlobalFree(). It is a side-effect of DLLs under Microsoft Windows, and importantly the C Run-Time Library DLL implementation. As a result of the attempted workarounds, it has probably had as big of an effect on Windows as the old Intel 8088 segmented memory architecture.
Re: (Score:3, Informative)
This particular issue is C++ only, but that's simply because MS has screwed the way things used to work for unmanaged binaries.
You may still get this issue next time MS issues a patched version of any of the .NET libraries, imagine that your code will only run on .NET 3.5 SP1 patch 1234, then you'll see the exact same problem. I guess we're all used to installing all manner of .NET framework libs through WU so no-one sees this issue so far, but its only a matter of time.
Of course, if binaries used the previ
Re: (Score:3, Informative)
Re:And by all developers you mean (Score:5, Insightful)
"It works fine for me" said the MS developer.
I put it down the the decline of Microsoft, I've been working as a MS dev for the past 15+ years, and since Bill left (a coincidence, I feel) the company has started a steady decline - wasting their money on frippery, attempting to get a new growth market, screwing with established systems in place of selling new stuff, and generally trying desperately to get your money off you. At least in the past, they were also focussed on making good technical stuff too.
Re:And by all developers you mean (Score:5, Informative)
But when issues arise, it becomes excruciatingly difficult to troubleshoot and fix.
The version information is written in plain text in the manifest. The files have names based on the version information in the WinSxS folder. If you get the error and notice the files aren't there, its fairly trivial to troubleshoot and fix.
I'm not a fan of side by side assemblies, I just hate to see issues like this blown out of proportion as it obscures some of the real issues that developers face when developing for Windows (such as just about every bug filed on Microsoft Connect being closed as "by design" instead of being worked on or at the very least closed as "can't fix, compatibility issues", for example).
Re:And by all developers you mean (Score:5, Informative)
You are 100% correct. The "problem" with those guys is that they did not even understand that they have to redistribute CRT DLL with their binaries even before the update. They got lucky there because on most systems, it is already there (I believe Vista has it out of the box, and probably some MS software will install it on XP - maybe even IE7/8?), so they got away with not distributing it. Now the update brought in a newer version of runtime, which of course the clients don't have, and they suddenly realize that they didn't distribute their dependencies properly. What does "DLL Hell" or side by side assemblies even have to do with it?
And, if they find it so hard to comprehend the idea that they have to give users the DLLs they link against, they could always just compile CRT statically and be done with it.
Re: (Score:3, Informative)
Indeed. A stanard Windows DLL is not quite the same thing as an assembly, and as far as i know only assemblies have ever been "side by side." If you're doing plain old C++ on Windows, sxs doesn't enter into it, because side by side is a feature of the .net runtime.
Sorry, but you're wrong here. Microsoft is using the term "assembly" to refer to both managed and native DLLs since XP was released (which is before the first version of .NET was even out). Specifically, when you write a manifest file for your binary - which is something that first appeared in XP - you use the <assembly> element as root.
Also, SxS is quite specifically designed to work with native assemblies (for managed, GAC takes care of it).
Here [microsoft.com] is the reference:
Side-by-side (SxS) assemblies allow a
Re:And by all developers you mean (Score:4, Insightful)
I really hate this /lib stuff. I remember the first time I make a C binary executable with GCC. It worked find on my Linux box, so it must work fine on any other Linux box I thought. Wrong! Turned out I needed to apt-get a whole bunch of libraries...
Seriously, all you are saying is that you didn't understand that your compiler was linking to a bunch of libraries, some of which were distributed with the OS and others were your responsibility to distribute when you created the application's setup/install package.
Re:And by all developers you mean (Score:5, Funny)
Java?
Re: (Score:3, Funny)
Re:And by all developers you mean (Score:5, Funny)
A) A programming language called Brainfuck.
B) Homophobia.
C) How long the parent has known about Brainfuck.
It seems clear to me that there is a language called Brainfuck that everyone needs to be made aware of.
Re:And by all developers you mean (Score:5, Funny)
Also... (Score:5, Informative)
What might be interesting to note here is that the summary isn't everything there is to side-by-side (SxS) assemblies.
Suppose you're building an application using two DLLs, let's call them A and B. Both depend on a third DLL named C. Now, suppose A uses C version X, and B uses C version Y. You're screwed, right? Not with SxS, since that allows multiple versions of C to be loaded. That's the real added value of SxS.
All this is in theory of course, which as we all know is in theory equal to practice, but in practice is not...
A question .... (Score:3, Interesting)
Re:Also... (Score:4, Interesting)
Suppose you're building an application using two DLLs, let's call them A and B. Both depend on a third DLL named C. Now, suppose A uses C version X, and B uses C version Y. You're screwed, right? Not with SxS, since that allows multiple versions of C to be loaded. That's the real added value of SxS.
Huh? Microsoft's compilers and linkers have allowed this for ages. The difference is that it's now a feature, not a bug.
There was a study of MS's binary apps years ago (and I thought it was reported here, but I don't seem to find it in the archive), which analyzed the contents of the binary files. It was found that there were typically many copies of various library routines, sometimes 10 or more copies. They even had duplicated local data blocks. This was presented as a bug at the time, and an explanation of why MS's binaries required so much memory.
But now this practice is called "side-by-side", and it's considered a feature. Which, of course, means that it never will be fixed, and users can look forward to even more bloated programs that contain copies of all the versions of a routine that have ever been released.
It's sorta like extending the concept of a CMS into the binaries of apps. No longer will all those old versions be kept on the sidelines, off in obscure libraries. They will all be right there in the binary, ready to be loaded into memory if they are ever needed.
Who was it that has congratulated the software industry for consistently undoing all the amazing hardware advances of the past decades by responding to bigger memories and faster processors by making the software use more memory and cpu cycles? SxS is a major step forward in this technique, and promises to lead to major increases in memory sales in the near future. And all it took was realizing that a practice that had been considered sloppy wastefulness of memory was actually a valuable new feature for backwards compatibility.
Re: (Score:3, Informative)
Who was it that has congratulated the software industry for consistently undoing all the amazing hardware advances of the past decades by responding to bigger memories and faster processors by making the software use more memory and cpu cycles?
That would be all the people for whom software is a tool, not intellectual masturbation. The people who consider "how soon can you deliver", "how much will it cost" and "how long can you maintain it" substantially more important than "will it run on my 10 year old
Re: (Score:3, Interesting)
Office still boots twice as fast as OpenOffice on a typical computer ... in practice, it doesn't seem to be hurting them anyway.
I have to disagree - OpenOffice on my Linux partition boots faster than Office on my Windows partition on the same computer. The problem is when MS gimp Windows to hell - they know the workarounds, so Office runs OK. OpenOffice just have to use trial and error and a whole lot of guesswork to rid themselves of bugs that only exist to give MS an advantage. I completely understand that the average Joe Sixpack doesn't care, but that's why MS is still managing to sell gimped OSs.
Re:Also... (Score:5, Informative)
It might surprise you, but Microsoft isn't actually to blame here. Rather, the legions of incompetent programmers that wrote DLLs such as C are to blame. We'd call them idiots, but Microsoft calls them paying customers. Thus prompting them to design SxS and incorporate it in WinXP.
Also, SxS is what made the restyling of the UI (through common controls version 6) technically possible.
Microsoft takes backwards compatibility very seriously.
Re: (Score:2)
Re:Also... (Score:5, Insightful)
You do understand that DLL Hell exists exactly because version X and Y DO have huge differences?
Re: (Score:2)
What are you proposing exactly, have them automatically analyse all code that is being released (and solve the halting problem at the same time)? And how would you propose they deal with the ensuing shitstorm? There are actually legitimate reasons for DLLs changing you know, like the common controls I mentioned.
Besides, what would you rather have, a program not working in its entirety, or working properly and using a meg more memory (that'd be a huge DLL;
Re:Also... (Score:5, Insightful)
MS shouldn't really be allowing such poor practices. Why should my memory be eaten up by loads of DLL files that are nearly identical. Let's face it, there isn't going to be huge differences between version X and Y.
Versions X and Y of a DLL will be flat-out incompatible if that DLL is written in C++ and the author has changed the number of attributes in an interface class (unless he uses tricks such as pimpl), or if he's added or removed any virtual functions.
And the fact that Microsoft is so good at preserving application backward compatibility, even in the face of "poor practices", is frankly one of the main reasons that Windows is the #1 business desktop operating system in the world.
Re:Also... (Score:4, Interesting)
It might surprise you, but Microsoft isn't actually to blame here. Rather, the legions of incompetent programmers that wrote DLLs such as C are to blame. We'd call them idiots, but Microsoft calls them paying customers. Thus prompting them to design SxS and incorporate it in WinXP.
Also, SxS is what made the restyling of the UI (through common controls version 6) technically possible.
Microsoft takes backwards compatibility very seriously.
They don't take it seriously. They don't even try for DLL compatibility within a product release. Can you really say with a straight face that you can new using MSVCR70.DLL and delete using MSVCR40.DLL, and get away with it? What about MSVCR80D.DLL and MSVCR80.DLL?
I'd bet dollars to donuts (adjusted for inflation) that the majority of problems caused by that incompatibility are due to Microsoft's C and C++ runtime libraries and their tendency to allocate memory from different arenas. My coworkers and I mostly program for Linux, but those few who have spent much time coding on Windows see that variant cause problems more often than practically anything else. How often you see third-party libraries cause DLL hell (that is, libraries not written by the application author) where those libraries are written by someone other than Microsoft?
Re:Also... (Score:4, Informative)
>>Microsoft takes backwards compatibility very seriously.
They always say this, and then fairly fundamental programs always seem to break during one of their OS upgrades. Like my very expensive copy of MATLAB. Won't run on Vista. Period. And the code we've licensed won't run on more modern versions of MATLAB. And it's impossible to find high end visualization laptops these days with XP on them.
And yeah, I am a little bitter about it.
Re: (Score:2)
Oh, yeah, you're right, sorry. Somehow the myriad SxS calls I've seen littering the MFC source must've slipped my mind.
Re: (Score:3, Informative)
Because you can't really expect your customers to upgrade their apps,
No, you can't. Really. Remember, the vast majority of business software, in sales and in number of titles, is proprietary software written in-house or by some consulting dude. A significant portion of this software is mission-critical and replacing it is mostly nor practical (cost) or it is impossible since the original author and often the source code, is long gone/dead etc.
Backwards compatibility is a huge issue for business apps, and Microsoft has been doing a phenomenal job of it. Not that this stops th
Re: (Score:3, Interesting)
.NET's strong requirements mean it no longer matters if "foo.dll version 1.5" breaks backwards compatibility with "foo.dll version 1.4", because the developer has the option to say at compile time "ONLY use foo.dll version 1.4". Consider this from the perspective of the end user, the application developer, and the dll vendor.
The dll vendor is happy, because they can release new versions into the wild on their own schedule. Win.
The application developer may or may not have been aware that the vendor released
Re: (Score:3, Informative)
The rest of your post makes sense (though Unix-style versioning can do the same thing), but this point is wrong. Don't fool yourself or anyone else into thinking you can't modify any application regardless of this stuff. Of course you can: all the SxS stuff might do is make it a touch more difficult.
Speaking as a user (Score:5, Insightful)
Speaking as a user, can we get statically linked libraries? I don't care if it's dependency hell or DLL hell. I want self-contained applications.
Re:Speaking as a user (Score:4, Informative)
Re: (Score:3, Informative)
But the other side is that the OS is massive.
c:\windows alone is 11gb, with \winsxs being around 6gb.
Googling shows that \winsxs is where all these dll versions are being stored.
I haven't got close to 11gb of applications installed (and I've got VS, SQL server + office on here).
Re:Speaking as a user (Score:5, Insightful)
If they were statically linked you'd have way more than 11gb of applications..
Re:Speaking as a user (Score:5, Insightful)
It'd be more massive if everything were statically linked.
Remember, shared libraries didn't come first. The world began with static libraries, and shared libraries came later. There were good reasons for the switch, and those reasons apply even today.
Re: (Score:3, Funny)
Re:Speaking as a user (Score:5, Insightful)
Dynamic linking is being used because static linking has been denied as a choice in most of the current dev tools that matter on Windows. In Delphi, I had a choice of static and dynamic linking. I always chose static linking. Most Delphi users did the same. I didn't have that choice with VB6, Java, and .NET.
Static linking is not bad. When smart linking, only the used routines from the library are bundled along, not everything in the library. When I used dynamic linking, to simplify installation for the user, I had to also distribute the full DLLs along (they would not be installed if they already exist on the target machine), even when I used only a small portion of their functionality. Consequently, my installers were always SMALLER when using static linking.
If you are developing in-house applications, this is less of a concern though since you can reasonably assume their presence on the target machines; and because you will be using the same dev tools consistently. Dynamic linking is only efficient when the involved DLLs are relatively unchanging and necessary by many apps. This also works well in Linux where a package manager with dependency tracking is an assumed part of the system. Dynamic linking has its advantages and disadvantages. But it is not a solution that uncritically deserves its current dominance.
Re: (Score:3, Insightful)
Static linking is not bad ... my installers were always SMALLER when using static
How easy it is to show that you do not understand anything at all... who cares what size your installers are? If everybody followed the "I'll just statically link everything" the average Windows computer would need 32G of memory just to function (exaggerated to make a point).
DLLs are good, but they have problems. Static linking is bad for anything slightly more advanced than a "Hello World" application.
Re: (Score:3, Insightful)
If everybody followed the "I'll just statically link everything" the average Windows computer would need 32G of memory just to function (exaggerated to make a point).
Wanna provide some data for that claim? And any guesses as to the number of people who ship private versions of the DLLs they need to ensure their app behaves properly because it depends on bugs in that specific DLL version? In my experience that's a pretty common move for anything above a "Hello World" application.
Also, a number of people th
Re: (Score:3, Informative)
Just like in the physical world, bad packaging can fuck up an otherwise good product.
Incidentally, on Fedora, the scenario you describe doesn't occur.
Re: (Score:3, Interesting)
Re: (Score:3, Informative)
Also keep in mind you're looking at a huge RAM footprint as well, since multiple copies of the same library will be loaded per-application, whereas sometimes you can make optimizations for libraries shared across applications with paging magic.
That said, I thought that OS X apps were statically linked (except with OS libs), and thus tend to be large, but reduce this issue since so much functionality is based on system libs. I could be wrong; I don't really ever work with non-linux systems.
Re:Speaking as a user (Score:5, Informative)
That said, I thought that OS X apps were statically linked (except with OS libs), and thus tend to be large, but reduce this issue since so much functionality is based on system libs. I could be wrong.
Yeah, you are wrong. Mac OS X apps are definitely dynamically linked (the equivalent of so on Mac OS X is dylib). The reason Mac OS X apps can be larger than executables on other platforms is because they often contain machine code for multiple platforms (ppc, x86, x86_64). That only translates to a large footprint in storage.
Now you know.
Re: (Score:3, Informative)
You seem to be under the mistaken impression that if a library is dynamically linked, then the entire library must be loaded before any program can use a portion of the library. That's not true.
Consider a library with symbols A, B, and C, with programs P and Q. Program P uses symbol A from the library, and program Q uses symbol B from the library.
If these programs are statically linked and are run simultaneously, the memo
Re:Speaking as a user (Score:5, Informative)
Re: (Score:2)
TFA problem is easily resolved by statically linking the stuff. And the thing you pointed up is the Achiles heel of pretty much any *NIX system where the app dependencies are often much more widespread and the app itself is not self-contained, but rather disseminated over the whole system.
Re: (Score:2, Informative)
Re: (Score:3, Informative)
Re: (Score:2, Informative)
Totally agreed - and additionally, the rigid static linking doesn't work in a plugin environment where each plugin (which is in itself a shared library/dll) brings in its own (shared) dependencies based on the users requirements. Apologies to the OP, but users really shouldn't make demands at this level - there are many reasons why developers and packagers prefer dynamic loading, and likewise it's *their* responsibility to get it right such that users aren't even aware of this level of detail.
Why don't apps just use their own copy of the .dll (Score:2, Interesting)
Instead of referencing the .dll in \Windows\System\ why don't you reference a copy of the .dll in \Program Files\Your Stupid App\ ?
Seems like a simple fix to me, though I'll admit most of my .dll experience comes from the asp.net/c# world.
Re:Why don't apps just use their own copy of the . (Score:5, Informative)
That defeats the whole purpose of a DLL anyway. The thought was that you wouldn't have to "reinvent the wheel" and you could reuse code. However, Microsoft's failure to document their operating system's API thoroughly in a public manner led to developers relying on undocumented features that were later changed. Then, those applications needed older versions of those libraries and would install them over the newer versions. This, of course, crashed the newer applications. Ugh.
Re: (Score:3, Interesting)
I hate to break it to you, but there is quite a lot of this going on. For instance, if you were to look at the source code of every app that needs to decode png files (just as example), you would probably find only about 50% use the libraries that come with the OS, and the reasons why vary:
* avoiding dll hell (as mentioned)
* the app is cross platform
* poor implementation of official dlls
* politics (at one stage Microsoft made tied certain dlls to the installation of IE, even though they had nothing to do wi
Then don't bother (Score:3, Insightful)
everyone having their own DLL would be the same as just statically linking everything. you'd have tons of code duplicated and loaded. have no easy way to patch common code system wide.
People suffer DLL hell because it is better than not using DLLs at all.
You should not blame Microsoft for this (Score:5, Insightful)
Everybody who developes applications for the Windows platform should know that you need to include the merge module for the C/C++ runtime libraries in your installer. You've just been luck so far that other applications have installed the DLL's you needed for you. Try your app the way it is on a clean install of Windows XP without the service packs and see how well that goes :P
In fact the SxS assembly system in windows is the only real way out of DLL hell, much better than the versioning scheme for shared libraries used in Linux. Get your facts straight before posting.
Re:You should not blame Microsoft for this (Score:5, Insightful)
Indeed.. this is a non-story. If the submitter had distributed his app correctly it would have worked out of the box. Instead he decided to rely on a specific version of a DLL being installed on the target system, then blames Microsoft when it all goes to hell.
Re: (Score:2)
How does that work for PortableApps or U3 packages?
Re:You should not blame Microsoft for this (Score:5, Insightful)
In fact the SxS assembly system in windows is the only real way out of DLL hell, much better than the versioning scheme for shared libraries used in Linux.
Better than the Linux scheme + proper shared library design? How? I've done extensive work with both, and the SxS scheme seems like like a gigantic, fairly ugly hack to me (albeit not as ugly as a lot of other hacks) and Linux's scheme, while not perfect, seems much more elegant and reliable.
I'm not trolling or picking a fight, I really want to know.
Re: (Score:3, Insightful)
Why do you think it's a hack? I mean, the manifest files used by the SxS assembly system are much more expressive than the three digits used by libtool versioning to tell which shared libraries can be loaded for a specific process. Also note that two DLLs loaded into a process can reference different major versions of the same third DLL without a name clash (leading to two versions of it being loaded), while that's AFAIK not possible with shared libraries.
http://www.freesoftwaremagazine.com/books/agaal/buil
Re:You should not blame Microsoft for this (Score:5, Insightful)
Funny, it's only proprietary software authors that think this way. Over here in the free world, application flexibility is seen as a Good Thing. LD_* hacks might not be the most elegant way to implement certain functionality, but the approach certainly makes hard things possible.
And again, the SxS signing approach doesn't actually add any real security. Someone wanting to modify an application will find a way to do it regardless of any special "don't modify me" bits the application might contain.
(Go ahead and moderate this troll. That doesn't make it any less true.)
Re: (Score:3, Insightful)
And again, the SxS signing approach doesn't actually add any real security. Someone wanting to modify an application will find a way to do it regardless of any special "don't modify me" bits the application might contain.
You think public key signatures of the executable and it's dependencies is not real security? ... Then what is?
Re:You should not blame Microsoft for this (Score:4, Insightful)
You think public key signatures of the executable and it's dependencies is not real security? ... Then what is?
Security never ever works if the attacker and the one being attacked is the same person
Re:You should not blame Microsoft for this (Score:5, Insightful)
A thing that a LOT of Linux programmers (and a lot of programmers in general) seem to miss is this simple fact, bolded for emphasis:
Most programmers suck.
The very fact that you're here reading Slashdot means it's likely that you do not suck. Most programmers do. For most programmers, "cargo cult" programming is a way of life. The majority of programmers do not, and never will, fully understand pointers to the level where they would be able to re-create the C++ STL by themselves. Relevant to this discussion: most programmers don't know how linking works, they just hit the "Play" button on the IDE and off it goes. Most programmers have zero knowledge of user permissions or fast user switching, and see nothing wrong with writing their application data in the Program Files folder.
Most programmers never, ever read the API documentation. Not only do they have no problem using deprecated functions, but they don't even know what functions are deprecated.
And when their programs break because of this? They blame Microsoft! It's Microsoft's fault, always Microsoft's fault!
Now the open source community might be lucky enough that it has no bad programmers. (I doubt it, but let's play along.) Good for you. Microsoft, unfortunately, isn't that way: one of their biggest challenges is to keep terrible programmers from constantly breaking their own apps and/or Windows itself.
What I'm getting at here is that Microsoft's goal is to make programming for Windows as easy and hands-off as possible. Any solution to this problem that requires the programmer to fix their application is far inferior than a solution that works "automatically."
The programmer who posted this topic didn't read Microsoft's documentation, and screwed up his application's installer so that it links directly to a specific library version DLL instead of to the version-agnostic DLL. He's one of the bad programmers I've been talking about, but to be fair: considering he's actually posting here, it's probably one of the best of the bad. Hopefully he'll take something away from this and Slashdot won't spend the entire thread bashing Microsoft for no reason, but I doubt it.
Re:You should not blame Microsoft for this (Score:4, Informative)
I don't think they actually suck. I think they are simply pressurized to produce results in the quickest time possible. I think part of the blame goes to the commissioning clients.
You just haven't worked in enough offices. Believe me, we have people working for our company, producing software, who couldn't code their way out of a wet paper bag. And they're not under any particular time pressure. (We also have a guy who does really good work, but he's so slooow... in that case, you're right, but from my experience that's the minority.)
And part of the blame goes to the "simple is best" mentality in some schools of thought. Simplicity is NOT elegance.
Possibly; but if you know that your shop has coders who won't understand the elegant code, you're better off writing simple code to accomplish the same task. Otherwise, they'll fuck up your elegant code and produce buggy, bloated results. Joel wrote an article about this recently: http://www.joelonsoftware.com/items/2009/09/23.html [joelonsoftware.com] making that very point.
Personally, I don't think elegance should be the goal. Who cares if it's elegant? The finished product is the important part, not the code and the UI is what people judge. I'm sure a lot of products with extremely elegant code have godawful UIs.
Also, opinion is NOT fact.
Shocker!
Surprisingly, I see Cobian Backup, Avast! antivirus, and some other software being rather multi-user aware. Cobian v8 is open source, Avast! Home is freeware. These little attention to details impressed me a lot.
Being multi-user aware isn't "a little detail", it's an industry standard! If they weren't previously for the NT series, they should have been patched to be multi-user away for Windows 2000 Pro, which was mainstream.
That's not "impressive" that's just par for the course. That's like being "impressed" that it uses menus and buttons, or being "impressed" that it lets you switch which language the UI is in.
Then what do they rely on? Their IDE's ability to suggest methods and properties when they type?
That. Or Googling it. Or yelling, "hey Bob, what do you use to sort a list?" over the cubicle top. Or they keep an extensive collection of code snippets and they just copy and paste those in randomly until something works, or appears to work at first glance.
Re: (Score:3, Insightful)
Have you tried to install DLLs without using MSI?
It's not so easy with NSIS, for example. And don't get me started on shared DLL usage counters...
DLL hell never left (Score:5, Interesting)
I upgraded my Fallout 3 installation yesterday. After patching, the game wouldn't run, returning some fairly obtuse message about import ordinals [google.com]. So I googled the message, and found out it's because the game now links against a newer version of "Microsoft(R) Games for Windows(TM) Live(TM)" whatever. Note that this wasn't some new patch, it's months old and yet this problem, which must realistically be hitting quite a few users, persists. This isn't something you get via Windows Update either, this is just some obscure 'distributable runtime' crap you should know you need?
So let me repeat that: Super mainstream game on a super mainstream platform (Vista x64), no add-ons, I patch to the latest version and it won't start, nothing is mentioned at the developer's site.
Now I recognize good old Bethesda again. Here's how they'd be able to repro: Fully updated Vista machine, install game from DVD, apply patch, notice it won't fucking run.
I don't normally give much for the 'PC-gaming sucks' choir, but c'mon..
Re: (Score:3, Informative)
I upgraded my Fallout 3 installation yesterday. After patching, the game wouldn't run, returning some fairly obtuse message about import ordinals [google.com]. So I googled the message, and found out it's because the game now links against a newer version of "Microsoft(R) Games for Windows(TM) Live(TM)" whatever. Note that this wasn't some new patch, it's months old and yet this problem, which must realistically be hitting quite a few users, persists. This isn't something you get via Windows Update either, this is just some obscure 'distributable runtime' crap you should know you need?
So let me repeat that: Super mainstream game on a super mainstream platform (Vista x64), no add-ons, I patch to the latest version and it won't start, nothing is mentioned at the developer's site.
Now I recognize good old Bethesda again. Here's how they'd be able to repro: Fully updated Vista machine, install game from DVD, apply patch, notice it won't fucking run.
I don't normally give much for the 'PC-gaming sucks' choir, but c'mon..
I had the same problem. Only, I run Fallout3 in Linux via WINE and there is apparently no way whatsoever to get xlive.dll to work in WINE. In addition, you do need the latest Fallout3 patches in order to install the expansions. Personally, I found it unacceptable that I would not be able to use any of the expansions merely because someone decided to add functionality that I never asked for, do not need, and will never use.
I found a solution. There is a patch for Fallout3 that removes all Live functio
Discussed on Visual C++ blog (Score:5, Insightful)
This has been heavily debated in comments in the Visual C++ blog:
http://blogs.msdn.com/vcblog/archive/2009/08/05/active-template-library-atl-security-updates.aspx
Unfortunately, the VC++ team doesn't seem to understand what's wrong with pushing out a critical update through automatic updates that silently updates dependency requirements. I've personally seen projects that were suddenly broken by this update and the ensuing confusion that resulted.
Re: (Score:3, Informative)
I agree that updates to IDE & toolchain that introduce new dependencies for binaries produced by that toolchain shouldn't be silent. On the other hand, installing the application on a clean system (for every OS you support, and counting each major Windows version separately) and checking if it runs is one of the most basic tests that every new build of the product should go through. It's trivial to automate, too (if the installer is written properly and allows for silent installation) so there's no excu
Time to question if DLLs are still needed (Score:3, Insightful)
Would it be worth burning more RAM (although in an on-demand paged system, there's obviously no need to have your entire processl resident) to get rid of the problems associated with incompatible versions of libraries. Just go back to statically linking everything, so yo only ever need 1 binary - as all the routines it will ever call are already part of it.
Re: (Score:2)
We'll never get rid of the problem with incompatible libraries until processes stop communicating with each other. What if two processes communicate over a shared memory segment where one version is using a data structure with an extra struct field? What about window message differences, drag-and-drop, and file formats? Sure, static linking might paper over some problems, but it won't free programmers from having to think about backwards compatibil
Re:Time to question if DLLs are still needed (Score:5, Informative)
Additionally, DLLs can enhance security. If there's a bug in MSVCRT.DLL, Microsoft can patch it and release it as an update. In a static world, every application that had that buggy code compiled into it would have to be rebuilt and redistributed.
In a statically-linked world, every application would need to be rebuilt (on Windows, redistributed) every time an API / ABI change was released, as well as every time a bug was fixed. Furthermore, download sites would either have to release binaries for every API / ABI combination (that's a ton of releases per version) and deal with users downloading the wrong one, or do the open-source model and release the source, forcing the users to build the application on their system and rebuild every API / ABI update. And somehow I don't think the latter solution would fly with the Windows community.
Like other posters have said, Microsoft's solution is actually not a bad one. Allowing multiple DLLs to be loaded simultaneously is not a pretty solution, but it's not a pretty problem that they have to solve, either. Advance with backwards-compatibility in mind as much as it makes sense to, and use SxS DLLs when it doesn't.
Re: (Score:3, Funny)
I see you've been working with C++ templates recently.
Oh? (Score:2)
I read the linked article [com.com], and don't see anything exciting. How is this any different from the shared libraries that have been used in the Unix world for 23 years [faqs.org]? "Private assemblies" can be achieved with rpath or a simple manipulation of LD_LOAD_PATH (see Mozilla). And what does the asymmetric cryptography buy you over simply using the appropriate soname?
SxS is a fine technology (Score:5, Insightful)
Yeah SxS works a treat. No more dll hell. Great for servicing too. The problem here is moronic devs not shipping the libraries that they link against. MS would be castigated if they didn't fix security holes. Why oh why does kdawson think this is a return to dll hell? Does he actually know what SxS is? Does he even have experience of windows development?
Re: (Score:3, Funny)
Additional reading (Score:5, Informative)
Everyone (even Windows programmers) should read Ulrich Drepper's piece on how to write shared libraries [slashdot.org]. (Warning: PDF.) (Likewise, even free software developers should read Raymond Chen's blog [msdn.com].)
Re:dead link (Score:5, Informative)
Oops. Correct link [redhat.com]. (I wish Slashdot would warn about obviously incorrect links.)
You sir are part of the problem! (Score:5, Insightful)
Microsoft did this intentionally. They deprecated the vulnerable version of the dll. You "solution" to the problem of your customers still running the vulnerable version of the VC dlls should be to either force them to upgrade or install the new dlls for them. Instead you decide the security is a hassle and undo the fix on your developer machine, so you can ignore the larger issue that you are building insecure software and you customers are running insecure computers. Fix the problem, instead of whining about it and continuing to crank out crappy .net software. How hard would it be to have your software check for the problem dll versions, and direct the customer to download/install the new version? Cripes, games do it all the time when they check what version of direct x is installed.
Is the OP not doing something wrong? (Score:3, Informative)
First off, why on earth is the developer still using Visual Studio 2005? We're on Visual Studio 2008, SP1. That, right there, raises a red flag. If someone compiled something with an ancient version of gcc and found out it didn't work, when distributed, on more up to date Linux distributions, wouldn't you think that the appropriate response would be for our man to get his tools straightened out first?
I would think that if the author shipped his system with a copy of the runtimes and had them install in his application directory, he would have no problem at all. The Windows DLL load order is application directory first, then, some other stuff, so his application should always have the right libraries, if he shipped them. In fact, I even think there's some sort of a doohickey that you can do to have Windows look for COM components first in your own directory before it looks for them in common areas. There's no need to have "DLL hell" at all, unless the developer really asks for it.
Frankly, I doubt DLLs of the relatively small size of the CRT should even be used any more across different applications from different vendors.
1. First, you cannot possibly test your application with all the different versions and patch versions of DLLs that are out there, because patch releases now are way too fast. Reliability, right now, not performance, is the pre-eminent problem in the software community.
2. The static linker is now actually capable of removing unused code, from an image, it could not do that before.
3. DLLs have to be relocated when an application loads them into its own process space, so you take a performance hit there.
4. The Windows API has 95% of what you would need the C runtime to do. This isn't like Linux, where, you would die without libc trying to make heads or tales of syscalls and what not. On Windows, I struggle to think of a CRT call that could not be done nearly as simply as in SDK directly. For files there's CreateFile, WriteFile, etc. All of the basic string functions exist within the SDK, and the stuff to display formatted strings in the SDK is better than what the CRT gives you anyway. It's a bit more involved, but, there's articles out there on how to not have a CRT at all. In fact, applications that use the ATL and WTL frameworks even support not having the CRT code, just so you can write really, really tiny applications and gasp, COM components.
Re:Is the OP not doing something wrong? (Score:4, Interesting)
"The Windows API has 95% of what you would need the C runtime to do."
Unless what you need the C runtime to do is to be cross-platform compatible. Then it has 0% of what I need the C runtime to do. The reason to have a standard c library, at all, was to make applications significantly more portable. That's why it's, I believe, part of the ANSI and ISO specifications of C, is it not? Sure, any vendor can create their own proprietary, non-portable, runtime library. I'm sure Microsoft would be delighted for you to use CreateFile, WriteFile, et al., because you've now tightly bound your C application to Windows. Of course, sometimes that's fine. Sometimes your app will be tightly bound by other things like DirectX, anyhow, so, might as well go native API's all the way.
But, if I'm using the C runtime library, it's because I'm trying to write portable code.
Wasn't one of the original goals of DLLs was to avoid lots of duplicate copies of the same runtime libraries littering up the hard drive, and RAM? It's my understanding that one of the 'advantages' of a DLL is that you could have 5, or 10, or 20 programs running, and they could all 'share' one copy of the DLL loaded in RAM? That if the DLL was already loaded in RAM by another program, it would also save a little bit of time that otherwise would have been required to load the DLL from disk, etc? I suppose, however, that since both hard drive space and RAM have gotten orders of magnitudes larger, faster, and cheaper than in the past, perhaps, for the sake of reliability, it does make sense to just go ahead and sacrifice some efficiency, as you suggest, by either having every program install their own directory-local copy, or even statically link.
Re: (Score:2)
For most projects, upgrading sources from VS2005 to VS2008 is really quite painless, no harder than opening the .SLN in 2008 and letting it save the converted result. Are the Source sources really that problematic?
Re: (Score:2)
Upgrading projects is not equal to the change in the compliers. Here's the guide on getting the Source SDK working in VS2008: Compiling under VS2008 [valvesoftware.com].
Actually, upgrading from 2003 to 2005 was far worse, as it required changes to individual files. Fortunately Valve cleaned most of these problems up, making 2005 to 2008 easier.
Re: (Score:3, Insightful)
"I was really pissed when I discovered that Microsoft had discontinued all versions of Visual Studio Express under the most recent one (2008, I believe?). I had to go and get a copy of VSE2005 off of bittorent since you could no longer download it from the Microsoft web site."
Then you got pissed and used BitTorrent for absolutely no reason whatsoever, because Microsoft are touting Visual Studio Express more than ever nowadays, it's become a core product for them.
http://www.microsoft.com/express/ [microsoft.com]
What gave yo
Re: (Score:2)
Well, how do you deal with a user who has applications dependent on different versions of libraries then?
Do you just say "FU" to them?
Re:Non-issue for actual msdn coders like myself (Score:5, Insightful)
No, because that's what sxs solves. You can have multiple versions of the MSVCxxx libraries installed and each app can find the correct one.
It's similar to the library versioning that unix does, except instead of changing the filename each library has a manifest containing its version number and a checksum of the library, and the loader knows how to fing the right one.
This is a complete non-story written by someone who doesn't test applications on a clean system prior to distribution, then wonders why it doesn't work.
Re: (Score:3, Informative)
Whether or not that's true, either way this is a non-story. The point is that the OS makers already address this exact situation with technology X, and since he's not using technology X he has a program that no longer runs. That's not Microsoft's fault any more than it would be Linus' fault if his buggy application ran on Linux.
Re: (Score:3, Informative)
YHBT... anyhow, on OS X you can have multiple versions within a framework, and frameworks private to an application. So "it just works."
Packages work beautifully. Anything I need for my app can be bundled right along with it, or in any of the standard directories.
Re: (Score:3, Informative)
Multiple versions per framework is essentially how sxs works.. it's the same idea.
Private frameworks to an application isn't new - you can put DLLs local to a windows app too if you want.. OSX has a global frameworks directory that is similar to the sxs directory, and has the same problems eg. if you're linking to the itunes lib you have to check the one you're writing with is on the target system in the package installer (or install it, but for something like itunes that's probably a bad idea).
Re:Non-issue for actual msdn coders like myself (Score:5, Informative)
Oh jesus christ I can't stand how fucking idiotic everything in this discussion is.
What happened: In August, they released a patch to the IDE that required a new version of the shared libraries. Binaries built with this patched version of the IDE/compiler/toolchain will by default require a version of DLLs that aren't installed by default on many systems.
What would have prevented this asshat's blog post: Write a fucking installer that includes the DLLs you use. Thus, when your compiler changes, and your new packages rely on some stuff, magically that 'some stuff' is bundled along with your binary, and everything _still works_. Relying on OTHER packages to have good installers that properly put stuff in to SxS is just idiotic. "Wah, I rely on stuff that I'm specifically told I can't rely on, Microsoft sucks, and SxS is evil!"
No. SxS has been SAVING your ass from updates like this since Windows XP. Surprisingly, you can now NOT recompile your project, and have it still work, even though there's a newer version of DLLs you depend on, because the backwards compatibility problem of these DLLs is just gone. The version YOU want and need and depend on is still there, still usable. Something else on the system that wants the newer version gets it, and all its patched glory. This is actually a security hole, and in THAT case, the DLL you depend on WILL change to fix the security problem. Nothing in the August 2009 update to VS 2005 seems to indicate that this happened, it's only for newly compiled binaries.
So, in summary: The blog poster is an idiot, he hates his user, he's too lazy to write a real installer, he has incompetent developers that don't care about the security of their product, and he asked for help and got 4 completely idiotic responses. Now everyone on slashdot who knows nothing about SxS and the actual problems it causes (and yes there are some) think that it's the same problem we've always had, but worse.
Note: I'm not a Windows developer (Linux developer), and I've not used Windows in well over a year (Mac user). But I've been bitten by SxS in my last job and dealing with VB.Net, back when I didn't know I had to make a real installer for my stuff. I somehow managed to learn to not be an idiot, and the blog poster should too.
Re: (Score:3, Insightful)
Most coders just write against API and should not care about binary compatibility.
A well-designed stable API should also ensure binary compatibility. That's almost a given with C APIs, fairly trivial to get with Java or .NET, and needs special attention but definitely possible with C++.
Re: (Score:2)
Yes it does - if you don't have the correct version of .NET installed your app won't run.
It affects *any* external dependency. It's not unique to the MSVCxx libraries at all.
Re:Use managed code.. (Score:5, Insightful)
Writing "managed" code has nothing to do with using sane concurrency abstractions. You can do one without the other. Hell, you can easily write buggy managed code that relies on raw semaphores and deadlocks often, and you can write elegant message-passing C++ code. The key is the abstraction, not the runtime mechanism.
Re: (Score:3, Informative)