Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft

Microsoft Declines To Make a 64-Bit Visual Studio (uservoice.com) 359

OhPlz writes: A request was made back in 2011 for Microsoft to provide a 64 bit version of Visual Studio to address out-of-memory issues. After sitting on the request for all that time, Microsoft is now declining it, stating that it would not be good for performance.
After almost five years, the request received 3,127 votes on the UserVoice forum for Visual Studio. Microsoft instead recommended the vsFunnel extension to optimize memory by filtering low-priority projects, adding "we highly value your feedback." They cited a December MSDN post that had argued "smaller is faster," and that no performance benefits would be realized for users whose code and data already fit into a 32-bit address space, while most other issues could be addressed with better data design.
This discussion has been archived. No new comments can be posted.

Microsoft Declines To Make a 64-Bit Visual Studio

Comments Filter:
  • In other words... (Score:4, Insightful)

    by ChodaBoyUSA ( 2532764 ) on Saturday June 04, 2016 @02:32PM (#52249125)
    We don't want to do the work.
    • by fyngyrz ( 762201 )

      "and we have a really, really lame explanation for our underlying laziness and/or incompetence"

      "also, fuck you, developers."

      • Re: (Score:3, Informative)

        Microsoft develops software the same way the British Army fought the Somme Offensive. They use massive amounts of cheap programmers, and just pound away until they have something to release. The code quality is so poor that they often just throw it way and do a complete rewrite for the next version. You may think that supporting 64 bit would mean just changing a few header files, tweaking some compiler flags, and typing "make world", but it would not be that simple. The code is likely riddled throughout

        • Re: (Score:3, Insightful)

          by Anonymous Coward

          I did a couple of 64-bit ports in the past and that's pretty much what it took. The first application was about 50000 lines of code that I had worked on for about a year, i.e., I didn't write it, but I fixed it and refactored it. That conversion took about 2 days, and most of that was making small fixes because the header files were reorganized between gcc 3 and gcc 4.

          The second time I did it was for a code base of about 2 million lines of code that I did not "own" in the same way. That took about three

        • by Fragnet ( 4224287 ) on Saturday June 04, 2016 @05:06PM (#52249743)
          I've been a developer for 15 years and I can say Visual Studio environment is way better than anything available on Linux. The code dependencies can be resolved in 64 bit but Microsoft is a business and there's an opportunity cost associated with doing that rather than something else more people actually need.
          • by maugle ( 1369813 ) on Saturday June 04, 2016 @07:25PM (#52250305)
            I'm sure Visual Studio works quite well for you. But, to counter one anecdote with another, I found Visual Studio to be lackluster and irritating in a thousand little ways, and its marginally-better code completion isn't enough to make me prefer it over either Eclipse or QT Creator.

            As for a 64-bit Visual Studio, my guess is that the code problems of porting to 64-bit are dwarfed by the bureaucratic maze involved in releasing a new edition of a product.
            • by Xest ( 935314 )

              Sure each to their own, but I have a hard time believing anyone whose done serious work (i.e. 12 months+ development, using the bulk of the functionality) in Eclipse and Visual Studio could ever objectively argue in favour of Eclipse. Eclipse is slow, cumbersome, has a broken plugin system that often results in you requiring multiple installs for using multiple languages/technologies (try merging STS and Zend if you use Java/Spring and PHP/Zend), and it's also much more buggy. Of course, you don't even have

        • by thegarbz ( 1787294 ) on Saturday June 04, 2016 @07:09PM (#52250235)

          The code quality is so poor that they often just throw it way and do a complete rewrite for the next version.

          Based on what? For the most part when MS projects have either had source code leaked or made open source the analysis of the code has shown some pretty fine programming. Re-writes seem to be based on strategic decisions and also decisions to not end up with endless layers upon layers of cruft (something that also supports code quality in a positive way).

      • by lloydchristmas759 ( 1105487 ) on Saturday June 04, 2016 @04:21PM (#52249537)

        "also, fuck you, developers."

        Shouldn't that be:
        "also, fuck you, developers, developers, developers" ?

    • by Anonymous Coward on Saturday June 04, 2016 @02:52PM (#52249229)

      It's not about doing the work, it's the outsourced Indian team doesn't know how to do the work.

      • by PimpBot ( 32046 ) on Saturday June 04, 2016 @02:59PM (#52249259) Homepage

        Just tell them to do the needful!

      • by Austerity Empowers ( 669817 ) on Saturday June 04, 2016 @03:24PM (#52249331)

        It's not about doing the work, it's the outsourced Indian team doesn't know how to do the work.

        I am pretty sure this is the actual reason, not just cynicism. MS has fired or encouraged significant attrition amongst it's can-do types, focusing those few who remain and who know how to do things into certain areas. The rest...they've backfilled with H1B or outright offshoring. Any other company would have collapsed by now, but monopolies are a powerful thing.

        • Re: (Score:3, Insightful)

          by justthinkit ( 954982 )

          Microsoft is pushing purchasable apps in Windows 10. It has created a (crappy) new interface, so programs stupid enough to comply with the new interface must be written for it before they can be sold to everyone.

          So Microsoft wants a head start -- "Sorry, our compiler is not available (to those outside Microsoft)". A couple of years from now, with the apps market saturdated, and Microsoft dominating once again in familiar (and new) categories, "Will you look at that, we WILL be shipping VS64. Here you go,

    • That's exactly how I read it. All the excuses giving are weak at best. Microsoft doesn't want to invest the time and money. But figures, Microsoft's products that are non-office are mostly turning into trash.
      • by Anonymous Coward on Saturday June 04, 2016 @03:26PM (#52249343)

        This right here.

        This is IE 6.0 all over again. Major products they push on everyone. Then let them rot in place.

        VS2015 is very cool. It is also *VERY* flaky. I have had to reinstall it no less than 5 times now because 'something' breaks. Woe unto you if you have to bring up the repair screen. Plan on that bitch taking 3-4 hours to change 1 package.

        They neglected C++ for so long on their train to .net (which is 64 bit hmmm). clang and gcc now regularly destroy them on compatibility and speed.

        They are so hell bent on making platforms they forgot to make product. I remember standing in long ass lines to buy windows 98. Fast forward to today. Very few actually *want* windows 10. For me getting network backups back again was worth the upgrade.

        Also they are making a rather extraordinary claim that x64 is slower than x32 with visual studio. They should do something like oh I dont know 'recompile it as 64 bit' and PROVE IT and show their work?! Perhaps oh I dont know ON A BLOG POST?! You couldnt find a few interns and a couple of seasoned guys to make it work? Out of a company that big? I call shenanigans.

        MS there is a reason everyone is jumping to other platforms. Yours is just not up to date and you change your mind every 3-4 years on what platform you want to push. Then the platforms MS comes up with are pretty much 0% compat with the old ones. So you can not even re-use your code. You have to throw it all out and start over. MS this is why devs no longer want to work with your crap.

        • Comment removed (Score:5, Informative)

          by account_deleted ( 4530225 ) on Saturday June 04, 2016 @07:16PM (#52250263)
          Comment removed based on user account deletion
          • Re: (Score:3, Insightful)

            by AReilly ( 9339 )

            a) 64 bit processors can do 64-bit arithmetic in a single cycle.
            b) The 64-bit processors in question have more named registers (fewer stack spills), and a significantly more efficient function calling convention (ABI)
            c) 64-bit ABI doesn't touch the old x87 register set, which is another net performance win. (Not that VS2015 will use this much.)
            Ergo: most of the time they are faster.
            The only way to make a 64-bit program slower than a 32-bit one is to have enough pointer-chasing and associated irregular dynam

          • This is absolutely false and I read you other comments on the subject as well. I am not sure whether it is deliberate, but what you are saying is absolute nonsense made to sound credible by inserting a random technical term like "pipeline" here or there. It is sad that the quality of posters on Slashdot has decreased to the point where laughable misinformation is moderated as "informative".

            64 bit code might get slower because pointers occupy more memory. Frequently that is offset by more registers and smart

      • by gweihir ( 88907 )

        That's exactly how I read it. All the excuses giving are weak at best. Microsoft doesn't want to invest the time and money. But figures, Microsoft's products that are non-office are mostly turning into trash.

        I agree, but I think the same is happening with Office, just slower.

      • Microsoft doesn't want to invest the time and money.

        It's not just that, it is also clear the the time and money required is significant, which in turn indicates that the code base is a rambling disaster. Big surprise or what?

        Note: the Linux ecosystem accomplished the 64 bit transition for essentially all projects with just a handful of developers per project, suggesting that an open source code base tends to be better structured than Microsoft's.

      • Is there an advantage to converting to 64 bits? Or is it more the newer-is-better type of thinking? Seriously, if your project is running out of memory while building then maybe the problem is with the project.

    • by OhPlz ( 168413 )

      That's really what it feels like. Visual Studio has a lot of powerful built in tools and most enterprise users add in even more. I have a solution with a half dozen projects that can hit the memory wall in a couple days or so. Perhaps it's that the IDE leaks memory badly, in which case 64-bit would let it leak worse. Either way though, it needs to be addressed. Their response was completely lame. They've done great things with C# and C++ lately which means they're putting money into the product, it's

      • It seems like you (and many other people here) are decidedly in a pro-64bit camp and ready to have a go at the traditioned pastime of MS bashing, but I haven't seen any hard examples of what would make a 64-bit VS better. Can you name some?

        64-bit certainly has advantages, but it also has disadvantages. It really depends on the app to know how they'd balance out. I can't imagine they looked at this problem lightly.

    • Re:In other words... (Score:5, Interesting)

      by Z00L00K ( 682162 ) on Saturday June 04, 2016 @03:04PM (#52249271) Homepage Journal

      No, it's because they want to drive people to instead use cloud services so that they can get into control of all your data.

      To Microsoft and Oracle the desktop operating system is a necessary evil and they want a transit into thin clients. But they don't want the users to understand that they do it, instead it's a free "upgrade".

    • by PolygamousRanchKid ( 1290638 ) on Saturday June 04, 2016 @04:11PM (#52249487)

      Oh, it's not on the top of the list of their priorities. Just take a look at Microsoft's recent behavior, and it becomes crystal clear. Satya Nutella is not forcing Windows 10 down everyone's throat because he wants to annoy his customers: He is doing it because there is a clause in his contract that gives him a big bonus, if Windows 10 reaches a significant market share.

      Even if that means feeding folks Windows 10 like the way a Foie Gras goose is fed.

    • Comment removed based on user account deletion
  • Summary : (Score:5, Funny)

    by SuperKendall ( 25149 ) on Saturday June 04, 2016 @02:34PM (#52249135)

    32 bits ought to be enough for anyone.

    • by hey! ( 33014 ) on Saturday June 04, 2016 @04:23PM (#52249549) Homepage Journal

      32 bits ought to be enough for anyone.

      ... said the lead programmer after glancing through the megabytes of 20 year-old legacy code, and then realizing every single variable had been named with Hungarian Notation.

    • by Desler ( 1608317 )

      Nah, rewrite it in 16-bits. Then by the logic of that MSDN post it will be even faster!

    • by rcase5 ( 3781471 )

      ...and nobody will ever need more than 640k of memory.

  • by heldal ( 2015350 ) on Saturday June 04, 2016 @02:37PM (#52249149)
    The day they broke that limit, some cheered. Others looked upon it with dread, knowing the hellspawn that would follow.
  • 64-bit designer mode (Score:5, Informative)

    by Anonymous Coward on Saturday June 04, 2016 @02:45PM (#52249185)

    The real problem is that the fancy built-in designers, such as the WPF designer, only work with 32-bit components in 32-bit Visual Studio. When someone writes a component that is 64-bit, because it references a DLL that in-turn references, say, a 64-bit Oracle driver, which is part of code that we don't have control over. Now the designer won't load and shows a cryptic error message.

    • by Anonymous Coward

      But there's no reason they couldn't handle that the way lots of us do. If you need to load 32 bit modules in a 64 bit program you just launch a 32 bit host process and reparent the UI onto the window handle of your main process.

      That would be really useful anyway, because those controls are where there worst memory leaks in studio can be found. Putting them into external processes makes it easier to throw them out and free memory.

  • visual studio (Score:5, Insightful)

    by Anonymous Coward on Saturday June 04, 2016 @02:48PM (#52249211)

    I have a quad core 3.5Ghz with 32GB of RAM, VS2015 installed on an SSD. It takes over 15 seconds to get to a working state. After the last update (2), then the Update patch, it now pops up some idiotic error about scc something-or-other every time it starts, and every time I open a project. An update for this POS software takes at least 15 minutes -- with far and away most of the time spent sitting at 99% complete with the status message: "Visual Studio is configuring itself -- this might take a while."

    Yeah no shit it might take a while.

    In summary Visual Studio has become one of the worst programs I use. It is horrifically bad in all aspects: Hard to use, impossible to navigate, useless documentation.

    When I wander over to the C++ forums on reddit I frequently see their runtime library/compiler guy -- I think his name is STL, sheepishly saying what an antiquated POS their C++ compiler is. That doesn't give me warm fuzzy feelings either.

    It's all just more nails in the coffin as far as I'm concerned. I rarely develop for Windows anymore and when I have to it's because I'm forced to. The entire Windows platform is a complete disaster from a developer's point of view. All the years of MSDN trying to sell whatever the current darling language, what's-old-is-new-again (C++ is back, did you notice?), terrible, TERRIBLE API design, and just general CRUFT (did I mention that COM is back too..?) have finally caught up to them.

    • Re:visual studio (Score:5, Insightful)

      by Dutch Gun ( 899105 ) on Saturday June 04, 2016 @04:14PM (#52249499)

      I've been using VS 2015 since it was released, and using VS professionally for close to 20 years. The quality of VS and the compiler has waxed and waned over the years, but I'm pretty happy with the way the current version is looking, given the general responsiveness, functionality, and stability of the IDE and the C++ compiler itself. It has pretty close to full C++ 14 compliance at this point - and certainly has the features *I* care about. If you're seeing such poor performance and odd error messages, there's something strange going on, as that's not a typical experience. My machine is rather *less* powerful than yours, and I just counted - it takes six second to get VS up and running for me.

      Stephan T Lavavej is the maintainer you're thinking of, and he maintains the standard C++ library, so it's pretty awesome his initials are STL. He's one of the few MS devs that interacts regularly with developers and speaks candidly about MS's development, and has given some pretty interesting talks. True, the MS C++ compiler is quite old, but they've been modernizing it in light of C++'s renewed interest and features, with pretty decent results (if slower than many wish, probably including STL). GCC is also quite old, if you recall, but software can be updated. Clang is the only "new" C++ compiler in widespread use I'm aware of, so it obviously benefits from a more modern implementation, having lessons learned applied during initial development.

      And COM never really went away (very few things in Windows do), but it's claim to fame was interprocess-communication and a language-independent execution model. New libraries have superseded the former, and .NET has largely replaced the latter. I'm not sure what constitutes "is back", but I've not heard anyone talking about it, and certainly nothing *new*. Maybe you're just hearing more about what's always been there since natively compiled code is making something of a minor comeback, with interpreted code like .NET never really having met promised performance expectations, even when JIT-compiled.

      As for 64-bit Visual Studio, it's strange that they're not looking more seriously at this, but my guess is they ran into some severe technical hurdles in early efforts to port it, so are downplaying the importance. I mean, who doesn't develop on 64-bit machines at this point? I'll bet they're working on it internally, but are not ready to commit to anything.

      • by AmiMoJo ( 196126 )

        The GP probably has a load of plugins installed. In my experience they are the cause of most slow loading and random error issues. Like you, VS loads in a few seconds for me, unless I have heavy plugins enabled.

    • by Elledan ( 582730 )
      I have been doing C/C++ development primarily on Linux/QNX (embedded) for a while, but recently figured that I'd try this new-fangled VS 2015 (Community Edition) after having used VS 2005 and 2010 (Pro) in the past with no complaints.

      Even though I am using a modern, high-end system with Windows 7 Ultimate, I didn't get VS 2015 to even install. It'd just flake out with a cryptic error message about a problem with a module or something, then after that I couldn't get the installer to get past that point, de
      • The problem here is you, not the tools. The tools install fine for me and absolutely everybody at work.
        • So Microsoft have now reached a level where the whole Visual Studio suite is 100% bug free so any problems what so ever must always lie with the user. Never thought we would see that day coming ever.
  • They cited a December MSDN post that had argued "smaller is faster," and that no performance benefits would be realized for users whose code and data already fit into a 32-bit address space, while most other issues could be addressed with better data design.

    Anyone who actually believes this has no business working on development tools.

    • by west ( 39918 )

      Moving to 64-bit will massively reduce stability for years (given the thousands of pieces of 32-bit code they'll need to be working with for generations). Unless going to 64-bit gives me some big feature win that I don't know about, I'll vastly prefer that MS put the resources that it invests into VS into something that will *increase* stability.

      I am certain that there are a few folk who would benefit from this so much it would make up for the 32-64 bit hash that VS would turn into for a release or two, bu

      • by Man On Pink Corner ( 1089867 ) on Saturday June 04, 2016 @04:16PM (#52249513)

        With a very few specific possible exceptions, x64 code is indisputably faster than x86. The reduction in register pressure buys you a speed increase of about 10% in my experience.

        The code is somewhat larger, of course, but instruction caches have also grown in size since this was observed to be an issue.

        Meanwhile, recompiling correctly-written C++ code to target x64 amounts to changing a couple of flags in the build script, so laziness is no real excuse.

        • by west ( 39918 )

          Do you find VS CPU-bound in any meaningful way? I've always found it was I/O that cost.

          As for converting it to 64-bit, I would be *massively* surprised if updating the code didn't involve messing with hundreds, if not thousands of components, many of which haven't been touched in decades. There are amazing amounts of legacy in there that are absolutely essential to some customer somewhere, and *that* is where I figure you could spend man-decades of development, reduce stability and get almost nothing in r

          • Honestly, I live at the CLI and only use the IDE when I need to debug something, so I couldn't say if it would actually be worthwhile to recompile it for x64 or not. It's not true that there is no performance benefit in doing so, but I agree that it's far from clear that anyone would notice or care.

            The compiler itself (cl.exe) is already built as an x64 executable according to dumpbin, even though it still resides in the legacy c:\program files (x86) folder. That's where the performance really matters.

        • All your pointers are twice as large but your cache didn't increase in size. I benchmarked my math library at 32 and 64 bit and the speed difference was statistically insignificant.
          • Yeah, the benefits (or lack thereof) will depend on register usage to a large extent.

            A well-optimized math library is probably using SIMD operations that work the same in either 32-bit or 64-bit code, so even small penalties in code and pointer size may erase any performance gains when you move to x64. But most applications don't spend most of their time doing vectorized math, and I imagine that includes VS.

  • No 64 bit version "because not needed" on Linux meanwhile you have to install 64 multilib packages and compile from source many of those on ArchLinux just to run Steam. Come on guys, you can argue that staying 32 bit is smaller but then it actually becomes larger because you have to install tons more 32bit libraries to support legacy on our all shiny 64 bit systems. In effect it is smaller and leaner to go all 64 bit.
  • 64-bit Windows (Score:4, Insightful)

    by Yurka ( 468420 ) on Saturday June 04, 2016 @02:58PM (#52249255) Homepage

    is and has always been a disgusting kludge on top of the 32-bit (that they finally managed to get in decent shape). Faced with the necessity to build something just as ugly for another product, I'm not sure I would have reacted differently than they did.

    Oh, and the bit about valuing the feedback is priceless, of course.

  • This'll change... (Score:2, Insightful)

    by RyanFenton ( 230700 )

    Microsoft makes very good developer tools - but the decision on how to make them, unfortunately, has constantly been shaped by a LONG series of internal political decisions on what products they want to be promoting at that very second.

    DirectX, Crystal Reports, 'Modern' (metro to everyone else), the dot Net framework, phones, MS-specific java,, etc., etc... Some of them ended up good ideas, some of them were just what they wanted to push to capture some market. It's a big part of why it's actually somethin

    • They don't want to enable development of binaries which are large. They may want to force developers to break projects into smaller modules. MSVC will produce 64-bit executable and DLL's. So it's not like you can't use more space than that during the execution. But they want to force the design to be more modularized. So people would not do things like embedd images in their executables as text-encoded binaries, but would rather have resource files and load them at run time. The runtime of any one exe
  • GCC and LLVM have full 64-bit capabilities and are free.

    • by DaHat ( 247651 )

      You must not have read even the summary.

      There not being native 64-bit version of Visual Studio is the beef... the backend build system (MSBuild) has long supported running native 64-bit toolchains.

    • by Desler ( 1608317 )

      GCC and LLVM are not IDEs. So your comment actually has fuck-all to do with the story.

    • by AmiMoJo ( 196126 )

      Visual Studio supports them both, but that's not the problem. It's the IDE itself that is running out of memory. People have massive projects with many sub projects, databases, GUI designs, web layout editors and previews...

  • 4194304k ought to be enough for anyone

    • by ADRA ( 37398 )

      Your joke would've been better if accurate. Only 2GB of 32 bit space is application addressible. So you'd need 2 GB of system allocated address space for it to make sense.

      • Re: (Score:2, Funny)

        by cfalcon ( 779563 )

        I can't be bothered to keep track of EVERYTHING Microsoft does wrong, I am just one man!

    • If your source code is larger than that, you may wanna break it up into smaller modules. Just saying. MSVC will produce 64-bit binaries. It just isn't itself a 64-bit binary.
  • by melted ( 227442 ) on Saturday June 04, 2016 @04:33PM (#52249587) Homepage

    I worked at MS in early 00s. We didn't use VS to write code. We used a stripped down version of VS or even WinDbg to debug. Nowadays I use Vim and YouCompleteMe (on Linux) and it just works. Zero dollars, easy to set up.

  • by grahamtriggs ( 572707 ) on Saturday June 04, 2016 @04:42PM (#52249643)

    The immediate reaction is to say it is silly that they are not offering a 64-bit version.

    But many developers are likely using a number of extensions - which will currently be 32-bit because that's what Visual Studio is, and 64-bit would require all the extensions provide new versions as well.

    It probably wouldn't take a huge effort to offer Visual Studio as both 32-bit and 64-bit versions (just like Office). But a more useful - although much longer - use of engineering effort would be to take the full Visual Studio experience on to the CLR.

  • Developers....?

    Developers....?

    Developers....?
  • How ridiculously broken does your overall OS and Application Development design have to be to make the transition from 32 to 64 bit this hard?

    I'm REALLY not Trolling; but OS X went through an entire CPU ARCHITECTURE change (and actually TWO, if you count iOS as an offshoot of OS X, and THREE if you count the "Classic (Blue Box)"/Carbon APIs), PLUS handled 32 to 64 bit transition (some of it happening at the SAME TIME as the PPC to Intel Transition), and almost ALL OS X users noticed NOT A SINGLE THING.

    I
    • As I mentioned in the other post, it's 32 bit compiler binary... not a binary only capable of producing 32 bit binaries. MSVC will happily produce 64 bit binaries. This isn't laziness. It actually requires more discipline from their compiler team to fit the compilation process within 32-bit memory space per running compiler process.
  • I think their argument is that if your project is so large that it needs 64 memory space to parse and generate the binary, then it's too large and should be broken up into smaller modules (dll's, resource files, etc.) The MSVC will definitely produce 64 executables if you configure it to. It's actually more a limitation they put on themselves. Because it means that their running compiler must run as a 32 bit binary. Which requires higher level of discipline from their compiler designers.
  • It wants its newfangled CPU architecture back.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...