Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming GNU is Not Unix Open Source Software Upgrades

GCC 4.9 To See Significant Upgrades In 2014 191

noahfecks writes "It seems that the GCC developers are taking steps to roll out significant improvements after CLANG became more competitive. 'Among the highlights to look forward to right now with GCC 4.9 are: The Undefined Behavior Sanitizer has been ported to GCC; Ada and Fortran have seen upgrades; Improved C++14 support; RX100, RX200, and RX600 processor support; and Intel Silvermont hardware support.'"
This discussion has been archived. No new comments can be posted.

GCC 4.9 To See Significant Upgrades In 2014

Comments Filter:
  • by Anonymous Coward on Sunday October 27, 2013 @02:40AM (#45249703)

    Most projects don't use C99 anyway.

  • by Anonymous Coward on Sunday October 27, 2013 @04:01AM (#45249861)

    What's your point? 4.8.2 is the second bugfix/stabilization release of 4.8.0 which was released in March this year. Should they stop releasing bug fixes as soon as they start developing the next generation compiler? Should they refrain from any new developments until the old version has proven to be bug free?

    What's wrong with continuing development that will likely result in a new version release next year?

  • by gweihir ( 88907 ) on Sunday October 27, 2013 @07:41AM (#45250389)

    My guess would be that some people just do not understand how release numbers work...

  • by Electricity Likes Me ( 1098643 ) on Sunday October 27, 2013 @07:55AM (#45250447)

    GPL does one very important thing well: it keeps the community alive. As long as people are modifying GPL code, they're obliged to contribute those modifications back.

    The problem with the BSD license is, as soon as Apple feels they have the market sewn up, those patches are going to stop flowing very quickly and probably not for the most rational reasons - remember, it's managers and executives who make these decisions, not coders.

  • by loufoque ( 1400831 ) on Sunday October 27, 2013 @10:44AM (#45251139)

    As a member of both the C and C++ standards committees, and as a CEO of a company that sells C++ libraries to businesses for high-performance computing, I have to disagree with you.

    The Oracle/Sun and IBM compilers are the worst C++ compilers available.
    Intel is also pretty bad, despite touting good standards conformance and being designed for runtime speed, it deals very badly with abstraction penalties, and is extremely slow to compile.
    Microsoft's compiler is also pretty bad, both at compilation speed, standards conformance, and runtime speed, with each new version introducing quirks and regressions (they have acknowledged major codegen regressions in the recent releases and are investigating them)

    If you want a good C++ compiler, GCC or Clang are the only tools available.

  • by epyT-R ( 613989 ) on Sunday October 27, 2013 @03:00PM (#45252605)

    Yeah, because those interpreted/bytecode 'point-and-stick' languages are the wave of the future right? Most of their interpreters are written in C++ too. Now we have applications that used to need 1MB in 1998 needing hundreds of MB of ram to do the same remedial things. Also, don't forget to add all the 'binding' dependencies needed to link that script-land with the real system libraries, which are also C/C++, so that it is actually useful. In most cases, a competent programmer can put together an equivalent program with a binary size less in the hundreds of kB using C/C++. It's smaller, faster, and has fewer dependencies and potential bugs because there's less code running in the first place.

    There will always be at least one 'bare metal' language around because we have to be able to write for the hardware, whether it be C/C++ or something else, and every programmer should be familiar with its basics at least.

  • by epyT-R ( 613989 ) on Sunday October 27, 2013 @03:04PM (#45252631)

    "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil." -- Knuth

    I'd say misinterpretations of this statement are the root of all evil. They have led to a world full of slow running bloated runtimes doing little more than shoving strings around because today's programmers were quoted this line by professors, and interpreted it to mean "never bother because the user will always have more ram/cpu." It's the reason for all the trashy software out there.

  • by Wootery ( 1087023 ) on Sunday October 27, 2013 @05:26PM (#45253643)

    Why? Compilers are pretty simple; Difficult for a lot of people to conceptualize, yes, but for those who can make that leap of understanding, not terribly difficult to design

    Err, no. Let's look at C++ in particular, as it's pretty much a worst case when it comes to compiler implementation.

    These guys [wikipedia.org] make a living working on a C++ front-end. A front-end only. Intel licence it because writing their own C++ front-end would be a tremendous effort; C++ is a hugely complex language, for machines (i.e. compiler front-ends) as well as for humans. The optimisation and back-end work is even more effort, especially if you want to be a serious competitor among today's compilers, which gcc certainly does.

    Getting these things right is, to put it mildly, not easy. Bugs in optimising compilers really do happen. Here [sourceforge.net]'s a compiler-bug warning I ran into just this week.

    Let's also not forget the scope of the gcc project: it's not 'just' a C++ -> x86/AMD64/IA-64 compiler, the way ICC is. It reads in [gnu.org] source-code in C, C++, Objective-C, Fortran, Java (in theory...), Ada, and Go, and emits machine code for a great many CPU architectures.

    Compilers are a legitimate sub-field of computer science, in the same way operating systems are. IBM invested in JikesRVM [wikipedia.org], a 'Research Virtual Machine' (for Java) for a reason. It's something some academics specialise in. Dismissing the field as "pretty simple" is hardly fair to its researchers and implementers.

  • by RR ( 64484 ) on Sunday October 27, 2013 @05:56PM (#45253861)

    All that extra complexity you ascribe to them is based solely on looking at the size of it and going "zomfg! Where do I start?" ... which is probably why I got downmodded. Nobody understands that just because something is _big_ does not mean it is _complex_.

    No, I think you got downmodded because you have a habit of throwing around technical terms like you understand what you're saying, but you don't.

    Making something like GCC is not simple. The language specifications are written by humans, and have a whole lot of edge cases and undefined behaviors. So, you have to decide what to do in edge cases, how to take advantage of undefined behaviors to improve performance. And then, a few years later, the specifications change and you have the fun of deciding on a case-by-case basis how to both maintain backwards compatibility and support the new features. Not to mention all the interactions with other people on the project and in the standards committees.

    A compiler is an upper-class undergraduate project. A competitive optimizing compiler with front-ends for C, C++, and several other languages and back-ends for a lot of different processors and operating systems is a very complicated job.

"Engineering without management is art." -- Jeff Johnson

Working...