Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming

Elon Musk and John Carmack Discuss Neuralink, Programming Languages on Twitter (twitter.com) 72

Friday night CNET reported: With a device surgically implanted into the skull of a pig named Gertrude, Elon Musk demonstrated his startup Neuralink's technology to build a digital link between brains and computers. A wireless link from the Neuralink computing device showed the pig's brain activity as it snuffled around a pen on stage Friday night.
Some reactions from Twitter:

- "The potential of #Neuralink is mind-boggling, but fuckkkk why would they use Bluetooth???"

- "they're using C/C++ too lmao"

But then videogame programming legend John Carmack responded: "Quality, reliable software can be delivered in any language, but language choice has an impact. For me, C would be a middle-of-the-road choice; better than a dynamic language like javascript or python, but not as good as a more modern strongly static typed languages.

However, the existence of far more analysis tools for C is not an insignificant advantage. If you really care about robustness, you are going to architect everything more like old Fortran, with no dynamic allocations at all, and the code is going to look very simple and straightforward.

So an interesting question: What are the aspects of C++ that are real wins for that style over C? Range checked arrays would be good. What else?

When asked "What's a better modern choice?" Carmack replied "Rust would be the obvious things, and I don't have any reason to doubt it would be good, but I haven't implemented even a medium sized application in it."

But then somewhere in the discussion, Elon Musk made a joke about C's lack of "class" data structures. Elon Musk responded: I like C, because it avoids class warfare
But then Musk also gave interesting responses to two more questions on Twitter: Which is your fav programming language? Python?

Elon Musk: Actually C, although the syntax could be improved esthetically

Could Neuralink simulate an alternate reality that could be entered at will, like Ready Player One? Implications for VR seem to be massive. Essentially, a simulation within a simulation if we're already in one ...

Elon Musk: Later versions of a larger device would have that potential

This discussion has been archived. No new comments can be posted.

Elon Musk and John Carmack Discuss Neuralink, Programming Languages on Twitter

Comments Filter:
  • by NoSleepDemon ( 1521253 ) on Saturday August 29, 2020 @11:37PM (#60454464)
    Spoken with the ignorance only a lifetime in front end web development could afford with respect to actual, useful programming work. Go back to murdering my phone's performance you silly muppet.
    • by ArchieBunker ( 132337 ) on Saturday August 29, 2020 @11:53PM (#60454488)

      To be fair twitter gives everyone a voice. That voice is cancer.

      • Malignant is certainly an apt description.
      • by BrainJunkie ( 6219718 ) on Sunday August 30, 2020 @12:44AM (#60454550)

        That voice is cancer.

        Not really. Like pretty much all communication on the web ever, you just have to filter it.

        Usenet gave everyone a voice long before Twitter and the same thing was needed to sort the good stuff from the bad. I don't really know what Carmack has been up to lately, but you are getting to see him and several other people who have a clue chat about cutting edge tech in a very direct way, with the guy who leads the company developing it being involved (and actually seeming to have a clue as well). Pretty cool and well worth the need to filter out the bozos.

        • by ed1park ( 100777 )

          This! Slashdot was a better form of Usenet because of modding points. Now Twitter offers the advantage of hearing people like Carmack and Elon directly communicate with each other and everyone independent of any topic or post with the added ability of modding posts with hearts.

        • Comment removed based on user account deletion
        • Usenet at least took a little effort to get set up and working.

      • I have seen Harry Potter author flamed by a reader telling her to read the book. There is also a tweet telling Pope to read the Bible.

    • by Twinbee ( 767046 )
      I'll never forgive C for forcing you to write those stupid header files. Boilerplate / duplicate code is perhaps my number one enemy.
      • That's where you write your documentation. If you don't write documentation, I don't forgive you.
        • Absolutely. Though I've got to admit that I have come to appreciate the JavaDoc, etc. technique of embedding documentation into the code itself. Keeps everything that needs to be updated at the same time, in the same place. Plus adds the benefits of hyperlinks and (sometimes tempermental) images, and nice HTML formatting.

          • by andi75 ( 84413 )

            Javadoc is basically a subset of Doxygen (https://en.wikipedia.org/wiki/Doxygen) built into the language. Both came out in the mid-90ies, doxygen a bit after Java, but I don't remember if Doxygen was actually inspired by Javadoc or something else at that time.

            Anyway, keeping your documentation inside the code files is not an idea that's limited to the Java language.

            • Yes, thank you! I couldn't remember that name offhand.

              And embedded documentation certainly isn't restricted to Java - I've even dabbled at using it with C++. But if you have it, it makes requiring separate header files even more redundant and annoying.

        • by Twinbee ( 767046 )
          You can write the docs in the main code, and then the IDE can display the summary of headers (with accompanying docs) automatically. Leave duplication and automation to the computers.
  • "Elon Musk: Actually C, although the syntax could be improved esthetically"

    So, Elon isn't an æsthete.

    Yeah, I know dropping the a is an American English spelling, but it's also an oxymoron, being un-æsthetic. One of his kid's names is Æ, not E, so no excuse.
    • If you're building embedded systems, there are only two languages; C and C++.

      The syntax could be improved esthetically regardless which you choose.

      They could use another language but it would inflate the hardware costs needlessly. Developers are not customers.

      • And assembly - though modern C compilers have gotten good enough that it's very rarely justified.

        I've heard Rust compilers can actually get pretty close to C-level optimization as well, though I've gotten the impression that the language is still in enough flux that you probably wouldn't want to tackle any potentially long-term project using it.

        • by sjames ( 1099 )

          Rust has a long way to go for that. First it has to be stable enough that people have confidence that if they dust off the source 15 years from now and change one line, they can compile a new version of the firmware without drama.

          Next, most BSPs for embedded devices are in C or C++ and assume the code that links them in will be also. Same for developer's and 3rd party libraries.

          • Well, I mentioned the stability, but that's a good point about libraries.

            Neither inflates the hardware costs though, which is what I was specifically responding to. (Well, maybe a little - mature libraries do tend to be pretty well optimized)

            • by sjames ( 1099 )

              Agreed that the libraries probably have a lot more influence over development and support costs than they do direct hardware costs.

  • by ClickOnThis ( 137803 ) on Sunday August 30, 2020 @12:02AM (#60454500) Journal

    ...they should program in Brainfuck. [wikipedia.org]

    • by Anonymous Coward
      A "discussion" on Twitter is as useful as a program in Brainfuck.
  • by Cassini2 ( 956052 ) on Sunday August 30, 2020 @12:02AM (#60454502)

    For fast, hard real-time code, pure C code has advantages. Firstly, it is important to avoid malloc and new. Memory layouts need to be fixed, such that ISRs and DMA devices know that the memory is in a fixed location and ready for access. There are special memory allocation calls for non-paged memory (memory that can't be swapped out to disk.) It is really hard to beat C for this type of code.

    C++ also works. However, for low-level code speed is important. The biggest danger with C++ is someone using the standard template library or standard C++ calls, which implicitly call new and malloc. These calls will often work, but create latent performance errors. Its nice to be able to view the C code and understand what is happening.

    I encountered one company determined to write a hard real-time system in C#. Not so sure about that ...

    • by ShanghaiBill ( 739463 ) on Sunday August 30, 2020 @12:16AM (#60454514)

      The biggest danger with C++ is someone using the standard template library or standard C++ calls, which implicitly call new and malloc.

      If you are writing mission-critical code, you will be using a coding standard such as MISRA, linking to a library that has no malloc, and the STL header files will not exist.

      There are static analysis tools that can detect MISRA violations, including recursion, pointer abuse, and naked if-else blocks.

      • by DrXym ( 126579 )
        MISRA more or less demonstrates why C is a horrible language to program anything safely.
        • MISRA more or less demonstrates why C is a horrible language to program anything safely.

          MISRA-C is used to develop safe software all the time.

          Do you remember when the Ariane-5 malfunctioned and exploded? Do you remember when Air France Flight 447 fell out of the sky?

          Both were coded in Ada.

          Most software failures are because of high-level design flaws, not coding errors or problems with a specific language.

          • by DrXym ( 126579 )
            MISRA-C is regular C with validation thrown to the static analysis gods to hope it's done safely. It's a super-dumb way to develop. Perhaps it is understandable at the time it was done, but these days there are better options.
        • by sjames ( 1099 )

          Not necessarily. Some of the "features" of more managed code can cause all sorts of havoc on an embedded device. You do NOT want a hard real time system deciding at random to pause for garbage collection, for example. Sometimes it's better (and faster) to just accept that there may be a memory leak or two in less critical functions and hard reset periodically.

          • by DrXym ( 126579 )
            I didn't say anything about garbage collection or managed code.
            • by sjames ( 1099 )

              But those are common features of higher level languages.

              • by DrXym ( 126579 )
                I didn't say anything about higher level languages either. If you absolutely need a real time performance you could get it in languages like Ada or Rust that have no additional runtime overheads compared to C. It's just that compiler does more upfront to stop you producing dangerous code. If you need *hard* real time then you might even use languages designed for that purpose, e.g. industrial control devices tend to use an IEC-1131 dialect like IL or ST or visual languages like ladder logic.

                The problem wi

    • malloc is not "pure C code"? (There might also be D's -betterC mode for those who can appreciate it.)
      • No, but if your goal is to avoid malloc, pure C code makes calls explicit enough to be easy to avoid, including in most of the standard library. In C++ malloc is embedded into a lot of standard language features, as well as the standard library.

        • by DrXym ( 126579 )
          Since C++ is almost a superset of C, it's no harder to avoid memory allocation there than it is in C. You would need additional rules to follow, i.e. don't use std collections, don't use new & delete but it's mostly the same.

          However if the constraints on your programming are such that you cannot allocate memory then really you should be using C at all. Use a language designed around those constraints. For example most industrial control devices use a IEC-1131 language such as ST or IL where you static

          • Who says you can't allocate memory? Malloc is only for allocating memory from the heap, with all the complexities that can create. Stack-based allocation may still be just fine.

            And yes, it's possible to avoid malloc in C++ - right up until you involve a non-guru programmer who isn't aware of all the places it gets used. The fact is that a whole lot of the "best practices" in C++ are going to invoke malloc by default. And if you're going to ban the best practices of a language for technical reasons... pe

            • by DrXym ( 126579 )
              The way to "allocate" from the stack is to move the stack pointer, a la alloca(). That's not really memory allocation in the normal sense of the word although it might be adequate in some cases. But note that embedded software tends to have a smaller stack so not always. And usually memory allocation means the software has a reserved chunk of memory called a heap and when you "allocate" memory from it, the space is reserved of it and then released when you free() it. The issue is that aside from leaks (forg
    • by CaptainLugnuts ( 2594663 ) on Sunday August 30, 2020 @12:44AM (#60454554)

      You can do hard real time in C++ just fine.
      You need to replace the allocators you supply to the STL with deterministic ones.

      • Or just avoid dynamic allocation altogether. This is the same in C or C++,
        or in any programming language for that matter.

        C++ provides plenty of improvements over C in realtime or embedded systems,
        e.g. RAII, more abstraction with templates, polymorphism etc. etc.

        I've worked in an embedded project limited by policy to C. Guess what, the
        project had polymorphism shoehorned back into C with a set of handcrafted
        macros.

      • by AmiMoJo ( 196126 )

        You can use C++ but you can't just drop any random C++ developer in and expect them to write good embedded code, which is the mistake a lot of companies make because HR sees "C++" on the requirements and the CV.

        C++ on embedded has some nice features but also you end up working around its limitations, such as restrictions on type punning that are valid C.

        It's the same with people who insist that you must use -Wall. All you end up doing is throwing in loads of #pragmas to disable certain warnings (e.g. using

    • For fast, hard real-time code, pure C code has advantages. Firstly, it is important to avoid malloc and new. Memory layouts need to be fixed...

      That's actually one of the advantages of C++: you can override the new allocator method for classes and assign addresses from a fixed memory layout and also cause inappropriate use of 'new' to fail in a helpful way. I used this in an embedded C++ project years ago so that there was a nice and simple interface for my fellow physicists that hid all the required memory shenanigans behind the standard new/delete interface they were used to.

      You do have to be careful with C++ in an embedded setting but you ca

      • by DrXym ( 126579 )
        You could do that but at that point you've more or less invented a crude heap. If you look at complex embedded software like Marlin they basically avoid using new & delete or classes that do altogether. Everthing is declared on the stack or globally.

        This would be typical in embedded programming and programmable logic controllers - declare what you use up front and nothing else. If C or C++ is being used against bare metal then likely the thing doesn't even have a proper C-lib to start with.

    • I think its also easier to write bad code in C++ than in C - there are just so many more ways for the software engineer to make mistakes.

      For some types of projects C code is a good tool.

    • C++ also works. However, for low-level code speed is important. The biggest danger with C++ is someone using the standard template library or standard C++ calls, which implicitly call new and malloc. These calls will often work, but create latent performance errors. Its nice to be able to view the C code and understand what is happening.

      It's not a huge danger; code reviews exist. Beyond that on many deep embedded systems, you don't have the STL or memory allocation anyway. It's pretty annoying TBH, and the

    • There was this guy at a C++ conference a few years ago who wrote "Pong" for the C64 in C++17 with classes and STL (no containers, only algorithms and iterators over const arrays, so it ended up all being done at compile time). I think it compiled down to a few hundred bytes of code that used about 6 bytes of (mutable) ram (statically allocated at a few 0x000x addresses).
    • by sjames ( 1099 )

      In many cases for embedded devices, the C++ is really a subset that's more like C with fancy scoping of variables.

    • For fast, hard real-time code, pure C code has advantages. Firstly, it is important to avoid malloc and new. Memory layouts need to be fixed, such that ISRs and DMA devices know that the memory is in a fixed location and ready for access. There are special memory allocation calls for non-paged memory (memory that can't be swapped out to disk.) It is really hard to beat C for this type of code.

      C++ also works. However, for low-level code speed is important. The biggest danger with C++ is someone using the standard template library or standard C++ calls, which implicitly call new and malloc.

      Duuude, that's so wrong. First of all, your C code runs in C++. Period. Second of all, whatever C can do, C++ can do better because templates can actually let compilers optimize based on the data type, as well as a few other things. And that thing about someone using the STL which call new and malloc” is like saying that the standard C library has malloc, so someone might call it.

      No, wherever you have C, C++ can do a better job. This is already proven time and time again in the past 20 years, there's

    • The biggest danger with C++ is someone using the standard template library or standard C++ calls, which implicitly call new and malloc.

      No, they don't. They implicitly use an allocator and the default allocator calls new. But you can provide your own allocator that doles out memory from statically-allocated pools. Placement new is also heavily-used in bare metal C++ coding.

      Having done plenty of both, I'll take C++ over C every time, in every context. C++ gives me expressiveness and safety that C does not, and imposes no additional costs unless I choose to accept them. There's simply no benefit to using pure C, unless a team of C++ progr

  • Uh, yeah, Elon. That's what we've been saying about Artificial Intelligence and computers for about 80 years now.

    • Yeah, there have been no advances in machine learning in the last 80 years.

      • Small incremental advances, with a lot of crowing about it.

        • The term incremental is wrong here. "incremental: relating to or denoting an increase or addition, especially one of a series on a fixed scale."

          The growth of anything in computing is by nature exponential instead. Part of the problem of early AI predictions was that they *did* take an incremental view of the problem but also vastly underestimated the amount of processing power of the brain, while also underestimating the complexity of various domains. It's largely limited by the hardware. A desktop computer

        • I'll also point out that critics keep shifting the goal posts.

          The history of artificial intelligence is as much marked by what computers can’t do as what they can.

          That’s not to say that the history of A.I. is a history of failure, but rather that, as a discipline, it has been driven forward in its quest for machine intelligence by a constant series of skeptical statements suggesting that “a computer will never [insert feat here].”

          A computer will never learn. A computer will never play chess. A computer will never win at the game show Jeopardy! A computer will never be any good at translating languages. A computer will never drive a car. A computer will never win at Go, or StarCraft, or Texas Hold ‘Em.

          So, a lot of the criticism is of the form "a computer will never do " and there's always going to be a thing that they haven't done yet, so the anti-AI crowd end up sounding like creationists with their god-of-the-gaps arguments. I've had conversations where they end up saying "well ... a computer will never fall in love!" as the fall back. Ok, so a machine can beat 99% of people at every task except falling in love. Seems like a useful machine.

          • forgot to link source:
            https://www.digitaltrends.com/... [digitaltrends.com]

            I'll also point out some other fallbacks rather than love:
            - a computer will never write a sonnet
            - a computer will never paint a painting

            Well, done, and done. But of course, once computers do write sonnets, the goal-posts shift that it needs to be a sonnet that you can't tell was written by a machine. But ... that's getting illogical. If two men, Jim and Barry both write sonnets, and Barry cannot write a sonnet that could pass for one of Jim's that in no

  • I think they should be looking at languages that are inherently secure.
    Otherwise it will only be a matter of time before someone hacks their pig.
  • Rockets! John Carmackâ(TM)s Armadillo Aerospace was a low-budgetpioneer of the VTVL rocketry path that Elon followed.

  • Presumably it can learn to grab intent. I mean look at your arm and think about moving it without actually moving it .. practice that enough and there is a point where you can't will your arm to move. A bit scary to say the least. Not sure where in the depths of willpower the signal to move a muscle emerges. It's locked in the conciousness somewhere maybe neuralink can uncover that -- and it may hold the key to unraveling conciousness and sentience itself.

  • "Could Neuralink simulate an alternate reality that could be entered at will, like Ready Player One?
    Elon Musk: Later versions of a larger device would have that potential"

    Elon sure knows how to play millennials. His results are impressive but to be the hero millennials want he has to hype everything to this ridiculous levels. Yeah, today we're at detecting pig touching things with its snout but we're looking at simulating alternate realities. We have a car that can follow lines but we will have 1000000 self

    • That's why Musk hires literally thousands of programmers and you just are one. See the difference?

      • by jandoe ( 6400032 )

        Of course I see it. As I've said, I'm impressed by his results. He's a great businessman. It's just irritating that everything as to be hyped to this absurd levels. I think it's because millennials are very hype-driven. They don't just want nice things. They want the 'greatest thing ever created omg so awesome best innovation ever'. It's silly.
        But it's just my opinion...

  • While C++ and classical OO has become widely taught and used, it has nevertheless failed in literally every originally stated objective. I first programmed in Basic, then assembly, then C, then C++, and on it went. Today, mostly JavaScript and C# with a little PowerShell on the side.

    Software used to be much easier. I found that it was C++ when simple things started to become extremely complicated. I fled to JavaScript for applications. I could just get so much done so much faster and, it turned out, t

  • The pigsty hasn't got 5G yet, some pigs burned down the pole.

  • Isn't that kind of like discussing leashes for Unicorns?
  • John Carmack is good at bringing up Elon Musk best side. That of a ambitious and competent engineer.

    I hate his fandom, the way he communicates and the way he does business. However when someone gets him to talk like an engineer and not like a salesman, he becomes really interesting.

Everybody needs a little love sometime; stop hacking and fall in love!

Working...