Elon Musk and John Carmack Discuss Neuralink, Programming Languages on Twitter (twitter.com) 72
Friday night CNET reported:
With a device surgically implanted into the skull of a pig named Gertrude, Elon Musk demonstrated his startup Neuralink's technology to build a digital link between brains and computers. A wireless link from the Neuralink computing device showed the pig's brain activity as it snuffled around a pen on stage Friday night.
Some reactions from Twitter:
- "The potential of #Neuralink is mind-boggling, but fuckkkk why would they use Bluetooth???"
- "they're using C/C++ too lmao"
But then videogame programming legend John Carmack responded: "Quality, reliable software can be delivered in any language, but language choice has an impact. For me, C would be a middle-of-the-road choice; better than a dynamic language like javascript or python, but not as good as a more modern strongly static typed languages.
However, the existence of far more analysis tools for C is not an insignificant advantage. If you really care about robustness, you are going to architect everything more like old Fortran, with no dynamic allocations at all, and the code is going to look very simple and straightforward.
So an interesting question: What are the aspects of C++ that are real wins for that style over C? Range checked arrays would be good. What else?
When asked "What's a better modern choice?" Carmack replied "Rust would be the obvious things, and I don't have any reason to doubt it would be good, but I haven't implemented even a medium sized application in it."
But then somewhere in the discussion, Elon Musk made a joke about C's lack of "class" data structures. Elon Musk responded: I like C, because it avoids class warfare
But then Musk also gave interesting responses to two more questions on Twitter: Which is your fav programming language? Python?
Elon Musk: Actually C, although the syntax could be improved esthetically
Could Neuralink simulate an alternate reality that could be entered at will, like Ready Player One? Implications for VR seem to be massive. Essentially, a simulation within a simulation if we're already in one ...
Elon Musk: Later versions of a larger device would have that potential
Some reactions from Twitter:
- "The potential of #Neuralink is mind-boggling, but fuckkkk why would they use Bluetooth???"
- "they're using C/C++ too lmao"
But then videogame programming legend John Carmack responded: "Quality, reliable software can be delivered in any language, but language choice has an impact. For me, C would be a middle-of-the-road choice; better than a dynamic language like javascript or python, but not as good as a more modern strongly static typed languages.
However, the existence of far more analysis tools for C is not an insignificant advantage. If you really care about robustness, you are going to architect everything more like old Fortran, with no dynamic allocations at all, and the code is going to look very simple and straightforward.
So an interesting question: What are the aspects of C++ that are real wins for that style over C? Range checked arrays would be good. What else?
When asked "What's a better modern choice?" Carmack replied "Rust would be the obvious things, and I don't have any reason to doubt it would be good, but I haven't implemented even a medium sized application in it."
But then somewhere in the discussion, Elon Musk made a joke about C's lack of "class" data structures. Elon Musk responded: I like C, because it avoids class warfare
But then Musk also gave interesting responses to two more questions on Twitter: Which is your fav programming language? Python?
Elon Musk: Actually C, although the syntax could be improved esthetically
Could Neuralink simulate an alternate reality that could be entered at will, like Ready Player One? Implications for VR seem to be massive. Essentially, a simulation within a simulation if we're already in one ...
Elon Musk: Later versions of a larger device would have that potential
"They're using C/C++ Too LMAO" (Score:5, Funny)
Re:"They're using C/C++ Too LMAO" (Score:5, Funny)
To be fair twitter gives everyone a voice. That voice is cancer.
Re: "They're using C/C++ Too LMAO" (Score:3)
Re:"They're using C/C++ Too LMAO" (Score:4, Insightful)
That voice is cancer.
Not really. Like pretty much all communication on the web ever, you just have to filter it.
Usenet gave everyone a voice long before Twitter and the same thing was needed to sort the good stuff from the bad. I don't really know what Carmack has been up to lately, but you are getting to see him and several other people who have a clue chat about cutting edge tech in a very direct way, with the guy who leads the company developing it being involved (and actually seeming to have a clue as well). Pretty cool and well worth the need to filter out the bozos.
Re: (Score:2)
This! Slashdot was a better form of Usenet because of modding points. Now Twitter offers the advantage of hearing people like Carmack and Elon directly communicate with each other and everyone independent of any topic or post with the added ability of modding posts with hearts.
Re: (Score:2)
Re: (Score:2)
Usenet at least took a little effort to get set up and working.
Re: "They're using C/C++ Too LMAO" (Score:1)
I have seen Harry Potter author flamed by a reader telling her to read the book. There is also a tweet telling Pope to read the Bible.
Re: (Score:3)
Re: "They're using C/C++ Too LMAO" (Score:2)
Re: (Score:2)
Absolutely. Though I've got to admit that I have come to appreciate the JavaDoc, etc. technique of embedding documentation into the code itself. Keeps everything that needs to be updated at the same time, in the same place. Plus adds the benefits of hyperlinks and (sometimes tempermental) images, and nice HTML formatting.
Re: (Score:2)
Javadoc is basically a subset of Doxygen (https://en.wikipedia.org/wiki/Doxygen) built into the language. Both came out in the mid-90ies, doxygen a bit after Java, but I don't remember if Doxygen was actually inspired by Javadoc or something else at that time.
Anyway, keeping your documentation inside the code files is not an idea that's limited to the Java language.
Re: (Score:2)
Yes, thank you! I couldn't remember that name offhand.
And embedded documentation certainly isn't restricted to Java - I've even dabbled at using it with C++. But if you have it, it makes requiring separate header files even more redundant and annoying.
Re: (Score:2)
Gonna say... (Score:2)
So, Elon isn't an æsthete.
Yeah, I know dropping the a is an American English spelling, but it's also an oxymoron, being un-æsthetic. One of his kid's names is Æ, not E, so no excuse.
Re: (Score:2)
If you're building embedded systems, there are only two languages; C and C++.
The syntax could be improved esthetically regardless which you choose.
They could use another language but it would inflate the hardware costs needlessly. Developers are not customers.
Re: (Score:2)
And assembly - though modern C compilers have gotten good enough that it's very rarely justified.
I've heard Rust compilers can actually get pretty close to C-level optimization as well, though I've gotten the impression that the language is still in enough flux that you probably wouldn't want to tackle any potentially long-term project using it.
Re: (Score:2)
Rust has a long way to go for that. First it has to be stable enough that people have confidence that if they dust off the source 15 years from now and change one line, they can compile a new version of the firmware without drama.
Next, most BSPs for embedded devices are in C or C++ and assume the code that links them in will be also. Same for developer's and 3rd party libraries.
Re: (Score:2)
Well, I mentioned the stability, but that's a good point about libraries.
Neither inflates the hardware costs though, which is what I was specifically responding to. (Well, maybe a little - mature libraries do tend to be pretty well optimized)
Re: (Score:2)
Agreed that the libraries probably have a lot more influence over development and support costs than they do direct hardware costs.
For the win... (Score:5, Funny)
...they should program in Brainfuck. [wikipedia.org]
Re: (Score:1)
Fast, Hard Real-time = No malloc, No New (Score:4, Interesting)
For fast, hard real-time code, pure C code has advantages. Firstly, it is important to avoid malloc and new. Memory layouts need to be fixed, such that ISRs and DMA devices know that the memory is in a fixed location and ready for access. There are special memory allocation calls for non-paged memory (memory that can't be swapped out to disk.) It is really hard to beat C for this type of code.
C++ also works. However, for low-level code speed is important. The biggest danger with C++ is someone using the standard template library or standard C++ calls, which implicitly call new and malloc. These calls will often work, but create latent performance errors. Its nice to be able to view the C code and understand what is happening.
I encountered one company determined to write a hard real-time system in C#. Not so sure about that ...
Re:Fast, Hard Real-time = No malloc, No New (Score:5, Informative)
The biggest danger with C++ is someone using the standard template library or standard C++ calls, which implicitly call new and malloc.
If you are writing mission-critical code, you will be using a coding standard such as MISRA, linking to a library that has no malloc, and the STL header files will not exist.
There are static analysis tools that can detect MISRA violations, including recursion, pointer abuse, and naked if-else blocks.
Re: (Score:2)
Re: (Score:2)
MISRA more or less demonstrates why C is a horrible language to program anything safely.
MISRA-C is used to develop safe software all the time.
Do you remember when the Ariane-5 malfunctioned and exploded? Do you remember when Air France Flight 447 fell out of the sky?
Both were coded in Ada.
Most software failures are because of high-level design flaws, not coding errors or problems with a specific language.
Re: (Score:2)
Re: (Score:2)
Not necessarily. Some of the "features" of more managed code can cause all sorts of havoc on an embedded device. You do NOT want a hard real time system deciding at random to pause for garbage collection, for example. Sometimes it's better (and faster) to just accept that there may be a memory leak or two in less critical functions and hard reset periodically.
Re: (Score:2)
Re: (Score:2)
But those are common features of higher level languages.
Re: (Score:2)
The problem wi
Re: (Score:2)
Re: (Score:2)
No, but if your goal is to avoid malloc, pure C code makes calls explicit enough to be easy to avoid, including in most of the standard library. In C++ malloc is embedded into a lot of standard language features, as well as the standard library.
Re: (Score:2)
However if the constraints on your programming are such that you cannot allocate memory then really you should be using C at all. Use a language designed around those constraints. For example most industrial control devices use a IEC-1131 language such as ST or IL where you static
Re: (Score:2)
Who says you can't allocate memory? Malloc is only for allocating memory from the heap, with all the complexities that can create. Stack-based allocation may still be just fine.
And yes, it's possible to avoid malloc in C++ - right up until you involve a non-guru programmer who isn't aware of all the places it gets used. The fact is that a whole lot of the "best practices" in C++ are going to invoke malloc by default. And if you're going to ban the best practices of a language for technical reasons... pe
Re: (Score:2)
Re:Fast, Hard Real-time = No malloc, No New (Score:4, Informative)
You can do hard real time in C++ just fine.
You need to replace the allocators you supply to the STL with deterministic ones.
Re: (Score:1)
Or just avoid dynamic allocation altogether. This is the same in C or C++,
or in any programming language for that matter.
C++ provides plenty of improvements over C in realtime or embedded systems,
e.g. RAII, more abstraction with templates, polymorphism etc. etc.
I've worked in an embedded project limited by policy to C. Guess what, the
project had polymorphism shoehorned back into C with a set of handcrafted
macros.
Re: (Score:2)
You can use C++ but you can't just drop any random C++ developer in and expect them to write good embedded code, which is the mistake a lot of companies make because HR sees "C++" on the requirements and the CV.
C++ on embedded has some nice features but also you end up working around its limitations, such as restrictions on type punning that are valid C.
It's the same with people who insist that you must use -Wall. All you end up doing is throwing in loads of #pragmas to disable certain warnings (e.g. using
Override new (Score:2)
For fast, hard real-time code, pure C code has advantages. Firstly, it is important to avoid malloc and new. Memory layouts need to be fixed...
That's actually one of the advantages of C++: you can override the new allocator method for classes and assign addresses from a fixed memory layout and also cause inappropriate use of 'new' to fail in a helpful way. I used this in an embedded C++ project years ago so that there was a nice and simple interface for my fellow physicists that hid all the required memory shenanigans behind the standard new/delete interface they were used to.
You do have to be careful with C++ in an embedded setting but you ca
Re: (Score:2)
This would be typical in embedded programming and programmable logic controllers - declare what you use up front and nothing else. If C or C++ is being used against bare metal then likely the thing doesn't even have a proper C-lib to start with.
Re: (Score:2)
I think its also easier to write bad code in C++ than in C - there are just so many more ways for the software engineer to make mistakes.
For some types of projects C code is a good tool.
Re: (Score:3)
C++ also works. However, for low-level code speed is important. The biggest danger with C++ is someone using the standard template library or standard C++ calls, which implicitly call new and malloc. These calls will often work, but create latent performance errors. Its nice to be able to view the C code and understand what is happening.
It's not a huge danger; code reviews exist. Beyond that on many deep embedded systems, you don't have the STL or memory allocation anyway. It's pretty annoying TBH, and the
Re: (Score:2)
Re: (Score:2)
"Rich Code for Tiny Computers" [youtube.com] presented by Jason Turner.
Yeah, I fondly remember that one. 8)
Re: (Score:2)
In many cases for embedded devices, the C++ is really a subset that's more like C with fancy scoping of variables.
Re: (Score:1)
For fast, hard real-time code, pure C code has advantages. Firstly, it is important to avoid malloc and new. Memory layouts need to be fixed, such that ISRs and DMA devices know that the memory is in a fixed location and ready for access. There are special memory allocation calls for non-paged memory (memory that can't be swapped out to disk.) It is really hard to beat C for this type of code.
C++ also works. However, for low-level code speed is important. The biggest danger with C++ is someone using the standard template library or standard C++ calls, which implicitly call new and malloc.
Duuude, that's so wrong. First of all, your C code runs in C++. Period. Second of all, whatever C can do, C++ can do better because templates can actually let compilers optimize based on the data type, as well as a few other things. And that thing about someone using the STL which call new and malloc” is like saying that the standard C library has malloc, so someone might call it.
No, wherever you have C, C++ can do a better job. This is already proven time and time again in the past 20 years, there's
Re: (Score:2)
The biggest danger with C++ is someone using the standard template library or standard C++ calls, which implicitly call new and malloc.
No, they don't. They implicitly use an allocator and the default allocator calls new. But you can provide your own allocator that doles out memory from statically-allocated pools. Placement new is also heavily-used in bare metal C++ coding.
Having done plenty of both, I'll take C++ over C every time, in every context. C++ gives me expressiveness and safety that C does not, and imposes no additional costs unless I choose to accept them. There's simply no benefit to using pure C, unless a team of C++ progr
"Later Versions Of A Larger Device" (Score:1)
Uh, yeah, Elon. That's what we've been saying about Artificial Intelligence and computers for about 80 years now.
Re: (Score:2)
Yeah, there have been no advances in machine learning in the last 80 years.
Re: (Score:1)
Small incremental advances, with a lot of crowing about it.
Re: (Score:2)
The term incremental is wrong here. "incremental: relating to or denoting an increase or addition, especially one of a series on a fixed scale."
The growth of anything in computing is by nature exponential instead. Part of the problem of early AI predictions was that they *did* take an incremental view of the problem but also vastly underestimated the amount of processing power of the brain, while also underestimating the complexity of various domains. It's largely limited by the hardware. A desktop computer
Re: (Score:2)
I'll also point out that critics keep shifting the goal posts.
The history of artificial intelligence is as much marked by what computers can’t do as what they can.
That’s not to say that the history of A.I. is a history of failure, but rather that, as a discipline, it has been driven forward in its quest for machine intelligence by a constant series of skeptical statements suggesting that “a computer will never [insert feat here].”
A computer will never learn. A computer will never play chess. A computer will never win at the game show Jeopardy! A computer will never be any good at translating languages. A computer will never drive a car. A computer will never win at Go, or StarCraft, or Texas Hold ‘Em.
So, a lot of the criticism is of the form "a computer will never do " and there's always going to be a thing that they haven't done yet, so the anti-AI crowd end up sounding like creationists with their god-of-the-gaps arguments. I've had conversations where they end up saying "well ... a computer will never fall in love!" as the fall back. Ok, so a machine can beat 99% of people at every task except falling in love. Seems like a useful machine.
Re: (Score:2)
forgot to link source:
https://www.digitaltrends.com/... [digitaltrends.com]
I'll also point out some other fallbacks rather than love:
- a computer will never write a sonnet
- a computer will never paint a painting
Well, done, and done. But of course, once computers do write sonnets, the goal-posts shift that it needs to be a sonnet that you can't tell was written by a machine. But ... that's getting illogical. If two men, Jim and Barry both write sonnets, and Barry cannot write a sonnet that could pass for one of Jim's that in no
The need for security (Score:2)
Otherwise it will only be a matter of time before someone hacks their pig.
These two have something else in common: (Score:2)
Rockets! John Carmackâ(TM)s Armadillo Aerospace was a low-budgetpioneer of the VTVL rocketry path that Elon followed.
learning mode (Score:2)
Presumably it can learn to grab intent. I mean look at your arm and think about moving it without actually moving it .. practice that enough and there is a point where you can't will your arm to move. A bit scary to say the least. Not sure where in the depths of willpower the signal to move a muscle emerges. It's locked in the conciousness somewhere maybe neuralink can uncover that -- and it may hold the key to unraveling conciousness and sentience itself.
Re: (Score:2)
He made his reputation in graphics processing for games, inventing a number of algorithms that allowed for high-speed rendering back in the day before you could just throw GPU power at everything. He's more of a business-leader now - he has a ton of money from the sale of ID software - but he does have a programming background.
Re: (Score:2)
"Carmack has pioneered or popularized the use of many techniques in computer graphics, including "adaptive tile refresh" for Commander Keen, ray casting for Hovertank 3-D, Catacomb 3-D, and Wolfenstein 3-D, binary space partitioning which Doom became the first game to use, surface caching which he invented for Quake, Carmack's Reverse (formally known as z-fail stencil shadows) which he devised for Doom 3, and MegaTexture technology, first used in Enemy Territory: Quake Wars."
"early in the development of Doo
Re: (Score:2)
Seriously? What are you a fool? By that measure Pythagoras (or whoever invented his theorem) was an idiot. In addition, given the fact that the only way a brain could generate such a stupid statement is if its IQ is in the low double digits, I doubt YOU could implement an FPS or game engine even after being shown how to do it.
Also, Carmack did do most of the game engine coding and given that he's smarter than you his statements have more value than yours.
I can't even... (Score:2)
"Could Neuralink simulate an alternate reality that could be entered at will, like Ready Player One?
Elon Musk: Later versions of a larger device would have that potential"
Elon sure knows how to play millennials. His results are impressive but to be the hero millennials want he has to hype everything to this ridiculous levels. Yeah, today we're at detecting pig touching things with its snout but we're looking at simulating alternate realities. We have a car that can follow lines but we will have 1000000 self
Re: (Score:2)
That's why Musk hires literally thousands of programmers and you just are one. See the difference?
Re: (Score:2)
Of course I see it. As I've said, I'm impressed by his results. He's a great businessman. It's just irritating that everything as to be hyped to this absurd levels. I think it's because millennials are very hype-driven. They don't just want nice things. They want the 'greatest thing ever created omg so awesome best innovation ever'. It's silly.
But it's just my opinion...
Re: (Score:2)
Who's the cranky one? Projection much?
I'm not saying Musk is a millennial. I'm saying he's hyping things to get to millennials. Every sane person knows that emulating alternative realities with a chip in you brain is not in the reach of this tech, not now, not in 50 years. So why make fool of yourself like that? Because millennials will buy that. Hype it up, they will eat it.
Also, I was born in 84.
I Vote for Rust (Score:2)
While C++ and classical OO has become widely taught and used, it has nevertheless failed in literally every originally stated objective. I first programmed in Basic, then assembly, then C, then C++, and on it went. Today, mostly JavaScript and C# with a little PowerShell on the side.
Software used to be much easier. I found that it was C++ when simple things started to become extremely complicated. I fled to JavaScript for applications. I could just get so much done so much faster and, it turned out, t
Why bluetooth? (Score:2)
The pigsty hasn't got 5G yet, some pigs burned down the pole.
Hm, (Score:2)
I love Musk / Carmack discussions (Score:2)
John Carmack is good at bringing up Elon Musk best side. That of a ambitious and competent engineer.
I hate his fandom, the way he communicates and the way he does business. However when someone gets him to talk like an engineer and not like a salesman, he becomes really interesting.