'Mojo May Be the Biggest Programming Language Advance In Decades' (www.fast.ai) 126
Mojo is a new programming language developed by Modular1 that aims to address the performance and deployment limitations of Python in areas like AI model development. After demoing Mojo prior to its launch, Jeremy Howard from the non-profit research group fast.ai said it feels like coding will never be the same again. Here's an excerpt from Howard's article: Modular is a fairly small startup that's only a year old, and only one part of the company is working on the Mojo language. Mojo development was only started recently. It's a small team, working for a short time, so how have they done so much? The key is that Mojo builds on some really powerful foundations. Very few software projects I've seen spend enough time building the right foundations, and tend to accrue as a result mounds of technical debt. Over time, it becomes harder and harder to add features and fix bugs. In a well designed system, however, every feature is easier to add than the last one, is faster, and has fewer bugs, because the foundations each feature builds upon are getting better and better. Mojo is a well designed system.
At its core is MLIR (Multi-Level Intermediate Representation), which has already been developed for many years, initially kicked off by Chris Lattner at Google. He had recognized what the core foundations for an "AI era programming language" would need, and focused on building them. MLIR was a key piece. Just as LLVM made it dramatically easier for powerful new programming languages to be developed over the last decade (such as Rust, Julia, and Swift, which are all based on LLVM), MLIR provides an even more powerful core to languages that are built on it. Another key enabler of Mojo's rapid development is the decision to use Python as the syntax. Developing and iterating on syntax is one of the most error-prone, complex, and controversial parts of the development of a language. By simply outsourcing that to an existing language (which also happens to be the most widely used language today) that whole piece disappears! The relatively small number of new bits of syntax needed on top of Python then largely fit quite naturally, since the base is already in place.
The next step was to create a minimal Pythonic way to call MLIR directly. That wasn't a big job at all, but it was all that was needed to then create all of Mojo on top of that -- and work directly in Mojo for everything else. That meant that the Mojo devs were able to "dog-food" Mojo when writing Mojo, nearly from the very start. Any time they found something didn't quite work great as they developed Mojo, they could add a needed feature to Mojo itself to make it easier for them to develop the next bit of Mojo! You can give Mojo a try here.
At its core is MLIR (Multi-Level Intermediate Representation), which has already been developed for many years, initially kicked off by Chris Lattner at Google. He had recognized what the core foundations for an "AI era programming language" would need, and focused on building them. MLIR was a key piece. Just as LLVM made it dramatically easier for powerful new programming languages to be developed over the last decade (such as Rust, Julia, and Swift, which are all based on LLVM), MLIR provides an even more powerful core to languages that are built on it. Another key enabler of Mojo's rapid development is the decision to use Python as the syntax. Developing and iterating on syntax is one of the most error-prone, complex, and controversial parts of the development of a language. By simply outsourcing that to an existing language (which also happens to be the most widely used language today) that whole piece disappears! The relatively small number of new bits of syntax needed on top of Python then largely fit quite naturally, since the base is already in place.
The next step was to create a minimal Pythonic way to call MLIR directly. That wasn't a big job at all, but it was all that was needed to then create all of Mojo on top of that -- and work directly in Mojo for everything else. That meant that the Mojo devs were able to "dog-food" Mojo when writing Mojo, nearly from the very start. Any time they found something didn't quite work great as they developed Mojo, they could add a needed feature to Mojo itself to make it easier for them to develop the next bit of Mojo! You can give Mojo a try here.
Maybe, maybe not (Score:5, Funny)
How about you kids go duke it out with the rust kids, eh?
âoeSon of a bitch. He stole my lineâ (Score:5, Funny)
https://m0.joe.ie/wp-content/u... [m0.joe.ie]
Re: (Score:2)
There, I just took AI's job of writing every clickbait headline about a new programming language ever. The robots are unemployed now.
So it's Python ... (Score:4, Insightful)
and a lot of current buzzwords - Whoop de do ...
Re: (Score:2)
Does it rely on whitespace?
Re: (Score:2)
IIUC, it's using the LLVM compiler and has a few changes to make it friendlier to optimizers. I've no idea how it would compare against Pypy or Cython.
Re: (Score:3)
> IIUC, it's using the LLVM compiler
No. As it says right in the article, it uses a new intermediate and most of the work was building a compiler for that. What it does *not* say in the summary is that the intermediate in question is designe to run AI-like massive threading tasks, as opposed to finely tune smaller application-like code that LLVM specializes in.
> 've no idea how it would compare against Pypy or Cython
Apparently it crushes them, at least according to the people who have reviewed it.
Not this again (Score:5, Insightful)
If you read the full article, it's essentially just a solution to Python having poor performance.
I mean, okay, but being able to write in a high level language yet have the performance of a low level one has been the pipe dream of language development for decades now. Ultimately, unless there is some step change in computing power, you just can't abstract away the underlying von neumann architecture and then get a big shock when that architecture chokes on some high level code sequence that hammers dynamic memory or pipelines etc.
In many respects, the reasons why Python is so successful now is that we have very powerful machines compared to most of the tasks it is used for. In the 1980s a language like Python would have required very careful use to avoid a single overly ambitious line destroying performance or consuming all the memory. Programmers had to understand the hardware to do most useful things.
Today, AI is at the bleeding edge of hardware capabilities, so really you just need to have someone who has a bit of an idea of how the hardware is working. Trying to 'solve' this problem of ignorance with fancy solutions is unlikely to work that well, and ultimately, I just don't understand why it's so hard for someone to learn the basics of CPU/GPU architecture. If you're doing serious AI research this should not be beyond you - it's pretty simple stuff.
I see the same ignorance in a lot of Javascript. It is possible to absolutely destroy a modern CPU doing something very simple if you just ignore the underlying hardware, while on the other hand you can write code that runs quite well by being a little respectful. Unfortunately, a quick perusal of most webpages suggests that ignorance is bliss for most developers.
Re:Not this again (Score:5, Interesting)
My take is that today, too many coders cannot do low-level coding (C, some assembler) at all. And hence they fanatically cheer for any perceived solution to their incompetence.
Of course, this is not a problem technology can solve. You either understand the hardware and code for it, or you do not and lose performance. There really is no way around that until we get AGI. If we ever get it and if it then wants to work for us in the first place.
Re: (Score:3)
My take is that today, too many coders cannot do low-level coding (C, some assembler) at all
Job security FTW!
Re: (Score:2)
Nobody is designing complex modern applications in pure C. You're either using frameworks by coding in a language that provides them or you're building your own inner platform from scratch every time you start a new project. Layers of abstraction are necessary to prevent you spending all day inventing the wheel.
Greenspun's 10th rule (Score:2)
https://en.wikipedia.org/wiki/... [wikipedia.org]
The only thing is that Common Lisp offers high computational performance, Python, not so much.
Re: (Score:2)
Re: (Score:2)
I'm by no means awesome, but I can write C and assembler (or rather, did a lot of it in my yoof), now I write some Python - and man oh man, some stuff is painfully slow in Python. I don't know if that's an inherent problem with high level languages, but I think Python has been a particularly bad solution to any sort of performance critical code (I believe latest versions are better - they may be, I don't know).
I also used to write quite a bit of Perl - that too had its issues, but there, somehow I never fel
Re: (Score:2)
Python is very nice glue code and very nice for not performance critical stuff. For any heavy lifting, it is just not the tool to use. But embedding C is easy (Have done if for the high-performance parts of a simulator) and as a glue, Python gives you excellent flexibility and still somewhat reasonable performance. I agree that Perl is easier for a lot of simple things, but have you ever tried to embed C in Perl? It can be done, but I gave up on it pretty fast.
Re: (Score:2)
One selling point for Mojo is it is easier to debug the system if you don't have to cross between worlds. Their argument is you can still use an existing performance critical C library for one task but you can develop a new performance critical library in Mojo for another task and the latter would be superior in terms of software maintenance. I kinda buy that.
Re: (Score:3)
When it the last time you wrote optimized assembler? For me it's over 4 decades. These days compilers not only do a better job, but the source code is more portable.
Whether one *should* do the code at a lower level is a problem dependent variable, and technology can definitely address where on the curve that answer. Currently I have a piece of code that I'm writing in both C++ and Python, trying to decide which is the most appropriate. (C's out because I use too many data structures that aren't portable
Re: (Score:2)
Of course, this is not a problem technology can solve. You either understand the hardware and code for it, or you do not and lose performance.
Or you outsource the low-level coding to someone else who is more knowledgable than you, in the form of using a library. Numpy would be the canonical example of that approach in the Python world.
Re: (Score:2)
> too many coders cannot do low-level coding (C, some assembler) at all
C is essentially useless for the tasks this system is aimed at. While one can, of course, write the same code in C, doing so would require an enormous amount of said code and perform about the same speed. So lots of work for little benefit.
The idea of using a language to reduce the coding workload while reaping the benefits of optimizations you may not even be aware of is the reason computer languages exist.
> You either understand
Re: (Score:2)
he fact that so much of AI is based on Python rather proves this point, and a system that can improve the performance of those exact programs without inflicting any additional programming overhead is a godsend.
The reason that so much AI is built in Python has absolutely nothing to do with technical details at all. The reason Python is used in AI is because there are so few actual ML people in the world. They all get hired by about 5 companies. Those 5 companies haven't used Python is almost a decade at this point. The other companies want that AI thing, so they hire Physicists to do ML (which is weird because ML is based upon entirely different stats and math than Physics). The only language Physicists know
Re: (Score:2)
The fact that so much of AI is based on Python rather proves this point,
I'll wager none of the heavy lifting in AI is done in actual Python. It's just too bloody slow.
Re: (Score:2)
My take is that today, too many coders cannot do low-level coding (C, some assembler) at all.
If you are right about that - and I defer to your experience - it's probably just the market working as it is supposed to. In programming as in most other lines of work, "more means worse". (That is, with more programmers overall the average performance will be worse, although the best individuals may be better).
If programmers are hired according to very strict requirements for high-level expertise, they cannot be blamed for not taking precious time to learn about hardware architecture. Such knowledge might
Re: (Score:2)
Probably. There is certainly a lot of really stupid hiring done in the IT space.
Re: (Score:3)
Re: (Score:2)
SQL? The Simple Query Language is too hard for these people? (Yes, I know the "S" is "Structured".)
Strikes me as if such people have no business programming anything.
Re: (Score:2)
Re: (Score:2)
I disagree. Any competent programmer should be able to master at least adequate skills in SQL. Yes, I know, the execution model is a bit different, but anybody that cannot master this (or functional programming, come to think of it) is just not a competent programmer. I do understand if somebody struggles with PROLOG, but functional or SQL? Come on!
I do agree that we have a ton of _incompetent_ programmers out there and that needs to stop.
Re: (Score:2)
Re: (Score:2)
SQL is hard, no doubt about it. I'd argue the benefits are vast though. Not only for performance but also data security. If your interface to the data are SPs (or perhaps SELECTs on views) there's are many security pitfalls, especially when dealing with attribute level authz, that can be avoided: if you can't get the data you can't leak it. Not only does relational algebra make it easy (well, easy-ish) to reason about, it's also pretty trivially testable. It's rare that I walk into a situation where muc
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
You took my comment and ran with it without even realizing that what I was trying to say was that I firmly believe that the ideal is to use the right tool for each job while the new generation tries to use the same thing (hybernate for example, which I think is pathetically and ridiculously bad) for everything that comes their way even when it is not a good idea.
Re: (Score:2)
Re: (Score:2)
Especially when you can't really do that.
For simple queries or data entry, anyone can use SELECT foo, bar FROM baz WHERE baf = something. No need to really set up another abstraction layer. Unless you consider views to be such - anyway, it's so simple you hardly need to put anything in between.
I've not really seen someone doing a proper abstraction layer for things like conditional aggregations, not to mention CTEs.
Then again, I hear that SQL injections are still a thing...
Re: (Score:2)
Those who don't know SQL are doomed to recreate it, poorly. For the life of me I can't understand so-called developers who aggregate by hand in the application code. Alas.
Re: (Score:2)
Preach it!
I've spent a lot of my career fixing problems with "software" built or maintained by "developers" with no understanding of data.
Like getting a list of all row IDs, grabbing one row at a time across a high-latency network, hydrating child objects that aren't even needed, until as much of an entire multi-terabyte table, not even in 1NF, is loaded, 10% in memory and 90% in swap or tempfiles. Then iterating through it to pull just a handful of those rows. Taking hours to do what a WHERE and a JOIN c
Re: (Score:2)
unless they can be bothered to learn a little bit of basic SQL and normalization.
And generate and interpret a query plan. Most developers these days don't even know what that is...
Re: (Score:2)
Query plan? Wuzz dat? :)
(Seriously: I do this from time to time if I get less performance from the DB server than expected . . . however, most of the time, any modern query optimizer does a pretty good job, and, when I see bad query plans, it's usually because there is an underlying problem like missing index, VACUUM or DBCC needed, excessively large intermediate sets, or, sometimes, someone else's bad queries taking all available IO, or something of that sort.)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Bad programmers write bad programs. No surprise there.
What you're missing is that when you tweak even a decent number of database queries to get just the right columns for the task at hand, you end up with a sprawl of data models in your code that all map back to the same columns. Composite models where the properties come from multiple tables because the underlying query has a join compounds this. Tracking what database change will affect what code becomes a truly cumbersome process and updating data that
Re: (Score:3)
Re: (Score:2, Insightful)
The new generation seems to be unwilling (or unable) to understand the basics of how a computer works. As an example, they try to invent bizarre and extremely inefficient ways to abstract SQL
SQL is not how a computer works, it's how SQL works
Re: (Score:2)
The new generation seems to be unwilling (or unable) to understand the basics of how a computer works. As an example, they try to invent bizarre and extremely inefficient ways to abstract SQL
SQL is not how a computer works, it's how SQL works
You are a really good example of why software professionals need a real CS degree. If you don't understand the role relational algebra plays in how software is designed and developed, then you probably are terrible at programming...you just don't know it.
Re: (Score:2)
Re: (Score:3)
What I wonder, is why not contribute to Julia instead. It seems Julia does what Mojo aims to do, and has a nicer paradigm/syntax.
Business reasons, of course -- Mojo's developers want it to be widely adopted, and leveraging Python's large existing user base is the fast way to achieve that.
Of course there's no reason that the same techniques couldn't be used in other languages as well (although perhaps other languages that are already more efficient than Python won't see as big of a benefit, simply because they have less of a performance problem to remedy)
Re: (Score:2)
Creating low level implementations of the model for inference and distributed training is error prone regardless of knowing the basics of GPU programming.
As the article says, it's not really pure python what the other solutions can compile to say CUDA either. It's already a restricted subset.
Re: (Score:3)
If you're doing serious AI research this should not be beyond you - it's pretty simple stuff.
True, but I don't think "serious AI researchers" are the target for this language. It's more like "the other 99% of the world who doesn't know all that much about AI but still wants to incorporate it into their app and get good performance out of it".
Re: (Score:2)
With the terrible dynamic typing and god awful, inconsistent, "VB5" style sytax ("str", "len" etc.) I thought Python *was* a 1980's laguage ?
And don't get me started how brain dead the whitepace indenting is. There's a reason the Goddess gave humans curly braces and semi colons. They make it bloody obvious where codes lines/blocks start and end.
Having said that it's a really useful languages as at least it's reasonably cross platform and quick to develop in. It's just a shame it's so fugly to look at !
Re: (Score:2)
Do you have an example for a short Python program and a short JAvaScript program where knowledge of the hardware below it improves its performance?
How do you "port" it to another hardware?
Re: (Score:2)
Re: (Score:2)
No, it does not really count.
The exact same would have happened in any language.
So it is definitely not a JavaScript fault that could be avoided by knowing more about the system/hardware.
but it is below the operating knowledge of even most C++ programmers so I think it counts
Exactly: as the language one uses is irrelevant. And the idea that a C/C++ programmer automatically somehow magically knows something about the hardware someone else does not: is nonsense. You only know what you learned or looked up. I
Re: (Score:2)
Exactly: as the language one uses is irrelevant. And the idea that a C/C++ programmer automatically somehow magically knows something about the hardware someone else does not: is nonsense
Tell me you don't know C/C++ without telling me you don't know C/C++. I assure you, you need to know more about how the hardware works in a language with raw pointers than in a JVM or other GC language. Also, the switch to kernel space is an entirely hardware driven thing as is the mechanism with which you access the clock to get a timestamp. You can have an OS which runs entirely in userland and that would change the timing of this call. And how the hardware handled switches between processes would abs
Re: (Score:2)
Re: (Score:2)
I'm not sure why you're so upset by being given alternatives. Isn't a larger ecosystem better?
Crappy, well marketed alternatives is how we got such a terrible language in wide spread use in the first place. Alternatives aren't always better. Alternatives from Google are rarely better. This alternative is likely reinventing the wheel multiple times. If you want a good language for writing GPU code in, Futhark is far better than whatever this team will make. Those stats packages needed for ML, the ones used in Python are already written in C and can be integrated into any language. Python is by
Again with this shit? (Score:4, Insightful)
Based on their record, I cannot believe it is simply incompetence.
https://developers.slashdot.org/story/23/05/07/0544230/swift-creators-company-builds-new-programming-language-mojo---a-python-superset
Re: (Score:2)
How much are they bribing the editors?
bribes! why so dramatic? it's just regular advertising, it has been their business model since like forever ...
you can promote anything on /. too, it's not illegal. just hit the "advertising" linky at the bottom of the page to get a quote.
Great. Yet another PL to join ... (Score:2)
... the pretentious crowd of Haskel, Scheme, Lisp, Closure, Scala, Elixir, etc. "Algebrahic Software Development / purely functional snobs and celebrate meetups specificaly held to intellectually masturbate and make IT regulars feel notably dumber than they thought they were.
Nope, this PL is going nowhere and will vanish into obscurity faster than seasonal fashion.
Re: (Score:3, Informative)
Re:Great. Yet another PL to join ... (Score:5, Interesting)
> Nope, this PL is going nowhere
Yeah totally, just look at how everything else Chris has worked on has failed.
You know, LLVM, Clang, Swift, the OpenGL JIT, LLDB, RISC-V, TensorFlow.
Oh, they are the most-used products in all of their niches, you say?
Re: (Score:2)
Jumping on the latest fad i see (Score:2, Interesting)
I bet there's nothing "AI" about their language. They're working on a python alternative and marketing changed direction.
Re: (Score:3)
I bet there's nothing "AI" about their language.
Well, you bet wrong. Regardless of whether the language is any good, it's primarily clearly at the niche that's mostly filled with python and pytorch.
Re: (Score:2)
Dog-food? (Score:3)
Re: (Score:2, Troll)
All nouns can be verbed, all verbs can be nouned. But dogfooding has been a well-known term in the tech world for decades now, so my only question is, what are the property taxes like under that rock? And I guess followup, are the neighbors nice?
Re: (Score:2)
Re: (Score:2)
instead of being a sanctimonious cunt, you could have simply explained what "dogfooding" is, but no you are just being a cunt.
Instead of defending people playing stupid on a site for nerds you could expect them to know how to search the web for the things they're ignorant about, but you are just being an enabler of willful ignorance. Thanks for making Slashdot grate.
Re: (Score:2)
Re: (Score:2)
> The dog-food concept (consume your own product) has been around in different forms for a while
1988. Likely before the OP was born, so it's not new.
Or the 1970s if you use it in the original sense.
Re: (Score:2)
And I tend to be annoyed by this verbification trend.
Re: (Score:2)
When did dog-food become a verb and what am I to make of it?
You'll just have to use the term yourself for a while and see if you like it or not!
This Headline May Be the Most Exaggerated In Decad (Score:2)
what a load of... hype (Score:2, Insightful)
oh man... if you think that's something, imagine the company was more than one year old and the whole company was working on whatever the heck this is lol
got so distracted by the buzz words that I lost interest within the first sentence, slashvertisement at its best. I vaguely remember seing the word python
Judging from previous posts, I guess I am not the only one that felt annoyed
Vaporware (Score:5, Informative)
"The Mojo standard library, compiler, and runtime are not available for local development yet".
There's literally nothing you can download and try. They have a very limited online playground and a long list of things that don't work [modular.com].
At this point it's impossible to tell for yourself if it's a promising new language in early alpha or just Python but broken.
Re: Vaporware (Score:2)
The developer does have quite the history of success. He has accomplished things much more complex, so I would give him the benefit of the doubt.
Re: (Score:2)
Re: (Score:2)
We are talking about the Father of LLVM (or co-father) here. You might as well call K&R as developers who just worked at Bell Labs "and got a lot of projects on their CVs".
LLVM is the foundation of Rust, Julia, Numba etc. Swift was also his project. So people take him seriously unlike our pet projects.
Re: (Score:2)
No. At this point it's NOT a promising new language. It may eventually become so. Perhaps. Using an emoji as a file extension is not a promising sign.
a good advise to this ad writer (Score:3)
Just don't cross the line of ads with propaganda - because the latter puts people away (like shoveling certain reddish language down our throats for some time now)
O, and a wiki page would've been in place, because - you know - "it's such a revolution in computing"
Programming language du jour (Score:2)
Just as LLVM made it dramatically easier for powerful new programming languages to be developed
Edit out the word "powerful" and that quote pretty much sums up the recent explosion of "new" programming languages. New is in quotes because most of these are just variations on a theme. The developers of Mojo seem to be actually proud of the fact that they they are attempting to ride Python's coattails.
Why so modest? (Score:2, Funny)
Why, oh, why ... (Score:2)
did that video have to have that annoying noise in the background ? It was distracting. It did not add anything. Did the authors think that viewers would get bored and switch off without that crap sound ?
Is mojo a superset of python or not??? (Score:2)
Then I looked at mojo "sharp edges" doc and I see all these python things that are not supported in the mojo runtime as I understand what they were saying.
So which is it? Strict superset (syntactically and semantically) or not? It's kind of important to know, with regard to feasibility of adapting existing pyth
History of programming languages (Score:2)
Any time they found something didn't quite work great as they developed Mojo, they could add a needed feature to Mojo itself to make it easier for them to develop the next bit of Mojo!
LISP did that. Smalltalk did that. C did that. C++ did that. Hell, C++ is the dominant language of compilers these days, with every major language getting a frontend on Clang and GCC. Most compilers aim for self-hosting as a goal.
This is like the "Uber startup disruptor" of PR. Acting like they have invented a new practice, when it's been done for decades.
Awesome! (Score:2)
Their full launch video [youtube.com] is even flashier and made me laugh out loud, saying "[there's] a crazy complex and fragmented ecosystem where developers are constantly brute forcing everything[....] this is slowing down everyone's ability to innovate!" Because if there's one problem with AI today, it's that the tech
Re: (Score:2)
No Mojo (Score:2)
When a headline says it "may be" something, that means it's not.
Re: (Score:2)
That is what extensions in C are for.
Writing C extensions sucks.It's certainly doable, but it's not a smooth process by any measure.
(Or C++ of you want another hyped language that underdelivers.)
There's a reason all those C compilers are written in C++, not C.
Re: (Score:2)
Ctypes simplifies that process considerably. (Or JNA if Java is your thing.)
Re: (Score:3)
In many places rewriting "loopy" code to numpy most of the times solves the problems.
But you don't even need to go the C/C++ route.
In many places rewriting "loopy" code to numpy most of the times solves the problems.
In other places numba might fit fine, or cython.
For machine learning, there are already well established frameworks (pytorch, jax, tf, to name a few) with plenty of existing code researchers can leverage from.
meh.
I've used these things. I rarely use numpy because I think pytorch is basically lik
Re: (Score:3)
Re: (Score:3)
All kinds of code is still written for Linux in C and C++. Usually badly. And, I'm convinced, usually because of inertia and/or lack of robust bindings to higher-level languages.
Anything that doesn't need to run close to the metal doesn't need these. A kernel, Web browser, or database engine probably does need to run close to the metal, but the vast majority of code does not.
Like most developers, I work on bespoke custom line-of-business apps that spend almost 100.00% of their time waiting for a network
Re: (Score:2)
> requires every computer "look like" a fucking PDP-11 underneath
Hate to break it to you, but that's "fucking PDP-7 underneath".
C's string model with the trailing null exists because the PDP-7 was a 18-bit machine with 6-bit characters, so any string with a !modulo-3 number of chars was padded with zero in the lower... I think they still called it a bytes even though it was 6-bits. So 2 of 3 strings were zero padded anyway. The PDP-8 also had a single-op to compare the bytes of word to zero, so you can e
Re: (Score:2)
For a PDP-7 they probably called it chars. I think bytes came in with the IBM 360, and wasn't the PDP-7 around before that?
Re: (Score:2)
No, bytes did not mean "eight bits" until the late 1960s, maybe even early 1970s. Before that it just meant "the natural sub-word". In the UK they apparently called this a "syllable", which is a good term for it, but not living in the UK I can't confirm this.
Re: (Score:2)
Au contraire. Idiots like _you_ are part of the problem. C is not only still taught today, it is essential for many things. The reason why many people do not like C is simple: It makes coder incompetence quite obvious. There is a reason why _any_ established engineering discipline teaches their students the very basics. Just in CS/IT/Software Engineering there are numerous idiots that think that is not necessary.
Re: (Score:2)
Re: (Score:2)
Go-style concurrency isn't necessarily the best. I prefer data-flow. And there's nothing really wrong with Python syntax. But I don't really expect this language to go anywhere. The github page is full of promises but no source, and the company web-site is to bad its nearly unreadable, and definitely uninformative. I didn't even find out what the license is like.