Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Python

'Mojo May Be the Biggest Programming Language Advance In Decades' (www.fast.ai) 126

Mojo is a new programming language developed by Modular1 that aims to address the performance and deployment limitations of Python in areas like AI model development. After demoing Mojo prior to its launch, Jeremy Howard from the non-profit research group fast.ai said it feels like coding will never be the same again. Here's an excerpt from Howard's article: Modular is a fairly small startup that's only a year old, and only one part of the company is working on the Mojo language. Mojo development was only started recently. It's a small team, working for a short time, so how have they done so much? The key is that Mojo builds on some really powerful foundations. Very few software projects I've seen spend enough time building the right foundations, and tend to accrue as a result mounds of technical debt. Over time, it becomes harder and harder to add features and fix bugs. In a well designed system, however, every feature is easier to add than the last one, is faster, and has fewer bugs, because the foundations each feature builds upon are getting better and better. Mojo is a well designed system.

At its core is MLIR (Multi-Level Intermediate Representation), which has already been developed for many years, initially kicked off by Chris Lattner at Google. He had recognized what the core foundations for an "AI era programming language" would need, and focused on building them. MLIR was a key piece. Just as LLVM made it dramatically easier for powerful new programming languages to be developed over the last decade (such as Rust, Julia, and Swift, which are all based on LLVM), MLIR provides an even more powerful core to languages that are built on it. Another key enabler of Mojo's rapid development is the decision to use Python as the syntax. Developing and iterating on syntax is one of the most error-prone, complex, and controversial parts of the development of a language. By simply outsourcing that to an existing language (which also happens to be the most widely used language today) that whole piece disappears! The relatively small number of new bits of syntax needed on top of Python then largely fit quite naturally, since the base is already in place.

The next step was to create a minimal Pythonic way to call MLIR directly. That wasn't a big job at all, but it was all that was needed to then create all of Mojo on top of that -- and work directly in Mojo for everything else. That meant that the Mojo devs were able to "dog-food" Mojo when writing Mojo, nearly from the very start. Any time they found something didn't quite work great as they developed Mojo, they could add a needed feature to Mojo itself to make it easier for them to develop the next bit of Mojo!
You can give Mojo a try here.
This discussion has been archived. No new comments can be posted.

'Mojo May Be the Biggest Programming Language Advance In Decades'

Comments Filter:
  • by Anonymous Coward on Wednesday May 17, 2023 @05:13AM (#63528023)

    How about you kids go duke it out with the rust kids, eh?

  • So it's Python ... (Score:4, Insightful)

    by JasterBobaMereel ( 1102861 ) on Wednesday May 17, 2023 @05:15AM (#63528025)

    and a lot of current buzzwords - Whoop de do ...

    • Does it rely on whitespace?

    • by HiThere ( 15173 )

      IIUC, it's using the LLVM compiler and has a few changes to make it friendlier to optimizers. I've no idea how it would compare against Pypy or Cython.

      • > IIUC, it's using the LLVM compiler

        No. As it says right in the article, it uses a new intermediate and most of the work was building a compiler for that. What it does *not* say in the summary is that the intermediate in question is designe to run AI-like massive threading tasks, as opposed to finely tune smaller application-like code that LLVM specializes in.

        > 've no idea how it would compare against Pypy or Cython

        Apparently it crushes them, at least according to the people who have reviewed it.

  • Not this again (Score:5, Insightful)

    by monkeyxpress ( 4016725 ) on Wednesday May 17, 2023 @05:42AM (#63528063)

    If you read the full article, it's essentially just a solution to Python having poor performance.

    I mean, okay, but being able to write in a high level language yet have the performance of a low level one has been the pipe dream of language development for decades now. Ultimately, unless there is some step change in computing power, you just can't abstract away the underlying von neumann architecture and then get a big shock when that architecture chokes on some high level code sequence that hammers dynamic memory or pipelines etc.

    In many respects, the reasons why Python is so successful now is that we have very powerful machines compared to most of the tasks it is used for. In the 1980s a language like Python would have required very careful use to avoid a single overly ambitious line destroying performance or consuming all the memory. Programmers had to understand the hardware to do most useful things.

    Today, AI is at the bleeding edge of hardware capabilities, so really you just need to have someone who has a bit of an idea of how the hardware is working. Trying to 'solve' this problem of ignorance with fancy solutions is unlikely to work that well, and ultimately, I just don't understand why it's so hard for someone to learn the basics of CPU/GPU architecture. If you're doing serious AI research this should not be beyond you - it's pretty simple stuff.

    I see the same ignorance in a lot of Javascript. It is possible to absolutely destroy a modern CPU doing something very simple if you just ignore the underlying hardware, while on the other hand you can write code that runs quite well by being a little respectful. Unfortunately, a quick perusal of most webpages suggests that ignorance is bliss for most developers.

    • Re:Not this again (Score:5, Interesting)

      by gweihir ( 88907 ) on Wednesday May 17, 2023 @06:02AM (#63528075)

      My take is that today, too many coders cannot do low-level coding (C, some assembler) at all. And hence they fanatically cheer for any perceived solution to their incompetence.

      Of course, this is not a problem technology can solve. You either understand the hardware and code for it, or you do not and lose performance. There really is no way around that until we get AGI. If we ever get it and if it then wants to work for us in the first place.

      • My take is that today, too many coders cannot do low-level coding (C, some assembler) at all

        Job security FTW!

      • Nobody is designing complex modern applications in pure C. You're either using frameworks by coding in a language that provides them or you're building your own inner platform from scratch every time you start a new project. Layers of abstraction are necessary to prevent you spending all day inventing the wheel.

        • https://en.wikipedia.org/wiki/... [wikipedia.org]

          The only thing is that Common Lisp offers high computational performance, Python, not so much.

        • You are incorrect for any reasonable definition of "designing", "complex", and "pure". Bonus points for saying the same things I heard 20 years ago at Uni. The C language is still widely used for many things and layers of abstraction are available that don't include whatever sugar-syntax you might be jonesing over today. I know, I interact with a couple and even have to help out with one.
      • I'm by no means awesome, but I can write C and assembler (or rather, did a lot of it in my yoof), now I write some Python - and man oh man, some stuff is painfully slow in Python. I don't know if that's an inherent problem with high level languages, but I think Python has been a particularly bad solution to any sort of performance critical code (I believe latest versions are better - they may be, I don't know).

        I also used to write quite a bit of Perl - that too had its issues, but there, somehow I never fel

        • by gweihir ( 88907 )

          Python is very nice glue code and very nice for not performance critical stuff. For any heavy lifting, it is just not the tool to use. But embedding C is easy (Have done if for the high-performance parts of a simulator) and as a glue, Python gives you excellent flexibility and still somewhat reasonable performance. I agree that Perl is easier for a lot of simple things, but have you ever tried to embed C in Perl? It can be done, but I gave up on it pretty fast.

          • One selling point for Mojo is it is easier to debug the system if you don't have to cross between worlds. Their argument is you can still use an existing performance critical C library for one task but you can develop a new performance critical library in Mojo for another task and the latter would be superior in terms of software maintenance. I kinda buy that.

      • by HiThere ( 15173 )

        When it the last time you wrote optimized assembler? For me it's over 4 decades. These days compilers not only do a better job, but the source code is more portable.

        Whether one *should* do the code at a lower level is a problem dependent variable, and technology can definitely address where on the curve that answer. Currently I have a piece of code that I'm writing in both C++ and Python, trying to decide which is the most appropriate. (C's out because I use too many data structures that aren't portable

      • by Jeremi ( 14640 )

        Of course, this is not a problem technology can solve. You either understand the hardware and code for it, or you do not and lose performance.

        Or you outsource the low-level coding to someone else who is more knowledgable than you, in the form of using a library. Numpy would be the canonical example of that approach in the Python world.

      • > too many coders cannot do low-level coding (C, some assembler) at all

        C is essentially useless for the tasks this system is aimed at. While one can, of course, write the same code in C, doing so would require an enormous amount of said code and perform about the same speed. So lots of work for little benefit.

        The idea of using a language to reduce the coding workload while reaping the benefits of optimizations you may not even be aware of is the reason computer languages exist.

        > You either understand

        • by sfcat ( 872532 )

          he fact that so much of AI is based on Python rather proves this point, and a system that can improve the performance of those exact programs without inflicting any additional programming overhead is a godsend.

          The reason that so much AI is built in Python has absolutely nothing to do with technical details at all. The reason Python is used in AI is because there are so few actual ML people in the world. They all get hired by about 5 companies. Those 5 companies haven't used Python is almost a decade at this point. The other companies want that AI thing, so they hire Physicists to do ML (which is weird because ML is based upon entirely different stats and math than Physics). The only language Physicists know

        • by pz ( 113803 )

          The fact that so much of AI is based on Python rather proves this point,

          I'll wager none of the heavy lifting in AI is done in actual Python. It's just too bloody slow.

      • My take is that today, too many coders cannot do low-level coding (C, some assembler) at all.

        If you are right about that - and I defer to your experience - it's probably just the market working as it is supposed to. In programming as in most other lines of work, "more means worse". (That is, with more programmers overall the average performance will be worse, although the best individuals may be better).

        If programmers are hired according to very strict requirements for high-level expertise, they cannot be blamed for not taking precious time to learn about hardware architecture. Such knowledge might

    • I've been getting that impression too. The new generation seems to be unwilling (or unable) to understand the basics of how a computer works. As an example, they try to invent bizarre and extremely inefficient ways to abstract SQL when they could just learn SQL itself.
      • by gweihir ( 88907 )

        SQL? The Simple Query Language is too hard for these people? (Yes, I know the "S" is "Structured".)

        Strikes me as if such people have no business programming anything.

        • by sfcat ( 872532 )
          SQL isn't really that simple. It is more like pure functional programming than anything. Unless you are writing a stored proc, there are no variables. That really throws a lot of people. For all the hype that FP gets, there are very few programmers who can actually do it. Writing entire apps in SQL is even harder than that (this is something you can do). And BTW, that SQL code is probably 10x as fast as most code written in Turning complete languages yet people still don't do it. That should tell you
          • by gweihir ( 88907 )

            I disagree. Any competent programmer should be able to master at least adequate skills in SQL. Yes, I know, the execution model is a bit different, but anybody that cannot master this (or functional programming, come to think of it) is just not a competent programmer. I do understand if somebody struggles with PROLOG, but functional or SQL? Come on!

            I do agree that we have a ton of _incompetent_ programmers out there and that needs to stop.

            • by sfcat ( 872532 )
              So I have written PROLOG professionally and it might be the only language of the dozen or so I use that is harder than SQL in a real apples to apples comparison. ATM, I use Scala (2, 3 is just Kotlin with bad support) and that is easy compared to SQL. As for your "anybody can master FP", I have to call BS on this one. Only about 10% of the programmers I have worked with over the last decade or so can do FP. About 30% fake it and the rest just say they can't do it. Writing entire apps in SQL is even ha
              • SQL is hard, no doubt about it. I'd argue the benefits are vast though. Not only for performance but also data security. If your interface to the data are SPs (or perhaps SELECTs on views) there's are many security pitfalls, especially when dealing with attribute level authz, that can be avoided: if you can't get the data you can't leak it. Not only does relational algebra make it easy (well, easy-ish) to reason about, it's also pretty trivially testable. It's rare that I walk into a situation where muc

                • by sfcat ( 872532 )
                  I couldn't agree more, well said. FP is pushed by programming language people largely because it makes the compiler far easier to write so more effort can be done on optimization. Relational algebra even more so. The problem is that it is really hard to work in such programming models and most programmers can't do it.
          • You don't have to do your entire application in SQL, another problem with the new generation is that they want to use the hammer for every problem, even where you should use the hacksaw and where you should use the pliers. Just not making bad mistakes like pulling everything and the kitchen sink from the database (instead of properly using SQL to pull only what you really need at the moment) would already help a lot.
            • by sfcat ( 872532 )
              First, my UID is lower than yours so I'm pretty sure that "problem with this generation" comment is out of line. Second, sure but your application would likely run faster and be more stable if it was entirely in SQL. This topic is about comparing programming languages. You can do more in SQL than just hibernate and ORM. I know this is the extent of most people's experience with SQL but that is about 30% of the language at best. I know the hammer and nail analogy is popular so in the language of this an
              • You need to chill down. A lot.

                You took my comment and ran with it without even realizing that what I was trying to say was that I firmly believe that the ideal is to use the right tool for each job while the new generation tries to use the same thing (hybernate for example, which I think is pathetically and ridiculously bad) for everything that comes their way even when it is not a good idea.
                • by sfcat ( 872532 )
                  Hibernate (no y) is a fine ORM system. If that's what you need, its quite nice. If you need something else, then its not going to be very good at that. The problem is when that's all people know, that's all SQL gets used for and that's all of the language programmers learn. So popular opinions on the language become a bit slanted because nobody really knows it. Truth is SQL isn't a great language. What is great are languages based upon relational algebra and SQL is the only one of those with any real
      • by Zarhan ( 415465 )

        Especially when you can't really do that.

        For simple queries or data entry, anyone can use SELECT foo, bar FROM baz WHERE baf = something. No need to really set up another abstraction layer. Unless you consider views to be such - anyway, it's so simple you hardly need to put anything in between.

        I've not really seen someone doing a proper abstraction layer for things like conditional aggregations, not to mention CTEs.

        Then again, I hear that SQL injections are still a thing...

        • Those who don't know SQL are doomed to recreate it, poorly. For the life of me I can't understand so-called developers who aggregate by hand in the application code. Alas.

          • Preach it!

            I've spent a lot of my career fixing problems with "software" built or maintained by "developers" with no understanding of data.

            Like getting a list of all row IDs, grabbing one row at a time across a high-latency network, hydrating child objects that aren't even needed, until as much of an entire multi-terabyte table, not even in 1NF, is loaded, 10% in memory and 90% in swap or tempfiles. Then iterating through it to pull just a handful of those rows. Taking hours to do what a WHERE and a JOIN c

            • unless they can be bothered to learn a little bit of basic SQL and normalization.

              And generate and interpret a query plan. Most developers these days don't even know what that is...

              • Query plan? Wuzz dat? :)

                (Seriously: I do this from time to time if I get less performance from the DB server than expected . . . however, most of the time, any modern query optimizer does a pretty good job, and, when I see bad query plans, it's usually because there is an underlying problem like missing index, VACUUM or DBCC needed, excessively large intermediate sets, or, sometimes, someone else's bad queries taking all available IO, or something of that sort.)

                • This is another thing that the query plan helps a lot, identifying when you have a problem in your query or some unexpected problem in the database itself like a lack of a index (that you thought you were sure you put in).
                  • That happens from time to time, possibly because we lag behind a lot of other places in terms of automated migrations and deployments. Works great on the dev and QA boxes, but grinds the production server to a halt, and typically because we forgot to add an index.
            • by Jaime2 ( 824950 )

              Bad programmers write bad programs. No surprise there.

              What you're missing is that when you tweak even a decent number of database queries to get just the right columns for the task at hand, you end up with a sprawl of data models in your code that all map back to the same columns. Composite models where the properties come from multiple tables because the underlying query has a join compounds this. Tracking what database change will affect what code becomes a truly cumbersome process and updating data that

          • That's exactly right. And I've seen even worse things like code pulling data from the database with something like "SELECT * FROM table" (you read right, without any WHERE) and then doing the filters... in the code.
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        The new generation seems to be unwilling (or unable) to understand the basics of how a computer works. As an example, they try to invent bizarre and extremely inefficient ways to abstract SQL

        SQL is not how a computer works, it's how SQL works

        • by sfcat ( 872532 )

          The new generation seems to be unwilling (or unable) to understand the basics of how a computer works. As an example, they try to invent bizarre and extremely inefficient ways to abstract SQL

          SQL is not how a computer works, it's how SQL works

          You are a really good example of why software professionals need a real CS degree. If you don't understand the role relational algebra plays in how software is designed and developed, then you probably are terrible at programming...you just don't know it.

    • What I wonder, is why not contribute to Julia instead. It seems Julia does what Mojo aims to do, and has a nicer paradigm/syntax.
      • by Jeremi ( 14640 )

        What I wonder, is why not contribute to Julia instead. It seems Julia does what Mojo aims to do, and has a nicer paradigm/syntax.

        Business reasons, of course -- Mojo's developers want it to be widely adopted, and leveraging Python's large existing user base is the fast way to achieve that.

        Of course there's no reason that the same techniques couldn't be used in other languages as well (although perhaps other languages that are already more efficient than Python won't see as big of a benefit, simply because they have less of a performance problem to remedy)

    • Creating low level implementations of the model for inference and distributed training is error prone regardless of knowing the basics of GPU programming.

      As the article says, it's not really pure python what the other solutions can compile to say CUDA either. It's already a restricted subset.

    • by Jeremi ( 14640 )

      If you're doing serious AI research this should not be beyond you - it's pretty simple stuff.

      True, but I don't think "serious AI researchers" are the target for this language. It's more like "the other 99% of the world who doesn't know all that much about AI but still wants to incorporate it into their app and get good performance out of it".

    • With the terrible dynamic typing and god awful, inconsistent, "VB5" style sytax ("str", "len" etc.) I thought Python *was* a 1980's laguage ?

      And don't get me started how brain dead the whitepace indenting is. There's a reason the Goddess gave humans curly braces and semi colons. They make it bloody obvious where codes lines/blocks start and end.

      Having said that it's a really useful languages as at least it's reasonably cross platform and quick to develop in. It's just a shame it's so fugly to look at !

    • Do you have an example for a short Python program and a short JAvaScript program where knowledge of the hardware below it improves its performance?
      How do you "port" it to another hardware?

      • by sfcat ( 872532 )
        I once fixed a bug for another developers in JS. The problem was that he was looking up the system time inside of a loop. Since looking up system time is a syscall, it takes milliseconds to complete. If his inner loop ran 1000 times, the entire loop (which because JS is single threaded locks the browser) took several seconds. We switched the lookup of system time to outside of the loop and it ran in a few ms instead. I had to know how getting system time worked and that it required a context switch int
        • No, it does not really count.
          The exact same would have happened in any language.
          So it is definitely not a JavaScript fault that could be avoided by knowing more about the system/hardware.
          but it is below the operating knowledge of even most C++ programmers so I think it counts
          Exactly: as the language one uses is irrelevant. And the idea that a C/C++ programmer automatically somehow magically knows something about the hardware someone else does not: is nonsense. You only know what you learned or looked up. I

          • by sfcat ( 872532 )

            Exactly: as the language one uses is irrelevant. And the idea that a C/C++ programmer automatically somehow magically knows something about the hardware someone else does not: is nonsense

            Tell me you don't know C/C++ without telling me you don't know C/C++. I assure you, you need to know more about how the hardware works in a language with raw pointers than in a JVM or other GC language. Also, the switch to kernel space is an entirely hardware driven thing as is the mechanism with which you access the clock to get a timestamp. You can have an OS which runs entirely in userland and that would change the timing of this call. And how the hardware handled switches between processes would abs

  • by greytree ( 7124971 ) on Wednesday May 17, 2023 @06:11AM (#63528095)
    How much are they bribing the editors?
    Based on their record, I cannot believe it is simply incompetence.

    https://developers.slashdot.org/story/23/05/07/0544230/swift-creators-company-builds-new-programming-language-mojo---a-python-superset
    • by znrt ( 2424692 )

      How much are they bribing the editors?

      bribes! why so dramatic? it's just regular advertising, it has been their business model since like forever ...

      you can promote anything on /. too, it's not illegal. just hit the "advertising" linky at the bottom of the page to get a quote.

  • ... the pretentious crowd of Haskel, Scheme, Lisp, Closure, Scala, Elixir, etc. "Algebrahic Software Development / purely functional snobs and celebrate meetups specificaly held to intellectually masturbate and make IT regulars feel notably dumber than they thought they were.

    Nope, this PL is going nowhere and will vanish into obscurity faster than seasonal fashion.

    • Re: (Score:3, Informative)

      by dargaud ( 518470 )
      I think it's the opposite of the 'pure' crowd. They have vectorize and parallelize instructions to more closely match and use the underlying hardware. And also a system to auto-optimize code depending on real during-execution benchmarks. Seems like a very pragmatic approach, something that Lisp never was.
    • by Maury Markowitz ( 452832 ) on Wednesday May 17, 2023 @10:02AM (#63528781) Homepage

      > Nope, this PL is going nowhere

      Yeah totally, just look at how everything else Chris has worked on has failed.

      You know, LLVM, Clang, Swift, the OpenGL JIT, LLDB, RISC-V, TensorFlow.

      Oh, they are the most-used products in all of their niches, you say?

      • by sfcat ( 872532 )
        Chris used those things, he didn't make (many) of those things. What the hell are you talking about? Some were developed before Chris was a professional programmer.
  • I bet there's nothing "AI" about their language. They're working on a python alternative and marketing changed direction.

    • I bet there's nothing "AI" about their language.

      Well, you bet wrong. Regardless of whether the language is any good, it's primarily clearly at the niche that's mostly filled with python and pytorch.

      • by sfcat ( 872532 )
        Pytorch isn't a language. And Python is a terrible language for AI that gets used for ML entirely for accidental reasons that have absolutely nothing to do with the language or any technical aspect of any programming language. You can write ML in any language. AI folks kept FP around on life-support for 30 years. We didn't do that for fun. We did it because FP languages are nearly ideal for writing AI and ML code. Python isn't FP and doesn't have good typing. It is a terrible language for AI/ML and i
  • by methano ( 519830 ) on Wednesday May 17, 2023 @06:59AM (#63528165)
    When did dog-food become a verb and what am I to make of it?
    • Re: (Score:2, Troll)

      by drinkypoo ( 153816 )

      All nouns can be verbed, all verbs can be nouned. But dogfooding has been a well-known term in the tech world for decades now, so my only question is, what are the property taxes like under that rock? And I guess followup, are the neighbors nice?

      • by sfcat ( 872532 )
        Its called a gerund (words with -ing endings).
    • The dog-food concept (consume your own product) has been around in different forms for a while. I have seen it become more common in the last 5 years and is probably due to increase in Cloud offerings.
      • > The dog-food concept (consume your own product) has been around in different forms for a while

        1988. Likely before the OP was born, so it's not new.

        Or the 1970s if you use it in the original sense.

      • by methano ( 519830 )
        OP here. Thanks for the clarification. I'm a chemist and only a very part-time coder. This seems to be a bit of a mixed metaphor. I'm familiar with one eating his own cooking. The dog food metaphor is more often, in my mind, associated with what happens to your menu if you get old and run out of money.

        And I tend to be annoyed by this verbification trend.
    • by Jeremi ( 14640 )

      When did dog-food become a verb and what am I to make of it?

      You'll just have to use the term yourself for a while and see if you like it or not!

  • by youn ( 1516637 )

    oh man... if you think that's something, imagine the company was more than one year old and the whole company was working on whatever the heck this is lol

    got so distracted by the buzz words that I lost interest within the first sentence, slashvertisement at its best. I vaguely remember seing the word python

    Judging from previous posts, I guess I am not the only one that felt annoyed

  • Vaporware (Score:5, Informative)

    by SQL Error ( 16383 ) on Wednesday May 17, 2023 @08:14AM (#63528355)

    "The Mojo standard library, compiler, and runtime are not available for local development yet".

    There's literally nothing you can download and try. They have a very limited online playground and a long list of things that don't work [modular.com].

    At this point it's impossible to tell for yourself if it's a promising new language in early alpha or just Python but broken.

    • The developer does have quite the history of success. He has accomplished things much more complex, so I would give him the benefit of the doubt.

      • by sfcat ( 872532 )
        The developer worked at Google and got a lot of projects on his CV. That isn't the same thing.
        • by jma05 ( 897351 )

          We are talking about the Father of LLVM (or co-father) here. You might as well call K&R as developers who just worked at Bell Labs "and got a lot of projects on their CVs".

          LLVM is the foundation of Rust, Julia, Numba etc. Swift was also his project. So people take him seriously unlike our pet projects.

    • by HiThere ( 15173 )

      No. At this point it's NOT a promising new language. It may eventually become so. Perhaps. Using an emoji as a file extension is not a promising sign.

  • by John Cavendish ( 6659408 ) on Wednesday May 17, 2023 @09:12AM (#63528557)

    Just don't cross the line of ads with propaganda - because the latter puts people away (like shoveling certain reddish language down our throats for some time now)
    O, and a wiki page would've been in place, because - you know - "it's such a revolution in computing"

  • Just as LLVM made it dramatically easier for powerful new programming languages to be developed

    Edit out the word "powerful" and that quote pretty much sums up the recent explosion of "new" programming languages. New is in quotes because most of these are just variations on a theme. The developers of Mojo seem to be actually proud of the fact that they they are attempting to ride Python's coattails.

  • Just claim that it is the most significant development since the invention of fire.
  • did that video have to have that annoying noise in the background ? It was distracting. It did not add anything. Did the authors think that viewers would get bored and switch off without that crap sound ?

  • I read an article on it that said it is, and if you don't want to use the new typesafe, high performance side of it, you can just stick to python syntax and it will all still work (albeit slow maybe).

    Then I looked at mojo "sharp edges" doc and I see all these python things that are not supported in the mojo runtime as I understand what they were saying.

    So which is it? Strict superset (syntactically and semantically) or not? It's kind of important to know, with regard to feasibility of adapting existing pyth
  • Any time they found something didn't quite work great as they developed Mojo, they could add a needed feature to Mojo itself to make it easier for them to develop the next bit of Mojo!

    LISP did that. Smalltalk did that. C did that. C++ did that. Hell, C++ is the dominant language of compilers these days, with every major language getting a frontend on Clang and GCC. Most compilers aim for self-hosting as a goal.

    This is like the "Uber startup disruptor" of PR. Acting like they have invented a new practice, when it's been done for decades.

  • This seems like basically the same concept as the Halide DSL for C++ [halide-lang.org], with two huge improvements:
    1. 1. Python not C++!
    2. 2. It's a true language, so its featureset can be expanded to various new areas.

    Their full launch video [youtube.com] is even flashier and made me laugh out loud, saying "[there's] a crazy complex and fragmented ecosystem where developers are constantly brute forcing everything[....] this is slowing down everyone's ability to innovate!" Because if there's one problem with AI today, it's that the tech

    • by Qwertie ( 797303 )

      ...and yes, I get that it's not ready to use yet, and that they are overselling it, and it's not even a superset of Python yet. But I've been following advanced languages for many years and never seen a language that was fast, easy-to-use, with strong metaprogramming AND low-level features that was funded or marketed this well, so I'm impressed! Plus I've long advocated new languages that are supersets of existing languages - hello TypeScript and Enhanced C# - so forgive my enthusiasm. I can see negativity

  • When a headline says it "may be" something, that means it's not.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...