Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Programming Technology

Deep Learning May Need a New Programming Language That's More Flexible Than Python, Facebook's Chief AI Scientist Says (venturebeat.com) 263

Deep learning may need a new programming language that's more flexible and easier to work with than Python, Facebook AI Research director Yann LeCun said today. From an interview: It's not yet clear if such a language is necessary, but the possibility runs against very entrenched desires from researchers and engineers, he said. LeCun has worked with neural networks since the 1980s. "There are several projects at Google, Facebook, and other places to kind of design such a compiled language that can be efficient for deep learning, but it's not clear at all that the community will follow, because people just want to use Python," LeCun said in a phone call with VentureBeat. "The question now is, is that a valid approach?" Further reading: Facebook joins Amazon and Google in AI chip race.
This discussion has been archived. No new comments can be posted.

Deep Learning May Need a New Programming Language That's More Flexible Than Python, Facebook's Chief AI Scientist Says

Comments Filter:
  • What we need is... (Score:3, Insightful)

    by Anonymous Coward on Monday February 18, 2019 @04:09PM (#58141620)

    What we need is less Facebook, less Google, less Amazon.

  • Easy (Score:5, Funny)

    by mobby_6kl ( 668092 ) on Monday February 18, 2019 @04:11PM (#58141638)

    Just use C++

    • Re:Easy (Score:4, Insightful)

      by Anonymous Coward on Monday February 18, 2019 @04:22PM (#58141708)

      True, but even C/C++ and Assembly doesn't provide an "easy" way to do threading, which is the issue.

      Scripting languages, basically do not do threading, of any kind, at all. They're too slow to synchronize across threads, which makes invoking threads inside them fruitless.

      While C is ultimately the right language to do everything in (not C++) , the real issue is that cpu's are expanding in cores, just like GPU's have, yet GPU's have standardized more or less on just three API's, OpenGL, Direct3D and Vulkan. So if you can write a program against Vulkan, you have as close to bare hardware as you are going to get. But for CPU's, there is still a 57 flavors of rubbish programming languages and no standard runtime that works for all of them, at best most of these programming languages are still developed in C or C++ and thus require a complete C AND C++ runtime to function.
      Python is not written in Python. Java is not written in Java. If a language can not compile itself, it's not flexible enough to be used for any of the three main corner stones of software development: Operating Systems, Applications, and Games. While you certainly can write an application or game with a scripting language, it will be slow, it will be limited by the operating system's own libraries (eg 32-bit libraries on a 64-bit OS as just one example) and generally require more maintenance than simply writing it in C to begin with.

      Memory overflow errors are caused by people learning programming languages like Java first instead of C, because if you learn C first, you then learn how to initialize memory, and how big memory chunks actually are.

      • Re:Easy (Score:5, Insightful)

        by thereddaikon ( 5795246 ) on Monday February 18, 2019 @04:46PM (#58141848)

        True, but even C/C++ and Assembly doesn't provide an "easy" way to do threading, which is the issue.

        Nothing is easy to multithread in because multithreading anything more than the most basic of processes is inherently complex. This doesn't apply to SIMD structures which is the main reason why GPU's can be so parallel. One pixel doesn't care what color the other one is.

        While C is ultimately the right language to do everything in (not C++) , the real issue is that cpu's are expanding in cores, just like GPU's have, yet GPU's have standardized more or less on just three API's, OpenGL, Direct3D and Vulkan. So if you can write a program against Vulkan, you have as close to bare hardware as you are going to get. But for CPU's, there is still a 57 flavors of rubbish programming languages and no standard runtime that works for all of them, at best most of these programming languages are still developed in C or C++ and thus require a complete C AND C++ runtime to function.

        A lot to disect here. So for starters APIs and languages aren't the same thing and the main graphics APIs can be interfaced with multiple languages, I've seen programs written in everything from Javascript, Python, C, and Java interface with the OpenGL api. Second, there is no such thing as a C/C++ runtime. Runtimes only exist in interpreted languages like JS with its DOM or in ones that compile to byte code for a nonexistent VM like Java. C/C++ compile to binary for a given architecture. That's the fundamental difference between compiled and interpreted languages.

        Python is not written in Python.

        Actually, some runtimes are.

        If a language can not compile itself, it's not flexible enough to be used for any of the three main corner stones of software development: Operating Systems, Applications, and Games.

        I don't know about flexibility, one of the biggest strengths of these dinky languages is their flexibility. Their biggest weakness however is a lack of efficiency and performance. If you had said they are unsuitable due to performance issues or an inability to run direct on the metal then I would agree with you. It's impossible to write an OS kernel that runs on a real machine by itself in javascipt or python. It cant be done because they both require JITs to work.

        While you certainly can write an application or game with a scripting language, it will be slow,

        Not a guarantee, but likely depending on complexity. There have been many successful games written in Java, minecraft for example.

        it will be limited by the operating system's own libraries (eg 32-bit libraries on a 64-bit OS as just one example) and generally require more maintenance than simply writing it in C to begin with.

        Eh, any program referencing external libraries has this problem. See issues with old C++ programs referencing deprecated Win32 APIs trying to run in Windows 10. However it is possible with some careful coding and luck to write a complex application that can work for decades unmodified in C. The same cannot be said for JS. If you write something complex in JS and dont touch it, three years later it wont work. This is especially true if you use some idiotic technology like NPM. Because storing your dependencies in the cloud is a great idea.

        Memory overflow errors are caused by people learning programming languages like Java first instead of C, because if you learn C first, you then learn how to initialize memory, and how big memory chunks actually are.

        While I do appreciate C, I think you give it too much credit. For one any serious Java programmer has to learn memory management eventually because the built in garbage collection is trash and once you get to a certain complexity level its no longer good enough. On top of that C's memory model is not an accurate representation of how a computer's memory model works anyways. I think this is one of C and C++'s biggest weaknesses and contributes to many of the mistakes programmers make in regards to memory management when using them.

        • Re:Easy (Score:5, Informative)

          by Pseudonym ( 62607 ) on Monday February 18, 2019 @05:14PM (#58141984)

          Second, there is no such thing as a C/C++ runtime.

          Yes, that thing called crt0 that you've seen all your life is an illusion.

          On a modern CPU, the C runtime doesn't have to do much. It has to set up the stack, argc/argv/envp, atexit handlers, and a few more random things. But it very much exists.

          Also consider that C compilers are allowed to generate calls to memcpy if you assign a large struct or something, and many of them do.

          • Re: Easy (Score:2, Troll)

            You clearly don't understand C, or what constititues a runtime environment.
            • Re: (Score:3, Insightful)

              by Anonymous Coward

              To recap: The first comment mentioned the C/C++ runtime, i.e. the distributable implementation of the standard, providing things like malloc. The second comment misinterpreted this to mean a runtime environment, e.g. the java executable, or the .NET common language runtime (CLR). The third comment noted that the second comment was mistaken, and once again referred to the runtime. Then your comment again mistakes that for runtime environment

              Conclusion: Words are hard.

              Captcha: obvious.

              • The common meaning of the word "environment" implies that "runtime" includes "runtime envuronment", as in the *environment* (set of objects or bindings) available at program's *run time*. If the environment is null, then there's no runtime enviroment, but that is mostly not the case even with C programs.
              • Re: Easy (Score:4, Informative)

                by Pseudonym ( 62607 ) on Monday February 18, 2019 @11:04PM (#58143252)

                To a compiler writer (which is where I got my start) a compiler's runtime is any code that is needed to run a program but isn't generated by the compiler when an end user compiles their program.

                C runtimes used to be a lot bigger than they are now. In the days of MS-DOS, you couldn't assume the presence of a FPU, so floating point was often compiled into calls into the FP runtime. Even today, microcontrollers often don't have instructions which directly implement basic C operations (e.g. 64 bit integer division) so these operations are typically compiled as calls to runtime routines.

                As CPUs get more powerful, C runtimes get smaller. But to say there's no such thing is flatly untrue.

            • C is my primary language, and I gotta say, you seem pretty confused. Somebody pointed at the C runtime, and you start crying and claiming they don't understand C. That's just daft.

              Run time, compile time. In C this is not complicated.

              If you didn't have a runtime, how would you assign constants at compile time and have them exist in RAM at runtime? Simple, you wouldn't.

            • You clearly don't understand C, or what constititues a runtime environment.

              Read the standard; there is a C runtime in any hosted implementation. The standard literally refers to a runtime, calling it the hosted implementation. The same is true for C++.

              For freestanding implementations a smaller runtime is compiled in.

          • Re: (Score:3, Informative)

            by raymorris ( 2726007 )

            The following is the C runtime, crt0. It is 9 lines of assembler: .text .globl _start

            _start: # _start is the entry point known to the linker
            mov %rsp, %rbp # setup a new stack frame
            mov 0(%rbp), %rdi # get argc from the stack
            lea 8(%rbp), %rsi # get argv from the stack
            call main # %rdi, %rsi are the first two args to main

            mov %rax, %rdi # mov the return of

            • So the rest of the program is expected to bring its own runtime and OS interface?
              • Yep. The runtime starts main(), and returns the return value of main to the OS(). So basically it does START and END.

                Everything in between is the responsibility of main(). It needs not interact with the OS at all, other than being started by the OS and telling the OS when it's done. Only for standalone programs, though - kernel modules don't need to do those two things.

                There is a very useful kernel module which does nothing but allocate some memory. That's all it does. :)
                It's used when you have a a few byt

            • All mammals have boobs.

            • It's worth comparing this with the C runtime for a 32-bit CPU or a microcontroller. It wasn't so long ago that 64 bit division was a function call.

              One more thing that I didn't cover is the C++ runtime, which is a little larger due to exception handling.

            • Darn you!!! You beat me to it. So much posting space wasted by the others going offtopic... I thinking "Its a 15 line answer!!!" (So it was less than 15, but whatever).

              Is this no longer covered in second or third tier CS classes?

            • Cute. That isn't the same thing as an interpreter runtime which is a hardware abstraction layer. If you know enough to recognize assembly then I know you know the difference between the java virtual machine and crt0. Stop being obstinate.
            • Are you purposely forgeting about libc? malloc/free IS a runtime, for example.

              I have coded in C without a runtime, writing bare-metal code for microcontrollers, but the code you run on your PC has a runtime.

      • Re: Easy (Score:5, Insightful)

        by reanjr ( 588767 ) on Monday February 18, 2019 @06:33PM (#58142326) Homepage

        Threading in ANSI C is pretty straightforward. There are inherent complexities in multithreaded code, but those can't be ignored by any language. In the context of what C code looks like, I think the threading interface is about as simple as you can get. Do you have any examples of a simpler model?

        • Threading in C is pretty straighforwards.

          The problem is, it is up to the programmers, all of them, to be very careful. Threading in C is unsafe. You have to take every step carefully, or some part of the code will smash the toes of some other part.

          That's fine for me, it sounds like it is fine for you. But it gets really hard to do something like deep learning that way without risking deadlocks.

          That's why something like Go-lang is better for that sort of application; it is easier to avoid deadlocks while onl

      • Re:Easy (Score:5, Informative)

        by Dunbal ( 464142 ) * on Monday February 18, 2019 @08:09PM (#58142710)

        There is no easy way to do threading, by its very nature. It's not a language thing, it's a computer thing. If more than one thread is trying to access the same resource at the same time the headaches begin: who goes first, how do you get the threads to accept its place in the queue and "know" when the other is done, etc, etc, etc. Languages that support memory locking of course make things easier but you still have to think the program through very clearly and often you end up with rather unusual and not reproducible bugs.

        However I posit that many people who use higher level languages have no actual idea of how a computer works or what they are actually doing, unlike us old timers who grew up with assembler.

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          You are approaching the problem in the wrong way.

          You capture the problem with this line "If more than one thread is trying to access the same resource at the same time the headaches begin".

          By far the best solution is to avoid this issue in the first place. I.e. write in a functional style, prefer immutability over mutability, and don't share mutable state in concurrent code. Then you will find that reasoning about concurrent code is much much easier.

          Of course, C doesn't really support this paradigm, which

      • The constraints on deep learning AI aren't going to be solved with squeezing 10% better performance out of the software.

        And you don't program AI. You educate an AI.

    • After 2 years of R&D we finally string concatenation and parsing that isn't locking up the computer!
      What were we doing again? Oh Deep Learning algorithm. I guess we now need to figure out how to hook the code up to a data source, and maybe we can find a way to interact the the GPU Cuda cores.

      Yes, I am exaggerating. But C++ Doesn't solve that Python has with the Deep Learning Systems. Sometimes we need a non-general use programming language, that is optimized for the type of coding involved. These pro

    • by L_R_Shaw ( 5544684 ) on Monday February 18, 2019 @04:40PM (#58141822)

      Over the past few months I have spoken to a few recruiters I know who were asking me to give them all the senior C++ engineers I know or if I personally was interested.

      In what?

      Doing complete rewrites of giant mountains of garbage Python code written by twenty something year old hipsters or older researcher type people.

      It is boring as fuck work but companies and organizations are desperate and willing to pay huge amounts of money to rid themselves of the clusterfuck that is Python.

      • by tomhath ( 637240 ) on Monday February 18, 2019 @07:26PM (#58142530)
        Those companies should be thankful they aren't rewriting giant mountains of garbage C++ code written by twenty something year old hipsters or older researcher type people.
      • If they were smart, they'd do it in ANSI C and just hire consultants.

        It isn't that hard, but who wants to learn all of C++, or risk writing something in the "wrong" subset? Total PITA. Lots of people love C++, but they all love and hate different parts. It is the plague.

        And why would career-oriented people want to get stuck doing it? It is a job for legacy code wranglers, not engineers.

        But in most cases, they should be using Golang or Ruby or something. Python is the howto language, the BASIC of the modern

    • Re:Easy (Score:5, Interesting)

      by serviscope_minor ( 664417 ) on Monday February 18, 2019 @05:57PM (#58142158) Journal

      Just use C++

      Indeed, and the library you want in particular is called DLib.

      http://dlib.net/ [dlib.net]

      Specifically:

      http://blog.dlib.net/2016/06/a... [dlib.net]

      the networks are represented as templates. It's pretty cool and very high performance. Particularly impressive given the relative resources invested relative to Tensorflow and PyTorch/Caffe.

    • Everyone is already using C++ when they need efficiency, even Google. Tensorflow is C++ then they use python for the times when efficiency doesn't matter.
  • by williamyf ( 227051 ) on Monday February 18, 2019 @04:12PM (#58141652)

    or perhaps prolog...

    • I have one character for you ")"

      Unfortunately Procedural and Object Oriented languages, have gotten a foot hold, and training someone to code functionally is quite difficult, and for parallel processing it may be even trickier.

      • I have one character for you ")"

        That's not a character, that's a SIMPLE-STRING of length 1. ;)

    • or ruby..

    • by jythie ( 914043 )
      I am picturing the current crop of 'the past was stupid, never learn from it!' programmers diving into LISP, Smalltalk, or Prolog.

      But as always, designing new languages is more fun than learning old ones, and learning from past mistakes makes you uncool.
  • If indeed f "deep learning" has hit the wall with current languages.... perhaps the hardware itself in it's current configuration isn't up to the task.
    Since all languages simply compile to machine code, which runs on an OS making hardware calls, a top level restriction implies a deep level constraint.
    To develop the next -real- AI, requires a different hardware approach running some as of yet unconvinced OS - followed by a language.
  • by Etcetera ( 14711 ) on Monday February 18, 2019 @04:35PM (#58141796) Homepage

    ... for building Skynet, and it'll be Lisp or perl.

    And we all know which one the Lord used: https://xkcd.com/224/ [xkcd.com]

  • Julia anyone? (Score:5, Insightful)

    by jgfenix ( 2584513 ) on Monday February 18, 2019 @04:41PM (#58141824)
    I think it fits the requirements quite well.
    • Re:Julia anyone? (Score:5, Interesting)

      by TimothyHollins ( 4720957 ) on Monday February 18, 2019 @05:08PM (#58141946)

      Julia has a tremendous problem. It's not designed for users, it's designed for Julia designers. If they had said "let's create an environment for deep learning that is great for threading" everything would be fine. But instead they went with "Hey, let's do all those awesome and cool things that we always wanted to see in a programming language and also it should be great with deep learning and implicit threading". The result is a (possibly great) environment that takes far too long to learn, has way to many individual quirks and ways of doing things that differ from the standard approach, and just is a bitch to intuitively understand.

      The environment may or may not be great, but the designers made sure that you couldn't just pick it up and go, you have to go "ahaaaa so that's how you do that" for every single thing. And when the option is using Python/R that you already understand or use Julia that you have to learn from scratch, the choice is easy, especially for the people that are scientists and not programmers at heart - which is the exact audience that Julia is targetting.

      • Re:Julia anyone? (Score:5, Informative)

        by thejam ( 655457 ) on Monday February 18, 2019 @05:59PM (#58142164)
        You can absolutely use Julia productively without getting into all the extra stuff. You can write code similar to Matlab or Numpy. Later, when you want more performance, you can delve into types more. Admittedly, the documentation emphasizes the multiple dispatch sophistication, and maybe Julia has a longer on-ramp than Python. In the past Julia was evolving very quickly, but now that 1.0 has been released, you can stick with that. But there is no other new language that has as strong a community dedicated to readable, powerful high-performance numerics. And the appeal of Julia is not that it does what currently Python or R can, but it's a better place for libraries to written by experts in the language itself. I can't think of a better language for doing research in numerical optimization, when you're really exploring new ideas and not just plugging into someone else's canned, but confining, "solutions". Most Python numerical libraries must, for performance, ultimately rely on C or C++ underneath, so becoming expert at Python does not help you in contributing to new high performance libraries. By contrast, high performance libraries for Julia can be written in Julia itself, so therefore Julia can be a very good long term investment. Please, tell me what high performance Python numerical libraries are written in Python, without C or C++?
        • Re:Julia anyone? (Score:5, Interesting)

          by Anonymous Coward on Monday February 18, 2019 @07:15PM (#58142478)

          You can write code similar to Matlab

          MATLAB is a terrible example of an "easy to learn" language. It's full of shitty hacks like "putting a semicolon after an expression suppresses output; deleting the semicolon causes the expression to dump its output to console." It's loaded with arcane semantics like the differences between a normal array and a cell array. For fuck's sake, it permits semicolons in the middle of a parameter list, like this [mathworks.com].

          MATLAB has the quintessential property of an overdesigned language, which is: if I leave it alone for a few months and come back to it, I have to re-learn the whole damn thing. The syntax, the library, and the UI are all unintuitive and illogical. I rely heavily on my cheatfile that I've built up from this iterative exercise to get back up to speed faster. I don't have to do that with Python or C.

          or Numpy

          Numpy is a terrible example of an "easy to learn" API. Numpy arrays use semantics that are logically similar to Python arrays - but of course they're nothing like each other, none of the libraries work the same way, etc. And And some core features are so poorly documented that you have to dig through sample code until you find something that vaguely resembles what you wanna do, and then shoehorn it into the shape you want. And if you want to stare into the abyss of insanity, try looking into using the Numpy array-sharing options to circumvent the cross-thread limitations of the Python global interpreter lock.

          Don't get me wrong: both are powerful and fun when you're acclimated to them. My first Numpy experience was porting a super-simple image processing technique, and my amateur, first-attempt Numpy code finished 10,000 times faster, and that's not an exaggeration. But they're both crappy languages from a model-of-clarity-and-consistency perspective.

      • Re:Julia anyone? (Score:5, Informative)

        by Pseudonym ( 62607 ) on Monday February 18, 2019 @07:38PM (#58142572)

        Julia has a tremendous problem. It's not designed for users, it's designed for Julia designers.

        I have to disagree with that. Julia is designed for users, but it knows that its use case is not Python's use case.

        Julia was designed as an upgrade for Fortran programmers. Like all good upgrade languages, it learned from all of the languages which tried (and failed) to replace Fortran previously, like SISAL and Fortress.

        There is a cohort of programmers for whom "the standard approach" means Python's highly idiosyncratic approach. In my (admittedly limited) experience, anyone who predates the rise of Python tends to have no problem picking up Julia.

    • Re: (Score:3, Interesting)

      by qdaku ( 729578 )

      Futhark looks cool: https://futhark-lang.org/ [futhark-lang.org] and promising in this realm.

      "Futhark is a small programming language designed to be compiled to efficient parallel code. It is a statically typed, data-parallel, and purely functional array language in the ML family, and comes with a heavily optimising ahead-of-time compiler that presently generates GPU code via CUDA and OpenCL"

      the ML family of languages being things like Standard-ML, Haskell, OcaML.

  • by Patrick May ( 305709 ) on Monday February 18, 2019 @04:54PM (#58141882)
    it's called Common Lisp.
    • it's called Common Lisp.

      Coming soon from Facebook - HHCL.

    • Aternatively, the first working version of Racket-on-Chez-Scheme just came out. Now if only they removed that Chez's pesky all-flonums-are-boxed limitation... There's really no need for that.
  • Everyone is apparently using an interpreted language? Sounds to me like they need to apply the second word of A.I. , since they keep insisting on calling it that.
    • by ceoyoyo ( 59147 )

      There isn't really any such thing as an interpreted language anymore. There are Python compilers. There are even Python compilers that can be told to compile single functions or methods.

      Deep learning code is absolutely compiled, but for the major libraries the models are specified in python.

    • People are using scripting languages to piece together subsystems written in low-level code.

      This is the way that tasks like this have pretty much always been done. Your web browser isn't written in JavaScript, even though its performance may make it seem that way.

    • The heavy lifting is done by compiled code in the libraries / modules. Python's just the glue language that gives an easy entry point.
    • "Intelligence" may very well involve just using environments that can figure out themselves what to compile, and when. (And don't need to be spoon-fed like C compilers.)
    • LLVM has helped to blur the line.

  • by Locke2005 ( 849178 ) on Monday February 18, 2019 @05:28PM (#58142024)
    LISP, then?
  • That's definitely what we need. A new programming language.

    • If the only other alternative is Python, then that is precisely what we need.

  • by bahwi ( 43111 ) on Monday February 18, 2019 @07:44PM (#58142610)

    I mean, sure, a single data processing pipeline might have to use 6 different conda environments, each with different dependencies and python versions due to tool and libraries are often deprecated with even minor point changes to python versions...... oh yeah, all that and then you have to shoe-horn in tensorflow (or something else).

  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Monday February 18, 2019 @08:15PM (#58142726)
    Comment removed based on user account deletion
    • Deep Learning is stuck again because there is no fundamental theory of learning. We have no idea how we learn.

      Actually, we know how learning works just fine. They are called neural networks and our implementations of them work as expected.

      In truth, what we don't understand is how to build neural networks that give rise to a general intelligence. My theory is that predefined (via evolution) generic brain structures that segment every animal brain is the key to general intelligence while hardwired instincts drive the advancement of general intelligence.

      Neuroscience will probably discover that learning is a very biological trait, not one that can be copied or simulated in discrete mechanical systems.

      That's a very silly argument because absolutely anything can be

  • If someone develops a language that offers true leverage over Python then the transition *will* happen and all that will be left unanswered is "how long will it take for the transition to happen?"
  • I say this every other day in one form or another, but let me try again: we all keep jumping up and down and shouting about whether we need a more flexible language or a fussier language with super-strong type-checking and encapsulation and what not, and some are deeply convinced that Pythons's fascist attitude toward formatting is excellent, and some are not -- one of the reaons the Go project was started, from what I understand, was to get away from syntactic whitespace--

    None of us really know who's rig

  • I agree with Yann that our current crop of languages aren't well suited to deep learning, but I'm not sure it's a Python specific problem. I don't think whitespace, threading, syntax etc are the barrier.

    It's much more that Python is fundamentally an imperative language, and deep learning doesn't fit into either the imperative or functional category, I really think DL deserves its own category, designed from the ground up for manipulating tensor data structures of unknown shapes.

    I haven't come across a
  • ... there was a programming language [wikipedia.org] designed specifically for that
  • by sad_ ( 7868 )

    Not Invented Here syndrome at work?
    People want to keep using Python, it must be doing something right.

Be sociable. Speak to the person next to you in the unemployment line tomorrow.

Working...