Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Education Programming

'Picat' Programming Language Creators Surprised With A $10,000 Prize (bcexcelsior.com) 63

An anonymous reader writes: "I didn't even know they gave out prizes," said a Brooklyn College CS professor, remembering how he'd learned that a demo of the Picat programming language won a $10,000 grand prize last month at the NYC Media Lab Summit. Professor Neng-Fa Zhou created Picat with programmer Jonathan Fruhman, and along with graduate student Jie Mei they'd created a demo titled "The Picat Language and its Application to Games and AI Problems" to showcase the language's ability to solve combinatorial search problems, "including a common interface with CP, SAT, and MIP solvers."

Mie tells the Brooklyn College newspaper that Picat "is a multi-paradigm programming language aimed for general-purpose applications, which means theoretically it can be used for everything in life," and Zhou says he wants to continue making the language more useful in a variety of settings. "I want this to be successful, but not only academically... When you build something, you want people to use it. And this language has become a sensation in our community; other people have started using it."

This discussion has been archived. No new comments can be posted.

'Picat' Programming Language Creators Surprised With A $10,000 Prize

Comments Filter:
  • by Anonymous Coward on Sunday October 23, 2016 @03:42PM (#53135433)

    Computer, make it so!

  • In a proper microservices world, there is indeed room for unpopular languages. To be a citizen of this world, at minimum they merely need to have IPC message passing support. This means REST, WebSocket, RPC, pubsub, etc. Giving individual developers the freedom to work with their language of choice for each subproject is immensely rewarding and powerful.
    • Yep, I agree, although I think "microservices" is better described as "any sort of specialized language domain", and is a little less buzzword-bingo-ready.

      I recently finished work on a small embedded language that I'm using in my own projects. I published it on GitHub. Interest? Zero. Quite literally, no one else is using it, as far as I can tell. No worries, it's got one satisfied customer, and it's available for others to use if they want. It was hugely satisfying to design the language and work thr

    • Old schoolers will inevitably complain about high latency between microservices. If this strongly matters, there are InfiniBand and PCIe fabrics for very low latency.
      • by Anonymous Coward

        Old schoolers will inevitably complain about high latency between microservices. If this strongly matters, there are InfiniBand and PCIe fabrics for very low latency.

        Service oriented has its place. I just get tired of it being branded as the cure all for everything. I've actually spent a lot of time working towards low latency near real time solutions, and now we have people wanting to push SOA everywhere, and to a large extent dismissing any latency or performance concerns as me just being an idiot or something. Sometimes all the extra layers do matter, and sometimes it just isn't worth the problems created for the problems solved. Of course a problem doesn't have

        • Lock-free shared-memory writes seems like a black art. Don't know how it works or how you did it.
          • The one example that I've seen basically uses shared memory as an expandable pool of fixed-size elements (structs), using an enum to control the size. The real trick is to structure the thing so that one-and-only-one thread/process controls the writing, whereas many others can read with abandon. Those readers could send messages through a separate channel to request a status change, but they didn't directly write to shmem. I'm not aware of any method to handle true multiple shmem write access without some f
            • by Anonymous Coward

              The one example that I've seen basically uses shared memory as an expandable pool of fixed-size elements (structs), using an enum to control the size. The real trick is to structure the thing so that one-and-only-one thread/process controls the writing, whereas many others can read with abandon. Those readers could send messages through a separate channel to request a status change, but they didn't directly write to shmem. I'm not aware of any method to handle true multiple shmem write access without some form of locking, though....

              The one producer many consumer model works well with it. If you assume the producer and the consumer are disconnected, where the consumer _must_ keep up, or simply drop data, then you can do it just with basically a 64 bit integer that keeps track of the points added. The consumers each look at that integer and do a mod operation on it to find the real data in memory, as well as checks for likely data loss. (The consumer should not read a point that may be in the process of being written.) The consumer

  • > Compared with Prolog, Picat is arguably more expressive and scalable: it is not rare to find problems for which Picat requires an order of magnitude fewer lines of code to describe than Prolog and Picat can be significantly faster than Prolog because pattern-matching facilitates indexing of rules. Ref: http://picat-lang.org/ [picat-lang.org]
  • by Anonymous Coward

    And this language has become a sensation in our community; other people have started using it.

    That's a pretty low bar for 'sensation'

  • by JustNiz ( 692889 ) on Sunday October 23, 2016 @04:55PM (#53135693)

    > theoretically it can be used for everything in life,

    VM? garbage collector? no pointers? I'm not seeing anyone writing OS's, device drivers or any other low-level stuff in this anytime soon.

  • by Gravis Zero ( 934156 ) on Sunday October 23, 2016 @05:20PM (#53135791)

    Picat looks like what you get after Python eats Javascript and then vomits. I give it s/[stars]/1 star.

    • by Anonymous Coward

      you forgot to end each sentence in a comma,

    • Picat looks like what you get after Python eats Javascript and then vomits.

      Looks a lot more like Prolog which has swallowed an OO-flavour of Pascal actually.

      • Either way it is puke-worthy. Redundant characters in code make my blood boil. Python and Nim almost got it right. But Picat had to pickaxe those lessons.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...