Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Python

'Faster, Leaner' Python 3.12 Released Today with Improvements to Speed, Multiprocessing (infoworld.com) 53

Python 3.12 was released today, with improvements to speed and efficiency, reports InfoWorld. Core developers explained the improvements at this year's PyCon convention in Salt Lake City, Utah, including efforts to reduce Python's memory use, make the interpreter faster, and optimize compilation for more efficient code: Subinterpreters is a mechanism where the Python runtime can have multiple interpreters running together inside a single process, as opposed to each interpreter being isolated in its own process (the current multiprocessing mechanism)... While subinterpreters have been available in the Python runtime for some time now, they haven't had an interface for the end user. Also, the messy state of Python's internals hasn't allowed subinterperters to be used effectively. With Python 3.12, core python developer Eric Snow and his cohort cleaned up Python's internals enough to make subinterpreters useful, and they are adding a minimal module to the Python standard library called interpreters. This gives programmers a rudimentary way to launch subinterpreters and execute code on them.

Snow's own initial experiments with subinterpreters significantly outperformed threading and multiprocessing. One example, a simple web service that performed some CPU-bound work, maxed out at 100 requests per second with threads, and 600 with multiprocessing. But with subinterpreters, it yielded 11,500 requests, and with little to no drop-off when scaled up from one client. The interpreters module has very limited functionality right now, and it lacks robust mechanisms for sharing state between subinterpreters. But Snow believes by Python 3.13 a good deal more functionality will appear, and in the interim developers are encouraged to experiment...

Python 3.11 introduced new bytecodes to the interpreter, called adaptive instructions. These instructions can be replaced automatically at runtime with versions specialized for a given Python type, a process called quickening. This saves the interpreter the step of having to look up what types the objects are, speeding up the whole process enormously. For instance, if a given addition operation regularly takes in two integers, that instruction can be replaced with one that assumes the operands are both integers... Python 3.12 has more adaptive specialization opcodes...

And starting with Python 3.12, object headers now use 96 bytes, which InfoWorld reports is "slightly less than half of what it was before."
This discussion has been archived. No new comments can be posted.

'Faster, Leaner' Python 3.12 Released Today with Improvements to Speed, Multiprocessing

Comments Filter:
  • Why would anyone hold a convention in Utah?

  • Good (Score:3, Interesting)

    by gweihir ( 88907 ) on Saturday May 06, 2023 @11:00PM (#63503249)

    A much-needed cleanup. Of course, Python will never be "high performance", but for a lot of things it is good enough and most coders cannot do "high performance" anyways.

    • Re:Good (Score:4, Interesting)

      by sg_oneill ( 159032 ) on Saturday May 06, 2023 @11:11PM (#63503259)

      Yep. We do a lot of data analysis at work, and we've had slow algorithms where I've gone and taken it and rewritten it in C++ and the speedup wasnt substantial because the majority of the work was done in well optimized libraries anyway. However theres always going to be a category of algorithms where its the actual high level of the algorithm that slows it down and while some just need to be rewritten to purge the stupid, some genuinely benefit from a faster runtime. Considering the plethora of science and math libraries in python , its in everyones interest to figure out where the snares are in python. (afaik its mostly in the typing system, which recent advances in static explicit types have helped but yeah. [issues with GIL havent been much of a practical issue in quite some time, its there, but its largely tamed by libraries such as multithreading)

      • by groebke ( 313135 )

        Choose one: quick coding, cheap hardware, fast processing.

          Though Python does offer ease in duct taping something functional quickly, multipurpose languages are not good at evaluating large analysis datasets speedily. This is where R, or better, SAS come into play. Also, there is only so much one can do from a performance standpoint on X86 hardware.

        • Re:Good (Score:4, Insightful)

          by jma05 ( 897351 ) on Sunday May 07, 2023 @12:19AM (#63503323)

          How is R faster than Python? You can absolutely evaluate large datasets in the Python ecosystem. Note that I said ecosystem, not just Python. There is a native Python module for just about everything. If I want language level speed, I would use Julia, but R and SAS aren't built for speed anymore than Python is.

          I use R for exploring data, simply because I find statistical and plotting code quicker in it. The reason isn't computational speed.

        • Na. R is a lame dog.
          If you're looking for flexibility in large dataset processing, and native languages aren't something you're... proficient in, JIT Python is king.
          And I say that strongly holding my nose.
        • by gweihir ( 88907 )

          You do not seem to have much experience with large-scale data analysis. What you claim could be straight out of a marketing brochure making bogus claims to rope in customers.

        • Also, there is only so much one can do from a performance standpoint on X86 hardware.

          Before amd64, x86 dominated the top500. You have no idea what you're talking about at all.

        • by dfghjk ( 711126 )

          Or choose none, depending on how bad you are at your job.

          It's not insight, it's dogma, and it comes from ignorance.

          • by gweihir ( 88907 )

            Indeed. People that do not understand things, but cannot see or admit that, typically go for dogma. It is rather pathetic in anybody working in technology.

      • Science bitches!!

      • by dfghjk ( 711126 )

        "However theres always going to be a category of algorithms where its the actual high level of the algorithm that slows it down..."

        LOL couldn't prove you're not a programmer any more clearly.

        The fact that high performance code is isolated to libraries has always been the glaring evidence of just how bad Python performance is.

        • Just facepalm.

          Hint: those libraries can be used from any language.

          In other words: their existence has nothing to do with Python.

          Lucky that opinions like yours show us how irrelevant you are as:
          - a programmer
          - a scientist

    • High performance isn't that important when most of the time is spent writing the code and real performance gains come from better asymptotic performance of a more carefully designed algorithm.

      Unless it is something that absolutely needs to run as fast as possible, damn the costs, Python is a perfectly fine language. If you just need to write something to answer a question once, Python probably gets you there faster even if the program executes slower.
  • by fahrbot-bot ( 874524 ) on Saturday May 06, 2023 @11:38PM (#63503281)

    'Faster, Leaner' Python

    Wouldn't that just be Perl? :-)

    • by jma05 ( 897351 )

      No, that would be Julia, if you are looking outside Python versions.

    • No. Perl is faster and the executable is smaller, but in order to make Python like perl, you'd have to not just make it faster, you'd have to make it funner.

      Think of some more rigid language for your comparison. Really just about any of them, will work -- whatever you pick, it's probably faster than python. As in
      "Wouldn't that just be RPG? :-)"

      • Perl is faster and the executable is smaller, but in order to make Python like perl, you'd have to not just make it faster, you'd have to make it funner.

        Writing python is more unfun than writing perl. Neither one is really party time though.

        Reading python is more fun than reading perl, on average.

    • No, Python is readable.

  • really? (Score:5, Funny)

    by pz ( 113803 ) on Sunday May 07, 2023 @12:03AM (#63503313) Journal

    "Quickening"?

    Why do the people developing Python have such a need to invent new words for already-established ideas? The rest of the world calls this particular idea specialization, and it's decades old.

  • Alpha version (Score:5, Informative)

    by Equuleus42 ( 723 ) on Sunday May 07, 2023 @12:52AM (#63503357) Homepage

    Python 3.12 was released today

    Python 3.12.0 alpha 7 was released today. The official release of 3.12.0 is slated for 2 October 2023 [python.org].

    • Re:Alpha version (Score:5, Informative)

      by rta ( 559125 ) on Sunday May 07, 2023 @02:42AM (#63503463)

      Yeah, what the heck is with this post?

      Not only is 3.12 not being released until Q4 but all the items in the article aren't even slated for 3.12 afaict. The "interpreters" module they mention seems to be PEP 554 (https://peps.python.org/pep-0554/) which is labeled as targeted for 3.13 and is nowhere mentioned in the linked "what's new in 3.12" page. It is also not among the ones mentioned as still being considered for inclusion before the 3.12 feature freeze in ~ a week ( https://discuss.python.org/t/p... [python.org] )

      The "separate GIL per sub-interpreter" is PEP-684 (https://peps.python.org/pep-0685) and IS accepted for 3.12 according to its page and the message it links to. (https://discuss.python.org/t/pep-684-a-per-interpreter-gil/19583/40) though it still hasn't made it to the "what's new in 3.12" page. This is pretty cool as some future speedup of the multiprocessing module, But w/o 554 it still won't be accessible from python code.

  • I suggest you check, and stay tuned about, Chris Lattner's latest child, Mojo. It's a superset of Python, with amazing new capabilities... And way, way faster than current Python solutions. Go to www.modular.com/mojo and check it out!
  • I wonder if Python could be made to run on top of a modified Luajit
  • Or is Emperor Guido still sticking to his government provided 40 column CTR utopia?

  • >
    Snow's own initial experiments with subinterpreters significantly outperformed threading and multiprocessing. One example, a simple web service that performed some CPU-bound work, maxed out at 100 requests per second with threads, and 600 with multiprocessing. But with subinterpreters, it yielded 11,500 requests, and with little to no drop-off when scaled up from one client.

    lol they invented mod_perl

    • lol they invented mod_perl

      Yeah, but still slower. With real, consistent effort they may get Python to the point where it is half as fast!

  • It's not just having the sped up Python code, it's really about the inclusion rate for libraries. Many haven't even caught up to 3.11 yet, let alone 3.12. Seriously, how many coders out there are still using 3.9 or older because "it just so happens I need this 1 library that hasn't been updated since a fair number of Tuesdays ago". It's not just the language. It's how fast the rest of the community catches up to it.
    • by pz ( 113803 )

      I understand that with 3.10 they tried to have coordination with some of the major libraries to have them release updates at about the same time as 3.10 went out.

      As far as I can tell, that effort was not very successful, as some of the primary libraries I use still haven't caught up.

  • Can anyone explain why sub-interpreters are so much faster than multiprocessing? The demo cited in the OP reports a 25x speedup on a CPU bound web services. I can find comments from 2017 [1] predicting that sub-interpreters will be similar to multi-processing but require less overhead from the OS for inter-process-communication, but I find it hard to believe this is the cause of such a large speed up, except in a contrived example. [1] https://mail.python.org/piperm... [python.org]

One good suit is worth a thousand resumes.

Working...