Is Python About to Get Faster? (zdnet.com) 134
"Python 3.11 will bear the fruits of CPython's multi-year effort to make Python a faster programming language," reports ZDNet.
"Core Python (CPython) developer Mark Shannon shared details about the project to make Python faster at the PyCon 2022 conference this week..." Last year, Microsoft funded a project for the Python Software Foundation (PSF), led by Python creator Guido van Rossum and Shannon, to make Python twice as fast as the current stable 3.10 series. The vision is to nudge Python towards the performance of C. Microsoft hired van Rossum in 2020 and gave him a free hand to pick any project. At last year's PyCon 2021 conference, he said he "chose to go back to my roots" and would work on Python's famed lack of performance....
The Faster CPython Project provided some updates about CPython 3.11 performance over the past year. Ahead of PyCon 2022, the project published more results comparing the 3.11 beta preview to 3.10 on dozens of performance metrics, showing that 3.11 was overall 1.25 times faster than 3.10. Shannon is realistic about the project's ability to improve Python performance, but believes the improvements can extend Python's viable use to more virtual machines. "Python is widely acknowledged as slow. Whilst Python will never attain the performance of low-level languages like C, Fortran, or even Java, we would like it to be competitive with fast implementations of scripting languages, like V8 for Javascript or luajit for lua," he wrote last year in the Python Enhancement Proposal (PEP) 659.
"Specifically, we want to achieve these performance goals with CPython to benefit all users of Python including those unable to use PyPy or other alternative virtual machines...."
On the question of a just-in-time (JIT) compiler for Python's performance, Shannon suggested it was not a priority and would likely not arrive until Python 3.13, according to the Python Software Foundation's coverage of the event.... According to the Faster Python implementation plan, CPython 3.12 might gain a "simple JIT compiler for small regions" that compiles small regions of specialized code, while 3.13 would enhance the compiler to extend the regions for compilation.
"Core Python (CPython) developer Mark Shannon shared details about the project to make Python faster at the PyCon 2022 conference this week..." Last year, Microsoft funded a project for the Python Software Foundation (PSF), led by Python creator Guido van Rossum and Shannon, to make Python twice as fast as the current stable 3.10 series. The vision is to nudge Python towards the performance of C. Microsoft hired van Rossum in 2020 and gave him a free hand to pick any project. At last year's PyCon 2021 conference, he said he "chose to go back to my roots" and would work on Python's famed lack of performance....
The Faster CPython Project provided some updates about CPython 3.11 performance over the past year. Ahead of PyCon 2022, the project published more results comparing the 3.11 beta preview to 3.10 on dozens of performance metrics, showing that 3.11 was overall 1.25 times faster than 3.10. Shannon is realistic about the project's ability to improve Python performance, but believes the improvements can extend Python's viable use to more virtual machines. "Python is widely acknowledged as slow. Whilst Python will never attain the performance of low-level languages like C, Fortran, or even Java, we would like it to be competitive with fast implementations of scripting languages, like V8 for Javascript or luajit for lua," he wrote last year in the Python Enhancement Proposal (PEP) 659.
"Specifically, we want to achieve these performance goals with CPython to benefit all users of Python including those unable to use PyPy or other alternative virtual machines...."
On the question of a just-in-time (JIT) compiler for Python's performance, Shannon suggested it was not a priority and would likely not arrive until Python 3.13, according to the Python Software Foundation's coverage of the event.... According to the Faster Python implementation plan, CPython 3.12 might gain a "simple JIT compiler for small regions" that compiles small regions of specialized code, while 3.13 would enhance the compiler to extend the regions for compilation.
It couldn't get much slower (Score:4, Interesting)
From various benchmarks I've een it runs anywhere from 1/20th to 1/100th the speed of C/C++ depending on the task.
Re: (Score:1)
There is absolutely no point comparing it to a low-level language. CPython runs in C, so there is always overhead.
Re: (Score:3, Insightful)
The problem is because of its extensive (and very good) systems library and all the other 3rd party libraries, its increasingly being used for tasks that require speed - eg servers and AI. And yes, most (all?) AI libraries are C++ at their core , but transfering datas from the C++ to the python level and python processing it at the high level all takes up CPU cycles.
Re:It couldn't get much slower (Score:4, Interesting)
For most things the transfer takes up very little time. The data is normally stored in numpy arrays and can be handed directly to C++ without conversion or even a copy needed. For AI libraries you are mostly designing the network and data loading but once training or inferencing starts you are mostly just running pure native code.
My experience is that with Python you can get the design working first and then fix the algorithm issues and then hand off to high performance libraries for most of the heavy lifting and get something that gets close to the performance of a native solution at a fraction of the effort.
Re:It couldn't get much slower (Score:4, Insightful)
I do most of my algorithm development in python. Sometimes, when it matters I will re-write it in C or C++ when it's working, but often I rely on Python features like the built in large integer support and MPFR for high precision floats, so mapping to a compiled language is a problem and usually not needed. When I take an O(n^2) algorithm down to O(n.log(n)) or even O(n) with some neat algorithm trick, it usually computes fast enough in Python. Often I'm looking for an answer, not code that I'm going to run many times, so coding speed and convenience is the primary utility of the language.
Re: (Score:2)
I do most of my algorithm development in python. Sometimes, when it matters I will re-write it in C or C++ when it's working, but often I rely on Python features like the built in large integer support and MPFR for high precision floats, so mapping to a compiled language is a problem and usually not needed.
There's a variety of C++ interfaces for MPFR and etc. They're all much of a muchness, really, they supply a C++ type with operator overloading so you just write your code as you'd expect and it works.
Re:It couldn't get much slower (Score:4, Interesting)
Yes. I've tried a few. MPFR in C++ is way simpler to use than in C where you don't have operator overloading.
However with the dynamic typing of python I can inject MPFR numbers into existing code and it will be handled just fine.
Similarly I needed to do matrix arithmetic in Galois extension fields (that numpy doesn't do). So I wrote a GF library and overloaded the four arithmetic operations. Then I wrote a basic matrix library and the GF elements just worked within the matrix library without the matrix library needing to know. Then I do my ECC (both the error correction and elliptic curve crypto variants) and matrix based crypto (like 2-EXT) in simple python expressions using the matrix library with the GF library. That is not so simple in C++ although it is doable, but the off-the-shelf libraries tend not to be handle that kind of thing.
So it's the relative speed and ease of getting the code to run and work that usually matters to me. Once it's running, I'll get my output and I'm done. My 2-EXT code takes a few seconds to find all the matrices for a large order extractor and spit out the logic operations that I can put in hardware.
I don't know why lots of people think Python is only used for ML and numpy. It's used for lots of things.
I've been programming since the late 70s and I've used my share of languages. But Python is the one I fall back on when I need quick development. C is the one I fall back on when I need speed or I'm writing reference code for customers. I've tried the Javas and Rusts and Smalltalks and Haskells and occams and pilots and prologs and UTC&A and they don't really do that much for my type of work. That's not to say that Rust isn't good for large projects that need security and stability, or Haskell isn't good for maintaining functional consistency. Just use what works for you.
Re: (Score:2)
An example for the curious.
MPFR code written in Python to develop and test the algorithm : https://github.com/dj-on-githu... [github.com]
MPFR code written in C in the application : https://github.com/dj-on-githu... [github.com]
Re: (Score:2)
The transfer usually costs: exactly nothing
It is just a pointer you pass to a C/C++ function.
If you do not work with Python, then at lest have the dignity and stop spreading your misunderstandings.
Re: (Score:2)
There is absolutely no point comparing it to a low-level language.
A language is a language. You use a language to create programs. In that sense they are comparable. Now there is no inherent speed in a language, they can all be theoretically translated and all the proper ones are Turing complete. But when every implementation of Python is fairly consistently slower than Java or C/C++ you have to wonder if the savings on programming effort is worth the overhead. Sometimes, probably, other times if a program is written once but ran a billion times the combined overhead beco
Re: (Score:2)
How is it hard to install on Windows? I use miniconda on Windows and Linux and the install process is basically identical and it is very rare to run into something that only works on one of them. All the commands are the same, the dev process is the same etc. I switch between Windows and Linux for Python development for decades now depending on what I am doing.
Re: (Score:2)
On Linux I type 'python'. ;o)
On Windows it was recommended to install Chocolatey then use that to install Python. A bit of a cop out that we dealt with for a very long time. Another option is Anaconda and then later Miniconda because Anaconda was kind of a pain in the ass too.
If you want to take a trip down memory lane. Grab the windows installer for Python 2.0.1 and try it. It sucked and what you ended up with wasn't a very practical installation of Python. Fast forward a decade and it was still not great.
Re: (Score:3)
I don't use the system Python on Linux. I find that makes things too hard when trying to make reproducible setups. With conda based systems I can quickly replicate the entire environment elsewhere. I can also clone the environment and make some changes and verify that all tests still pass.
For Windows I have used Anaconda/Miniconda for a LONG time. It has high performance versions of numpy and other scientific libraries by default which is a huge gain over using pip.
Re: (Score:2)
Yes. My day job includes getting a flavor of conda pushed out to a cluster. At lot of this stuff took years to come around. I think people hate Python on its decades of bad practices rather than any single workflow you happen to use.
I still consider the tools around Python to be immature. If they were better a big part of my day job would have been eliminated. This isn't your grandpa's FORTRAN, but also they don't make 'em like they used to.
Re: (Score:2)
Re: (Score:2)
Use conda. Don't use the system python for anything. Think of the system python as the python needed by other stuff in the repos. If you upgrade it or install packages in it you can actually break your distro.
Re: (Score:2)
And for 99 uses out of 100, that is fast enough.
Re: (Score:2)
The difference between pure Python and C-based Python libraries has actually had a nice effect. This difference means that there is a strong motivation to produce C-based libraries that provide a nice high-level abstraction. For nearly every computationally-intensive task that one might commonly want to do, someone has created a C-based Python library with a simple interface.
In C, if I need to do a one-off matrix multiply, it might be easier to inline the code than it is to try to set up a separate module
Re: (Score:2)
Good observation.
For clever programmers it is not C vs Python, it is C AND Python.
Re: (Score:2)
And thats why datacentres are now using a significant percentage of world power generation with the accompanying emissions.
Re: (Score:2)
Citation for this being the fault of higher-level languages ?
Re: (Score:2)
Code that works, will be used in companies. And initially it works well enough, so more and more people inside the company start to use this code.
And that keeps growing until tit hits a certain scale. But now the use of that code is so entrenched in the companies business flow that you now suddenly need to do some serious hardware upgrading to the server this code runs on.
A one time rewrite in a much faster language would have saved a lot of time, computing resources and hardware upgrades.
TCO of software in
Re: (Score:2)
My approach is to first code in Python and make sure it works with a test suite. Then I profile and optimize with algorithms and using common high speed libraries. Then I optimize with more high speed libraries depending on what additional profiling shows up. If I do have to recode it in a low level language or just a part at that point at least I then have a well designed set of algorithms, tests, interfaces etc. which is a MUCH better target for a low level language. I find coding from scratch in a low le
Re: (Score:2)
"I find coding from scratch in a low level language is more likely to lead to project failure."
Amen to that - it is simply your old Premature Optimisation writ large.
I was coding in C++ until I came here to comment. I wish we had prototyped in Python.
Julia (Score:2)
Julia is wickedly fast out fo the box. Most of Julia is actually Julia under the hood! This isn't merely a boast but it enables the most essential feature of Julia which is that it can autorecompile on the fly for any new type long after the original library was written.
Julia is not pythonic so pythonista's give it a lot of grief. When you first encounter it you may even think of it as a throwback. But it's not it's just a different mindset. Think of your initial gag when you kinda mocked python for it
Re: (Score:3)
I have looked at Julia a number of times now and while it is interesting it mostly seems like it needs time to mature. It mostly needs time to stabilize for the language and the ecosystem. When I looked at using it for a project it still seemed pretty difficult for multiprocessing due to the slow startup. There was work being done to be able to save the compilation so the workers could start up quickly but it seemed kind of hacky still.
I also REALLY dislike 1-base indexing and Julia would be the only langua
Re: (Score:2)
Fortran isn't locked into a particular base, whether 0 or 1. 1 is the default, but you could start at 0, or 4, or -72,339, or . . .
Re: (Score:2)
Okay so just Julia and MATLAB then.
Re: Julia (Score:2)
R also uses 1-indexing
Re: (Score:2)
Julia is not locked to origin 1 indexing.
Re: (Score:2)
You obviously have a limited familiarity with languages. You should try some Smalltalk programming. (There are other languages that are worse, and there's an optimized version that I haven't looked at.)
Re: (Score:2)
I used smalltalk once for something that, by concept, was purely objects talking to one another.
It was painfully slow on a fast machine for the time (an alpha).
I did a line by line translation into fortran, using array references rather than an array of objects. No effort, *at all*, at optimizing.
The speedup was 45,000:1
PyPy has existed for years (Score:5, Informative)
And averages 4.5x faster than CPython. [pypy.org]
Re: (Score:2)
Re: (Score:2)
PyPy doesn't provide much speedup for the Python parts of a program if the program uses a C extension module that uses Python C API. Unless a module is ported to CFFI [pypy.org], PyPy loses a lot of time marshaling objects back and forth between Python and C environments in order to emulate the Python C API. My experience is with Pillow (the Python imaging library) in a program that needed to make a bunch of calls to PIL.Image.Image.crop() and PIL.ImageStat.sum2() to determine which of four candidate images had the lo
Re: PyPy has existed for years (Score:2)
Sure, there are cases where Pypy doesn't run faster. However the beauty of Pypy (in my opinion) is that it requires zero modification to your code: you simply run it with Pypy instead of CPython. So you can always simply try, just to see if it runs faster, and you can do that at various stages of your development.
CPython does not mean "Core Python" (Score:2, Informative)
Stop talking bullshit, editors. Seriously.
Here's an excerpt from Wikipedia, which you could have checked in about 30 seconds:
"CPython is the reference implementation of the Python programming language. Written in C and Python, CPython is the default and most widely used implementation of the Python language."
Re: (Score:3)
Hey, this is Slashdot, editorial inaccuracy is part of its anarchic charm! :)
Re: (Score:2)
You misspelt "annoying incompetence" there, old chap.
Hmm not sure what you mean:
Re: (Score:3)
Hey, this is Slashdot, annoying incompetence is part of its annoying incompetence! :)
Re: (Score:2)
"It's not the editors, it's a direct quote from TFA (note the double quotes)"
Quoting stories that are inaccurate is acceptable.
What is NOT acceptable is the Slashdot "Editors", when informed here about the errors in the story, not going back and EDITTING the story to correct them.
THAT is *annoying incompetence*.
Basic types (Score:2)
Re: (Score:3)
This is also why high performance code uses numpy. You don't reference count very item in a numpy array (unless it is an array of objects).
Re: (Score:2)
Think of the way int/float/boolean are handled as objects, allocated and reference-counted for each simple statement. Ints alone are just worse because of their infinite length. This takes an awful lot of time. Some data flow analysis could spare a lot of this useless handlng.
When I need to use huge integers or exact rationals, python is perfect. It supports them natively. So instead of Googling for some C bigint library and trying to get it to compile, then learning it's call structure, I just use python and the code is written in a couple of minutes. Those calculations may be slow in computer terms but if it takes less than a second to compute the result, it's fine for getting the result and all the time is in the development.
Re: (Score:2)
Most languages support both natively and easily. In Haskel, Integer is the type for a bignum integer, and Int for a machine integer and both can be used as easily as the other.
Re: (Score:2)
Most languages support both natively and easily. In Haskel, Integer is the type for a bignum integer, and Int for a machine integer and both can be used as easily as the other.
Haskell is all fun and games until you want to change some state.
Re: (Score:2)
Changing state is very easy in Haskell, but that's hardy the issue of that a language can have native support for both machine-level integers and bignums. Indeed it's not quite clear what “native support” would even mean in Haskell but it being part of the standard library. Even booleans in Haskell are simply defined as any other constants in a library.
Rust aso does not native supply bignums, but the interfaces to them are no different than arithmetic on any other type.
Re: (Score:2)
Haskell doesn't map to my problem domains very well.
Rust was a bit of a bear to use when I tried it with long compile times for small programs and doesn't supply features I need with the immediacy of Python.
If I wrote large, security critical SW, Rust might be my go-to. But I don't. I create security critical hardware (system verilog mostly, but I sneak in code generators like confluence and myhdl when no one is looking) and the software I write it usually is in aid of the design process, manufacturing test
Will they improve its memory footprint as well? (Score:2)
And no, Numpy isn't a universal solution, there's many use cases you can't do with their arrays that I've hit several times.
16 bytes: a=None
28 bytes: a=3
49 bytes: a=""
56 bytes: a=[]
64 bytes: a=[1]
64 bytes: a={}
232 bytes: a={1:1}
Re: (Score:2)
>And no, Numpy isn't a universal solution
True dat. Try to use numpy for matrix operations with elements in Galois fields. It ain't happening.
Development time versus execution speed (Score:4, Insightful)
The strength of a language like Python is getting code that works, in a reasonable timescale. Benchmark execution speed is probably irrelevant in many cases. I would have been wasting my time to code one of my recent in-house applications in C++ rather than Python, because the Python code was fast enough, and there is every indication that it will scale well.
With other dynamically typed languages I use, such as GNU Octave and Ngspice, execution speed close to C code can be achieved, because the data is vectorised. For example, a few statements can set up a loop over thousands of items, in which case, most of the overheads of dynamic type checking and garbage collection disappear. Of course, that depends on having libraries or modules whose core functions are implemented in an efficient way, and then interfaced to the dynamic language.
Re: (Score:3)
The strength of a language like Python is getting code that works, in a reasonable timescale.
And that's the problem: people use for rapid development of a proof of concept - and the Python proof of concept ends up in production.
Re: (Score:2)
Pythons strengths are overrrated and misstated (Score:2)
Why is that a problem? A lot of production stuff is just fine in python despite not being as fast as a lower level language. If speed was truly the ultimate goal then you were stupid to begin coding in Python to begin with. Not a fault of the language but rather of the person leading the project.
My concern with Python is not the runtime performance, but the management after it's been in production for 5 years. Here's the problem: that prototype, if successful, almost never gets rewritten in a production-grade language. You can blame whomever you want and you're right, someone was an idiot along the way, but I've inherited many python apps that are nightmares to maintain in production.
My other problem with this is that Python IS NOT FASTER TO WRITE CODE IN. There are many languages that are j
In 20 years, I have seen the opposite (Score:2)
There really aren't other popular languages that are productive as Python. Python is a really really concise language compared to C / C++ / Java / C#.
Language really doesn't matter. Nearly everything has been invented already. Most professionals stitch together calls to existing APIs. If you're writing a ton of code in any language, my guess is you're reinventing the wheel and probably need to get more familiar with the language libs and ecosystem. For example, I am sure I could write a sorting algorithm more concisely in Python...but that would be stupid...just call the method already built into the language. The same principle applies to nearly ev
Why isn't Python as fast as C? False Dichotomy (Score:2)
When static typing is belatedly added to Python, there will only be bad reasons why it should not be able to statically compile to code that is as fast or faster than C.
It is a false dichotomy to say that one needs to be either easy to use or efficient. You can have both. C#.Net comes close, and you can even turn off array bounds checking if you really want to.
C can never do modern garbage collection which moves object in memory. That is where Java and .Net (but not Python implementations) can have a big
Re: (Score:3)
Most of the time that is not a problem at all.
As you run the python scrip only once or twice on huge data sets, and then you archive the result.
You Python haters simply have no clue which people use Python why.
If I need a scientific result tomorrow, I have no time for you writing a perfect C++ version that is finished programming in 4 weeks, and only has the benefit that it is 10 times faster on your machine.
What counts is: I have the result tomorrow.
Re: (Score:2)
the Python proof of concept ends up in production.
In my case, "production" is just an application that I run to take some drudgery out of my job, and as long as that works, Python is the right tool for the job.
I think what you are talking about is the kind of bad management that says "It works, so we ship it". Developers are not provided the budget for turning a prototype into a sound production design. Instead, money is squandered fixing the shortcomings of the unfinished product, your company gets a reputation for shipping unreliable tat, and so on. I am
Re: (Score:3)
I'll just add that having source code and object code in the same place also has benefits for home-grown stuff. Finding a binary called "doit" which does most of what you need, but then not being able to find the source for it isn't a pleasant day.
Like Perl, Python the (probably vast majority of) programs often don't really care about execution speed. However, for Python to become the One True Language, it needs to do a lot better in web server apps and the like. I have no idea what sorts of things the pyth
Re:Development time versus execution speed (Score:5, Informative)
Like Perl, Python the (probably vast majority of) programs often don't really care about execution speed.
Unlike Python, Perl is reasonably performant. It's usually 4x faster than Python... or more.
Re: (Score:2)
Unlike Python, Perl is reasonably performant. It's usually 4x faster than Python... or more.
Perl was the first modern scripting language I learned, when we converted all our business stuff to Linux. I was highly impressed with its handling of regexes, which I had not come across before with my code in BASIC running on DOS. However, I eventually found that my terse Perl scripts for munching text data tended to become incomprehensible gibberish, when I wanted to work on them later. There are just too many ways to do things in Perl, and you are never entirely sure what the interpreter will do when pr
Re: (Score:2, Insightful)
Unlike Python, Perl is reasonably performant. It's usually 4x faster than Python... or more.
Perl is a write-only language.
Perl is a dead language with less than 1% market share [statisticstimes.com].
The war between Perl and Python is over and Python is the clear winner.
If performance is important to you, you are better off using Julia, which has much better performance than Perl and python, it is very expressive, and it has a bigger market share than Perl at this point.
Re: (Score:2)
Most people grow out of childish habits.
Program control by indentation is one of those. It was literally a major advance in computing when we got rid of that. I've literally had Python code samples flattened when copying and pasting them from a browser, rendering them worthless. I don't have that problem with perl.
Another childish habit is when you assume that because you write obfuscated perl with no comments, that everyone does that. I have never had the slightest problem maintaining any of my perl scripts because I never try to be cleverer tha
Crooked (Score:2, Interesting)
It's a horribly crooked language.
Rather than trying to mind the horrible mess, would it not be better to just abandon it?
Re: (Score:2)
Lots of people seem to like it, so there is a market for it. Me, I've never got to warm up to python. Its way and my thinking are just not compatible.
Re: (Score:2)
Abandon it for what though?
There's C# and .NET Core, I bet you love those.
There are dozens of competitors (Score:2)
Abandon it for what though?
There's C# and .NET Core, I bet you love those.
I'm personally a JVM guy, so Kotlin and scala come to mind if you don't like Java. Groovy was great, but the industry seems to have moved on from it.
However, there's Go, Dart, Swift, many others....and honestly, while I haven't worked in C# much and avoid MS products, it does seem like a pretty good language and most people I know who gave it an open minded try said it was great. I've seen greater success in C# shops than any Python shop I've ever heard of.
Re: (Score:2)
Dude, that's so cruel. Think of the abandonment trauma, the loneliness.
Better to give it an overdose of sleeping pills.
Surely you mean less slow? (Score:3)
Especially after a meal (Score:5, Funny)
Pythons can spend about 3 weeks digesting a meal. During this time they are extremely slow, in fact they are not moving at all.
On the other hand, Python gives the programmers quite a bit of freedom on how to attempt to solve a problem, so Python programmers do not feel constricted.
Misguided (Score:4, Insightful)
Don't break compatibility (Score:2)
Do whatever, but don't fuck off compatibility like the whole Python 2.7 to 3.0 thing. My God did that suck. Yes it was mostly print statements, but jeeze it caused 2.7 to stick around forever.
Re: (Score:2)
Oh, it went much deeper than print statements. Print statements were perhaps the tamest of all the changes.
The most widespread impact was the whole string/bytes junk, where they made unicode the new str, without recogizing the old type name (unicode) but at least catering to the prefix (u). More frustrating is that some libraries decided to require str and others bytes that formerly worked together. Sometimes in odd ways. For example, base64decode requires the input be binary 'bytes', despite the whole poi
GIL (Score:2)
Re: (Score:3, Interesting)
I've been honestly impressed with how bad simple ruby code performed, and blown away large Python application stacks needing 30GB of RAM and 5 minutes just to start.
Sure, they're interpreted languages, and they're not meant to perform at native levels... but fuck, do they really need to give me traumatic flashbacks to Windows booting on my fucking Pentium 90?
Re:Not just about speed either (Score:5, Informative)
> with how bad simple ruby code performed, and blown away large Python application stacks needing 30GB
These guys are keeping a small army of Perl coders going. It's ludicrous how mod_perl still blazes in 2022.
Re: (Score:3)
Perl is still the glue that keeps the internet running.
Re: (Score:2)
If it used 30 GB of RAM, that most likely only meant that the process space was:
a) that big
b) garbage collection never was triggered
So? What is your damn problem with that? For what do you have cheap RAM if the process is not allowed to use it?
Re: (Score:2)
What is your damn problem with that? For what do you have cheap RAM if the process is not allowed to use it?
If it needs it, by all means, use it - but using it for the sake of using it, and ignoring the need to play nicely with the rest of the system/other tasks that might run/tasks that ARE running as well... that's not smart, that's plain stupid (and dare I question if people who think that is OK have studied computer architecture and/or computer operating system design).
Re: (Score:2)
That is not how RAM (and the rest of the system) works.
And most certainly it is not the probelm of the languge used but the algorithm.
Re: (Score:2)
I do not have 30 GiB of working memory, do you? I do not think that's common on home machines.
I have no idea why people often say that working memory is cheap, the last time I bought 16 GiB more it set me back 120 euro if memory serve.
Re: (Score:2)
he last time I bought 16 GiB more it set me back 120 euro if memory serve.
That is not much.
I have no idea why people often say that working memory is cheap,
Because for the Cyber 205 I started programming FORTAN on in 1987: 16GB would have costed the GPD of a sizeable country.
Re: (Score:2)
Ah, I see, these are prices people speak of when they say memory is “cheap”.
Because for the Cyber 205 I started programming FORTAN on in 1987: 16GB would have costed the GPD of a sizeable country.
Back then no program would take up
30 GiB
.
I'm not even sure module exist that would allow for many notebooks to grow over 32 GiB, and this is simply one piece of software. obtaining a machine that can comfortable support software that takes 30 GiB of memory is not, by any notion, cheap.
Re: (Score:2)
Back then no program would take up
30 GiB
Because you had not that much RAM.
Tautology - I suggest to google the word.
Back then people would have been happy to write a program that uses (could use/can use) so much RAM :P
As I mentioned before, your program most likely did not use 30GB RAM (btw: it is GB, not GiB). Perhaps I should have explained it: just because 30 GB is listed in the task - display, does not mean it uses that much. If it was an interpreted program in a kind of VM, the VM did a strbrk() system
Re: (Score:2)
So, what's the problem with that? It's obscene for the job being done.
It means you must dedicate a significant chunk of resources to something that should not need it.
Equivalent software tools that are written in php are snappy and instant- and more importantly, have a working set of ~2GB. This isn't me saying php is good, by any means. And $DEITY knows bad php can be written too. But the problem with Python, is that all of you only know how to write bad code
Re: (Score:2)
So, what's the problem with that? It's obscene for the job being done.
You do not know that without measuring it.
It is a difference if I write a C++ program that slurps in two files line by line and compares them line by aline, or if i write a Python program, that slurps in the same two files at once and then compares them line by line.
Or I just do the opposite.
If you have a program in language A that uses a lot more RAM than the functually same program in language B: then it is most certainly the algorithm
Re: (Score:2)
You do not know that without measuring it.
I mentioned analogue software in php that doesn't require that kind of memory footprint.
It is a difference if I write a C++ program that slurps in two files line by line and compares them line by aline, or if i write a Python program, that slurps in the same two files at once and then compares them line by line.
That's an example of a program that is nearly entirely syscall bound, which is a perfect job for python because it's not the bottleneck.
If you have a program in language A that uses a lot more RAM than the functually same program in language B: then it is most certainly the algorithm and not the language that changes runtime behaviour.
That's simply untrue.
Objects in python have a larger memory footprint than any other language known to man, period.
A standard OO implementation of a large collection of objects representing, for example, monitoring points on a network, can take obscene amounts of RAM where it wouldn't
Re: (Score:2)
s/CPython/PyPy/g;
Re: (Score:2)
Any example about tat.
Sorry, it makes no sense.
What? Of course it does.
You're fluent in C, so imagine we're writing an object system for a language.
A lean object system may contain nothing but a list of statically compiled pointers to methods (whether referenced during compile phase, or as a virtual).
A complex object system may include a hash to dynamically store a symbol table for the object. That has can be a heavy hash (generally faster, allocate more buckets) or a light one (less buckets, but slower due to list traversal)
I just did some googling [pythonspeed.com]
Re: (Score:2)
Sadly, several academic HPC clusters are suffering from reduced throughput in terms of jobs completed, after allowing Python, because too many Pytards can't effectively make use of all the performance enabled by Infiniband etc
Re: (Score:3)
There is certainly some horrible code I have run into which is slow as hell. My program spends about 99% of its time in c++ libraries and for long stretches of time. Writing all the code in C++ would not make it faster but it would make it much harder to develop.
However, I have also worked with people that wrote C and C++ code with horrible scaling problems on the clusters. You won't believe how bad the MPI and OpenMP code I have run into is.
Mostly what I see is people are pushed to get their work done righ
Re: (Score:2)
Scaling is indeed a separate problem from single-threaded performance. I have little experience in OpenMP, but I have an asston of experience in OpenCL with C shuttling data between work kernels, and I've spent a lot of time tuning the performance of that. So I get it. It takes work. I moved away from pyopencl, because it was wildly too slow- by integer multiples- but it was excellent for the
Re: (Score:2)
Re: (Score:2)
Python bindings to Fortran are so used, which is entirely different.
The bottleneck of quite a bit of software are not massive vectorizable operations on arrays but internal logic, for which Python of course cannot offer an easy solution.
Re: (Score:2)
Higher level languages that compile to a lower level form CAN be as energy-efficient as the low level language, depending on how good the compiler is. (This is because it is - in principle - easier to write high level abstractions, so the energy cost of compiling a few times to an intermediate form and then compiling that in turn is lower than the cost of repeatedly trying to develop in the lower level form directly.)
But it requires a truly intelligent compiler that can convert the code into something that
Re: (Score:2)
I think my point is, how complex, or removed from the computer architecture your runtime is, isn't really an excuse for terrible fucking performing runtime/interpreter.
I think that people who get defensive over criticisms of Python's world-leading bad performance are just making it harder to get the problem fixed.
Re: (Score:2)
Agreed it's not an excuse for terrible performance. A good scripting language can always get pretty decent performance, which is why the MICE videoconferencing suite was largely written in TCL/TK.
Re: (Score:2)
Re: (Score:2)
Add parenthesis, eliminate the indent requirements, add some native syntax for regex.
And THEN make it take less than geologic time to do anything.
Re: (Score:2)