• 🏆 Texturing Contest #33 is OPEN! Contestants must re-texture a SD unit model found in-game (Warcraft 3 Classic), recreating the unit into a peaceful NPC version. 🔗Click here to enter!
  • It's time for the first HD Modeling Contest of 2024. Join the theme discussion for Hive's HD Modeling Contest #6! Click here to post your idea!

Python

Status
Not open for further replies.
Level 16
Joined
Aug 7, 2009
Messages
1,403
I recently got into Python programming (thanks to my university and my teacher/friends as the course is optional, but other students recommended it and the teacher invited me), and I must admit, I really like it. It's so incredibly simple and easy to use, yet very versatile and powerful. I'm still pickin' it up, but I like it more and more as I learn new things. (Now that I'm forced to learn interpreted languages [Java & C#] the way I look at them has completely changed, so it's also thanks to this).

If you're not familiar with it: http://python.org/

There're crapton of free libraries available for it, you can pretty much find something for everything you could ever need. It's also getting more and more popular, and it's a multiplatform (script) language (it's natively included in the Ubuntu releases and they keep on rewriting a lot of its code in Python), which, in my opinion, is a huge plus.

So I thought I'd open this thread as there might be other people around here who are familiar with Python, so that they could share their thoughts with us, but also to make those interested who haven't heard of the language before.
 

peq

peq

Level 6
Joined
May 13, 2007
Messages
171
I like Python, because it has a nice library ("batteries included"), a clean syntax, strong typing. For me it is very nice for writing small scripts. However, I prefer statically typed languages for bigger projects.

I believe writing "indentation based scopes" summarizes what I think about Python.

So do you think it is good or bad? If bad: why?

The only disadvantage I see is with copy&paste where the indentation can get messed up.

The language trying to look as low level as it can does not help either.

I think the language looks very high level for an imperative language. (For me "low level" means close to the hardware and few abstractions, are we talking about the same thing?)
 
Level 2
Joined
Aug 24, 2013
Messages
3
I believe writing "indentation based scopes" summarizes what I think about Python.

Are you afraid of your code sticking to the right side of your editor? In this case it's not Python's fault.
https://www.kernel.org/doc/Documentation/CodingStyle said:
The answer to that is that if you need more than 3 levels of indentation, you're screwed anyway, and should fix your program.
While I wouldn't take it that serious, exceeding 3/4 is (and should be) rarely needed.

Besides, Python is a high-level programming language aiming for readable code while providing a pretty good performance. Personally I like it for the possibility to write functional-style code (without being forced to go full functional).
 
Python is my favorite. It is just plain clean. And it is very extendable with all the support its gotten (e.g. Cython). I'm thinking of making a StormLib wrapper in python just because I love the language so much.

And another thing, py2exe and py2app are not too shabby if you are looking to create native binaries.

I haven't really looked into the GUI libraries though (aside from Tkinter), let me know if you find any that you like. :)
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,199
(Now that I'm forced to learn interpreted languages [Java & C#] the way I look at them has completely changed, so it's also thanks to this).
Java and C# are not truly interpreted languages. They use a virtual machine to interpret byte code which is closer to an emulator than to an actual interpreter. Even that is not used that much as Java compiles into native assembly during runtime to allow for faster execution (near compiled speed).

So I thought I'd open this thread as there might be other people around here who are familiar with Python, so that they could share their thoughts with us, but also to make those interested who haven't heard of the language before.
Python is great for a scripting language but that's it. Every single thing you do performs a hash table name space lookup and type checks making referencing even the most simple variable a time consuming process for the processor. It is also not very suited for large projects as it lacks programming time type safety which can easily result in annoying and hard to debug type mismatch errors.

Although optimizations can be added to the Python language (in fact, many have been) the underlying lack of type safety still bottlenecks the maximum performance obtainable from the language. This is why for programming compile/programming time type checking is pretty dominant as the well defined types allow for much more efficient assembly or interpreters to be made.

Tasks where Python is good to use for...
Game scripting language, few calls made per unit time that perform difficult to define at compile time logic such as special environment behaviours or story progression stuff.
Compile time building of special files based on values from other files. Execution speed of make process does not really mater and the flexibility offered by interpreted languages far out-weighs the extra few seconds when compiling a project.
Simple tasks where performance is of little mater, such as simple button commands or customizable button behaviour.

Tasks that Python should never be used for...
Game engines. The bloat and execution speed of Python is completely un-suited for game engines. Everything from the interpreted nature, sparse data density to memory allocation abuse and constant recycling makes it bad for this.
Real time tasks. You cannot accurately predict time performance from Python which is important for Real Time systems. Combined with all the un-predictable tasks python runs it is not the kind of language you want to trust controlling your car.
Performance intensive applications. You do not ever want to use a language where shortening variable names is an optimization for something that is computationally intensive to use.
Applications that need high fault tolerance. Passing your rocket thrust calculation function a list of velocities instead of a tupple of velocities will not end well and will likely end in en explosion. Hard to accommodate every parameter for every logical possible type with well defined behaviours.


And another thing, py2exe and py2app are not too shabby if you are looking to create native binaries.
I am pretty sure all they do is bundle a native version of a Python interpreter and your Python byte code files together in a single executable package. As far as I know they still execute Python using the standard Python interpreter and still your code will have pathetic performance. Only advantage is it makes it more/less portable as people no longer need the interpreter installed (which not everyone has) but at the cost of it becoming platform specific.
 
Level 2
Joined
Aug 24, 2013
Messages
3
Java and C# are not truly interpreted languages. They use a virtual machine to interpret byte code which is closer to an emulator than to an actual interpreter.
You make it sound like this aspect sets them apart from Python. Have a link. Python creates these little nasty .pyc files for a reason.

It is also not very suited for large projects as it lacks programming time type safety which can easily result in annoying and hard to debug type mismatch errors.
Well, that's what one calls 'dynamic typing', though the rest of your sentence is filled with arbitrary prejudice. There's nothing hard to debug with a language that verbose regarding its exceptions, stack traces and whatnot. Python's checking its types (not statically, of course) and the hole thing blows up in your face in case of a mismatching type. There are close to no implicit conversions, too.

Regarding the large projects, you better go and tell google, DJango, etc. about it - they might make a huge mistake without knowing it.

Every single thing you do performs a hash table name space lookup and type checks making referencing even the most simple variable a time consuming process for the processor.
[...]
Tasks that Python should never be used for...
[...] The bloat and execution speed of Python [...] Everything from the interpreted nature, sparse data density to memory allocation abuse and constant recycling makes it bad for this.
Oh all this bloat and slow and inefficient and time consuming and... please allow me to stop here.
Yes, there are indeed things taking more time with Python than with your <insert favorite random language> language but you can avoid them if you know what you are doing (or with what you are doing it). Just a random example:
https://wiki.python.org/moin/PythonSpeed said:
Membership testing with sets and dictionaries is much faster, O(1), than searching sequences, O(n). When testing "a in b", b should be a set or dictionary instead of a list or tuple.
...and in cases where you don't know what you're doing, speed shouldn't be your top priority.
Python packages are often just wrappers around a C library achieving a pretty impressive speed.
Using Python + NumPy for huge data processing and numerical calculations is quite common - not just because people like indentation based languages. The performance is significantly better than e.g. Octave and on par with Matlab - languages specifically build for such tasks. Of course they all turn pale compared to C++ but that's not the point when comparing to an interpreted language. Regarding numerics C++ is rarely used, if they need excellent performance they turn straight to OpenCL/CUDA (at least the people I know do).

On a side note: you can get it (way) faster by using e.g. pypy (it's for Python like luajit for lua) or really C-like (not only performance wise) with Cython.

Tasks where Python is good to use for...
Game scripting language,
Actually, that's LUA. The things Python does good as a game scripting language are done better by LUA.

Passing your rocket thrust calculation function a list of velocities instead of a tupple of velocities will not end well and will likely end in en explosion.
Yeah, such a horrible thing would result in... well, just an exception. Opposed to a language like Ada, incorporating an incredible strong, generally considered very safe and mighty type system; in conjunction with a compiler searching even for the most difficult statically inferable variable type which actually goes as far as injecting a runtime for dynamic checking in case it failed to perform a proper static check...
https://en.wikipedia.org/wiki/Cluster_%28spacecraft%29 said:
Specifically, the Ariane 5's greater horizontal acceleration caused the computers in both the back-up and primary platforms to crash and emit diagnostic data misinterpreted by the autopilot as spurious position and velocity data.
...uhh, well, but at least the types were fine? :/

but at the cost of it becoming platform specific.
What, why? You just package your existing - platform independent - code to a bundle designated for one specific environment (Windows in case of exe, OSX in case of app). Your code stays the very same, only your 'binary' is dependent (<- and I guess nobody should be surprised about something like a OS dependent binary?).


And on a finale note... it's a thread about Python - an interpreted language - imo it makes not much sense to mock about its performance being inferior to compiled languages and blaming the type system for it. Most of these parts, especially the latter, are the reasons why so many people actually like the language. I don't complain about a lack of dynamics in C's type system in a C thread and try to use it as an objective reason against the language either. It's an inherent fact/part of the language's design.

inb4: no Python fanboy here, I'm more on the Ada side of the force.
 
Last edited:

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,199
Python creates these little nasty .pyc files for a reason.
Which is still a byte code that has to be interpreted. Any kind of named lookup is still done using a hash table which means every single function call is still a slow lookup process.

Standard python byte code files can be reverse engineered line for line and recreate their source python file almost perfectly. It even place comment lines inside them. At least under standard python byte code compiled a few years ago.

In short, Python byte code is closer to an interpreter cache (avoid having to parse syntax) rather than true byte code. You can still instruct the Python interpreter to parse Python code line for line dynamically without using a python byte code file. Java on the other hand only works with java byte code files, you cannot get it to interpret line for line the original source code.

There's nothing hard to debug with a language that verbose regarding its exceptions, stack traces and whatnot.
It is if you have >100,000 lines of code and you suddenly get a type mismatch error. Yes you get exceptions that tell you where the error occurred but they tell you nothing of the function that created the error (what function put the wrong type into a list). And seeing how strict the syntax is, you could easily find you add an encased type to a list by mistake when you meant to add a value.

Regarding the large projects, you better go and tell google, DJango, etc. about it - they might make a huge mistake without knowing it.
You would be fired from Google even suggesting using Python for many of the tasks they perform. It just will not cut it to process millions of search results every minute. Sure there are tons of places where you can use Python in Google but not for most of their stuff.

Using Python + NumPy for huge data processing and numerical calculations is quite common - not just because people like indentation based languages. The performance is significantly better than e.g. Octave and on par with Matlab - languages specifically build for such tasks. Of course they all turn pale compared to C++ but that's not the point when comparing to an interpreted language.
That is only the case because of what I mentioned... It works fine if the underlying stuff is fast and you just need something top level to control it (like Matlab does). If each function takes 3 seconds to execute then taking 50 ns to call it or only 1 ns makes practically no difference. However if you construct an entire program out of millions of 50ns calls which do very little then it makes a huge difference. Again, the core performance parts are written in other compiled languages (like you said) with python to pull it all together.

Yeah, such a horrible thing would result in... well, just an exception. Opposed to a language like Ada, incorporating an incredible strong, generally considered very safe and mighty type system; in conjunction with a compiler searching even for the most difficult statically inferable variable type which actually goes as far as injecting a runtime for dynamic checking in case it failed to perform a proper static check...
Most of the errors I encountered when using Python were because I accidently used the wrong type at some stage (such as not casting to a tupple, or putting a tuple instead of its element). These errors can only be efficiently picked up at run time with python which is a reason why using it can be difficult at times. In a strict type language such as Java most IDEs will inform you of a type mismatch as you type and the compiler will tell you of any before it lets the program be built. This means such an error is not possible with those languages.

The reason I raised this was because in real life there was such an occurrence with a rocket. Sure it did not involve Python but did involve such a stupid typing mistake caused by people ignoring strict typing procedure. It involved a dated rocket design being fitted with a new sensor which used higher precision types. Instead of updating the rocket controller to support this higher precision type or writing a range bound adapter, they decided to interpret the bits directly as a different type. This appeared to work until the sensor started outputting large numbers which lost all meaning to the controller. The result was that the rocket thought it was the wrong way round (when it was not) and so it crashed into the ground luckily not injuring anyone.

Sure such a situation in Python cannot occur directly due to how it processes numbers, but the idea of incorrect types being passed around and trying to be interpreted does. A simple syntax error can mean that tuples are created or not created which is disastrous if they try and be processed. Sure testing should catch all such cases before a system goes live, but clearly it does not happen since a rocket blew up due to a type conversion mistake.

...uhh, well, but at least the types were fine? :/
They were not, as they were passing off a larger type as a smaller type without an appropriate adapter (probably interpreting it by discarding the most significant bits). This had the anomaly of a "overflow" occurring resulting in the computer systems thinking that from going somewhere very fast it was suddenly going the opposite direction very fast. All that was needed was a proper adapter to the interface used that would keep the values in bound so that instead of over or under flowing, it would simply stay at the maximum or minimum possible value for that type.

Since each type should have strict behavioural requirements, any deviation in those behaviours should result in some form of adapter to convert between the types. An example would be from a signed to an unsigned int where it may not be desireable to perform a direct re-interpret as a very large unsigned becomes negative where as you may want to redefine the range so that the 0 point of the signed sits in the middle of the unsigned type range.

0 -> 255
would become -128 -> 0 -> 127
as opposed to 0 -> 127 ~> -128 -> -1

Anyway I deviate. The fact is that it is far easier to get a type mismatch at runtime using Python than languages such as C or Java which refuse to compile unless all types match to some degree. Sure it throws an error which can be handled gracefully but can all errors be handled gracefully enough?

What, why? You just package your existing - platform independent - code to a bundle designated for one specific environment (Windows in case of exe, OSX in case of app). Your code stays the very same, only your 'binary' is dependent (<- and I guess nobody should be surprised about something like a OS dependent binary?).
That is what I meant... Instead of the distribution being portable it is platform dependant. The advantage is that not everyone has python installed and so running a platform specific binary may be more convenient for them.
 
Level 2
Joined
Aug 24, 2013
Messages
3
Do you mind if I keep this short? Neither of us will change their point of view so I'd rather not waste too much time writing walls of text for nothing. ;)

Which is still a byte code that has to be interpreted.
But it is byte code run in a virtual machine. Though I fail to see the point here - once the file is parsed it's... parsed. CPython is designed for an as-small-as-possible footprint and fast interpretation of files. If you don't like the downsides of this approach just go with some other implementation.

Standard python byte code files can be reverse engineered line for line and recreate their source python file almost perfectly.
Fortunately that's not the case for Java ...oh, wait? It is.

Java on the other hand only works with java byte code files, you cannot get it to interpret line for line the original source code.
You make it sound like a disadvantage.

It is if you have >100,000 lines of code and you suddenly get a type mismatch error. Yes you get exceptions that tell you where the error occurred but they tell you nothing of the function that created the error (what function put the wrong type into a list).
I don't know which implementation you're using but mine does what you want. And since python3 you don't lose these information when exceptions occur during exceptions (previously only the last one was saved).

You would be fired from Google even suggesting using Python for many of the tasks they perform.
Oh come one, try harder. Citation needed.

It just will not cut it to process millions of search results every minute. Sure there are tons of places where you can use Python in Google but not for most of their stuff.
I can't recall me saying to use python for one of the (probably) largest datasets in the world. There are domain specific languages for such tasks, this is not about multipurpose languages (and you are well aware of that).

That is only the case because of what I mentioned... It works fine if the underlying stuff is fast and you just need something top level to control it (like Matlab does). If each function takes 3 seconds to execute then taking 50 ns to call it or only 1 ns makes practically no difference. However if you construct an entire program out of millions of 50ns calls which do very little then it makes a huge difference. Again, the core performance parts are written in other compiled languages (like you said) with python to pull it all together.
What's your point? That's how all these languages work. They're build on top of existing, reliable (and preferable fast) parts, providing a convenient interface. I don't want to you reimplement gmp with python, that's not what the language is intended for and better results are achievable with language which are intended for such tasks. But if it's about reimplementing something using gmp, Python would be one way to go (and I guess not the worst).

Sure such a situation in Python cannot occur directly due to how it processes numbers, but the idea of incorrect types being passed around and trying to be interpreted does.
No, it does not. Use try blocks if you want to be safe.
May I ask you what you're trying to achieve here? All I see is you arguing for statical typing and talking about the ever so bad dynamical typing. This discussion is held on great extends all over the internet; no need for us to mirror it here (especially in a Python thread). Both ways got their advantages and disadvantages - both.

They were not, as they were passing off a larger type as a smaller type without an appropriate adapter (probably interpreting it by discarding the most significant bits). This had the anomaly of a "overflow" ...
They used Ada. Ada is highly sophisticated when it comes to such things (type equivalence by inferring structural equivalence or name equivalence) and strikes most of these errors at compile time. Yet they resorted to guard the assignments by hand with if blocks for the greater good of a faster execution speed, ultimately making a huge mistake. What does this tell us about a super-safe, statically typed language like Ada? Well, I guess using C would've resulted in far more crashed rockets...

That is what I meant... Instead of the distribution being portable it is platform dependant. The advantage is that not everyone has python installed and so running a platform specific binary may be more convenient for them.
(Technically speaking, the distribution is still platform independent, the wrapped interpreter is not)
 

Dr Super Good

Spell Reviewer
Level 64
Joined
Jan 18, 2005
Messages
27,199
Fortunately that's not the case for Java ...oh, wait? It is.
So Java also stores comment lines in its byte code?! I could swear it did not.

You make it sound like a disadvantage.
Performance wise it is.

Oh come one, try harder. Citation needed.
And where is yours that Google search is powered by Python?

I don't want to you reimplement gmp with python, that's not what the language is intended for and better results are achievable with language which are intended for such tasks.
Which is exactly what I have been trying to say all the time, just you seem to want to argue over it despite saying the same thing.

May I ask you what you're trying to achieve here?
I could ask you the same thing. There was no reason for you to derail this thread like you have. I am guessing you are getting some kind of kick out of trying to contradict everything I say yet still supporting 90% of it.

Well, I guess using C would've resulted in far more crashed rockets...
Considering when the first space rockets were made, this is hardly a valid argument. The ROM memory used to be hand knitted so there were far more things that could fail. It is just that it is an example of how bypassing or ignoring strict interactions results in possible and unpredictable errors. The number of times I have seen Python programs crash due to type mismatches (including games like Civilization 4 which were scripted using it) is not to be joked at. Sure you can handle exceptions but often there is no reasonable way to handle such an exception as there may be no way to recover in a meaningful way.

(Technically speaking, the distribution is still platform independent, the wrapped interpreter is not)
The reason it is called a distribution is that it is the distributable which is the program used by the end user. Since the Python script is abstracted into a platform specific executable, it is platform specific as there may be no simple way to tell that the executable contains python code without resorting to file content analysis (which the average user who has no idea about computers clearly will not do).

I do agree that the source code still is platform independent due to its Python nature.
 
Status
Not open for further replies.
Top