Java and C# are not truly interpreted languages. They use a virtual machine to interpret byte code which is closer to an emulator than to an actual interpreter.
You make it sound like this aspect sets them apart from Python.
Have a link. Python creates these little nasty .pyc files for a reason.
It is also not very suited for large projects as it lacks programming time type safety which can easily result in annoying and hard to debug type mismatch errors.
Well, that's what one calls 'dynamic typing', though the rest of your sentence is filled with arbitrary prejudice. There's nothing hard to debug with a language
that verbose regarding its exceptions, stack traces and whatnot. Python's checking its types (not statically, of course) and the hole thing blows up in your face in case of a mismatching type. There are close to no implicit conversions, too.
Regarding the large projects, you better go and tell google, DJango, etc. about it - they might make a huge mistake without knowing it.
Every single thing you do performs a hash table name space lookup and type checks making referencing even the most simple variable a time consuming process for the processor.
[...]
Tasks that Python should never be used for...
[...] The bloat and execution speed of Python [...] Everything from the interpreted nature, sparse data density to memory allocation abuse and constant recycling makes it bad for this.
Oh all this bloat and slow and inefficient and time consuming and... please allow me to stop here.
Yes, there are indeed things taking more time with Python than with your <insert favorite random language> language but you can avoid them if you know what you are doing (or with
what you are doing it). Just a random example:
https://wiki.python.org/moin/PythonSpeed said:
Membership testing with sets and dictionaries is much faster, O(1), than searching sequences, O(n). When testing "a in b", b should be a set or dictionary instead of a list or tuple.
...and in cases where you don't know what you're doing, speed shouldn't be your top priority.
Python packages are often just wrappers around a C library achieving a pretty impressive speed.
Using Python + NumPy for huge data processing and numerical calculations is quite common - not just because people like indentation based languages. The
performance is significantly better than e.g. Octave and on par with Matlab - languages specifically build for such tasks. Of course they all turn pale compared to C++ but that's not the point when comparing to an interpreted language. Regarding numerics C++ is rarely used, if they need excellent performance they turn straight to OpenCL/CUDA (at least the people I know do).
On a side note: you can get it (way) faster by using e.g. pypy (it's for Python like luajit for lua) or really C-like (not only performance wise) with Cython.
Tasks where Python is good to use for...
Game scripting language,
Actually, that's LUA. The things Python does good as a game scripting language are done better by LUA.
Passing your rocket thrust calculation function a list of velocities instead of a tupple of velocities will not end well and will likely end in en explosion.
Yeah, such a horrible thing would result in... well, just an exception. Opposed to a language like Ada, incorporating an incredible strong, generally considered very safe and mighty type system; in conjunction with a compiler searching even for the most difficult statically inferable variable type which actually goes as far as injecting a runtime for dynamic checking in case it failed to perform a proper static check...
https://en.wikipedia.org/wiki/Cluster_%28spacecraft%29 said:
Specifically, the Ariane 5's greater horizontal acceleration caused the computers in both the back-up and primary platforms to crash and emit diagnostic data misinterpreted by the autopilot as spurious position and velocity data.
...uhh, well, but at least the types were fine? :/
but at the cost of it becoming platform specific.
What, why? You just package your existing - platform independent - code to a bundle designated for one specific environment (Windows in case of exe, OSX in case of app). Your code stays the very same, only your 'binary' is dependent (<- and I guess nobody should be surprised about something like a OS dependent binary?).
And on a finale note... it's a thread about Python - an interpreted language - imo it makes not much sense to mock about its performance being inferior to compiled languages and blaming the type system for it. Most of these parts, especially the latter, are the reasons why so many people actually like the language. I don't complain about a lack of dynamics in C's type system in a C thread and try to use it as an objective reason against the language either. It's an inherent fact/part of the language's design.
inb4: no Python fanboy here, I'm more on the Ada side of the force.