Master_Aricitic has written
Some languages are interpreted, also commonly called scripting languages. JavaScript is an (obvious) example; Delphi, Pearl, Python and Lisp are other examples. Interpreted languages run slower than near-native and certainly slower than native languages because they have to be parsed before they can be run. Similar to Java, interpreted languages use something like a virtual machine to run, as they have to be interpreted and converted into machine code to be used. Unlike Java they cannot and are not directly converted into machine code (recent versions of Java will actually automatically compile into machine code, rather than run within the Virtual Machine). This is true of Lisp, though I am unsure of the statements validity of Python, Pearl and Delphi (as I haven't studied those languages).
That's actually not quite the reason that interpreted languages are slow. True, they have to be parsed from source, but that is only a one-time overhead and is usually relatively insignificant. Modern interpreters usually use several "stacks" to keep track of the program space. Java for example uses an indexable heap-like structure that acts nominally as a stack with either (byte)strings (all object signatures and references are serialized as strings, which is a rather clumsy way of doing it to be perfectly honest) or bitstrings of varying sizes for large integers or references. Similarly, Python dynamically allocates blocks of memory with the heap at the bottom and the stack on top, which contains references to the absolute location of these python objects in the CPython interpreter (a nonprimitive object occupies quite a lot of memory and a single lookup can amount to hundreds of reference lookups, which makes it comparably slower than both Java's and Lua's implementation). Because a program running in a managed environment requires many services from its host (GC, dynamic memory, Java's threading scheduler and Python's GIL comes to mind immediately), a significant amount of overhead accompanies each cycle of interpretation, which is the reason why interpreted languages are usually slower than native code.
Now since interpreted source is usually represented in binary trees, this structuredness allows for inline optimization of the logic of the program without explicit instructions from the programmer (GCC at default optimization level will usually strip out unreachable code, take out empty loops, optimize arithmetic, etc). At an interpreted level, as the interpreter is itself native code, this cannot be done (except for bytecode optimization). A native application however can emit machine code and inject it into its own heap, so interpreters can inline some of the instructions as machine code. But due to the complexity of object orientation, this is only done on primitive types (Java's JIT cannot optimize reference manipulations save for Integer, Float, and other composite types that autoboxes to primitives). I believe this is what you meant by Java being capable of producing machine code. This is not exclusive to Java however, other managed environment have also been known to JIT part of the bytecode into machine code (LLVM is one of the most sophisticated platforms out there, CPython also does a little bit if you ask it to, LuaJIT, a few Javascript engines, and MS's CLR which popularized this idea in the beginning of the millennium)
PS: Java's VM is an interpreter in the full sense of that word. If you really want an easy way to learn the limitations of java, recode a non-optimizing version of the JVM in java itself. It's actually not as difficult as you would believe it to be as the entire specification of the JVM is open.
And while we're on the topic of Java, it's tradition for developers to make fun of the languages that he uses, so here's my go at it.
Often in computational analysis, we will encounter rate equations for which there are no easy solutions. The traditional way of finding unique solutions for such DEs is to use Euler's method for successive approximation, which can implemented as such:
1
2
3
4
5
def euler(dy, yn, tmax, h, tn=0):
	yn += h*dy(tn,yn)
	return yn if tn >= tmax else euler(dy, yn, tmax, h, tn+h)
print "Approximation for y' = y (e^t): %s"%euler(lambda t,y:y, 1., 2., 0.1)
But coming from java, where we had to use an ugly hackaround of initializing anonymous interfaces (Hell, even C# had delegate methods...)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
public class EulerSolve {
	public interface DifferentialEquation{
		abstract double function(double t, double y);
	}
	
	public static double approximate(DifferentialEquation dy, double yn, double tn, double tmax, double h){
		yn = yn+h*dy.function(tn,yn);
		if (tn < tmax){
			return approximate(dy, yn, tn+h, tmax, h);
		}else{
			return yn;
		}
	}
	
	public static void main(String[] args){
		
		double eulerApproximation = EulerSolve.approximate(new EulerSolve.DifferentialEquation() {
			@Override
			public double function(double t, double y) {
				return y;
			}
		}, 1., 0., 2.0, 0.1);
		
		System.out.printf("Approximation for y' = y (e^t): %f", eulerApproximation);
	}
}
I had always thought that "OO Interfaces" was the only right way:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
class EulerSolve(object):
	class DifferentialEquation(object):
		def function(self, t, y): pass;
	@classmethod
	def approximate(cls, dy, yn, tn, tmax, h):
		yn += h*dy.function(tn, yn);
		if (tn < tmax):
			return cls.approximate(dy, yn, tn+h, tmax, h);
		else:
			return yn;
class dyEqualsY(EulerSolve.DifferentialEquation):
	def function(self, t, y): return y;
def main():
	diffEq = dyEqualsY();
	approximation = EulerSolve.approximate(diffEq, 1., 0, 2., 0.1);
	
	print "Approximation for y' = y (e^t): %f"%approximation;
if __name__ == "__main__":
	main()
I was satisfied with my handywork (a little bitter about the lack of PythonDoc support) so I just stopped there, being none the wiser about how to program in python.
A friend of mine took a look at the code and looked at me as I were crazy. I asked him what was up, but he just kept staring at me as if in disbelief. He then wrote two brief lines of code so elegant and concise that my mind was completely blown to pieces...
What did he write?
1
2
3
4
easy_install euler
# Lee you're a f*cking moron.
import euler