Saturday, 5 January 2013

Python Rocks - So what is Stackless Python?

Python may be one of the most widely learned and used languages today, but it was conceived in the late 1980's when if you hadn't got a mainframe, you almost certainly were running your code on a single CPU computer of some sort.

For this reason the original implementation of Python was written with the understanding that it was perfectly sensible to use the same single execution stack that C used - after all Python was written in C. Despite being on version 2.7/3.3 nowadays, the standard Python is still written in C, still uses a single execution stack design and is often known as CPython.

English: CPU Zilog Z8
English: CPU Zilog Z8 (Photo credit: Wikipedia)
The execution stack - or call stack - or just stack is like a big spike that you stick messages on in a last on, first off way and it's where the low level machine code subroutines used to stick the current code  address before going away to do some jiggery pokery. When the subroutine was finished it returned by pulling the last address off the stack and then execution continued from there. It was all fairly simple in the days of the Z80 and as I understand it, that's still essentially how a single CPU - or Core - works.

AMD Athlon™ X2 Dual-Core Processor 6400+ in AM...
AMD Athlon™ X2 Dual-Core Processor 6400+ in AM2 package (Photo credit: Wikipedia)
The problem is that sometime in the early 2000s, dual-core, and then multi-core chips started to be become increasingly affordable and therefore available. Most of you will be using a multi-core system to read this post. This means that your systems are capable of running more than one process a time - what is called concurrency.

This is a bit of a pain for CPython because it only knows how to use a single stack, i.e. a single core, and that is just a bit of a waste of those other cores which are just itching to make it all run super fast.

So Stackless Python is essentially a redesign of CPython which avoids using the call stack and instead uses something called microthreads to get around the problem. This means four things to you:

  • Concurrent programming is possible.
  • Concurrency can improve on execution time if done properly.
  • You need to learn some new concepts: tasklets and channels.
  • You get to use some new stuff: tasklets and channels.
I'll introduce those next time.

There's a very informative interview with the creator of Stackless Python here.

You may like to be ready by reading my post about installing Stackless Python.

Good luck fellow travellers.

No comments:

Post a Comment