This book is dedicated, in respect and admiration, to the spirit that lives in the computer.
"I think that it's extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don't think we are. I think we're responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun. Above all, I hope we don't become missionaries.
Don't feel as if you're Bible salesmen. The world has too many of those already. What you know about computing other people will learn. Don't feel as if the key to successful computing is only in your hands.
What's in your hands, I think and hope, is intelligence: the ability to see the machine as more than when you were first led up to it, that you can make it more."
When I taught Python to my kid and a few other kids that live around my house a coupl of years ago, Guido was gracious enough to sign these head-shots for me to give to each kids as their graduation present.
I'm not sure what to make of that. It feels both nice and oddly narcissistic at the same time. I'm not that familiar with Guido though, maybe I just don't understand his style.
As a young programmer, I cannot be more lucky than this. Right now, you can learn virtually any programming language (such as Python), online, so the fact that I'm not old enough to get a college degree doesn't matter.
And that you can actually communicate via email/twitter/g+ with the creators or the core developers of those languages and expect to even get an answer is pretty awesome as well :)
Very nice to see a blog post with this headline that isn't full of sarcasm or finger-wagging or self praise. It was a good reminder to me that positive messages, simply put, are worth the little time it takes to write them. I need to keep more of my own rants in my drafts folder. :-) Thanks Guido!
Single CPUs don't, but the number of cores does grow. GPUs are also worth considering here.
What does not get faster is programs written for old architectures. 20 years ago the best way to optimize a program was waiting a decade. That won't work today, and to make programs future-proof they have to scale to a lot of threads.
Software is slowing down faster than hardware is speeding up. My work desktop takes somewhat over 5 minutes to boot; at 3.4 GHz and 4 cores, 5 minutes equates to almost exactly four trillion operations, which accomplish nothing at all WRT productivity. Of course most of the time is spend waiting for IO rather than actually calculating anything.
Years ago, four trillion instructions processed would have accomplished something powerful, something business critical. Now all it means is you've booted up.
The end user doesn't care about a waveform inside a box, all they know is computers are slower and less productive than ever.
Software is slowing down faster than hardware is speeding up.
Write your own; that's the only way of staying sane. (It's especially atrocious with vertical market applications, of course. Not only does the SW I used to use in the past suck more and more, but it doesn't even keep adding the features I really need.)
but I remember I read something for guido van rossum, i always wanted to find it again, and read it to remember the details
what i remember of it, is that guido was saying something in the line of ... that one can spend tremendous amount of time editing html, crafting html by hand, and make it perfect ... but that this would be a waste of time because most developers nowadays use template engines that generate html, or use WYSIWYG tool ...
and that we should focus on more interesting problems
again, it was something like that, i tried to find it, since for ever, to get a better read of this view ... but i just cant
maybe my memory is tricking me, maybe it was not guido
but anyway, this quote in the article reminded me of this statement ... the idea ... that we should find more interesting problems that we can use computers for
While Python as a high-level language cannot control the hardware like C and assembly do, I think Guido is urging Pythonists to learn C as a second language to prepare for the software-hardware interaction world?
Python was my third language; I wish it had been my first. It's just so fun to write and wonderful to look at. C (my first) is also fun to write, but not when you're young--not as much as Python is.
I remember having lots of fun with C as a teenager. This was in the early 90s. We wrote games and demos using mode 13h (320x200 1 byte/pixel) in DOS (thus pretty much bare metal).
Those were the days. I think it was the perfect way to get started with programming :-)
You can still play around with computers like that! In fact, just `pacman -S qemu` and you're set! Mode 13h is awesome. I wrote a little quine[1] in x86, using mode 13h.
This book is dedicated, in respect and admiration, to the spirit that lives in the computer.
"I think that it's extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don't think we are. I think we're responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun. Above all, I hope we don't become missionaries. Don't feel as if you're Bible salesmen. The world has too many of those already. What you know about computing other people will learn. Don't feel as if the key to successful computing is only in your hands. What's in your hands, I think and hope, is intelligence: the ability to see the machine as more than when you were first led up to it, that you can make it more."
Alan J. Perlis (April 1, 1922-February 7, 1990)
[1] http://mitpress.mit.edu/sicp/