Coding 2 Learn

Education and Technology Ramblings with a little Politics for good measure.

Computer Games Are a Waste of Time

| Comments

When I was a kid I played a lot of computer games. I once played Betrayal at Krondor for so long that I started hallucinating from sleep deprivation. When I was at Uni I chose to mow-down humanoid warthogs in Duke Nukem rather than learn metabolic pathways for amino acid synthesis and failed an important exam.

When I was a child, I spoke as a child, I understood as a child, I thought as a child: but when I became a man, I put away childish things.

I gave up gaming in the traditional sense, years ago. I now recognise the stupidity of playing a computer game for hours on end, hoping to make the value of some variable creep upwards. Thank goodness I don’t do that anymore. These days I’m far more sophisticated.

The Hacker News Game.

I’ve been playing The Hacker News game for just under a year. When I first started playing I was killed fairly quickly, what’s known in HN circles as being Hellbanned. It’s a neat little mechanic where by the NPCs introduce large amounts of lag into the game and hide your character from the other players. I’m still not sure what got me Hellbanned, so I just started again.

My second attempt at playing has been a little more successful. I created a new character called ‘coding2learn’, who is a kind of geeky computer science teacher. Since playing this new character I have gained nearly 1000xp, known as Karma in the game. The concept of The Hacker News Game is to go on a search for websites that would be of interest to Programmers and Technologists. When you find a website, you submit it to the community, who can then upvote your submission and reward you with xp.

Like most MMORPGs, The Hacker News game allows you to interact with other players. To do this you use comments. Comments can also be upvoted, for additional xp, and if you are a high enough level you can even downvote comments. You’re best off only commenting on things you are quite knowledgeable on. If you want to gain lots of upvotes, the more pedantic you are the better.

The Hacker News game is web based, although there is talk of a mobile app or optimisation coming soon.

The Twitter Game

I used to play The Facebook game, but I stopped when they kept changing all the game mechanics without notification. Besides, I interact with my friends and family enough irl. Instead I thought I’d give The Twitter Game a chance.

The basic idea of The Twitter Game is to gain followers. Followers are other players who might decide to join your clan, so that they can receive your pronouncements. There is no limit to the number of clans that a player can join, although it is often deemed a measure of success if you keep the ratio of people in your clan, to clans you are a member of, as high as possible.

You gain followers by making pronouncements. Humorous pronouncements are often best, but you can also make informative and controversial pronouncements in an attempt to gain followers. One of the quirky mechanics of The Twitter Game is that you have to make pronouncements in less than 140 characters. Sometime your pronouncements are spread further by your followers. This enables you to reach a wider audience and therefore increase your clan size.

In The Twitter Game I play more or less the same character as in The Hacker News Game. There are a few differences though. ‘coding2learn’ in The Twitter Game tries to use humour a little more and is a little less arrogant (although not much).

I’ve had moderate success in The Twitter Game, gaining over 1000 followers. I don’t play it enough really to become an elite player.

The Twitter Game is web based, although there are apps for mobile that are quite good.

The Blogger Game

The Blogger Game is my favorite game of all. If you’re reading this you’re playing The Blogger Game right now, and so I should really say ‘thank you’ for the additional xp.

In the Blogger Game you write what are called ‘posts’. It’s a good idea to write posts that, like in The Twitter Game, are either funny or informative. There’s no character limit in The Blogger Game though, and you can write really long posts if you want, although be aware that sometimes your post might be so long that people will comment TL;DR (too long; didn’t read).

Success in The Blogger Game is determined by your scores in something called Google Analytics. Google Analytics is like a personal high-score table, that can tell you how well each of your posts has done and how well your blog is doing in general. There are loads of stats to play around with, such as first time visitors, bounce rate and time on page.

If you want to play The Blogger Game it’s a good idea to decide on a topic to ‘blog’ about and stick with it. Regular posting is quite important, and something I struggle with. To get really high scores you need to be ‘blogging’ about once a week.

What I like about The Blogger Game, is its crossover with The Hacker News Game and The Twitter Game. Success in any of them can lead to success in the other two, even though they’re created by completely different studios.

The Blogger Game is available on the web or as a stand alone app on mobile, PC, Mac and Linux. web based, although there are apps for mobile that are quite good.


So that’s my gaming life these days. I’m sure we can all agree that it is far preferable and productive than the waste of time that is CoD or WoW

Please Stop Sending Me Your Shitty Word Documents

| Comments

Throughout this rant I use the second-person personal pronoun (you) quite a lot. This does not necessarily mean I am speaking to ‘You’ the reader, but rather some other ‘You’ who will probably never read this anyway.

When Microsoft announced Office for iPad I shed a small tear. Excel is an incredibly useful application, without which my managers couldn’t inundate me with graphs, statistics and indecipherable Look-ups that reference hidden and protected sheets. PowerPoint allows literally anyone (regardless of their public speaking skills, understanding of image aspect ratios, or ability to use less than 15 different fonts on a single slide) to prepare presentations for their audience. These apps becoming available on iOS did not worry me. What upset me however, was the fact that all of a sudden, swathes of iPad users will now have the ability to view, edit and most worryingly of all – create Microsoft Word documents.

Here’s what I have installed on my Mac:

  • Alfred – searching for anything
  • Python – coding anything
  • VLC – watching anything
  • FireFox – browsing anything
  • Homebrew – installing anything
  • Emacs – anything

You’ll notice that Word is not on the list. I have nothing against people who use Word, I am just not one of them. There was a time when if I wanted to put text on a screen, it was my go to software, and I thought I was a pretty 1337 hacker because I knew how to do mail merges. I’m not that guy anymore. I don’t tell you what software to install on your computer, and I don’t assume you have the same software installed as me. For this reason I am careful to use non-proprietary file types when sending documents via email. I expect the same courtesy from you, and here’s why…

I don’t have Word installed

When you send me a Word document, you are making some pretty major assumptions, and as Samuel L. Jackson once said in the outstandingly amazing film The Long Kiss Goodnight

“When you make an assumption, you make an ass out of ‘u’ and ‘mption’.”

Firstly you assume that I have Word or some clone of it installed. I know you think the words ‘Computer’, ‘Microsoft’, ‘Windows’ and ‘Office’ are synonymous, but they’re not and there are plenty of people in the world who use *nix operating systems. By sending me a .docx file you’re forcing me to find a work around, so that I can use your document. What are my options? Well I could install an Office clone like Libre or Pages. I could use an online service like Google Docs or Zoho. I could even attempt to get Emacs to read the data and make a go of presenting it to me in some recognisable format. Do you see what you’ve done? You’ve made more work for me. You’ve sent me a locked box and asked me to either pay to get a key cut or smash it open with a crowbar.

Plain text should be plain

What happens when I finally manage to open your document? Well 90% of the time, all it contains is text. That’s it. Text. Strings of characters. So why the hell did you send it as a Word document to begin with? Why not just write the text directly into the body of the email? If it’s that important for you to write in Word, then save it as a .txt file. There’s not a computer on the planet that can’t read plain-text. (Well, that’s not technically true, as I’m pretty sure my Microwave contains a computer, but that’s besides the point.)

Are you really that good a designer?

The only possible reason I can imagine that you had to send me the document in Word format is because you are the world’s finest graphic designer/type-setter. Maybe your choice of fonts, margins, kerning and paragraph indentation are so awe-inspiring, that the very act of viewing the document will have me gouging my eyes out with a spoon, knowing that the gift of sight is no longer of any consequence as I shall never again behold a thing of such beauty. Of course the small flaw in your plan is that I don’t have the Lucida handwriting font installed on my system, and Preview struggles to display Word-Art clearly, so all your efforts are probably in vain.

Tables grrrr!

Sometime you send me the Word document as a container for other joys, such as tables. I understand that a .csv is ugly to behold, but computers don’t tend to worry about aesthetics too much, so they really are preferable. There are prettier tables available if you’re into that kind of thing. HTML tables are great, easy to parse and render, but Microsoft obviously think they’re the devil’s work and so prefer to use their own method of tabulating data. I don’t know how Microsoft has chosen to represent tables in their .docx files, but I do know that if Linus, Stallman and ESR got together and hacked away for a decade or so, they wouldn’t be able to create a program that could render a sodding table created in Word, correctly.

What’s with the crud

Sometimes the documents you send me contain other interesting elements. You feel the need to augment your text with such things as; little animated gifs of a stick-man who is frustrated with his computer, borders of coloured apples and 3D Word-Art. Now I know you think that such embellishments will bring a smile to my face and ease my reading of your text, but I’m sorry to inform you that you’re wrong. Very wrong. Criminally wrong. You see, with out Word installed, I won’t be able to view these quirky little additions to you plain text. Your efforts were in vain. I could additionally argue that if your text was too boring, without such witty little quirks, then you might like to consider whether the content is worth reading in the first place.

A heartfelt plea

So please… pretty please… please with bells on top, borders of apples and the word PLEASE written in bright blue Word-Art; think next time you want to send a Word document by email or put one on your website, think about your recipient. Could you use the body of the email or a page on the site? Perhaps you could save the file as a .txt, .rtf or PDF. Just spare a thought for those of us that choose not to use Microsoft Word, and respect our right not to do so.

Oh… and learn to write in sodding Markdown.

Installing Pygame on Mac OS X With Python 3

| Comments

This has been a bugbear of mine for sometime now. I like using Python 3.x. I like teaching kids how to use Pygame. I use a Mac. Trying to get all three to play nicely with each other has been impossible for me up to now.

I’ve trawled through web pages and blog posts that recommend all manner of ways in which you can install Pygame on a Mac for Python 3, I’ve tried numerous solutions on StackOverflow, and I’ve even tried angrily shouting at my computer and threatening to throw it out of my classroom window. None of them worked.

Today I finally nailed it, and I have Pygame running. Here’ what I did.

1) Install XCode and command line tools
2) Install Homebrew (ruby -e "$(curl -fsSL https://raw.github.com/Homebrew/homebrew/go/install)")
3) brew install Python3
4) brew install git
5) brew install sdl sdl_image sdl_mixer sdl_ttf portmidi
6) Install XQuartz - http://xquartz.macosforge.org
7) brew tap homebrew/headonly
8) brew install --HEAD smpeg
9) brew install python (needed to install mercurial)
10) brew install mercurial
11) pip3 install hg+http://bitbucket.org/pygame/pygame

And that’s it. If you have any problems yourself or a better way then please let me know in the comments.

note: the smpeg install is failing for me at the moment, so I’ll look into this a little more. Pygame seems to be working without it though.

update

I had some brew doctor issues (around 20!), which might have been due to me trying to install Pygame from source earlier and therefore manually installing all the dependencies, which then conflicted with homebrew.

I deleted everything brew doctor suggested and overwrote all links as suggested. The brew install --HEAD smpeg suddenly worked (although that might have been because I was no longer behind a proxy). I then did a brew unlink jpeg and brew link --overwrite jpeg.

Everything is working perfectly for now. (Crosses fingers, touches wood and searches for a black cat to cross his path.)

What Exactly Are We Teaching Anyway?

| Comments

On Twitter, numerous education blogs and even on Hacker News, there have been more than a few debates of late regarding the education of students in Computer Science/Computing/Coding/IT. In the UK, in particular, the debate has been fueled by Lottie Dexter’s “Year of Code’; a government backed scheme to encourage everyone to learn a little more Computer Science/Computing/Coding/IT this year.

The usual suspects have all weighed in on this debate.

There are those that consider Computer Science and Computational Thinking the very purpose of a modern education. They argue that without the ability to fully comprehend the Halting problem, no child could ever tie their own shoelaces without entering some sort of bizarre shoelace-tying infinite loop.

Then there are those that argue it is impossible for a student to understand any abstract Computer Science problem. In fact, children are incapable of writing code, finding an on switch, or even manage to sound out the words on a “C” is for “Computer” nursery school flash car. They make the case that we should give up now and all go back to making pretty pictures in Paint.

C is for Computer

The arguments seem, more often than not, to be focused on what name we give to the subject that we are teaching. In reality, of course, the name means very little. During a child’s early years we teach them what we think they need to know and during the latter years we’re subject to the whims of the exam boards and organisations such as Ofqual.

It is for this reason that I propose a radical reform of the name given to Computer Science/Computing/Coding/ICT, that I will hope will clear things up once and for all, and prevent any future arguments, back biting and bullying. From this day forward I intend to teach students “Getting Shit Done With A Computer”.

Getting shit done with a computer is at the end of the day, what I’d like every kid to be able to do. Regardless of what you call my subject, I’ll always teach students how to get shit done with a computer, as that’s what I think they need to know.

I like my students to be able to recognise when the cable has been removed from the Ethernet port, and understand the reason why they have no network connection. I like my students to understand the basics of a file system and how to navigate it. I like my students to be able to use a spreadsheet, operate a database and write up a project. I like my students to be able to choose the right tools for the right job (The right tool being emacs and the right job being any job, always.). I like my students to be able to knock together little scripts that will recursively grab files out of a nest of directories, or as one enterprising young fellow did the other day, write a script that replaces all your files with this picture of Chuck Norris.

Chuck Norris is the Internet

Along the way I’ll teach them a little Boolean logic, some binary and maybe Big O notation. This isn’t just high flung theory with no practical use though. I recognise that when you want to get serious shit done with a computer, these are important concepts to have at hand.

When all is said and done, if we could just lose the pathetic tribal mentality that causes some of us to identify with the moniker Teacher of Computer Science, or Teacher of Computing, or Teacher of IT, the students would benefit in the end, and we could all just get some shit done using our computers.

X Days of Christmas

| Comments

Just a quick one from me today.

I woke up this morning with a lesson idea in my head, that was also a Python script.

I’ve a few teacher followers, so I thought I’d shove it up here for others to use if they want. You’ll have to forgive my poor coding and poorer use of the English language.

The challenge for the students is to create a program that will produce the lyrics for ‘The X Days of Christmas’.

You can find my solution at the bottom of the post, and here’s a few files – xdays.py, nouns.txt, verbs.txt

The results can be quite amusing – my particular favorites have been “20 kittens a bleeding”, “99 rats a computing” and “11 creators a mating”

I’ve even had ones that make sense, like “8 pies a baking”.

Here’s an example final verse that I quite enjoyed.

On the 12th day of Christmas my true love sent to me:
12 roads a developing
11 faucets a mating
10 planes a combing
9 worms a solving
8 tents a liing
7 kittens a slaying
6 tubs a handing
5 passengers a scattering
4 bats a utilizing
3 mornings a promoting
2 pollutions a foreseing
and a partridge in a pear tree

from random import randrange
nouns=[]
verbs=[]
parings=[]

days = int(input('How many days of Christmas are there?'))+1

with open('nouns.txt','r') as file1:
    for line in file1:
        noun = line.rstrip()
        if noun[-1] == 's':
            nouns.append(noun)
        elif noun[-1] == 'y':
            if noun[-2] == 'e':
                noun=noun[0:-2]+'ies'
            else:
                noun=noun[0:-1]+'ies'
        else:
            noun=noun+'s'
        nouns.append(noun)

with open('verbs.txt','r') as file2:
    for line in file2:
        verb = line.rstrip()
        if verb[-1]=='e':
            verb = verb[0:-1]+'ing'
        else:
            verb = verb+'ing'
        verbs.append(verb)

for i in range(days):
    paring = nouns.pop(randrange(len(nouns)))+' a '+verbs.pop(randrange(len(verbs)))
    parings.append(paring)


for day in range(1,days):
    if str(day)[-1] == 1 or day ==1:
        ending = 'st'
    elif str(day)[-1] == 2 or day ==2:
        ending = 'nd'
    elif str(day)[-1] == 3 or day ==3:
        ending = 'rd'
    else:
        ending = 'th' 
    print('On the',str(day)+ending,'day of Christmas my true love sent to me:')
    for count in range(day,1,-1):
        print(count,parings[count])
    if day == 1:
        print('a partridge in a pear tree')
    else:
        print('and a partridge in a pear tree', )
    print('')

How I Rediscovered Experimental Learning and Why It Doesn’t Matter

| Comments

A few months back I received this tweet.

I was a little surprised, but DM’ed them back with my address and then promptly forgot all about it. (I don’t know what the past tense of the abbreviation of Direct Message is, so please feel free to correct me in the comments below.)

About a month later I was at home when the doorbell rang and a delivery driver handed over a parcel for me. I opened it up and was surprised to find three littleBits kits.

Now I’d just (literally that week) started teaching a new subject called Systems & Control, that has a heavy element of electronics involved, so I packed the kits into my car and took them to school the following day.

For those that don’t know, littleBits make electronics kits consisting of snap together magnetic modules. You can use the kits to make a range of electronic circuits – driving motors and buzzers and blinking LEDs.

littleBits

My colleagues and I stood around the opened boxes, picking up the little plastic modules, snapping them together and building an array of little projects. After fifteen minutes or so we all came to the same conclusion. The kits were clever, easy to use, accessible for students but completely impractical for a classroom of thirty secondary school students.

The kits were placed back in their boxes, and left in a cupboard, forgotten about.

Then, due to personal circumstances, my six year old son, Jimi, had to be home schooled by his grandmother. I remembered the kits in my classroom cupboard and took them over to her, suggesting she might like to put together some of the kits with him. This she did, dutifully following the instructions and assembling the kits. She reported back that they had built the circuits, and then the kits came home, to be once again forgotten about, this time in a drawer in Jimi’s bedroom.

On Saturday I was standing in the kitchen, enjoying a steaming cup of coffee and my morning Nicotine Replacement Therapy Lozenge, while perusing Hacker News. Jimi came bounding in.

“Dad, come and look at the machine I’ve made.”

I’ve been caught out by this one before. The “machine” is normally a cardboard box with a cushion inside it and a pencil stabbed through the side. Sometimes it’s supposed to be a rocket, sometimes a train or sometimes an Angry Birds catapult.

“Why don’t you bring it through here?” I asked, reluctant to leave my coffee and laptop.

“It’s a bit delicate,” he said “but okay.”

He came in a few minutes later, but I wasn’t really paying much attention as he began messing around on the kitchen floor.

“Look Dad.”

I looked over and was stunned by what I saw. Jimi had pulled out the littleBits kits and had assembled a monstrous creation.

I got down on the floor with him and asked him what he had made. He explained the contraption. How you had to turn this thing and push that thing and hold down this part and then this part lights up and this thing makes a noise. He didn’t use a single punctuation mark in his excitable and breathless sentence.

We moved his machine onto the kitchen table and his big sister suddenly made an appearance. Within minutes the two of them were busily clicking together bits of electronics, having fun, amazing themselves, and more importantly learning.

Jimi and his Sister Concentrating Lantern

Jimi figured out that a push-to-make switch needs to come between a power supply and an LED to have an effect. He learned what a variable resistor does in series with a motor. He learned what a piezoelectric sensor does when combined with a buzzer. Obviously he learned none of the terminology, but that wasn’t important.

I’d dismissed the littleBits kits because I have preconceived ideas of what education should be. These misconceptions have been honed by years of operating in a pressured school environment, where results rule above all else, and where you’re required to demonstrate progress in every lesson, term, year and key stage. Watching my son experiment, succeed and learn, all while having fun was sobering.

Unfortunately the lesson I learned is one that I will struggle to apply in my professional life. There just isn’t the time available to allow students to experiment at their own pace. I would love nothing more than to give the students the tools they require, be that in computing or in electronics, and allow them to explore the possibilities available to them, but alas I operate in a system where data rules and demonstrable progress is required.

There are areas in which I can allow students to experiment. Thanks to a particularly keen and talented student, I have become involved with a project called THINKSPACE. This involves giving students a time and place that they can come and begin to develop apps. They work on what they want to, at a pace that suits them and where I am a facilitator and troubleshooter as opposed to a driver of progress. I can spare only an hour a week to THINKSPACE. It deserves two or three hours a day.

I admire littleBits, THINKSPACE and similar initiatives. I think they’re admirable endeavors with the right mindset when it comes to education. I just wish that policy makers within the education system shared their ideals and attitude towards learning, so that we could give all our students the opportunity to truly experiment in the classroom, to succeed and to fail, to have fun and most importantly to teach themselves to learn.

Computing Is Much More Than Coding?

| Comments

Trying to justify that every student in the country should learn computing is quite a tricky endevour, and it shouldn’t be. We’re all users of technology after all, and benefit from the advantages and disadvantages of being so. We should all have a basic understanding of computers, networks, encryption and software. Knowledge of technology can help keep us safe, make us money and help our productivity. Why should every student in the country not be given access to such knowledge? After all, we don’t balk at the fact that every student in the country should study Shakespeare, trigonometry, the causes of The Second World War or how to throw a rugby ball. I probably use trigonometry about once a month, I haven’t thrown a rugby ball in decades but I use a computer every day.

I studied French for five years at secondary school. I hated it. There’s nothing worse than sitting at a parents’ evening and listening to your mother speak fluent French with your teacher, and knowing, despite not understanding a word they are saying, that he’s detailing your every shortfall in the subject. Do I resent the fact that I was made to study French? No. Of course not. I have the utmost admiration for multi-linguists. I marvel at their ability and I know they perform an essential role in our society. Every student should study a foreign language, because a few of them will find they have a talent for it and go onto greater things. Did attempting to learn French benefit me? No. Not one bit, and that doesn’t matter

Other subjects have no need to justify their existence in schools. English, Foreign Languages, Maths and Science have the weight of centuries of educational history behind them. As does History. Computing is new. The bastard child of Maths and Electrical Engineering. It’s a subject that’s only been in existence for a few decades. Because of this it has struggled to gain a firm foothold in our schools, yet I would argue that it is a subject that has a far greater impact on our lives and the lives of our children than any of the others.

There are quite enough reasons to make Computing a compulsory subject in schools. We don’t need spurious excuses that confise the issue. What annoys me about many of the “Computing Apologists” is their overwhelming desire to insist that Computing has benefits outside of the sphere of technology. I’m all for Computing in primary and secondary schools, because I think the ability to code is important and I feel that every student should have the opportunity to learn to program. If we teach a thousand students to code, then maybe we’ll find a hundred that enjoy it, ten that excel at it and one that goes on to revolutionise our society.

When many advocates of compulsory Computing education are asked why it is so important, they often hail “Computational Thinking” to be the panacea to all our woes. They state that even outside the field of programming, Computational Thinking is an important life skill. They argue that Computational Thinking can be applied to many problems in the real world and that every student should learn to tackle problems in a “Computational way”. I teach Computing, and to be honest I only have a vague idea of what Computational thinking is.

I know how to teach programming. I know how to teach students to break down problems and tackle bite-sized chunks with different algorithms. I know how to get students to manipulate large data sets, and to make simplified models of real-world situations. I can teach students all of these things, once I’ve taught them the basics of the programming language we are using, be it Scratch, Python, Javascript or whatever the flavour of the week is.

The idea that Computational thinking is an essential life skill is nonsense.

If I was to shuffle a pack of fifty-two playing cards and hand them to you, then asked you to sort them, you’d do what any sensible person on the planet would do. You’d sort them into suit order first, then into value order. Is this decomposition or just common sense? Did you require lessons in Computational thinking in order to achieve this task?

My degree is in Biochemistry. I would never argue that Scientific thinking is crucial for everyone. It’s useful, in certain situations, but not essential. When some Daily Mail reader argues that the presence of a minority group in our country is resulting in a broken society, I might apply the Scientific Method to analyse their evidence, find its flaws and then disprove their hypothesis. If they told me they’d eaten a bacon sandwich, I’d probably just believe them. My wife is an English teacher. When she reads a novel she drills down into the layers of meaning and the subtext of the book, to elucidate a truer understanding of the authors message. When she reads the menu in a local cafe, detailing the contents of their bacon sandwiches, she just takes it at face value.

You can live a successful life without knowledge of the Scientific Method. You can live a successful life without knowledge of Literary Deconstruction. You can live a successful life without knowledge of Computational Thinking.

If I ask you to build me a shed, do you pick up an armful of planks and repeatedly throw them into the air until a shed has been built? Of course not. Maybe you’ll start by building a floor, then some walls and finally a roof. Is this decomposition? Are you “Thinking Computationally”? Maybe you’re actually engaging in abstraction. After all, there’s no such thing as a roof. A roof is just a series of planks of wood, joined together at an angle that is optimal for self-support and the shedding of rain water. Of course, there’s no such thing as a plank of wood. That’s just really bundles of xylem vessels, cut into regular geometric patterns. Of course, there’s no such thing as a xylem vessel. They’re really just arrangements of cells composed of cellulose and… well I’m sure you get my point. We’re all fairly familiar with abstraction. We just might not recognise it for what it is, and we certainly don’t have to be taught Computational Thinking in order to build a shed.

I think one of the major problems is our labelling of the subject.

Computer Science is no more about computers than astronomy is about telescopes

Edsger Dijkstra

Unfortunately, Computer Science has little to do with Science either. Let’s get straight what we are actually teaching here. We’re teaching programming, network infrastructure, databases, communication protocols, markup. We’re teaching these things because these technologies are so ubiquitous and important, that it will benefit everyone to have a little understanding of them. We are not teaching a revolutionary new way of thinking that will have wider benefits to society.

Why should we give every student an opportunity to learn Computing? Some of the students we teach might one day become the next Linus Torvalds or Steve Wozniak. Some of our students might become senior developers, collaborating on amazing new technologies and changing the world for the better. Some of our students might develop the algorithms for more realistic flag fluttering in Call of Duty XIII. Some of our students might become Gregs executives and not give the developers such a hard time when they can’t sort six million customer records according to when they last ordered a bacon sandwich, in real time, in a browser… that’s IE6.

Lets stop trying to make Computing something it isn’t, and instead be clear as to what we’re teaching and why we’re teaching it. Let’s stop being afraid of the words programming and coding, as if it’ll scare students away. Let’s be honest about Computing, and we’ll see it’s popularity soar.

A Rant From My Brother

| Comments

My brother is the reason I learned to code. To be honest, he’s probably forgotten more about programming than I’ll ever know, and I’m not exaggerating. His preferred languages are Haskell and OCaml, but he’s recently had to dive into Javascript for a project he’s working on. I received this email from him tonight, and I found it amusing so I thought I’d share it. (Note – he talks about Python a lot as it’s the language I understand the most.)

Javascript is pretty pathetic when it comes to bug-finding. Here’s some Python:

>>> foo = {}
>>> foo["bar"] = 3
>>> foo["baz"]

The dictionary foo doesn’t have a key “baz”, and this is likely a typo. Python sensibly throws an error, and execution will not continue.

In Javascript:

>> var foo = {};
>> foo["bar"] = 3;
>> foo["baz"]

This does not throw any errors, but instead returns undefined. This is not entirely retarded, until we find that Javascript happily coerces undefined to NaN (Not a Number) whenever it appears in arithmetic expressions. Since NaN is a valid floating point number, it can happily propagate through running code. Things go from entirely retarded to completely fucking braindead when we find that Javascript will accept NaN as an argument in most functions:

ctx.fillRect(NaN,NaN,NaN,NaN)

In other words, what started out as a typo which would have Python raise an error at the earliest possible opportunity is silently ignored by Javascript, only to be found if one notices certain rectangles not being drawn. Tracking down such a typo from a bit of missing graphics is going to be a pain in the arse.

Now functions: Javascript has no time for conventions of mathematics, programming, or basic sanity. In Javascript, any function can be passed any number of arguments without raising an error. The concept of arity be damned. Extra arguments in Javascript are ignored. Missing arguments are set to undefined. And, as explained before, undefined will be coerced to NaN in arithmetic expressions to create lots of great bug-full code when you forget the number of arguments required of a function. For further hilarity, undefined can be used as a key to a dictionary. So if you do:

function insert(y,x) {
   dict[x] = y;
   ...
}

and you accidentally call insert(3), you won’t be told, as you would be in Python, that you are missing a required argument. Instead, x gets bound to undefined, and the dictionary will be become

{ undefined : 3 }

That’s almost certainly an unexpected behaviour.

The way that function parameters are interpreted leads to this truly bizarre example, which I got from another site:

['10','10','10','10','10'].map(parseInt)

this yields the truly weird

[10,NaN,2,3,4]

The function map is supposed to apply its argument to every value in a list. In sane languages,

[x,x,x,x,x].map(f) 

should give you the list

[f(x),f(x),f(x),f(x),f(x)]

In Javascript, for likely dumbfuck reasons, map takes a function of three arguments. The first argument is bound to the element in the list. The second argument is bound to the index into the list. The third argument is bound to the entire list. This will cause surprise when you don’t know exactly how many arguments the argument to map is expecting (parseInt in this case), but don’t expect a prompt error in case of mistakes, as you would get in Python.

It turns out that, in this case, parseInt takes an optional second argument which is the base in which the first argument is to be interpreted. For unexplored reasons, when the base is 0, the argument is read in base 10. In base 1, NaN is always returned. This explains the first two elements in

[10,NaN,2,3,4]

The third element is “10” in base 2. The fourth element is “10” in base 3. The last element is 10 in base 4

Ridiculous.

How We Were Trained to Lower the Drawbridge

| Comments

A few years ago, the parody news network The Onion released a video claiming that Facebook was a massive CIA surveillance project. It was funny at the time. It’s not so funny any more.

Perhaps naively, I believe that Facebook, Google and the other tech giants reluctantly cooperate with the NSA. I believe that they comply with FISA requests because they have to and that they have remained tight-lipped about their cooperation because if they don’t then whistleblowers could expect the same fair, just and proportionate treatment as has been meted out to Chelsea Manning and Edward Snowden. These corporations exist, after all, to make money and handing over vast swathes of user data to spy agencies just isn’t in their financial interests.

I feel however, that the tech giants have accomplished something far more insidious, and in many ways more detrimental to our privacy than is claimed in the video. They’ve trained us to devalue privacy.

There’s an old saying that actually used to mean something.

An Englishman’s home is his castle.

In the UK at least, it used to be the case that we valued our privacy. What went on between the four walls of our homes was our business and nobody else’s. There were only three people in your life you would ever share your private lives with: your doctor, your priest and your spouse – in that order.

Then along came the Internet. At first it was a place only a select few could publish. You had to have the technical ability to setup a webserver and write in HTML, and the World Wide Web was a curious place filled with niche websites created by geeks, academics and hackers. But it didn’t stay like that for long. Facebook, Wordpress, Twitter, Google+ all came along and made it easy to share everything.

We’ve been trained to lower the drawbridge, lift the portcullis and let the world into our castles. Social networks reward us every time we publicise our lives, and we eat it up. This is most startlingly apparent amongst Generation Y, for whom sharing their lives with the world is so natural and ingrained, they almost see it as a basic human right. They consider privacy as something archaic and quaint, no longer relevant to the world we live in. They like it when they Google their own name and see images of themselves on the front page. They compete to gain followers on Tumblr, friends on Facebook and mentions on Twitter.

We don’t yet know what full the consequences of the sharing culture will be. When today’s fifteen year-old students attempt to stand for Parliament in twenty years time, and the front pages of the red-tops are plastered with embarrassing Snapchat selfies, will we look at them and decide that they are not fit to represent us in Government, or will we just shrug and acknowledge that ‘everyone used to do that’?

We can see one consequence of our training by the social networks here and now though – apathy. When Snowden’s revelations first hit the Guardian’s front page almost nobody cared. Hacker News was filled with NSA stories, but you’d expect that from a community of technophiles. The BBC seemed to include Snowden stories as an after thought though, and even then, they focused on the human element of where he was and what he was doing, rather than the surveillance programs he had exposed. Glen Greenwald promised there would be more to come, and he didn’t disappoint. But the latest revelation, that the NSA and GCHQ consider most of the widely used encryption technologies as a mere hindrance to their dragnet data gathering, has caused barely a ripple in the public consciousness.

Why are we not out on the streets protesting these flagrant invasions of our privacy? Why are we not holding our parliamentary representatives to account, and demanding the end to mass surveillance of innocent citizens? Why are we not doing something… anything?

Generation W didn’t think about privacy, they just had it. Amongst Generation X there are precious few of us who care. Most of Generation Y consider privacy a barrier to their lives. It’s Generation Z where the only hope lies.

As a teacher I grow tired of the government and media constantly passing the buck and demanding that all societal ills need to be cured in schools. Teenage pregnancy rates too high? Teachers can fix that. Young adults can’t manage their finances? Teachers can fix that. Too much apathy at the polling booth? Teachers can fix that. Government, the media and parents constantly abdicate responsibility and throw more into the curriculum in an attempt to fix society. When it comes to teaching about online privacy however, I don’t see who can help Generation Z other than the teachers. The media, the Government and the corporations have no vested interest in a generation that considers privacy important. As for parents, they’re already setting up Facebook accounts for their babies so that we can track their offspring’s progress from cradle to grave.

I’ll start. I’m currently working on a scheme of work about cryptography. There’ll be plenty of Computer Science concepts in there, but I also intend for students to understand the importance of strong cryptography from a societal perspective rather than just a technological perspective. My hope is that it will make them think a little when using digital communication technologies, about exactly who has access to the content they send. I’ll publish it here when I’m done. If you would like to join me in this campaign, then please feel free to share links to resources in the comments section, or on Twitter, and I will endeavour to curate and publish what you send.

The Myth of Mobile Computing

| Comments

note – I use the term General Purpose Computer (GPC) rather than PC as it does not carry connotations of which OS you are running

This is not an anti-Apple rant. I love my iPad. It’s the device I pick up the moment I get out of bed in the morning. While drinking my coffee I check my emails. While on the toilet I check my twitter feed. While having a cigarette I read Hacker News. While taking my dogs for a walk I listen to Today on iPlayer. While at work I project content to students via AppleTV. When I get home I play Plants vs Zombies with my son. While marking students’ work, I’ll watch some film or TV series on Netflix.

To misquote Charlton Heston:

You can take my iPad out of my cold dead hands.

Now go back and read that first paragraph. What do I do with my iPad? I use it to view content and communicate. iPads, and any tablet, are great for this. I do not use my iPad to create anything.

I’ve tried. Honestly I have. I’ve downloaded apps to make videos, code in Python, create presentations and write blog posts. I’ve excitedly shown friends and colleagues how easy it is to knock up a Keynote presentation, or make Khan Academy style tutorials using various apps. I’ve even purchased a stylus and tried using it as a notebook only to discover that my writing is illegible unless my characters are at least three inches high. As time goes on, the apps I’ve downloaded to ‘aid’ productivity get used less and less, and I find myself firing up my MacBook whenever I want to do any serious work. I got sick of finding workarounds to get over the restrictions of the OS; transferring files along a chain of apps or even emailing them to myself. I now have a simple policy for managing my apps. If an update comes through, and I haven’t used the app in more than a month, I delete it. This is what my home screen looks like. These are the only apps I have.

My Only Screen

I read this Guardian article (on my iPad) and I’d like to spend a little bit of time explaining why mobile computing, in its current form, is not going to take over the world.

The statistics that are quoted in the article, along with fancy looking graphs, are all for GPC sales. This is important. They are not quoting GPC usage. If you want some statistics on usage, then check out this screen-shot for the analytics of my blog.

Analytics

When you buy a computer you tend to expect it to last a few years, and you tend to just buy one or maybe two for an entire household. This is not the case with mobile devices. Along with every member of my family over the age of eleven, I am on carrier subsidised mobile contract. We upgrade our phones regularly. So just looking at my household, you have a purchasing pattern of maybe one computer every five years and four mobile devices every two years.

I use a third generation iPad (screw Apple’s ridiculous naming policy). My son uses a first generation iPad. This is a device that is barely three years old and has already become obsolete. Even typing on the thing is a pain, as the OS has advanced beyond the hardware and it takes what seems like seconds for the key presses to register. The last few apps that I’ve tried to download for him won’t run on the OS and those that do tend to crash pretty often. I should think that most people that purchased a first generation iPad that can afford to, have already replaced it with a later one.

Is it any wonder that the sales of mobile devices are so high? Mobile devices, in my opinion, are like disposable computers. They’re designed to be that way. You use it for a couple of years, it becomes obsolete, and you buy a new one (all with a nice carrier subsidised contract so you don’t realise how much money you’re actually spending). I think this is in part due to the manufacturers’ policy and in part due to the rate at which technology is developing. Apple and Samsung are constantly innovating, bringing out better mobile devices each year that consumers want to buy. At the same time, they have no desire to ensure that a two-year old device is as functional as the day you bought it. We’ve been tricked into accepting this business model in a way we would never accept with our GPCs. A three-year old computer will still run the latest operating systems and software, so we don’t need to upgrade as often. Most households now already own a GPC, so is it any wonder that sales have flat lined.?

Convergence I think is important, it’s just that the big manufacturers have got it the wrong way around. They’ve taken the sales of mobile devices and made an assumption – everyone wants mobile and people aren’t buying PCs. Apple and Microsoft are pushing to make our GPCs more like mobile, in the misguided opinion that this is what people want. It’s not. Windows 8 is a prime example of how this tactic has failed, it’s a God awful OS that nobody wants. I hope Apple have paid attention to this. Imagine if the next iteration of Mac OS stopped you from installing apps from outside the App Store, removed the terminal and locked away your file system. How many of us would upgrade then?

No, the correct style of convergence can be seen in Canonical’s attempt at launching the Ubuntu Edge. The crowd funded campaign failed, but I think it gained significant enough numbers of pledges to show that this is what many consumers want. We want a GPC that we can carry around in our pockets, SMS our friends with, post to Twitter with and browse the Web with. Then, when we need to do some work we can plug the thing into a monitor, link up a Bluetooth keyboard and get on with some real productivity.

I’ve seen a worrying number of articles about schools who have bought into the mobile craze. They’re making the assumption that current mobile technologies are the future, and that they need to jump on the bandwagon as quickly as possible. This is of course only part of their reasoning. Any school that’s offering free or subsidised iPads has a competitive advantage over neighbouring schools, not in terms of results, but in terms of bums on uncomfortable plastic seats. Schools are paid per pupil, so getting feet through the door is important, and nothing is more attractive to a student than the ability to use Instagram during their History lessons.

The problem with school iPad programs is that they’re providing the students with a device that will be next to useless by the time they’re half-way through their education. Additionally they’re providing students with a device for viewing content rather than creating content, and education should be about creation. I don’t want my students reading blogs, I want them writing them. I don’t want my students playing games, I want them developing them. I don’t want my students watching videos, I want them making them. I don’t want my students using Snapchat… full stop.

I think that in a few years time this will cease to be a topic of conversation. I think we will see convergence of personal computing devices in the right direction, and I’ll own a pocket-sized GPC with a user interface that doesn’t inhibit my ability to create content. We need to keep in mind that mobile is still young, and it should not be assumed that that iOS and Android (operating systems that are only six-years old) will be the future. And if you’re the Head of a school, please think twice before jumping on the iPad bandwagon, there’s better uses of public money.