Give Things Away

One of the smartest things I ever did was to give away the C# Yellow Book. I've had more interest and traffic from the free downloads than I'd ever have got if I tried to make a profit from the text. 

I was talking to one of the musician folks who will be helping make music and sound effects for Three Thing Game on Friday and I was expounding on this theory. My theory is that people really love free stuff, and that if you make a name for yourself as a provider of good, free, stuff then sooner or later someone will want to hire you for a proper job.

The great thing about being young and creative is that you often have more ideas than you know what to do with. So why not put some of the stuff you make out there for free and see what comes back. If you're a programmer you can get a similar effect by taking part in Open Source Projects.  

Failure is not a state, it's an event

I'm not a big fan of trite aphorisms. Today may be the first day of the rest of my life, but putting it that way doesn't really do much for me. But today in a presentation about teaching and learning I saw the phrase "Failure is not a state, it is an event". I didn't get time to jot down the attribution, and the best that the internets can come up with on the matter is this, but I do think that this is a crucial thing to remember.

Never think of yourself as a failure if something you try doesn't work. Just think of yourself as someone who has now got some valuable knowledge along the lines of: "I need to do that differently".

I think you learn more from your failures than your successes. And fear of failure is worse than failure itself. I now allow myself the luxury of trying things that might fail on the grounds that worst case I'll know more than I would by doing nothing. And it is always possible I might succeed.

The Five Knows of Programming

I've been teaching programming for a very long time. I'm still waiting to get properly good at it. In the meantime I'm given to thinking about what it means to learn how to program. I've narrowed it down to five "knows".

  1. Know what the computer does.
  2. Know how to create a program.
  3. Know how to automate a task that you yourself can perform.
  4. Know how to think like a computer.
  5. Know how to structure and manage your solutions.

I've been teaching the First Year course for the last few weeks and I reckon that we are at level 3 at the moment, moving on to level 4. This is a crucial time.

At "Know level three" you can take something you would be able to do yourself and write a program do perform that action. One example we do is deciding whether or not you can see a particular film. If your age is lower than the rating for that film, you can't go in. When you write the program you can imagine yourself selling tickets and deciding whether or not people can come in.

Level 4 is quite different. You have to let go of how you would do a task and try to think how you could make a computer do it. Sorting is a classic example of this. If you gave me 20 numbers to put in descending order I'd be able to do it, but I'd not really be able to tell you how I did it. To write a program to sort 20 numbers you would make it do the task in a way that a human never would (for example bubble sort). This is the hardest part of programming. Up until you hit level 4 you can think you are doing very well. Ifs and loops make sense, as do variables. And you've even written the odd program. And then wham, you suddenly find that you can't do it. And I mean really can't do it. This can be very painful and demoralising.

Today I did a tutorial with the students where we explored the transition from level 3 to 4. The best advice I have on this matter is not to stress if the penny doesn't drop first, second or third time. Don't think of it as a technical problem (I must re-read my notes so that I understand arrays better) but think of it as a "way of thinking problem".

Work with what you know a program can do (stash things in arrays, get them back, work through elements, compare values and do things with them etc) and then try to figure out how these abilities can be used to solve the problem.

Consider lots of related problems: find the biggest item in an array, find the smallest in an array, count how many times the value three appears in an array etc etc and notice how fundamental behaviours (working through the elements of the array in sequence) can be used to solve a whole class of problems. Don't worry if your answers seem complicated to you. You get better with practice, and some things are just tricky to do.

I learned to program a long time ago, but I still remember the worry of "What if I don't get this bit" every now and then. Your best bet is to start early, find friends to discus it with and keep the focus on what you are trying to do. And you'll be fine.

Of Broccoli and Stoppers

I don't like broccoli. Never have. Give me a plate of food with some broccoli on it and I'll eat the broccoli first. This is because I like to get rid of the bits I don't like before moving onto the stuff that I do. Note that I don't leave the broccoli. That would be impolite. And a waste of food.

I do this kind of thing in software projects too, as I was telling my project students this week. I reckon that step one of any project is to "Identify the stoppers". Stoppers are the tricksy things that must be made to work otherwise you don't have a working system. It might be storing the data. It might be getting the network to connect. It might just be being able to compile and run a program on the target device. These are the "broccoli" in your project. And you should eat them first.

It's tempting to start with the easy bits and leave the nasty, difficult bits to the end. However, this can lead to problems. You really don't want to be doing the hard bits when you are under time pressure at the end of the project. And you really don't want to find out at the end that one of your "stoppers" is actually impossible. Much more recoverable if you find out at the start.

I've asked my students to identify the stoppers in their projects and report back at the next meeting.

Hull Knowledge Factory Student Talk

At least now I know what happens if I try to use the Panorama feature of my camera to take a picture of the audience. Sorry if you were cut off.

At least now I know what happens if I try to use the Panorama feature of my camera to take a picture of the audience. Sorry if you were cut off.

I did a talk for a bunch of Knowledge Factory students today. These are folks who will be joining us at the end of the month as students, but have come along early to spend a few days getting a taste of university life.

The subject of the session was the joys of "Making Stuff" and it was great fun. Thanks for being a lovely audience folks. During the talk I mentioned some bits and bobs and I said I'd post references for anyone who fancies following up on the things I talked about.

Arduino

Arduino is the name of a family of embedded computers of different sizes. These are the kind of computers that you would put inside a device to control what it does. I use them in my wedding lights and other gadgets that I've made. You program them in C using a very easy to use framework that you can download for free from the Arduino web site.There are versions of the framework for Mac and PC. You put a program into the Arduino device and it runs that program each time the power is switched on.

You can buy Arduino branded devices but they are a bit pricey. It is much cheaper to go onto eBay and just search for Arduino. A company called Sintron makes some very nice kits of parts to play with, these start at around 30 pounds. Once you have the kit just search the web and you'll find loads of libraries, sample code and videos to get you started. 

If you want some books to read about the platform I'd look for books by Simon Monk. He has written some good Arduino primers, plus a few other fun books. 

Programming 

There is no such thing as the best programming language in the world, but I quite like C#. You can get a free C# book, plus a lot of teaching materials and sample programs, from here

If you want to learn some Python (and why not, it's great) we have a course of sorts here

3D Printing

My 3D printer is an Ultimaker. I call her Una and I made her from a kit a few years ago. You can find all my 3D printing posts here

Blogging 

Bloging is a great way to practice writing and maybe even make a name for yourself. I did a Rather Useful Seminar about blogging. You can find it here

Give Yourself Time to Fail

Restarted the teaching today. I did a show of hands in the first year lecture to find out how many people had started the coursework, which is due for completion in three weeks or so. I was pleased to find that pretty much everyone had started work on it. (or was claiming to have)

It is very important to start your computer science coursework as early as you can. This is because you need to have time to fail. By that I mean that there will be times when your program won't work and you won't be able to work out why. (this still happens to me by the way). In this situation you need time to be able to walk away from the computer and go back to it, which is a problem solving technique that works for me a lot of the time.

The other technique I use is explaining the problem to somebody else, or even just the wall. Half way through the explanation I hit upon the "broken assumption" that is the cause of the problem and can then go and fix it.

This is one of the things that makes computing different from lots of other disciplines. I don't think it is OK to work on the basis that you could dash off an essay overnight, but at least by the time morning comes you could have a bunch of words that may or may not be the right ones.

Try this technique with a programming assignment and at around 10:30 in the evening you'll find that your program doesn't work and you've no idea why. And you have no time to step away for a while and then come back and fix it. So start early.

Careers and Internships Networking Event

Gathering at the start. Nothing brings folks in like Free Food..

Gathering at the start. Nothing brings folks in like Free Food..

We held a Careers and Internships event last year. It went really well so we thought we'd do it again. So we did.  And it went really well again. We had loads of companies show up and present, and then they manned stands and took business cards (that we had rather thoughtfully provided) from our students. 

Peter gets things going

Peter gets things going

One thing that surprised and pleased me was the number of companies in the area doing world beating stuff. And one company mentioned the awesome news that Hull was one of the top ten cities singled out in a recent Tech City UK report. You can find the report here.  Skip to pages 45 and 46 for the good stuff. 

Plenty of action at the exhibition again

Plenty of action at the exhibition again

It was great to see the students and employers engaging again. Many companies had brought Hull University graduates with them as part of their teams, and there was something of a reunion flavour to the event, which was really nice. And, of course, we'll run it again next year.

Ooh. Free pens..

Ooh. Free pens..

If you're a Hull student, I wrote a little executive summary about the event. Send me an email and I'll let you have a copy. 

Charlotte Talks Industrial Placements

As soon as I found out that Charlotte Godley, one of our students, had landed a placement at Airbus Industries I made a mental note to ask her to do a Rather Useful Seminar on her experience when returned to the department. Today she came along and gave that seminar.

It was excellent.

Charlotte started with reasons why you should take a placement for a year. (it just makes you all round more awesome) and reasons why not (it is hard work, and you might get out of step with chums in your cohort who will graduate just as you come back). Then she spoke about the best way to get a placement. I think her approach really boils down to three words.

Have a plan.

Having a plan means things like finding out about a company and tailoring your CV and accompanying letter to chime with what they do. It means thinking about the kind of questions you might get asked at interview and coming up with some really good questions of your own for the company. It means preparing for careers events and making hit lists of companies to target. But most important, it means giving some thought to what you really want to do in your future.

A placement is a great way to find out if you really want to work in a large company, or write Python programs, or travel the world in a van solving mysteries (my favourite). It is also a great way to learn the ways of work, where suddenly everyone around you is not the same generation as you and everything stops at 5:30 leaving you exhausted but looking for things to occupy yourself with.  

Charlotte gave a very good description of these issues and the fact that there were so many detailed questions at the end of the session was a testament to how well the material had been delivered. She has put her slides up on her blog, and I've asked if she wouldn't mind doing a screencast of the deck, as I'm sure it would be useful to all our students.

We are having a Careers and Internships Networking event in two weeks. Hull students can sign up here and get a set of free business cards that they can pass around. We'll be releasing the list of companies coming along so that you can prepare your "hit list".

Students from any year really should come along. First years can be thinking about internships over summer (we have some developers in the cohort who would be well up for this) and second and onward years can be thinking about taking years out or finding employers.

Programming At Hull Web Site

Over the last couple of days I've spent some time adding content to our WhereWouldYouThink web site. This is a site that has lots of stuff that might be useful to students in general and computer science students in particular.

I'm trying to rationalise some of the support materials that we have for programming and I've attacked the problem in the only way I know how.

I've bought another domain name.....

I've set up a microsite at the address programmingathull.com. This brings together all the stuff that we've put together to help folks learn to program. This includes the wonderful C# Yellow Book and Arduino and Python content. If we make anything else useful we'll put it there too.

WiFi Security on the Radio

You've probably seen the film "The Truman Show".  It's about a chap who is is unknowingly the centre of a reality TV show. His life is being continuously filmed for a worldwide audience. Everything around him is choreographed so that the viewers can see his reaction to events. There's a memorable sequence in the film where Truman is heading out to work and all the traffic around him is planned like a military operation.

I was reminded of the film this morning when I was trying to drive into the middle of Hull to take part in a radio item about WiFi security. Everything, and I mean everything, seemed to be happening in a manner calculated to make me late. I had a mental vision of someone in a control room speaking into a headset and saying  "OK, he's had the slow running train and the reversing bus, now lets set up the trick cyclist and red light sequence......".

I was a bit late, but they managed to shuffle things around and we had what I thought was a good chat. It was in response to a piece in the news about a seven year old girl who had learned how to hack into an unsecured public WiFi system in around 11 minutes. You can read a good description of it here.

The story had been set up by a WiFi security company (who would have guessed). The girl wasn't actually a hacker in the proper sense, more someone who could find a video on YouTube and then copy the instructions in it. Actually I feel rather sorry for her, in that she now has "how to hack wifi" in her Google search record for the rest of her life. Oh well.

But the story did hold important lessons on security. The most important one is probably that folks need to be aware of the dangers that using free "open" wifi brings. By "open" I mean the kind of connection you don't need a username or password to access. When you use these your phone, PC or tablet will frequently give you a warning, and with good cause.

The open nature of these connections means two things. Firstly it means that the data exchanged between the network and your PC is not encrypted, so anyone can see what you are doing. Secondly it means that it is child's play - literally - to make a computer pretend to be the WiFi connection and perform a "man in the middle" attack, reading the contents of each of your messages before passing them onto the network.

So, using an open WiFi connection must be regarded as fraught with risk. If you have to use a a username and password to connect things are probably OK. Lots of hotels have a little printer on reception that prints out a set of credentials that you can use for a limited time. These are probably OK. But places where you can just find the site and then connect must be regarded as rather dodgy.

If you really must use an open site (and we've all done it - including myself who has been known to install Windows Phone firmware upgrades in Starbucks the world over) then here are a few tips:

  • Only visit  web sites that have https (and the little padlock in the address bar) while you are online. These encrypt the conversation between your computer and the server so that any eavesdropper will get an earful of meaningless chatter.
  • You can use your banking applications quite safely, as these will encrypt the data sent to and form the bank.
  • If you really, really, must log in to use sites that are not https secure, use usernames and password pairs that are unique to that site. One nasty trick that hackers have up their sleeves is to take credentials that you use in one place and then try them on lots of other ones. If possible you should really have different passwords on every site you visit to stop this from happening.
  • Once you have finished, check your device to see if it has remembered the connection. Lots of phones and tablets keep a handy list of sites so that they can reuse connections if they see them again. This means your phone might try to remake the insecure connection again without you knowing. I'd advise you to delete the connection from the list to stop this from happening.

Networked devices are massively useful and we have built large chunks of our lives around them. But you also need to remember that some of this wonderful technology was not really built for the nasty world that it is being used in and make sure that you limit exposure to these horrid tricks.

Which Computer Science Option Should I choose?

After my First Year lecture today someone asked me an interesting question: "Which Computer Science option should I choose for next year?". Our course is structured so that you can choose your specialism (Computer Science, Software Development, Game Development, Information Systems or Computer Systems Engineering) at the end of year one. I think (although I may be wrong) that the person asking the question was primarily concerned about the best option to choose from an employ-ability point of view.

My answer was that you should pick the one you are most interested in. When you go for a job it is very likely that your prospective employer will not necessarily be looking for someone with a particular specialism. They will just be looking for someone who is keen on the field of Computer Science and shows the prospect of being useful.

To me this means a candidate who has done lots of things that aren't on the syllabus and can talk about the things that they have made "just for the fun of it" as well as the good grades they got on their set coursework.

So, do the thing you enjoy, and do stuff that you haven't been told to do, and don't worry quite so much about being a good fit with what you think the employers might want. 

Make Your Documents Work for You

I've spent a chunk of today performing Seed exit vivas. This is where students on our industrial placement module have to come along and explain why they should get 5 out of 5 for Project Management. Or whatnot. We discuss things for 45 minutes or so and finally agree on figures for each marking category. Sometimes the figure goes down, but in a surprising number of cases we end up delivering the happy news that we think that they have undervalued their work. Which is nice.

One thing I like to do is point at pages that have been supplied as part of the thick folder of documents and ask "What's that for?". This can be quite illuminating. For example:

"What's that for?"
"It's the Risk Analysis."
"OK, where did it come from?"
"Well, at the start of the project we wrote down all the risks we could think of, and that's the result."
"Did you ever look at it again?"
"No. Should we?"

.. at which point the conversation goes downhill a little bit. Risks should be identified and then tracked over the project. At regular intervals the document should have been produced and checked over to make sure that nothing has changed, and that none of the risks were becoming critical. 

If you are going to take the trouble to make a document that is part of your development then you are making an investment in your time. It is important that the investment pays off. Documents should "earn their keep".

The Risk Analysis document should be checked and updated at regular intervals to make sure that risks are managed. Minutes of meetings should record who was there, what was said, give people actions and check on actions from earlier meetings. Specifications should be signed off. Tests documents should be acted on and then the results of the tests recorded and used to drive future development. I could go on (and in fact I did - quite a bit). 

I got the feeling that some of the documents were shoved in "because we thought we had to write them". I also got the feeling that some folk thought that writing all this was a distraction from the proper job, which was creating the solution for the customer. However, this is very, very, important stuff. It can make the difference between success and failure in a project. And doing it right will definitely get you higher grades....

Of Garbage Man and Banjos

AlienBanjoAttackShot.PNG

This year the first year students can create a game as part of their programming coursework. The game is "Alien Banjo Attack" and above you can see a screenshot of my demo version. At the top are various different types of evil banjo and your mission is to defend the earth using nothing more than your accordion which will shoot notes of music to destroy the incoming swarm.  If the banjos reach the bottom of the screen you lose the game. If you crash into a banjo you lose a life. Space Invaders with a banjo feel. What's not to love?

I want there to be loads of objects (notes, banjos etc) on the screen at the same time, which means a lot of instances of the various classes that represent the game objects. There has been some discussion about the best way to handle this, as banjos are destroyed and new ones appear.

I reckon the best way to do this is to make a banjo stateful. The banjo will contain a flag to represent whether or not it is involved in the game. If a banjo is destroyed this flag set to indicate that it is dead and the banjo plays no further part in the gameplay. When we need a new banjo we just have to find one which is marked as dead and "resurrect" it by moving the banjo to a new position and then changing the flag to bring it back to life. 

You might think that creating new instances of banjos to replace discarded ones would be another way to achieve this behaviour but I don't like that at all. If we start creating and destroying objects this will make work for the Garbage Collector who will have to come in and tidy up memory every now and then which might slow things down a bit. 

Another way to reuse objects would be to keep a list of "dead" banjos. Each time a banjo is killed we move it out of the "active" list into the list of dead ones. That way, if we need a new banjo we just have to look in the list. This is a bit more complicated than just searching for a banjo that can be resurrected, but it does have the advantage that the game never wastes time working through "dead" display objects, as these are no longer in the active list.  Many operating systems use this technique in what is called a "thread pool" where previously used threads are kept ready for use by processes that might need them.

Phil and Stuart talk iPlayer Transcoding

Phil Cluff and Stuart Hicks getting ready for the off

Phil Cluff and Stuart Hicks getting ready for the off

We love it when ex-students come back to the department to tell us what they've been up to. Phil Cluff graduated a few years ago and Stuart Hicks a couple of years after that. Since graduation they've both been working for the BBC on the iPlayer team.

If your'e not from the UK you might not appreciate just how wonderful BBC iPlayer is. It lets you consume BBC TV output on any device, in any place. All the output of the BBC channels, plus the content from BBC local TV, is made available shortly after broadcast so you really don't have to worry about missing things. The programmes are available for a number of days after transmission and they are also made available for download. It's awesome. Phil and Stuart  told us that 3% of the consumption of BBC output is now via iPlayer and it is growing rapidly. There have been around 28 Million downloads of the iPlayer apps across the various platforms,  they get around 10 million requests a day and release around 700 hours of new content a week.  Amazing stuff. 

Of course, getting all that content onto the internet is a tricky job. The original content has got to be converted into the different video formats and sizes that are consumed by the users, and this must happen as soon after broadcast as possible. Nobody wants to have to wait until tomorrow to see today's news. There are huge peaks and troughs in demand and this makes it tricky to manage the computing resource that you need for the task.

A little while back the BBC decided to move their transcoding (as this is called) activities into the cloud. Phil and Stuart described the cloud as "A collection of services that someone else runs for you, which lets you not care about hardware". Which is just what the BBC needed. So they put together a project to build a cloud based "Video Factory" from scratch in 18 months. And they did it. 

Large computing projects are notable for failing or being expensively late. The Video Factory was neither of these things. Phil put this down to the way the design was based on a number of small components, each of which performs one step in the process of getting the video into your iPhone or whatever. The workflow of the video data through the system is managed as a series of messages that are passed from one component to each other. Components take messages out of their input queue, perform their part of the process on the message and pass it on to the next component.  If things get busy and queues start to fill up the system can "spin up" extra processing resource in the form of further servers in the cloud which can run components to work on the extra elements. 

And because each component has very well defined inputs and outputs this makes testing much easier. Phil and Stuart talked about Behaviour Driven Development, which is where you express what you want to achieve in terms of business outcomes. They used a language called Cucumber which lets you express your requirements in a really easy to understand way, and will also produce a whole framework of tests you can use to prove the solution. This is very, very interesting stuff.

It was great to see Phil and Stuart again. It looks like they are doing very well. It was also really nice to hear them echo a lot of the things that we tell our students during the course.  They had some very useful things to say about the craft of development. You can find the slide deck of their presentation here. I strongly advise you to take a look at the last few slides, there are some lovely references there that you can use to find out more.

Computer Science and the Hour of Code

Sunday's paper had a big section on how important it is to learn to program. I agree. There was a lot of kerfuffle a few weeks ago when Lottie Dexter, a Conservative activist hired to manage a "Year of Code" initiative, admitted that she didn't know how to write software. Lots of people got very cross about that and they got even crosser when she went on to say that you can learn how to teach programming in a day or so. 

Oh well.  I think/hope that what she meant to say was that you can learn to do something useful/fun with a program and tell someone about it in one day. And I suppose that you don't need to be an active practitioner of a field to promote it. Nobody seems to expect the head of British Airways to be a qualified pilot. 

You can watch the item on Newsnight here. For me the most unwholesome aspect of the whole interview is offhand manner of the interviewer, Jeremy Paxman, who gives the impression that learning how to instruct the machines that control our lives today is somehow beneath him and that's how it should be. Oh well again. 

Of course if Lottie Dexter had said "Learning to program is very hard, takes ages,  and you need an enormous brain to be able to be able to do it" she would have been in about the same amount of trouble. I reckon I've been learning to program for the last forty years or so, and I wouldn't put myself forward as that much of an expert. But I can get useful things done. And I know (mostly) which things are impossible to do with computers.  

For me the important thing about the whole learning to program business is that if you have a go you know what you can and can't do with computers. You get a feel for what is possible and you are motivated to have a go. And maybe you'll like doing so much that you will decide to learn a bit more....

And with that I present "Rob's Things to Do if you want to learn about computers (or indeed science and technology in general)"

  • Have a go. Go to the hour of code site and have a play.
  • Don't expect to learn everything at once. One of my favourite tweets was from someone who said "It seems to have taken me around 20 years longer than I expected to become a good programmer". It won't take you 20 years to learn how to do be useful with a computer, but just like a great concert pianist is always finding new things in pieces of music and new ways to express them, programming has a lot of art in it and you are always discovering new ways to do things. But you can't ever say you've learned all there is to know about the subject. The good news though is that you don't have to.
  • Have an aim in mind. Programming is all about making things. You can't just learn it in isolation, you have to be building something. Set out to make a silly game, a clever web page, a moving robot, a musical instrument or anything that you find fun and engaging.  I'm presently trying to make some remote controlled table lights for a wedding. I've learned about a whole new branch of embedded technology and I'm still finding out new things. I'm having a great time. And I might even make some lights by the end of it. 
  • Find friends. Working together is much more fun. You can solve problems by just explaining them to each other. Join a group. Start a group. Look for STEM ambassadors in your area and find out what is going on. 
  • Keep it simple. Get something simple working and add things to it. Don't have a huge, complicated idea and then fail to make it work. 
  • Remember it is supposed to be fun. If you start to fret about whether you can make your thing work then take a step back, change what you do, or go and listen to music or play computer games for a while and then come back and try again.

I've no idea how my life would have turned out if my school hadn't pointed me at a teletype connected to a computer all those years ago. But I'm jolly glad they did. Perhaps you should take a look too. 

What does ?? mean in C#

WP_20130829_17_11_38_Pro__highres.jpg

I sent out a tweet asking folks if they knew what the ?? operator in C# does. Quite a few people did, which was nice. For those of you that haven’t heard of it, here is my explanation.

If you are a C# programmer you will know about references. The way that C# works, a variable can be managed by value or reference. A variable of value type is stored in a location in memory. For example:

int i;

This creates an integer variable called i. When the program runs the system will decide where in memory the variable should live. You can think of computer memory as a whole bunch of numbered locations. Because that is what it is. Perhaps the variable i could live at memory location 5,000. Then when we write the code:

i=99;

The effect of this is to put the value 99 in memory location number 5,000

So, if I write code like:

j = i;

This would copy the value held in memory location 5,000 into the location where the integer variable j is stored.

This is how value types work. Whenever we do an assignment of a variable managed by value what happens is that the value is copied from one memory location to another.

So, what about values managed by reference? Well, in these there is a level of indirection between the variable and the actual data. Suppose we have a class called Account, which is managed by reference.

Account a = new Account();

This statement makes a new account variable and then sets the variable a to refer to it. The variable a will hold a reference to the new Account. Perhaps the new Account will be located at memory location 10,000 which means that the variable a (which might be stored at location 6,000) will hold the number 10,000 – because that is where the Account instance is stored. The Account class might have a Name property, so I can write code like:

a.Name = "Rob";

When the program runs it will go to location 6,000 (where a is stored) and read the number out of there to find out where the Account is. In this case the variable a holds the number 10,000 and so the program will go to the Account there and set the name.

So if write code such as:

Account b = a;

This creates a new Account reference called b which refers to the same Account instance as a, in other words it will refer to location 10,000.

So, in the case of the value the information is just there in memory, but for a reference we have to go where the reference refers. With references you can also set them to null:

a = null;

This has the effect of putting a “magic value” in the variable a that indicates it really points nowhere.  It is a way of saying “this reference does not point to any object”.

The null reference is often used in programs to indicate that the thing you asked for could not be found, or hasn’t been made yet, or doesn’t matter.

This “nullability” is so useful that people wanted to be able to make values “null” as well. So they invented one.

int? ageValue;
The addition of the question mark makes an integer variable (ageValue) that can be made null. For example, the bank might store the age of a customer when it is important (when the age is less than 20 say) but once a person reaches a certain age, from then on the age is completely irrelevant to the system. They can mark the ageValue as null to indicate this.

ageValue = null;

Programs can also test for null

if (customerAge != null)
{
// Stuff you do if the age matters
}

In XNA you often find nullable value parameters being sent to method, so that the method can know to ignore them.

So, I’ve been writing for what seems like ages, and I’ve still not explained what ?? does.

Well, ?? provides a convenient way that I can map the null value of a nullable variable onto a specific value

int actualAge = customerAge ?? -1;

It saves us having to write code that tests for null and sets a value appropriately. The above statement sets the value of actualAge to the value in customerAge unless the value in customerAge is null, in which case it sets it to –1.

if (customerAge == null)
actualAge = -1;
else
actualAge = customerAge;

In other words ?? is a short form of the above test.

Kids Can’t Use Computers?

DSC02256_7_8.jpg

A while back I read this post titled “Kids Can’t Use Computers – And This Is Why It Should Worry You”. It depressed me. Not because I agree with all of the sentiments of the author, but more because I worry that people may take this kind of viewpoint seriously. The underlying tenet of the article (I’ll save you the bother of reading it) is that only a few people can actually use computers, especially kids. And this is apparently a bad thing.

The author brings out a collection of “horror stories” of people who “can’t use computers” for one reason or another, including not knowing that their laptop has a WiFi off switch. It says a lot for the nature of computer folk that a good portion of the debate following the post is an earnest discussion of whether or not laptops should have such a switch. I’ve actually been caught out by this myself, and therefore, by definition, am unable to use a computer. Oh well. Back to the drawing board for me.

The author completely misses the point that it is perfectly reasonable to expect that things should just work. Engineers have been very good at producing products that “just work”. I can remember when starting a car in cold weather was a tricky affair which was involved lots of fiddling with the choke and pumping the accelerator pedal. Nowadays you just push a button.

Cars have been around for well over a hundred years, and their fundamental function has not changed over that time. Computers have been around for much less than half of that and we are still discovering new things we can do with them. I think it is safe to say that we have a much better handle on how to make a useable car than we do a useable computer. Which means that people will sometimes have problems getting their systems to do what they want.

The current generation of computer hardware and software is very prone to failure when confronted with frailty of human nature in all its forms. But people will translate their experiences with their cars to computers. Why should the computer not just work when I press the button?

It is interesting to watch kids use technology, to see them running their fingers over the TV screen and expressing surprise when it doesn’t respond to their touch. To see them start watching a video on an iPad and then become confused when the video doesn’t just continue on the TV when they turn it on. They are going to see these things as omissions that need to be fixed, not evidence that they are too stupid to use those devices. And quite right too.

So what do we do about it? The author suggests a “back to basics” approach so that everyone can learn as much about the low level functions of computers as he is evidently proud of knowing. Then we can all argue about the best version of Linux to run on a mobile phone while our kids throw things together and build the future out of what sticks. Just like it ever was.

It is very important to learn the low level stuff, to have an understanding of the limitations of the computer, what they are really good for, and how you get them to work for you. Learn to program. Absolutely learn to program. But then use that ability to make things and have fun making things. Take ideas (silly and otherwise) and give them life. Spread them round. Build solutions and games that make people happy (or at least smile). Learn about the technology by playing with it. If it turns out that your ultimate interests lie outside the realm of processors and memory then that’s fine, but an understanding of what a computer is good for and how you make it work is always really useful to have.

Never regard your skill with computers as marking you out as in any way special unless you can do everything else as well. I can write code, but I can’t draw for toffee. If an artist comes to me with a computer problem I’ll not call them stupid if I can solve their problem and they can’t. Because I can’t do what they can. 

A proper computer professional is as good with people as he/she is with computers. In fact, bearing in mind that a lot of computing is finding out what people want and then making a happy ending from their wishes, I reckon that good inter-personal skills are more important in computing than they are in just about any other field. Work at these as you would a new programming language. And learn to write well.

For me computing is all about having fun making stuff and engineering happy endings. I’m not looking for a future where everyone is “clever” enough to use a computer. I’m looking for a future where we have enough people around who are able to produce compelling and useful applications for those who want to use computers to make their lives better. In other words, computers should “just work” and we should have folks around who are skilled enough to make this happen (and enjoy doing it).

Writing with Colour at the Guardian Masterclass

DSCF3564.jpg

Anyone can write, just like anyone can cook. As soon as you move from restaurants and ready meals to getting ingredients and mixing them in pans you can start thinking about getting a white uniform and people shouting “Yes chef!” to you across steam filled kitchens. Moving beyond shopping lists and one line Facebook updates means that you can start pondering putting “writer” on your business card and extracting killer quotes from unresponsive interviewees. Or then again, perhaps not, because of course the really important thing is what everyone else calls you.

If you are the only person that thinks you are the next Jamie Oliver then you might have a hard time getting folks to eat your food.  And while the internet does provide a potential audience of billions, getting them all to come and read your web site will take more than just your idea of deathless prose. This means that you have to do the hard stuff, like practice and learning how to get better.

I’ve never dared call myself a writer; I’m more someone who throws a bunch of words at a blog post every day to see which ones stick. But today I went along to a Guardian Master class called “Writing with Colour” to find out a bit more about this writing business. There was actually another reason for going as well, the sessions were being given by writers who I’d long admired from afar, and I liked the idea of admiring them from a bit closer up.

There were about 80 or so of us on the course, which took place in the actual Guardian newspaper building in London. The sessions were all great. If you have a low opinion of journalists and editors then you should go along, just to find out how thoughtful and considered these folks are about what they do.  I’m pretty sure that not all writers are like this, but these were folks who I’d be happy to listen to all day, which is just as well, because that is what we did.

A few of my thoughts from the sessions:

Read what you have written. Out loud. All the writers took evident pleasure in reading what they had put on the page. This is as confidence thing I reckon and darned good advice. Sometimes you might like what you hear. If you don’t like it, go back and change it until you do.

Be loyal to your work. This can mean a bit of internal wrangling as you seek permission to print that quote from a reluctant interviewee. It might mean you can’t be a totally nice person all the time. And it might mean dropping that wonderful sequence because it doesn’t add anything to the piece.

Always deliver what you were asked for. Someone asked Lucy Mangan what she did if her four o’clock deadline came along and she hadn’t thought of anything to write about.  Her reply was brilliant. That. Does. Not. Happen.  If you are a proper writer and you are asked to write something that’s what you do. You can wrestle with your inner demons about the content (and you should) after you have pressed the send button, but the important thing is if you are asked for 550 words you should deliver 550, along with a convincing pitch for why you should be allowed another 200 or so.

Always edit, and always cut. The editor is the person who makes things better and tighter, sometimes by cutting out what the writer thought of as the best bits. If we end up losing the traditions of print journalism I reckon the editor is the person we will miss the most. This probably means that writers will have to spend more time editing their own work. So try to do this.

Work at what you write. I was very pleased to find that nobody said that they found writing easy. Everyone said they had to work at it. Interviews take preparation and persistence in writing everything down. Features take research and rewriting.  And the work doesn’t stop when the piece is finished, everyone valued re-visiting items and look at why they wrote what they wrote.

Seek out the colour. Work to find that killer fact, or interesting angle, which will give you a hook to hang your words on or will be quoted in the pub by your readers. If you are very lucky the colour will find you, but mostly you find it in the research you did, or the huge pile of notes that you made.

Last week I sent a jaunty tweet to the organisers saying how I was bringing along some crayons, as the subject was “Writing with colour”. I can imagine the sinking feeling in the stomach of the recipient, along the lines of “We’ve got a right one here….” Sorry about that.

Anyhoo, I found the whole affair really stimulating, and if you want to get tips about improving your writing style, and maybe meet a few heroes, then it is well worth the price of admission. And the lunch was good too.

DSCF3560_1_2.jpg

Think of the Audience

DSCF0504_5_6.jpg

Some time back I wrote a blog post about the most important thing in a project. To save you the trouble of reading it again, I concluded that the biggest risk that any project can run is that you might not  be able to make it work.

I’ve been thinking about presentations in a similar light having seen a bunch over the last week at the Imagine Cup. So, what’s the most important thing in a presentation. Is it the script? The demos? Running to time? The jokes?

Actually I reckon it’s none of these things. The most important thing in any presentation is the audience. If you don’t build your presentation with them in mind then it will not go as well as it should.

Thinking about the audience begins at the start, when you worry about whether or not what you are going to say will make sense, has the appropriate level and the like. I reckon that the thing an audience likes the best is a story, so presentations that have some kind of narrative flow are going to go well.

During the presentation you should be watching the audience to make sure that what you say is going down well, and don’t be too afraid to change tack. Asking questions to confirm that you are going in the right direction is a good idea too. It builds your confidence and establishes a rapport.

If you are now thinking “Great, now I have to worry about watching the audience as well as everything else…” then I’m sorry about that, but I think it s something to keep in mind. For me the worst presentations are where the presenter just talks at the audience. You should try and make the presentation a conversation as much as you can. With very large numbers this can seem a bit daunting, but remember that an audience of 10,000 people is actually made up of 10,000 individual people.. If you think in terms of talking to just one of them, then that will help you manage this.

For me the best presentations I saw last week were those that engaged the audience from the start. So see if you can do the same when you stand up and start talking.

Heading for London

DSC_0196.jpg

Ian Livingstone playing the game that got us to London. He reckons we should set it in space, and I think he might just be onto something….

To start with, a bit of history. What seems like ages ago David, Simon, Lewis, David and me took part in Global Game Jam Hull. And we built a game. Then Microsoft offered a prize for the best Windows 8 game that came out of the Game Jam and made it into Windows Marketplace. So David and Simon took the game engine, made it marketplace ready and shipped it. And Heartotron won. At the time we weren’t sure just what we had won, but part of the prize turned out to be a trip to London to tour a game development studio and meet up with some of folks who made the game industry what it is. And so we found ourselves on a train at 8:00 am in the morning, speeding through the sunshine and looking forward to an interesting day.

Which we got in spades. It was great. First up was a look around Lift London, a shiny new Microsoft game studio with a focus on making games the Indy way. No big (or at least huge) teams, flexibility, appropriate technology and total commitment to the product are the order of the day. Lift London also sees incubating fledgling games developers such as Dlala as part of their remit, which is very interesting.

With people on the team who can say things like “..and then we went on to write Banjo-Kazooie…” or “..and then we did Singstar..” alongside folks who have grown up writing and pitching games any way they can I reckon we can look forward to some fun stuff in the future.

We got to look at some work in progress, which was fascinating for me. The transition from ideas to drawings to models on the screen was intriguing.. I get very jealous of people who can draw, and loved seeing these people in action, and how they can turn out lovely looking artwork with just a flourish of their Wacom pens. Most impressive.

Then it was time to move on to Modern Jago, a pop up venue in Shoreditch, for workshop with the gaming legends Ian Livingstone (Vice chairman of Ukie, Games Workshop founder, president at Eidos), Andy Payne (chairman of UKIE and CEO of Mastertronic) and Philip Oliver (TIGA board member and CEO of Blitz Games Studios).

Each had plenty of time to speak and plenty to say. Some things that I took away, in bullet form:

  • There has never been a better time to be writing games. Cheap tools and an easy path to market give you amazing potential.
  • There has never been a more competitive time to be writing games. The statement above is true for everybody.
  • Put your monetisation in at the start. If you are going to sell your game, be up front about that (although recognise that very few people will part with cash to buy it). If you have a pay to play model, put that in the central game mechanic. It is impossible to add it later.
  • Use metrics and feedback. Track downloads, watch for reviews and scores, use telemetry to be able to tell how far people get through the game, how long they play for, and when they give up. Phase your releases so that you get feedback from one part of the world (for some reason Canada and New Zealand are popular for this) before you go global.
  • Look for the “soft” market. A big splash in a small pond with a future has more potential than trying to make an impression in a huge marketplace with scant resources.
  • Get a following. Napoleon reckoned that with 1,000 followers you have an army of your own. He wasn’t on Twitter, but you can be. Give opinions and help to people out there and build a following of folks who like you. A crowd who like what you do and want to see what you do next are great to have around. Be loyal to them and they will replay you by supporting what you do.
  • Get a job. You might plan to be a lone gunfighter releasing your fantastic stuff for the world to marvel at, but it much easier to do this with a roof over your head and a full stomach. One of the things you need to succeed is luck (everyone said this). Napoleon (him again) reckoned that he always preferred his lucky generals to his clever ones. If the luck isn’t there, and it may not be, you still need to eat. Make time for your development and go at it full tilt, but there’s nothing wrong with having a backup plan.
  • Get some “skin in the game”. This kind of goes against the above but I still feel it is something to think about. If you feel that the stars are aligning and that this is “the one” then feel free to go for it every way you can. Living in a van for six months while you raise funds and build your product base might be the thing you have to do to achieve success. Worst case maybe your boss (see above) will have you back – particularly if you part on good terms.
  • The three most important aspects of a game are playability, playability and playability. But graphics and high production values are a way to distinguish your product and get people in to discover just how good your game is. But this, of course, costs money and time. As does buying a place in the charts that gets you noticed, something else which might be a necessity.
  • Put yourself out there. I was particularly pleased to hear this one, as it chimes with what I’ve thought for years. You need to be able to do the “front of house” stuff. This is really bad news for software developers, who tend not to be the most extrovert folks, but it is a necessary skill. Get yourself in front of people. Practise doing stand-up, meeting and greeting and networking. If you don’t have these skills other people will not compensate for your lack of them. They’ll just find someone else more interesting to talk to. Ian Livingstone himself said that this is one lesson he took a long time to learn. The great thing about computer folks is that they are used to picking up new tools and APIs. Treat this as just another thing you have to learn and get good at.
  • Make a good story. The press is interested in you, but only if you are interesting to them. “I’ve made a fishing game” is not useful to them. But “I’ve made a fishing game that I wrote underwater whilst wrestling a Great White Shark on the Barrier Reef” is. Make sure that you have a good tale to tell. Use your followers (see above) to big you up and help you get noticed.
  • If you are working as a team, set some ground rules (another favourite of mine). Have a plan for what to do if your lead developer gets a “proper” job and stops writing your game engine. Have a protocol and a policy for re-negotiating your arrangements when these events happen.
  • You don’t need to be the best, or cleverest. Just there are the right time, in the right place, with the right thing. Try lots of things, in lots of places as frequently as you can. Don’t expect success to happen the first, second or perhaps even the third time. But as soon as you get a sniff of something that seems to be working, follow it, develop it and ride it, and you might be the next big thing.

All in all a fantastic day out. Thanks to Microsoft for setting it up.