The logical (and arithmetic) binary left shift effectively doubles a number by moving the digits one position to the left, and inserting a zero in the least significant position. In practice the number of bits in a processor’s arithmetic register is fixed, so individual bits fall off the left side with each shift, and if you place a 1 at the least significant position then it will grow with each shift until it too falls off the left side. You can imagine this happening all at once, or just one bit at a time, though in the latter’s case you need to start the movement from the left towards the right or you’ll blitz the entire register. The rule is [click title to read more…]
What follows is by way of explanation for possible observers of an annual phenomenon.
JB sat in his chair one New Year’s Eve as the minutes ticked closer to another beat of the 1980s, surrounded by family and their their “young adult” friends, already well lubricated after a few hours of merriment. If you wanted to laugh until it hurt, this was the house to be in. As distant church bells started to greet the new year, we (for this is my family and I was there) heard a commotion outside in the street, vulgar language from some people who had obviously over-lubricated.
Leaning forward and raising himself from the chair like Old Man Time himself, JB made for the [click title to read more…]
As the kindling of 2017 takes hold, a new concept is slowly rolling across the land. The compound word that best captures it is “post-truth”, which was declared by Oxford Dictionaries to be the “Word of the Year, 2016“. It refers to the nascent contemporary period in which self-evident or provable truth is no longer the generally accepted truth by a growing proportion of society.
Why is this? What would make seemingly intelligent people say or do things that are more in keeping with Alice’s “up is down” Wonderland?
Let’s suppose that most intelligent people will accept the following: “if we are informed that A is true and we know that A being true implies that B is true, then [click title to read more…]
Everyone is busy predicting what’s going to happen in 2017. So, while the eternally optimistic are having their annual conflab with the doom-and-gloom tribe, I’ve just had a peek over my shoulder to remind myself of what’s just gone.
January saw both Alan Rickman and Terry Wogan shrug off their mortal coils, while scientists finally completed row 7 of the periodic table of elements (and thus one of my favourite temporary names, ununpentium, is no more). The departure of Irish stars continued in February with Frank Kelly (feck!) and we had another General Election, which was inconclusive and resulted in weeks of negotiations before a government was formed. We also had LIGO’s announcement of the first observation of gravitation waves! [click title to read more…]
Recently, the poet wife of a good friend left this world. She had once asked me what I missed most about my days as a lecturer. Without hesitation I said it was witnessing those brilliant moments when students suddenly understand something complex or see, for the first time, the elegance and beauty in a software algorithm. She could relate to that.
Yesterday, purely by coincidence, I decided to open my most recent reprint of Knuth’s Art of Computer Programming and start from the beginning, as I did many, many years ago. I have no idea why, but if you believe in “those kind of things” you may be stunned by the opening line from Chapter 1 (emphasis mine):