The Deputy Presiding Officer (Christine Grahame): The next item of business is a debate on motion S5M-05733, in the name of John Swinney, on safe, secure and prosperous: achieving a cyber-resilient Scotland.
15:13
... ... ...
15:41
Stewart Stevenson (Banffshire and Buchan Coast) (SNP):
On 9 February 1984, we saw the launch of the first real-time, high-value money transfer system: the clearing house automated payments system, or CHAPS. I was the project manager for the Bank of Scotland, which was the first bank ready to implement. I well remember our excitement later that year when we made our first real-time, irrevocable payment of over £1 billion pounds. By 2011, the system had processed £1 quadrillion of transactions—in other words, a thousand million million pounds, or a 1 followed by 15 zeros.
To secure the transactions, I had to gain permission from the US Department of Defense—and sign my life away—to use what was categorised as weapons-grade encryption and digital signing software. It operated from within a black box that self-destructed if someone attempted to open it to examine its contents. The technology was—and is—as secure as one could possibly imagine, and the objective today should be to ensure that every business and individual is in possession of similarly impenetrable security. We are, but we do not all choose to implement it. My point, however, is that even if we do so, we do not necessarily use it in a way that allows it to be as secure as we might imagine it to be. For the most part, it is not the technology that fails; it is humans who fail.
The motion says that
“citizens ... must be aware of the risks”.
Indeed, in his opening remarks, John Swinney said that this should not be the responsibility of the Government alone. The history of human failure to properly use secure data systems goes back a very long way. Two thousand years ago, slaves had their heads shaved. A message was written on their scalp; the hair grew back; and the slave and the message were sent elsewhere. That was all well and good—until people realised what method was being used. Having a secret method provides no real security, and that remains true today.
Indeed, effective data security systems rely on their having been published and scrutinised to confirm that their methods are sound. However, we need to keep the keys secret and change them frequently. In the 16th century, Mary Queen of Scots used a two-cover system to protect her confidential messages. The first was a secure box with two locks and a key for each—she had one key, while the other was held by the recipient; and no one else had access to either key. Mary put her message in the box, she locked it and then it went to the recipient, who used his key to lock his lock. The box came back to Mary, who unlocked her lock, and went back to the recipient, who unlocked his. It was a secure system for transmitting a message from A to B in the 16th century, because nobody shared the key or had access to it.
The second aspect of the system was encryption of the message inside the box through a letter-substitution system. However, that is where Mary fell down. She thought that the system was totally secure, because transmission was secure, but when the message came out of the box, she forgot that it was now a bit of paper that was available to anyone who might be passing. Queen Elizabeth I picked up one of her messages and was able to unscramble it, and it formed part of the evidence at Mary Queen of Scots’ trial, which caused her to be executed. Data security is quite important.
Napoleon had le grande chiffre—the great code. Common letters of the alphabet were not always coded in the same way, so that people could not break it by analysing frequency. However, encoders started to use some of the spare codes over and over again, as place names for where the fighting was, in order to save time and effort. Wellington’s code-breaker was a guy called George Scovell and, because of the weak way in which that good system was used, he managed to break in. When Wellington got to the battle of Waterloo, he knew what Napoleon’s plans were and that led to the end of an empire. Again, that was human error.
The Enigma machine, which the Germans thought was unbreakable until 1945, was actually broken by the Poles in 1932. Bletchley Park broke a later, improved version because, every day at 6 am, the Germans sent out an encrypted weather forecast. The fact that it was in the same format and at the same time every day enabled people at Bletchley Park to break what should have been a very secure system—of course, they had to do lots of other good things as well. Once again, there was human error.
Most of us know how to drive a car, but rather fewer of us know how the mechanical bits work or how to fix them when they fail. Most of us also know how to use a computer and perhaps even use the security functions that are provided with it. However, as with a car, if we do not get an expert to service it regularly or to fix it when it fails, disaster will loom. All businesses should have regular security check-ups. They will not be free, but the cost of not doing them will be even higher. It is like insurance; it is a product that a business cannot just buy when it wants it—when its reputation is trashed and its customers have flown, paying a little bit once a year will seem very cheap.
My final example of a security problem is from the modern world. I bought a good-quality second-hand car, as I usually do, and it had all the gadgets, including a Bluetooth connection for my phone. That is good technology, but an unaware previous owner of my car had left his phone’s entire contact list in the car’s memory. Do members realise that they could do that, too? I am a good guy and I deleted it, but suppose the chief executive—
The Deputy Presiding Officer: You are such a good guy that you have to wind up now, intriguing though this is, Mr Stevenson.
Stewart Stevenson: In that case, Presiding Officer, let me caution chief executives and chairmen of companies not to use Bluetooth in their cars unless they know how to delete data from the memory. I am a good guy and I deleted it, but not everybody is as honest and trustworthy as I am.
15:47