Here's something nice for a lazy Sunday. Red Hat's Alan Cox, currently on leave and getting his MBA at Swansea University in Wales, spoke recently at the launch of an advanced technical computing group, run by IT Wales, part of Swansea University's computer science department, and they have been nice enough to let us enjoy a video [that's the mpg -- Real Player and DivX here] of his speech. His theme was how to write better software. If you prefer to read about it, here is the introductory article, and page two, excerpted highlights from the talk.
Gartner's Vice President Victor Wheatman gave a speech recently too and he says don't look to Microsoft to solve all your security problems. He said that there are only 500 or so software engineers in the entire world who have the skills to figure out software glitches and fix them. I'm not sure whether he means only 500 outside of Microsoft that a company could hire to help them or if he means that is the grand total on the Windows partition side of the world.
There's no shortage of manpower on the free and open source side, for sure. Companies might want to think seriously about the implications of what Gartner is here saying. Once they do, they may decide that they are using the wrong operating system.
Wheatman says the world has been beta testing Microsoft's products for them. By that he apparently means that Microsoft releases software before all the bugs are out, and they use us to find them. That wouldn't be so bad, if it was clearly labeled as a beta release, you didn't have to "upgrade" to the unstable beta unless you wanted to, and there was a process in place for feedback and addressing the bugs found. Wait. That'd almost be the open source method of bug fixing. The missing piece is that you can fix it yourself, if you know how, in FOSS and then contribute your fix to the world, so they don't have to figure out what you already fixed.
Of course, that isn't how it works in the Windows world. How it works there is you find out you have a security issue when someone takes over your computer without your awareness and sends spam to the FTC in your name or something. You realize it only when you get a response to your "offer", somebody's spam blocker rejects "your" spam, or you get an email from "yourself" to you offering you Viagra or something else you don't need or want. Do you want your company to be that kind of beta tester? Here's a snip from the Gartner speech:
"'We've all been part of the biggest beta test the world has ever known--Windows. Microsoft will not solve all of the security problems, no matter what the richest man in the world says,' said Gartner vice president Victor Wheatman in a keynote speech at Gartner's IT Security Summit on Monday.
"Wheatman kicked off the conference saying that removing faulty software during operation was costing firms up to 5 percent more than finding flaws during quality assurance tests.
"'One of the problems is that there are maybe only 500 software engineers in the world who can burrow around in that code to find the problem. That's something the industry needs to look at,' he said."
Well, we aren't *all* part of that beta test. I finally got rid of my Windows partition, because I realized I hadn't used it in about 8 months.
Cox addresses the Microsoft beta test phenomenon and explains it:
"Another factor that's led to the current state of affairs is that of canny software companies which shi[p] bad software as quickly as possible, on the basis that once the end user has one piece of software for the job it becomes harder to switch to another one - in that context, Cox considers Microsoft's release of early versions of MS Windows as a very sound economic and business decision. . . .
"You take a WinXP machine, you plug it onto the internet, on average you have 20 minutes before it is infected with something, if it's not behind a firewall. That is considerably less time than you need just to download the updates. These are becoming economic issues, because they're starting to cost businesses all over the world astronomical amounts of money."
If you are a small company, with, say, 15 computers, and everyone uses the internet, just think of the mess you are in after 20 minutes. Think of what your company is looking at in the way of costs to take time to fix all the problems, if you even have anyone who can. Firewalls help, of course, and Cox says firewalls need to be on by default, but they are not impregnable, especially in Windows, because you can't easily see what is going on in proprietary software, and even if you take a peek, you are not allowed, legally, to change anything in Microsoft's software. You have to wait for Microsoft to fix what ails its software.
And not everyone takes the time to configure their firewall properly, sometimes because they don't know how. Anyway, the simple truth is, there's always someone more skilled than you are, unless you are Richard Stallman or Linus or someone like that. So it can happen in any operating system.
But I know from my Windows experience that it happens more readily in that operating system and that you often don't realize it for a long time, if ever. Proprietary software prefers that you not worry your pretty little head about things, just sit back and let them do everything for you. That makes it convenient to use but hard to control. When there is a problem, it compounds the problem. You may be oblivious to a problem going on right under your nose, anyway, because you can't readily poke around and look to see what is happening.
But quality control in software isn't just a Windows issue. Cox suggests it's time to recognize that all humans make mistakes, so all software will have bugs, and so it makes sense to use computers to do what they are good for, using such things as computer validation tools, mathematical models, defensive interfaces, scripted debugging, and rule validation, as well as document trails and statistical analysis to keep the error count as low as possible. His most interesting point, I thought, was what he has to say about root cause analysis:
"I've got a friend who works on aeroplanes, and he has the wonderful job of, when a piece of an aeroplane falls off, cracks, or something before it was supposed to, they go to him and say 'why did it happen?'. And it's then not a case of saying 'oh, this analysis is wrong', it's saying 'how did this analysis come to be wrong? How did it make this wrong decision? Where else have we made this decision?' People are starting to do this with software.. . .
"All of this sort of analysis then leads back to things like, what tools didn't we use? Are our interfaces wrong? And because you're able to actually start digging in and get data, you can start to understand not only the 'oh, it's failed, I'll fix it', sort of the car mechanic approach to software maintenance, but actually the need do the kinds of things that should be done and which go on elsewhere, where you say 'Why did this fail? Where else have we got this? Where else will it fail? What should I do proactively? How do I change the software component involved so it can't happen again, or so that it blows up on the programmer when they make the mistake, not blows up on the user when they run the software?".
Here's SANS's Top 20 internet security vulnerabilities:
This SANS Top-20 2004 is actually two Top Ten lists: the ten most commonly exploited vulnerable services in Windows and the ten most commonly exploited vulnerable services in UNIX and Linux. Although there are thousands of security incidents each year affecting these operating systems, the overwhelming majority of successful attacks target one or more of these twenty vulnerable services.
They provide remediation instructions, too, and what better day than a lazy Sunday afternoon?