They are etched into the conventional wisdom of IT security, but are these 12 articles of faith (to some) actually wise, or are they essentially myths? We've assembled a panel of experts to offer their judgments.
1. There's security in obscurity.
David Lacey, Jericho Forum founder and researcher: Yes, there is. Not everything is known or knowable to an attacker. This uncertainty prevents and deters the vast majority of attacks.
Nick Selby, analyst, The 451 Group: No, there's convenience in security. Say you're trying to keep your kid from discovering the birthday party plans you're making, and you don't want the workaday toil of waiting until he's asleep to discuss them. So around the dinner table, speak German. Now, for protection of ... well, anything, it's just not on. Wherever you hide the front door, it is trivially discovered, so recognize you live in a bad area, get a strong front door with good locks -- and don't hide the key under the garden gnome.
Bruce Schneier, crypto expert, chief security technology officer at BT: All security requires some secrets: a cryptographic key, for example. But good security comes from minimizing and encapsulating those secrets. The more parts of a system you can make public -- the less you have to rely on secrecy or obscurity -- the more secure your system is.
Peter Johnson, global information security architect, Lilly UK: It can slow down the bad guys, but they will find out in the end. It is like closing the front door at home, and hoping nobody will try opening it.
John Pescatore, Gartner analyst: Only true within the bounds of the tried and true concept of 'need to know.' For example, keeping your password obscure is obviously a smart strategy -- only you have a need to know. ... Where this one falls apart is when the assumption is that 'obscurity means security.' This is never true -- and worse, when people design software with this concept in mind, all kinds of bad things happen.
Richard Stiennon, independent analyst: I was thinking about this in terms of Web application firewalls. There are 70 million Web sites but probably only a few thousand Web application firewalls sold so far. Most Web sites are protected by the principal of security through obscurity.
Andrew Yeomans, vice president global information security at an investment bank, and Jericho Forum member: Obscurity buys you time, but doesn't last forever. Obscurity can add an extra barrier, and may deter poorly resourced attacks. But a better-resourced attacker may succeed, and as costs keep dropping, may only need low-cost resources in the future. And once obscurity is lost, security is lost forever, too.
2. Open source software is more secure than closed source.
Yeomans: At least when open source breaks you get to keep the pieces, and might be able to glue them together yourself. Some open source software has been well inspected ('many eyes make bugs shallow') but conversely other open source software is relatively insecure. There's probably little to choose between comparable open and closed source software on pure security grounds. But open source has the advantage that you can do a code review yourself, or pay to have one done, and also that it is possible to fix problems yourself without having to wait for the vendor.
Lacey: They present a different set of risks. Neither is more secure than the other.
Schneier: Secure software is software that has been analyzed by smart security programmers. There are two basic ways to get software analyzed: You can pay people, or you can make the code public and hope they do it for free. Open source software has the potential to be more secure than proprietary software, but making code public doesn't magically make it more secure.
Johnson: At least you know what you're getting [with open source] -- but it requires a different approach to support it, particularly in a regulated environment.
Pescatore: This one is not that is not far off, but still not true. The most secure software is software that is developed with the most attention to security. Most open source development projects do not have much of a secure development life cycle. But I do believe that software developed knowing the source will be open is more secure than software developed that is depending on security through obscurity. Developers are less likely to build in Easter eggs, back doors and other stupid things when they know the source will be widely viewed.
3. Regulatory compliance is a good measure of security.
Lacey: Yes, it is. I have always found a direct correlation between the number of controls implemented and the level of incidents and vulnerability.
Selby: (laughter) Â
Stiennon: Obviously not. You can be extremely secure but not compliant. Just as you can easily be compliant but not secure. Â
Schneier: Compliance is a good measure of the regulation. If the security regulation is a good one, then compliance improves security. If it's a bad one, then it doesn't.
Yeomans: It's not always a measure of good security. Regulatory compliance will help provide a reasonable base level of security, and may make it easier to justify the budget cost. But it may sometimes lead to good security measures being non-compliant, and compliant measures being more expensive than is justified.
Johnson: There are usually many ways to comply with a regulation -- not all are as secure as the others. Experience has shown this, and now the regulators are starting to try to specify requirements, which is going to be difficult as they generally do not understand security.
Pescatore: No-brainer, dead wrong. Especially for something like Sarbanes Oxley, which has actually nothing to do with security. What we tell clients is: protect your business, protect your customers and then demonstrate compliance to whatever regimes you are under.
4. There's no way to measure security return on investment
Lacey: You can assess many benefits accurately based on historical statistics, but not every benefit is measurable and future benefits cannot be guaranteed.
Schneier: There are lots of ways to measure security ROI, all of them flawed. This doesn't mean we should top trying, however.
Yeomans: ROI makes a lot of sense for a vendor, much less for a purchaser. 'Prevention of a possible loss' isn't a gain, otherwise I'd be rich from not betting on the lottery! Some security investments have a measurable return, such as more customers or lower expenses. For example, the security measures allowing safe online banking and shopping has generated a positive return in those industries. But it quickly became a minimum requirement for doing business, especially for later entrants to the business.
Pescatore: There are plenty of ways to measure security ROI, but there are very few times when doing so makes any business sense. Have you ever seen a CEO ask what the ROI is in having a roof on the building, locks on the doors? The real issue is tying security into business needs -- the business needs determine the ROI.
Stiennon: There is a way today because most organizations have security budgets so they can measure spend on security and compare it to cost of improved security.
5. The Russian cybermafia is to blame for the worst online crime.
Stiennon: The RBN [Russian Business Network] is responsible for some of the most malicious malware and concerted attacks.
Lacey: Depends on what you mean by "worst." It certainly is responsible for a lot.
Yeomans: Traditional fraud committed with a computer beats them.
Schneier: They're certainly to blame for a lot of it, but I don't think we know enough to rank the various criminal organizations from best to worst.
Pescatore: This one isn't far off, but who cares? If your house is robbed because you left the windows open, does it matter where the thief came from? Close the vulnerabilities and you stop all kinds of cybercriminals.
Johnson: I would not like to comment to protect the safety of my family.
6. Antivirus software is essential to prevent malware.
Lacey: Yes, it is. Just try operating without it.
Yeomans: Only on some platforms with some types of user. Some people seem to attract malware, others don't. And desktop systems are more likely to be hit than servers, Windows XP more likely than Unix and Vista. The scale of production of malware variants also makes it even more difficult for pure antivirus systems to keep up. Expect a trend toward white-listing and sandboxing techniques and away from simply looking for known bad stuff.
Schneier: Antivirus software is necessary but not sufficient. I suppose if you have a really secure network, you don't need antivirus software on the hosts. But why take the risk?
Johnson: It keeps the noise down so you can concentrate on the quiet and dangerous malware that the traditional antivirus is likely to miss. It is still a must certainly in the Windows environment, but that is starting to be challenged based on the lower visibility of malware attacks today.
Pescatore: On the desktop, antivirus software is primarily a removal tool, not a prevention tool. In the e-mail flow and in Web security gateways, antimalware is a must.
Stiennon: Not a myth. The myth would be: Configuration management and a behavior-based solution can protect you from malware.
7. Outsourcing security is riskier than staying in-house.
Lacey: Yes, it is. You lose a massive amount of visibility and control.
Schneier: People are risky, whether they get a paycheck signed by you or one signed by the outsourcer. Focus on how those people are hired, how they are trained, how they are monitored, and how they are audited -- not on who signs their paycheck. Often, an outsourcer has more security measures in place than you do.
Johnson: Operationally, it makes little difference; understanding the requirement, setting the expectation, and then monitoring the compliance is the key.
Pescatore: If you need 24/7 coverage, choose a solid managed security service provider, and choose the right services to outsource -- then for three out of four businesses, this myth is dead wrong.
Stiennon: Outsourcers can hire better people and because they see more real bad things, they are better at reacting.
Yeomans: You can't outsource your liabilities. But specialists might beat the local generalist team. It all depends. A well-skilled in-house team will likely beat an outsourcer, but might not be able to provide 24-hour-a-day cover. And if an in-house team doesn't have the skills or time, the outsourced security will be lower risk.
8. Biometrics is the best authentication.
Pescatore: Only in the movies.
Yeomans: So long as you don't mind getting it wrong quite often. False acceptance rates and false reject rates will need to be understood. Biometrics fits some problems well, but not all.
Lacey: Depends what you mean by 'best.' It's the ideal approach but not yet perfected.
Johnson: At least you cannot forget it -- but it is a bit of a problem changing it regularly. As with many solutions, implementation is the key.
Schneier: Like all security systems, biometrics have value but are not a panacea. There are applications where they make great authentication systems, and there are applications where they do not make sense at all.
Selby: Have you ever stood at a door or a laptop swiping your finger like an idiot? Even a New York City MetroCard has a certain cranky rhythm. Now let's roll out some kind of biometric-device lock to all 61,000 of our employees. We're safe now -- Yes, Gretchen, put your eyeball up to the eyecup. No, look straight ahead. Not working? Maybe you're not really Gretchen, 'Gretchen.' Hurry up -- there are 43,600 people trying to get into that bathroom door.
9. Digital certificates identify a Web site.
Stiennon: Good one!
Yeomans: When used by good people and processes. Public-key cryptography is still mathematically good to identify a certificate, but it's only as good as the handling process of the certificates.
Schneier: Digital certificates can identify a Web site but who ever looks?
Lacey: They do if the recipient understands how to use them.
Pescatore: Extended validation SSL certificates do identify a Web site for those of us using new enough browsers to recognize them and who have actually figured out what a green URL bar means -- still less than half the users.
Johnson: But is it the right Web site, and a safe one? How many users know how to use certificates, and even if they do, what about all the advertisements, and other content feeds?
10. Employees can be trained to behave securely and resist social engineering online.
Stiennon: Love it.
Pescatore: This will be true when gambling casinos go out of business because people no longer fall for the illusion that they might actually win something.
Yeomans: Yes, but remember Abraham Lincoln said, "You can fool all of the people some of the time." Education will greatly help people detect many security problems, but there will always be some that get past even experts.
Selby: Porn on the DCI's laptop. That kind of says it all, about employees behaving securely. And resisting social engineering is really, really hard, as most people you'd want to hire are socially disposed to try to be, at the very least, helpful.
Schneier: We're human, and we act as humans do. Social engineering preys on our inherent human-ness. While you can train people to behave better, you will never be able to train them not to be human.
Johnson: The well-known saying comes to mind -- "You can lead a horse to water, but you cannot make it drink." Training in what to do raises the bar, and reduces overall incidents, but training users to think secure should be the goal.
Lacey: You can achieve a substantial improvement but people are not foolproof. See my forthcoming book, Managing the Human Factor in Information Security, due out in January for details of how to do this.
11. Don't worry, the government has a secret cyber-defense capability.
Selby: In the same drawer as its secret economic fix-it plan.
Lacey: It certainly does. How do you think they spy on other nations?
Yeomans: Of course it does. But unless you are in a business that cannot be allowed to fail, don't depend on the government to help you. They will have more important people who need help.
Schneier: If they do, then we really should worry -- because it's not working very well.
Johnson: They do, but it does not extend to defending your privacy.
Pescatore: Well, this is true but the secret strategy is to disconnect from the Internet. The strongest attacks are coming from cyber criminals, not governments or nations. The strongest defenses [that don't involve isolation] are seen in private industry, not government.
12. The longer the key length, the stronger the encryption.
Yeomans: And the greater the chance that the weak point is elsewhere. I've rarely seen key protection that was as strong as the encryption. If you protect a key with an eight-character 'strong' password, it won't matter whether the key is 64 bits or 256 bits. And don't forget the choice of algorithm matters when mentioning key length: 256-bit AES is far stronger than 256-bit RSA.
Lacey: For the same algorithm and key material, that's absolutely correct.
Schneier: A long key length is essential for good security, but it is not sufficient. There are lots of ways to break encryption systems that completely bypass the key length.
Johnson: Yes, but only if the algorithm, the implementation and associated processes are good.
Pescatore: When stronger means harder to brute force, this one is true. However, putting 10 locks on your front door just means your hinges are now the weak point. Brute forcing encryption is almost never the path of least resistance, so increasing key sizes (beyond recommended minimal lengths) is rarely any increase in security.
This story, "Myth or truism? Security experts judge" was originally published by Network World.