Month: February 2009
Enhancing Security and Functionality At The Same Time
Have you ever been sucked into the false debate over how much IT spending should be spent on security? I used to all the time. Some folks point to a rule of thumb that goes something like “ten percent of the IT budget should be applied to security.” That old school formula may well be part of the reason we got into the mess we are currently in. It contributes to thoughts that lead you to think security can be separated. By my way of thinking, 100% of the budget goes to security and functionality and that is the calculus.
Really, security is about ensuring information confidentiality, availability and integrity. And those constructs are totally connected to functionality of IT. I try whenever possible to use the term security and functionality in the same context just to underscore that point.
For example, the goal I continually push regarding security in the federal space is not just one dealing with security. I put it this way: “Security and functionality of all federal IT will be increased by two orders of magnitude in the next 24 months.” Putting the goal this ways also underscores that it is not security vs. functionality. Both need to increase.
This goal also cries out for the need for metrics in security and functionality. For functionality there are many customer focused survey methods that can help collect the right metrics. For security, I think one metric stands out above all others: Detected unauthorized intrusions. There are many other important metrics for other dimensions of the security problem, but that one is key. So, a goal that expects both security and functionality of federal enterprise IT to improve by two orders of magnitude will expect customer survey satisfaction to go through the roof, and will expect detected intrusions to drop significantly. If there were 50,000 detected intrusions in 2008, there should be less than 5000 in 2010.
That is a dramatic goal. What makes me think it is achievable? In part the dramatic action being put in place today in the federal space. And in part by dramatic new technologies and approaches like private clouds and thin client computing and enhanced identity management and authorization methods. But of more importance and more relevance than all of that, in my opinion, is the coordinated action and leadership underway by CIOs and CISOs and the security experts in the federal space today.
As evidence of this incredible positive action I’d like to bring your attention to a release by a Consortium of US Federal Cybersecurity Experts on Consensus Audit Guidelines. Details of this effort are at http://www.sans.org/cag/
The Consensus Audit Guidelines provide the twenty most important controls and metrics for effective cyber defense and continuous FISMA compliance. These controls and metrics include:
Critical Controls Subject to Automated Measurement and Validation:
-
Inventory of Authorized and Unauthorized Hardware.
-
Inventory of Authorized and Unauthorized Software.
-
Secure Configurations for Hardware and Software on Laptops, Workstations, and Servers.
-
Secure Configurations of Network Devices Such as Firewalls and Routers.
-
Boundary Defense
-
Maintenance and Analysis of Complete Security Audit Logs
-
Application Software Security
-
Controlled Use of Administrative Privileges
-
Controlled Access Based On Need to Know
-
Continuous Vulnerability Testing and Remediation
-
Dormant Account Monitoring and Control
-
Anti-Malware Defenses
-
Limitation and Control of Ports, Protocols and Services
-
Wireless Device Control
-
Data Leakage Protection
Additional Critical Controls (not directly supported by automated measurement and validation):
-
Secure Network Engineering
-
Red Team Exercises
-
Incident Response Capability
-
Data Recovery Capability
-
Security Skills Assessment and Training to Fill Gaps
The site at http://www.sans.org/cag provides more details on each, including detailed descriptions of the controls, how to implement them, how to measure them, and how to continuously improve them. The site also spells out the fact that this is a work in progress and processes are in place to ensure this great effort remains relevant and maximizes our ability to protect ourselves.
What should CTOs think about this guidance? As for me, I most strongly endorse it. In my mind the appropriate implementation of these controls will reduce unauthorized intrusions in any enterprise.
The deeply respected community leader Alan Paller said it this way:
“This is the best example of risk-based security I have ever seen,” said
Alan Paller, director of research at the SANS Institute. “The team that was
brought together represents the nation’s most complete understanding of
the risk faced by our systems. In the past cybersecurity was driven by
people who had no clue of how the attacks are carried out. They created an
illusion of security. The CAG will turn that illusion to reality.”
Please give these controls a read, and please help get them into the hands of the security and functionality professionals in your enterprise.
The Future of the Grid: From Telecommunications to Cloud-Based Servers
There was once a time long long ago when telecommunications and
computing were two different concepts. That was the age when phone
company operators manually switched calls and computers like ENIAC
were programmed by patches and cables. Since then the two fields have
been on a convergence path. The many advances in both fields since
the 1940’s make for exciting reading for computer and telecom fans, but
rather than recount those achievements here I’d rather talk about a
more modern achievement of note, the establishment of the Advanced
Telecommunications Computing Architecture (ATCA or AdvancedTCA).
ATCA is an open standard that has been around since about 2003. It has
been continually enhanced and today it is perhaps the most broadly
accepted standard in the telecom industry, with over 100 companies
participating in development and implementation of the specification. Perhaps more important is the adoption of the standard in the telecommunications industry. A review of wikipedia entries and other open info (like the Intel Embedded and Communciations Alliance) indicates typical “hockey-stick” implementation seen in other highly reliable, highly virtuous standards. IDC projects the ATCA market will be about $2.7 billion in size by 2013. I think the global financial crisis and the ongoing wave of mergers and purchases of smaller comms and equipment providers by larger ones will accelerate this trend even faster, as the need for modular low cost, highly relieable standards is needed even more.
Network equipment providers face two challenges that they are addressing with ATCA: 1) the need to continue to deliver new platforms and applications and, 2) the need to reduce costs and improve productivity. ATCA provides a great opportunity to address these needs. ATCA standards provide a common platform which provides lower cost, reduced maintenance, the ability to use third party boards, and the ability to reduce vendor lock-in (more on ATCA capabilities is below).
In my opinion, enterprise CTOs should work to accelerate moving the
ATCA standard and compliant products into data centers. It results in
more computer power per square inch, higher reliability, power savings,
cost savings, long term maintainability, and a path for upgrade that
does not require forklifts. ATCA is not something that currently scales down to small network devices, but it is something that I believe will prove to be perfect for data center server support.
Here is more on ATCA:
– Boards (blades) in an ATCA shelf are hot swapable.
– There is not a “bus” for communications in an ATCA shelf. Instead,
boards communicate point to point, which is faster and ensures there is
not a single point of failure like in the bus model.
– Any switching fabric can be used.
– Boards can be processors, switches or specially designed advanced cards, if desired.
– The most advanced shelf management capability ever designed is in
the ATCA container. If any sensor reports a problem the shelf manager
can take action or report the problem to a system manager. This action
could be things like turning up a fan or powering off a component or
telling a human that something needs to be replaced before failing.
– It is designed for very high reliability and very high availability.
– It runs cooler, even with its higher powered processors.
– It supports a healthy multi-vendor, interoperable ecosystem.
– It is based on open standards vice proprietary (locked-in) solutions.
Now back to the opening idea of this post. Telecom and data and compute power are not separate things anymore. Each are closely interwoven and successes in one thrust can make a huge positive difference in capabilities in other areas. As organizations and users grow more accustom to the power of cloud computing they will demand higher and higher levels of reliability and resiliency from their server providers. And as service providers provider higher levels of reliability and throughput cloud compute providers will see more and more success which will place increased requirements on their capability. In both cases, ATCA will provide the agility, resiliency and reliability required, which will drive its adoption further and further into the telecon and data worlds.
So, for
CTOs who are concerned with maximum performance with power and space
efficiency and a path to future upgrades, accelerate ATCA into your
enterprise. How? I just typed the words “atca for the datacenter”
into Google and got several links worth diving much deeper into,
including:
The Future of Global IT: Its like the Kobayashi Maru
If you are a little rusty on your advanced science literature, theater and movies and don’t recall the story from the fictional Star Trek universe known as the Kobayashi Maru, please take a moment to watch this clip, just to get your mind going:
This part of the story, reportedly written by Gene Roddenberry himself, is about a simulation at Star Fleet Academy. For most students there it is a no-win scenario, as you saw in the clip. For one, however, there was a solution we should all remember. That solution is one I like to use to remind people of what we need to do to help enhance the security and functionality of today’s enterprise IT. More on that later.
Regarding the future of global IT, the NYT ran an article today I’d recommend to any enterprise technologist or security professional titled “Do We Need a New Internet?” The article provides a good summary of how we got into the current mess regarding security of interconnected devices:
The Internet’s original designers never foresaw that the academic and
military research network they created would one day bear the burden of
carrying all the world’s communications and commerce. There was no one
central control point and its designers wanted to make it possible for
every network to exchange data with every other network. Little
attention was given to security. Since then, there have been immense
efforts to bolt on security, to little effect.
It also briefly discusses some of the threats and significant penetrations we have seen, and then introduces the Stanford Clean Slate project. This project seeks to build a new Internet with improved security and capabilities to supoprt a new generation of applications, as well as support mobile users. It is an Internet designed with security in it from day one. From the Clean Slate site:
We believe that the current Internet has significant deficiencies that
need to be solved before it can become a unified global communication
infrastructure. Further, we believe the Internet’s shortcomings will
not be resolved by the conventional incremental and
‘backward-compatible’ style of academic and industrial networking
research. The proposed program will focus on unconventional, bold, and
long-term research that tries to break the network’s ossification. To
this end, the research program can be characterized by two research
questions: “With what we know today, if we were to start again with a
clean slate, how would we design a global communications
infrastructure?”, and “How should the Internet look in 15 years?”
The site provides a good synopsis of research and profiles of the leaders working on the project. I’ve heard of several similar efforts, but none so well formed, in my opinion. The thing I really like about this one is it does not require everything everywhere to be thrown out before transitioning to this new way. There will be changes required, but this is much more evolutionary than other approaches seem to be. It is a great way for us humans to take back control of the technological aspects of our destiny.
Now back to the Kobayashi Maru. How did our hero Captain James T. Kirk win the simulation? He realized that the simulation was a creation of humans and decided that it could be redesigned. He designed it to work better for him and he won. That is the approach we need when it comes to Internet security, and it is an approach I think of when I read about the Stanford Clean Slate project. People like Nick McKeown have realized it is ok for us to decide what our future will be and design it. Thanks Nick for that, you remind me of one of our SciFi heros.
A Blog I Like: Haft of the Spear
Michael Tanji brings a perspective forged in years of intelligence work and a successful stint protecting information in the financial sector. He is a well published author who focuses on national security issues and is also a thought leader in the computer security domain.
At Haft of the Spear he writes primarily about technology related/enabled national security issues, which includes a heavy dose of information warfare.
Read HOTS at: http://haftofthespear.com/
Next week I write about Nicholas Carr and his Rough Type blog.
Plastic Logic and what could be the ultimate thin client
I’ve written a bit here about new display technologies that are so thin they are disruptive to our current way of work.
In October 2007 I wrote “Enterprise Requirements Come From Hollywood” where OLED (organic light emitting diode) TV’s were discussed. I mentioned the fact that once again Hollywood got it right first, with superthin displays in sci-fi and fantasy movies helping to drive user expectations and requirements. I’ve also written about thin clients, especially the game-changing infrastructure components for thin clients from Sun Microsystems. The servers supporting thin clients provide dramatic positive benefits for any IT enterprise.
And in January 2009 I wrote:
Flexible computers will arrive in production this year for early
adopters and many CTOs will use them in labs to assess applicability
for massive deployment in the coming years. These flexible computers
are the ultimate thin clients. Backends/servers/architectures
developed for the cloud perfectly suit ultra thin, flexible computing
devices. For more on this hot topic, start at the site of the Flexible Display Center at ASU.
One company poised to take advantage of the technologies of flexible displays is Plastic Logic. They are a Silicon Valley startup producing a paper-pad-thin device that is designed for business reading. For now, their offerings are focused on the business user and information can get into the device either by users sending it to the device or by content providers.
The Plastic Logic Reader is officially still in development. It will
enter the market later in 2009 via pilots and trials (I hope to get
one) and then be commercially available in 2010. Complete features
lists are not available but it supports a wide range of document types,
including: PDF, DOC, DOCX, XLS, XLSX, PPT,
PPTX, JPEG, PNG, TEXT, HTML, BMP, RTF, and ePub.
Users will hold this reader like they hold a piece of paper and read documents provided via wireless communications. The device weighs ounces not pounds, is thinner than a Macbook Air, and has a battery that lasts days vice hours. For more see this video of Plastic Logic CEO Richard Archuleta from the Fall 2008 Demo conference:
My suggestion to any enterprise-class CTO is to check out their website
and find ways to get their capability into your lab and into the hands
of your users.
I’d also suggest thinking through how these devices can fit into the rest of your enterprise, and I’d suggest you (actually, I suggest all of us) start formulating our desires for enterprise capabilities on this device. For example, what encryption will be used? How will it do identity management? How will it to access control? How will it work with a Sun Ray environment?
Foreign Spies Make Recession Worse and Steal Part of Our Future
Foreign spies are in our country for many bad reasons. Spies target defense secrets and seek to penetrate the
decision-making process of our government leaders. They also gain unauthorized access to information held by our nation’s corporations. In this time of
serious economic crisis this aspect of the threat from foreign spies is particularly troublesome. Spies contribute to the problem’s we face in the economy.
Today one of the most damaging things spies do is steal the trade secrets and intellectual property of our corporations and research labs. The intellectual property they steal is moved overseas where other countries (and companies inside those countries) can benefit from the investments we make in research and development. This hurts our economy in many ways. It causes the value of our research and development to be significantly sub-optimized. It hurts the ability of our companies to compete in the global market place. It causes more jobs to go overseas. It can threaten the survival of companies which of course hurts both investors and employees. This is all bad for the economy. And its all WRONG! Our country needs to invest enough in our counterintelligence capabilities to find foreign spies and get them out of here.
A particularly insidious threat is one where a country might couple the power of spies in our borders with cyber attacks and cyber espionage to extract information from companies while at the same time monitoring the response to those attacks. Humans can enable cyber attacks in many ways that make them far more damaging. In fact the most feared type of data theft if one where a trusted insider moves data. With modern high capacity thumb drives large quantities of data can be moved in moments.
I just read an article by an authoritative source on this topic, Michelle Van Cleave. Michelle served as the hed of U.S. counterintelligence from July 2003 through March 2006 and was in a position to observe firsthand some of the damage being done by foreign spies. The article outlines examples and gives a firsthand account of some of the challenges we face in this area. It concludes with:
How important is all of this, really? Cynics will scoff and say, “There
will always be spies.” But I have read the file drawers full of damage
assessments; I have catalogued the enormous losses in lives, treasure
and crucial secrets that foreign intelligence work has caused. The
memory of what’s in those files — and the thought of the people and
the operations still in harm’s way — can keep me awake at night.So we have to choose. We can handle these threats piecemeal, or we
can pull together a strategic program — one team, one plan, one goal
— to reduce the overall danger. We can chase individual spies case by
case, or we can target the services that send them here. The next
devastating spy case is just around the bend. I fear that when it
comes, we will all ask ourselves why we didn’t stop it. I suspect I
already know the answer.
I recommend this article to all, especially enterprise technologists. If you are a CTO, a CISO, a CISO it is especially important for you to understand the nature of the threat to your systems and to your intellectual property. If you are a citizen it is important for you to know as well. We must collectively address this challenge to our intellectual property and to our economic recovery.
For more on these topics please see:
http://www.ctovision.com/cyber-war/
and
http://www.ctovision.com/information-warfare/
Intelligence Community Executive Forum and Carahsoft
Carahsoft is a fantastic company in Reston, VA run by the hardest working, most modest, ethical, business leader I have ever met. His behind the scenes style means he would probably not want me to mention much more about him, but if I have your curiosity up about them you can read more here (read the one about their winning the Smart CEO magazine Future 50 in Jan 2009, or Fairfax County economic development authority award for 2009, or other award after award after award).
One thing I like about Carahsoft is their desire to help government customers think through hard problems and their desire to help their extended team mates and partners learn about customer hard problems so enterprise solutions can be developed. One of the many ways Carahsoft does that is by hosting venue like the Intelligence Community Executive Forum (ICEF). This periodic venue brings together executives and thought leaders from government and industry to listen to lesssons learned, hard problems and successes in creating CONOPs to address mission needs.
I’ll be helping Carahsoft with the next ICEF on 17 Feb 2009. This one will focus on collaborative enterprise solutions like those provided by Adobe. Panels will be held on topics like real-time collaboration, secure information sharing and Integration/web2.0.
Please check out the agenda and register if you can make it. More info is here: http://www.intelligencecommunityexecutiveforum.com/
Unrestricted Warfare Symposium, Sponsored by JHU’s APL and SAIS
For enterprise technologists and national security professionals and most of all for those who fit both of those descriptions, please check out Johns Hopkins University’s 2009 Unrestricted Warfare Symposium at: http://www.jhuapl.edu/urw_symposium This symposium seeks to advance our understanding of and solutions for some very complex problems related to our nation’s defense. I’ll be speaking on a panel at the conference (on issues of cyber war and cyber defense) and hope to see you there.
The following is from an e-mail from Dr. Ron Luman (Johns Hopkins University Applied Physics Laboratory National Security Analysis Department Head)
National Security Community Colleagues:
This is a reminder that the Johns Hopkins University’s 2009 Unrestricted Warfare Symposium will be held 24-25 March 2009, and I encourage you to register now at http://www.jhuapl.edu/urw_symposium/.The fourth annual symposium is in Laurel, MD at JHU’s Applied Physics Laboratory (APL), and is jointly sponsored by APL and the Paul H. Nitze School of Advanced International Studies (SAIS). Last year more than 300 participants from government, industry, and academia interacted with distinguished speakers and expert panelists who addressed national security issues from three perspectives: strategy, analysis, and technology. In 2009, this uniquely synergistic approach will be applied to the challenge of identifying interagency imperatives and capabilities.
The symposium presentations and panels are organized around four potential unrestricted lines of attack – cyber, resource, economic/financial, and terrorism. We’ll begin each session with a discussion of the potential for such attacks and then expert roundtable panelists will discuss imperatives for interagency action, offering ideas for enhancing interagency capabilities. A fifth session will focus on the role of analysis in identifying and assessing interagency approaches for preventing and combating these types of attacks.
I am particularly pleased that The Honorable James R. Locher, III, Executive Director of the Project for National Security Reform, will open the symposium as our keynote speaker, providing the Project’s timely findings and recommendations for interagency reform. Throughout the two days featured speakers and distinguished panelists, include: Dr. George Akst, MCCDC; Mr. Eric Coulter, OSD(PA&E); Dr. Richard Cooper, Harvard University; Dr. Stephen Flynn, Council on Foreign Relations; Representative Jane Harman; Professor Bruce Hoffman, Georgetown University; Professor Michael Klare, Hampshire College; Dr. Michael Levi, Council on Foreign Relations; Dr. Matthew Levitt, Washington Institute; Dr. Pete Nanos (DTRA); Mr. James Rickards, Omnis, Inc.; Mr. Frank Ruggiero (Department of State); Dr. Khatuna Salukvadze, Georgian Ministry of Foreign Affairs; Mr. Dan Wolf, Cyber Pack Ventures Inc.; Mr. Bob Work, CSBA, to name a few.
The attached announcement identifies confirmed speakers and other essential information. We encourage dynamic networking, and to facilitate audience participation, we will again be utilizing electronic groupware to collect comments, insights, and questions. The collection of papers and transcripts of discussions will again be published as Proceedings, in both hard copy and electronic form. The 2006 -2008 Proceedings, the current agenda/speakers, and 2009 registration details can be found at the symposium website: http://www.jhuapl.edu/urw_symposium/.
Your experience in national security and defense will contribute unique perspectives and challenging questions to our understanding of Unrestricted Warfare, and I look forward to seeing you next month.
Best regards,
Ron Luman, General Chair
I hope to see you all there.
Symposium Attachment:
URW2009Flyer 4Feb-1.pdf