Thursday, January 29, 2009

Cloud & the Government Session at Cloud Computing Expo

Earlier this week I announced that I will be presenting at SYS-CON's 2nd International Cloud Computing Conference & Expo in New York City this coming March 30-April 1, 2009. During the Cloud & the Government Session at Cloud Computing Expo I will discuss the results of a new survey on cloud computing in the Federal government.

Federal government organizations have typically been viewed as cautious adopters of new technologies. However, as expressed by now President Obama's transition team, cloud computing has captured the attention of many government IT decision makers. The U.S. Defense Department’s Defense Information System Agency is, in fact, already fielding a cloud computing infrastructure. In October 2008, through on-line surveys and in-depth interviews, customer requirements and concerns were gathered in order to shed light on current cloud computing offerings targeted for government customer. In February 2009, the survey will be repeated in order to gauge how trends in this market segment have changed.

I look forward to your participation in the new survey and welcome you views and comments on the Obama administration's views on the importance of cloud computing to government transparency.

Tuesday, January 27, 2009

CSC and Terremark target US Government with Cloud Computing

Today's announcement by CSC reinforced the strong wave of cloud computing towards the Federal space. Ranked by Washington Technology Magazine as 9th largest (by contract dollar value) government contractor, this practically guarantees a bloody battle over the Federal cloud market.

"Together, CSC and Terremark will offer cloud services that include computing, storage and disaster recovery/continuity of operations that are delivered from a highly secure and reliable environment, located at Terremark's datacenter in Culpeper, Va.

'Terremark and CSC look forward to delivering a portfolio of cloud computing resources that will allow our government customers to implement IT services for mission-critical applications in minutes, not days," said Jamie Dos Santos, president and chief executive officer of Terremark Federal Group. "Instead of buying costly, cumbersome hardware, this offering provides the customer access to a resource pool of processing, storage and networking that can be provisioned on demand, all from Terremark's state-of-the-art facility in Culpeper, ensuring the security and reliability of the customer applications and data.'"

While this offering seems similar to the earlier Apptis/Servervault Fedcloud initiative, I expect that this move will force all the other large FSIs to quickly launch "cloud computing products" of their own.

While many will see these types of offering as thinly veiled datacenter outsourcing products, I do appreciate these cautious first steps for what they are and really look forward to the appearance of more aggressive cloud computing offerings in the near future.

Monday, January 26, 2009

Should my agency consider using cloud computing?

This is clearly the question on the minds and lips of every government IT decsionmaker in town. Why should a government agency even consider cloud computing?  In reality, the decision process is no different than any other IT management decision, “Cloud IT” options should be compared to “Traditional IT” approaches. As Frank Gens of IDC alluded to when he framed the cloud opportunity for IT suppliers, agencies have four options when deciding if and how to improve their IT infrastructure.

  • “Traditional IT” products and services to enhance traditional agency services;
  • "Cloud IT” products and services to enhance traditional agency services;
  • Traditional IT” products and services to create agency specific cloud services; and
  • "Cloud IT” products and services to create agency specific cloud services.

In a ZDnet blog post that compared traditional with cloud IT, Dion Hinchcliffe summarized the differences as shown in the table below.

 

Traditional IT

Cloud IT

Design Approach

Proprietary, customized

Standardized

Economy of Scale

Organizational

Ecosystem

Control

Full

Partial

Security

Most secure

Secure

Capacity

Limited

Nearly unlimited

Infrastructure

Dedicated

Shared

 

Any decision process must take into account the agency’s mission goals, the specific agency function being addressed and the current IT infrastructure. From a mission point of view, the comparison factors listed above can be addressed using six straight-forward yes/no questions:

  • Can this function within the agency’s mission be accomplished with the use of standard IT components or are proprietary or customized components necessary?
- Yes, "Cloud IT" is an option

- No, "Traditional IT" approach may be required

  • In anticipating future functional requirements, do the investments required and value obtained by letting the agency develop technical improvements outweigh the investment savings and time lost in leveraging technical improvement from an industry ecosystem?
Yes, "Traditional IT" may be required

- No, "Cloud IT" is an option

  • In view of agency mission objectives, is full control of all IT resources required to complete this function?
- Yes, "Traditional IT" may be required

- No, "Cloud IT" is an option

  • Is the level of security afforded by generally accepted commercial practices acceptable in accomplishing this specific agency function?
- Yes, "Cloud IT" is an option

- No, "Traditional IT" may be required

  • In executing this function during surge or peak situations, would capacity limitation severely affect agency mission accomplishment?
- Yes, "Cloud IT" should be considered as an option

- No, "Traditional IT" may be desirable

  • In view of agency mission objectives, is a dedicated IT infrastructure required to complete this function?
- Yes, "Traditional IT" may be required

- No, "Cloud IT" is an option

By weighing each factor and answering these questions,  could this framework help in discussing the use of “Traditional IT” versus “Cloud IT”?

What do you think?

Thursday, January 22, 2009

Cloud Computing Wargames !!

Wikipedia

 “A wargame is a game that represents a military operation.”
 “Military simulations, also known informally as war games, are simulations in which theories of warfare can be tested and refined without the need for actual hostilities”

---------------------------------------------------------------------------------------------

For ages, simulations and wargames have been used by the military to prepare for eventual future operations. During the Cold War, countless battles between the red and blue forces were set-up, run and reset in preparation for the conflict that thankfully never came. Some contend that these wargames, in fact, were instrumental in preventing a global nuclear holocaust.

As an outgrowth of this apparent success, business war games also came in vogue as a tool to help managers develop and execute business strategies more successfully. In 2005 there was actually an all-day “Battle for Clicks” war game between students from MIT Sloan School of Management and Harvard Business School. This game, run by Fuld & Co., a Cambridge-based strategic intelligence consulting firm, was the first such competition involving students from these two world-class business schools.

In a unique take on this concept, in 2006, Booz Allen Hamilton took business wargaming one step further. As reported by Government Computing News, the CIO Wargame, a BAH creation, combines the basics of craps and Monopoly to simulate how CIOs, chief architects and other program managers make decisions. The game's stated goal is to bring projects into the operation and maintenance phase and earn as many mission value points as possible, while taking steps to reduce the risk of failures and setbacks. The team with the most points after five rounds won. Like in Monopoly, players had to make strategic investment decisions on which projects and IT capabilities to bet on; like in craps, the roll of the dice often determined how well a project paid off.

I’m happy to report that the CIO Wargame is now being updated! The new Cloud Computing Wargame (CCW) represents a major evolution of the original concept and will be unveiled at FOSE 2009. I am looking forward to working with the BAH Cloud Computing Team on this exciting project.

More than a game, CCW applies simulation techniques to model “Traditional IT” and “Cloud Computing” environments and dynamically maps them against internal, community, and Cloud-based resources. The simulation represent real-life situations facing IT management daily, especially in an environment of rapid technological and mission change against a backdrop of resource variability. The CCW is designed for both mission “owners” and senior IT staff who are engaged in the strategic planning and use of information technologies to meeting organization mission and basic business requirements. The game puts the “players” in real-life situations that you can win … and you can loose.

By actively applying modeling and simulation to IT decision making The Cloud Computing Wargame helps players and organizations understand:
  • The inter-relationships between cloud computing technology and mission requirements
  • How long-term cloud computing strategy can develop, evolve, and change.
  • The interaction between different activities that occur within an IT organization in order to implement and support IT capabilities in different capabilities matrices.
  • How different strategies maximize mission impact and value
  • How different roles work with business and IT partners throughout the organization and value chain.

You comments and suggestions are welcome. I look forward to seeing you at FOSE for this exciting unveiling.

Tuesday, January 20, 2009

President Barack Obama. A New Day for Cloud Computing !!

Yesterday, President Barack Obama's transition team released a new video touting the benefits of cloud computing and government transparency.

"Cloud computing, which allows consumers and institutions to access their files and projects anywhere via the Internet "is an important change for the federal government because it is dramatically cheaper than the old fashioned way of doing computing infrastructure," said team member Andrew McLaughlin, head of global public policy and government affairs for Google, a longtime supporter of cloud services.

As stated by Andrew McLaughlin, Technology Innovation and Governmental Reform, a shift to cloud computing represents one of the most important transformations that the Federal government will go through over the next 10 years. "The First Cybergenic President of America" is sure to make cloud computing, along with the Blackberry, a major tool for realizing his goal of remaking America. Paul McDougall of Information Week stated it rather bluntly today in his blog, "IT Could Make Or Break Obama Presidency"

Wonder if there will be any related executive orders today :-)

Friday, January 16, 2009

How the Government Tweets

Last September in "Ambient Awareness. The cloud killer app? " and " The Cloud Wins in Minneapolis at the RNC! ", I wrote about how the cloud infrastructure and microblogging could be used tactically to great effect. Although its clearly still in an early phase, some government entities are now experimenting with Twitter. In the article "Governments use Twitter for Emergency Alerts, Traffic Notices and More" pilot uses include:
  • Los Angeles Fire Department (LAFD) updates its Twitter page with bulletins about structural fires, the number of responding firefighters, and injuries and casualties;
  • Portland (Ore.) Police Department tweets about crime reports and sometimes asks the public for leads in cold cases;
  • The Washington State Department of Transportation (WSDOT) updates its feed with traffic alerts and route changes for ferry vessels; and
  • The U.S. Geological Survey (USGS) tweets trivia questions and hypotheticals.

Even the Environment protection Agency has gotten into the act with "Greenversations".

I guess this is part of the change we voted for :-)

Thursday, January 15, 2009

Bob Gourley on Cloud Computing and NetCentric Operations

Bob Gourley, Crucial Point CTO and former DIA CTO, just posted  Cloud Computing and Net Centric Operations on his website CTOvision . In it he outlines how the OSD and ASD NII strategy to enable net-centric operations has benefited from the advent of cloud computing. Of particular note is his list of key principles for OSD implementation of cloud computing:
  • The importance of mission-focused engineering. This key point is already embodied in the ASD NII Strategic Plan, but is worth restating to keep it at the forefront of all IT discussions in the department.
  • The continual need for security, including data confidentiality, integrity and availability. All DoD computing approaches must be engineered to be in total consonance with DoD guidelines to assure DoD information, information systems and information infrastructure. Cloud Computing, when engineered right, makes dramatic, positive changes to the mission assurance posture of DoD. Cloud computing enables stronger end point security and better data protection. It also enables the use of thin clients and the many security benefits they provide.
  • The need for always instantaneously available backup of data in the cloud. Ensured availability under all circumstances is a key benefit of smart cloud computing approaches.
  • The continual need for open source and open standards. Most cloud infrastructure today is based on open source (Linux, Solaris, MySQL, Glassfish, Hadoop) and this positive trend will help in net centric approaches. According to the IDC Group, open source software (OSS) is "the most significant, all-encompassing and long-term trend that the software industry has seen since the early 1980's" Gartner projects that by 2012, 90 percent of the world's companies will be using open source software. This all indicates open source and open standards should be a key principle for DoD cloud computing and other net centric approaches.
  • The continual need to evaluate both low barrier to entry and low barrier to exit. As approaches to cloud computing are evaluated, too frequently the cost of exiting an approach is not considered, resulting in lock-in into a capability that may soon be inefficient. Cloud computing capabilities should be adopted that do not result in lock-in.
  • The need for open standards. Cloud computing contributions to net centric operations increase interoperability as the code, API's and interfaces for cloud computing are secure but are widely published for all participants to interface with. OSD involvement in open source and open standards communities should continue and be accelerated, since increasingly cloud computing open standards are being discussed and designed by open standards bodies like W3C, OASIS, IETF and the Liberty Alliance. Document and other formats used by OSD cloud computing activities will be open and available for all authorized users on all devices.
  • The need to understand the cost of "private clouds". For at least the near term, DoD will remain a provider of "private cloud" capabilities where security dictates ownership levels of control over compute power. This fact means DoD must continually engineer for change and technology insertion, which underscores the need for low barriers to exist in design criteria.
I hope that our decision makers heed his wise council.

Wednesday, January 14, 2009

Obama Administration CTO Top Suggestions

Check out the top vote getters for suggestions to the nations's first CTO!


#5 with 5,835 votes

Open Government Data (APIs, XML, RSS) We can unleash a wave of civic innovation if we open up government data to programmers. The government has a treasure trove of information: legislation, budgets, voter files, campaign finance data, census data, etc. Let's STANDARDIZE, STRUCTURE, and OPEN up this data.

#4 with 6,460 votes

Complete the job on metrication that Ronald Reagan defunded The government has failed to take the lead on completing the task of moving the country completely to the SI metric system. George H.W. Bush tried to do something about it, but gave the bureaucrats an easy out. Failure to follow the same measurement standards as the rest of the world is costing US industry something like $1 trillion per year.

#3 with 8,454 votes

Repeal the Digital Milennium Copyright Act (DMCA) It is evident that the framers of the infamous Digital Millenium Copyright Act intended it to have a transformative effect on the public and legal perception of intellectual property. In the attempt to develop a fully-realized definition of what constitutes infringement, fair use, and the rights of users in the consumption of digital works, the DMCA has had quite the opposite effect by institutionalizing considerable legal ambiguity. The recent spat between the John McCain presidential campaign and YouTube has demonstrated that both practitioners of law and a leading supporter of the DMCA are no closer to understanding the controversial law than the general populace.With so much confusion and abuse surrounding the DMCA, isn't it time we start over and take a fresh approach to intellectual property that doesn't irreconcilably tip the scales in favor of big media?

#2 with 10,105 votes

ensure our privacy and repeal the patriot act. The patriot act had many sub-ordinate clauses that strip away our privacy as American citizens. These were shoehorned in as an effort to protect us, while they in fact strip us of certain rights to privacy as citizens. Lets protect our nation while ensuring confidence and privacy to our citizens.

And #1 with 12,957 votes

Ensure the Internet is widely accessible & network neutral
The Internet is one of the most valuable technical resources in America. In order to continue the amazing growth and utility of the Internet, the CTO's policies should:
Improve accessibility in remote and depressed areas.
Maintain a carrier and content neutral network.
Foster a competitive and entrepreneurial business environment.


The only suggestion with "cloud" had 9 votes

Establish the "gov cloud" for commodity computing
Why keep separate email servers across Federal government? Why keep data servers in downtown DC, some located in a flood plain? Why not centralize management of, and provide ubiquitous access to: email, portals, collaboration sites, expertise profiles, etc.?

Tuesday, January 13, 2009

2009 Cloud Computing Events

2009 is off to a fast start with the following events on the horizon!

See you there !!

Monday, January 12, 2009

World Summit of Cloud Computing Virtual Site



The Israeli Association of Grid Technologies (IGT) has made its recent IGT 2008 World Summit of Cloud Computing available on-line. Speakers include:



Day 1:



Day 2




ENJOY !!

Friday, January 9, 2009

1105 Government Information Group does Cloud Computing


Mark your calendars for April 29, 2009 ! 1105 Government Information Group has announced that there will be a Cloud Computing Conference at the Ronald Reagan Building in Washington, DC.


"Cloud computing has been heralded as one of the top ten technologies > to watch in 2009. From DC government's adoption of Google Apps and Gmail, to DISA's Rapid Access Computing Environment (RACE), cloud computing has begun to permeate many federal, state, and local agencies."


For more information visit http://www.govcloudconference.com/ .

Thursday, January 8, 2009

Sun Acquires Q-Layer

Yesterday, Sun Microsystems announced their acquisition of Q-layer. This Belgium based company automates the deployment and management of both public and private clouds. In the press release, David Douglas, SUN's SVP of Cloud Computing said “Q-layer's technology and expertise will enhance Sun’s offerings, simplifying cloud management and speeding application deployment.”

In last month's cloud computing update, Mr. Douglas outlined Sun's cloud plans so I'm expecting quite a bit from them in the next couple of months. The slides from this chalk talk are available in my slidespace.

Wednesday, January 7, 2009

SOA is Dead; Long Live Services

Blogger: Anne Thomas Manes
Obituary: SOA

"SOA met its demise on January 1, 2009, when it was wiped out by the catastrophic impact of the economic recession. SOA is survived by its offspring: mashups, BPM, SaaS, Cloud Computing, and all other architectural approaches that depend on 'services'."

This obviously sparked a few tweets with comments from Singapore, Europe and the US. I was also pointed to Five fearless predictions for SOA 2009 (Thanks Laura!), which had very excellent points including:

"Vendors will de-emphasize SOA as a distinct “product” offering. There will be less hype about SOA, but that doesn’t mean it will have gone away. New solutions and applications will have service-oriented aspects. Cloud offerings will be built in accordance with SOA principles. There won’t be a lot of start-up vendors pitching SOA solutions, but plenty of start-ups will be offering Web 2.0-type and cloud-based services, which will be underpinned by SOA principles."

The reports of SOA's death have been greatly exaggerated! (My apologies to Mark Twain)

Tuesday, January 6, 2009

2009 - The Year of Cloud Computing!

Yes, everyone is making this bold statement. In his article, David Fredh laid out the reasons quite well:

The technological hype has started already but the commercial breakthrough will come in 2009. Cloud computing is being driven by providers including Amazon, Google, HP, IBM, Intel and Microsoft. The potential of the hype maybe will be clearer with the fact that Amazon price for one GB in the cloud is only $0.150 at this moment. The next amazing web services are in the Cloud. Already now, we have services like Dropbox (file synchronization) and Mozy (providing unlimited backup for only 5$ a month). More of your data will move out on the internet. Many people already have all their mails in their gmail inbox instead of in their harddrive. The same will probably happen with documents, images and music. It’s convenient, but watch out.. .who owns your data?"

Al Tompkins actually listed a few of the bold statements:

"Fortune magazine said:

Software-as-a-service companies have long promoted themselves as more capital-efficient alternatives to installed software solutions. Instead of financing a big software purchase and installation, companies can "pay as they go" under the cloud services model.

"The capital crunch of 2009 will put a spotlight on the advantages of cloud computing: less risk, no capital expenditure, predictable operating expenses and fast results," predicted Salesforce's (CEO Marc) Benioff. "I believe that will translate into greater adoption for both cloud computing applications and platforms."

The Software Licensing Blog said:

Demian Entrekin, founder and Chief Technology Officer of Innotas, has written an Op Ed piece for SandHill entitled 10 Predictions for Software as a Service. In it he cites a Gartner study that predicts the $6.4 billion in SaaS sales for 2008 will grow to over $14.8 billion by 2012.

VMBlog said:

"This year cloud computing made the leap from an interesting proposition to a viable option for even the largest of enterprises. In 2009 it becomes mandatory," said Appirio co-founder, Narinder Singh. "Today's economic climate will force enterprises to pick technology winners and losers for their environment in order to cut costs, be more efficient and deliver business-relevant innovation. Cloud computing makes this seemingly impossible task a possibility -– much more so than traditional software. This is why we believe cloud computing will be counter cyclical, with SaaS and PaaS investment accelerating, and traditional software spending declining."

2009 will also be The Year of Security (again)!

Cloud computing will not soar in 2009 unless concerns around information security and privacy are succinctly answered with solutions that are transparent in their understanding and verifiable in their operations. While I clearly feel that these sort of technical solutions are already available, success in 2009 lies in educating those held responsible for operating and managing global information repositories and networks.

For the Federal government community, the onus is squarely on cloud computing vendors and solution providers to show the value of this technology and to prove that the available technical solutions meet and exceed Federal requirements and standards.

Monday, January 5, 2009

Salesforce.com and Google expand their alliance

In a Jan. 3rd announcement, Salesforce.com announced an expansion of its global strategic alliance with Google. In announcing the availability of Force.com for Google App Engine™, the team has connected two of the industry’s leading cloud computing platforms. This could definitely help in expanding the number of applications available through cloud-based services. It also bears watching.

In my earlier post on cloud interoperability , I discussed three paths to accomplishing this worthy goal:
  • “Standards Body” Approach
  • “Adopt Proven Technology” Approach
  • “Customer Driven” Approach
This clearly fits in the second bin. While I welcome this expansion of cloud capability, the cloud titans should also keep the greater community in mind as they move forward. That may mean a more open Salesforce.com.

The quote from Marc Benioff, Chairman and CEO of Salesforce.com is that,“We have an open vision for cloud computing.” Sounds good but as the late President Ronald Reagan would put it, "doveriai, no proveriai" (Trust, but verify).