GovCloud Network

30 Day Pageviews

CloudTekU

Monday, August 24, 2015

“Cloud First” Lessons Learned from ViON



In 2011, then United States CIO Vivek Kundra released the US Federal Cloud Computing Strategy [1]. In the executive summary he pointed to cloud computing as a key component of the US Federal Government’s information technology modernization efforts:

“Cloud computing has the potential to play a major part in addressing these inefficiencies and improving government service delivery. The cloud computing model can significantly help agencies grappling with the need to provide highly reliable, innovative services quickly despite resource constraints.”
This “Cloud First” initiative was a complement to the 2010 Federal Data Center Consolidation Initiative (FDCCI), which was created to reverse the historic growth of Federal data centers.

Since then, ViON Corporation has teamed with its US Federal Agency customers to make the promises outlined by Mr. Kundra a reality. In accomplishing this work, they provide industry leading data center professional services which also include networking, Cloud Design and IT infrastructure architecture expertise.  In the following interview, Richard Campbell and Keith Greene, two of their leading Cloud Solution Architects, share their experience and lessons learned with the “Cloud Musing’s” audience.

Kevin:  Rich and Keith, thank you very much for taking the time to participate in this interview.  In  order to get us all on the same page, could you please tell us how your US Federal customers view data center consolidation and cloud computing? Are these initiatives separate activities or are they part of an overall strategy?


Rich:  Our customers really have differing views on the intended goals of these two initiatives.  Most agencies want to leverage congressionally appropriate funds to help them transition to a more economic IT consumption model that can be funded through their operational budgets.  Data center consolidation efforts, however, are typically a reaction to budget cuts and are based on the need to reduce agency capital expenditures.  Some actually do link the efforts and are using consolidation benefits in order to modernizing while simultaneously reaping the rewards that cloud solutions generally provide.

Keith:  I agree with that observation.  Apart from physical relocation of equipment, customers achieved data center consolidation thru virtualization.  Over time, they discovered that the projected savings weren’t being realized due to unanticipated increase costs associated with licensing, support, security, and compliance issues.  This phenomenon is generally referred to as "VM Sprawl".  While virtualization can be an important component of many deployments, cloud resource enablement, orchestration, and management capabilities typically reduce the overall costs of an organization’s IT environment. Additionally, controls in the cloud stack continually monitor the provisioning and billing process which inevitably leads to more efficient use of the environment.

Kevin:  With that said, what challenges do your customers run into when they decide to adopt cloud computing?

Rich:  The biggest challenges we see are application modernization and the enterprise getting a grasp on the associated dependencies between those apps.

Keith:  IT Managers are faced with a major dilemma when they are asked to cut overall IT costs by going to the cloud.  While the cloud does provide a relatively easy way to consume infrastructure services in a “pay-as-you-go” model, many current applications are not “cloud ready”.  Most agency applications have been written and designed to operate in a client/server architecture.  Many times these deployments used hard coded IP addresses which introduced static application dependencies within a static infrastructure environment.  With cloud computing, the infrastructure is dynamic by design.  Applications must be therefore be written so that they can deal with the fluidity of the virtual cloud environments.  Rewriting legacy application so that they can work on the cloud can cost more than the infrastructure savings.

http://www.vion.com/

Kevin:  So how can ViON help agencies address this issue?

Keith:  Through the experience we’ve gained, Vion can provide advice and assistance on implementing best practices for cloud transition and data migration.  We can also help the agency understand application dependencies which is one of the most critical inputs when selecting appropriate data migration tools.

Kevin:  Do you see any differences civilian and DoD agencies when you help them in this transition?

Rich:  For the most part they are the same but FedRAMP,the Federal Risk Authorization and Management Program, has really helped by enabling a government-wide cloud computing security accreditation process.  Although the DoD requires some additional security and protection methods, the FedRAMP baseline can significantly reduce both time and cost.

Kevin:  What would be your advice to an agency that’s developing their cloud transition strategy?

Rich:  Government agencies really need to better align their IT resources with their individual mission requirements and goals.  Effective cloud computing solutions aren’t designed and built around a specific technology but rather IT services that support an organization’s global efforts. They also need to have a holistic approach in their application modernization efforts.  The ability to leverage an open agile solutions will deliver benefits at every level.  

Keith:  Agencies must also focus on the change management challenges they are sure to face.  Existing rules and policies need to be modified so that they don’t act as an impediment to the mission agility that cloud can provide.  The ability for the cloud administrator to move resources around dynamically is sometimes viewed by change advisory boards as operating out of control, which in turn is counter-productive and limiting.  

Kevin:  Rich, Keith, thank you both for enlightening us with your experience and insights.

Rich and Keith:  Our pleasure.




[1] https://www.whitehouse.gov/sites/default/files/omb/assets/egov_docs/federal-cloud-computing-strategy.pdf



( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)
 



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)



Tuesday, August 18, 2015

Looking for Security Peak Performance?

You can find it at Dell Peak Performance 2015!!!

video

I'll be there at the Aria Resort and Casino in Las Vegas attending as a social media correspondent with a full access pass and an invitation to meet and interview Dell executives and security technology leaders.

http://www.aria.com/

The event is packaged across three focused tracks:

Security Foundation: This track is designed for professionals including business owners, partner executives and sales professionals that are new to the Dell SonicWALL solution or who are striving to establish a security practice. Enhance your knowledge and maximize your investment in the Dell SonicWALL security portfolio with our security apprentice breakout sessions. All courses will include strategy, roadmap and solution selling – including competitive positioning in the marketplace at introduction basis. This track has three pre-assigned sessions and seven optional sessions.

Security Professional: This track is designed for security professionals, including business owners, partner executives and sales professionals. Enhance your knowledge and maximize your investment in the Dell SonicWALL security portfolio with these security professional breakout sessions. All Security Professional courses will include strategy, roadmap and solution selling – including competitive positioning in the marketplace. This track has ten pre-assigned sessions.

Security Ninja: This track is designed for Dell Security Certified System Engineers who are responsible for designing, implementing, supporting or managing a security infrastructure with SonicWALL products and services. Dell SonicWALL CSSA/CSSP certifications are a suggested requirement to enter this curriculum. All Security Ninja sessions include technical deep dives, upcoming features and hands-on learning. This track has five optional two-part sessions.

video

 Make a comment or shoot me a message if you'll be there as well. I will also be doing some video interviews and blog post. Looking forward to sharing your stories with my audience. 
But remember.......



( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)



Friday, August 7, 2015

The Cybersecurity Sprint: Are we safe yet?



UPDATE: NBC News reports U.S. officials have disclosed a hack of the Pentagon’s Joint Staff unclassified email system, which took place on July 25.

Recent unauthorized access to a U.S. government database led to thecompromise of information on at least 21.5 million individuals. This massive background investigation data breach also compromised usernames, passwords, mental health records and financial information. Although a security update applied by the Office of Management and Budget (OPM) and the Homeland Security Department (DHS) in January ended the bulk of the data extraction, the U.S. government-wide remediation efforts were extended by launching a 30-day Cybersecurity Sprint.

This action was done to assess and improve the health of all federal assets and networks. Agencies were instructed to immediately patch critical vulnerabilities, review and tightly limit the number of privileged users with access to authorized systems and dramatically accelerate the use of strong authentication, especially for privileged users.

Last month, federal CIO Tony Scott reported significant progress with the sprint citing significant progress that included:

  • A Federal Civilian agency increase in the use of strong authentication for privileged and unprivileged users from 42 percent to 72 percent
  • An increase in the use of strong authentication for privileged users from 33 percent to nearly 75 percent
  • The implementation of strong authentication across 95 percent of privileged users working in thirteen agencies, or more than half of the largest agencies – including the Departments of Transportation, Veterans Affairs, and the Interior Department



Although I applaud these recent efforts the Federal government has a very long way to go before anything like “Mission Accomplished” can be claimed.

One major and costly challenge will be in the area of software development. A recent analysis, State of Software Security by Veracode, that rated application security by using compliance with Open Web Application Security Project (OWASP) top 10 vulnerabilities recommendations as a yardstick found a “low pass rate” in government applications. This dismal finding was theorized as being the result of a:

  • Higher use of scripting languages
  • More prevalent use of older languages which are known to produce more vulnerabilities
  • Low rate of software remediation (e.g. fixing flaws)
The four most important vulnerability categories listed by the study are SQL injection, cross-site scripting (XSS), cryptography issues and command injection. This is due to the pervasiveness and severity of these vulnerabilities, specifically:

  • SQL injection was the application vulnerability most often exploited in web application attacks in a recent 2015 data breach Incident
  • Cross-site scripting is overall far more prevalent than any other category
  • OS command injection played a role in 2014’s Shellshock vulnerability, in which a commonly used open source component was exploited in a way that allowed taking over a server to run arbitrary code
Government developed software has performed poorly by having the highest prevalence of both SQL injection and cross-site scripting when compared to other industry rates.

To improve on the current sad state of government cybersecurity, the Federal Information Security Management Reform Act (FISMA Reform), has been proposed as a new bill. This legislations offers five major initiatives designed to improve the overall security posture of federal networks:

  • Grant DHS authority to operate intrusion detection and prevention software across the .gov domain
  • Authorize DHS to conduct regular risk assessments on federal networks
  • Require DHS to enact defensive countermeasures in the event an intrusion is detected
  • Strengthen and streamline authority Congress gave to DHS last year to issue binding operational directives to federal agencies, especially to respond to substantial cybersecurity threats in emergency situations
  • Mandate annual OMB reports on enforcement of government wide cybersecurity standards
As citizens we all place a significant amount of trust in our government’s ability to defend and protect society and our way of life. In the modern world this trust extends into our cyber life as well. Although individuals still need to take more responsibility for how they manage and protect their own information, our government needs to look at how they are managing our data and protecting information. National security needs to be a priority to all. 


FY 2015 Q2 (4/15) vs. Cybersecurity Sprint Results (7/29)
(http://www.performance.gov/node/3401/view?view=public#progress-update)

This post was written as part of the Dell Insight Partners program, which provides news and analysis about the evolving world of tech. For more on these topics, visit Dell's thought leadership site Power MoreDell sponsored this article, but the opinions are my own and don't necessarily represent Dell's positions or strategies.)



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)



Thursday, July 30, 2015

Cloud Computing + Things = "Information Excellence", Not IoT


The Internet of Things (IoT) has quickly become the next “be all to end all” in information technology. Touted as how cloud computing will connect everyday things together, it is also feared as the real- life instantiation of The Terminator’s Skynet, where sentient robots team with an omnipresent and all-knowing entity that uses technology to control, and ultimately destroy, all of humanity.

Not there yet

Lucky for the humans among us, the technical capabilities of both cloud computing and IoT are way behind these Orwellian fears. Although the technology is promising, research and technical hurdles still abound. Challenges include:
  • Datasets that span multiple continents and are independently managed by hundreds of suppliers and distributors;
  • Volume and velocity of IoT dataflows exceed the capacity ad capability of any single centralized datacenter;
  • Current inability to conduct “Big IoT” data processing across multiple distributed datacenters due to technical issues related to basic service stack for datacenter computing infrastructure, massive data processing models, trusted data management services, data-intensive workflow computing; and
  • Benchmark limitations associated with heterogeneous datacenter application kernels.
Despite these current challenges, the blending of Things and cloud computing can deliver real value today in the creation of “Information Excellence”. Joe Weinman, author of “Cloudonomics: The Business Value of Cloud Computing”, eloquently explains this in his new book, “Digital Disciplines: Attaining Market Leadership via the Cloud, Big Data, Social, Mobile, and the Internet of Things”, information excellence is an extension of traditional operational excellence and its traditional static process design towards a business model that leverages real-time data to maximize process throughput and minimize process costs.[1]

Using cloud to optimize productivity

Also known as dynamic optimization, “Solving these types of problems requires big data collected in real time from things and people, processed in near real time through an optimal combination of edge

Tuesday, July 14, 2015

Cloud Computing Price-Performance Could Vary By 1000%!




Yes, you read that right. The price/performance of your cloud computing infrastructure could vary as much as 1000 percent depending on time and location. High levels of variability have actually been seen within the same cloud service provider (CSP) processing the exact same job. This also means that the cost to you of processing the exact same job in the cloud could vary by this much as well.
This surprising result was discovered by a Rice University group, headed by Dr. T. S. Eugene Ng, that has been focusing on cloud computing. Recently they published their joint work with Purdue University: Application-Specific Configuration Selection in the Cloud: Impact of Provider Policy and Potential of Systematic Testing, in the IEEE INFOCOM 2015 Conference Proceedings. That paper took a first step towards understanding the impact of cloud service provider policy and tackling the complexity of selecting configurations that can best meet the price and performance requirements of applications. That work resulted in a collaboration between Rice University and Burstorm, a developer of computer aided design (CAD) software specifically built to support cloud computing architects.
The Burstorm platform contains a product catalog of over 36,000 products across 900 CSP product sets. Working with Dr. Ng’s group, the study looked at seven suppliers across three continents (Asia, North America and Europe) with a total of 266 computer products spread over three locations per vendor, where available. Raw data was collected every day, for 15 days. The results were then normalized to reflect a 720-hour, monthly pricing model. The final output were price-performance metrics graphs that were used to look at performance and price variance both between the CSPs and geographic regions.
Analysis of the final output showed a 622 percent variation of performance within a same instance type and a price/performance variance of 1000 percent. Performance of the exact same virtual machine instance can also vary by as much as 60 percent over time. The best performing instance also did not show the best price-performance. Availability and behavior of instances was also very dependent on location, even when the instance was provisioned by the same CSP. Dave Hansen, Vice President and General Manager of sales, marketing and services for Dell Software sums up the importance of these results saying:

Dave Hansen, VP and General Manager, Dell
“…[This] report is incredibly valuable. I’ve looked at this problem many times over the years and it is very difficult to make buying decisions on cloud services without this context.”
These results also show that today’s enterprise desperately needs to use active metering and monitoring when procuring cloud-based services. Changes in instance types, pricing, performance over time and availability of services by location highlights the inadequacy of traditional benchmarking philosophies and processes. Another hidden gem in this report is the use of “performance quota” by some service providers. When a customer meets this CSP management quota, the performance of the relevant instance will be reduced. In other words, exceeding this limit will drive up your usage bill. These findings also drive home the need for enterprises to ramp up their due diligence when selecting CSPs. They should