Wednesday, April 27, 2016

The Future of Storage

A few weeks ago I had the pleasure of doing a Blab on advanced storage with Daniel Newman and Eric Vanderburg.  We covered some pretty interesting points on enterprise storage challenges, advanced storage trends and flash.  If you didn't catch it, a replay is now available.
https://marketing.dell.com/storage-blab-storage-insights

( This post was written as part of the Dell Insight Partners program, which provides news and analysis about the evolving world of tech. To learn more about tech news and analysis visit Power More. Dell sponsored this article, but the opinions are my own and don’t necessarily represent Dell’s positions or strategies. )



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)



Sunday, April 3, 2016

DevOps and Hybrid Infrastructure Synergy


(This post first appeared in IBM's Point B and Beyond)

The definition of DevOps emphasizes collaboration and communication between software developers and other IT professionals while automating the software delivery and infrastructure change process. While agile software development and the use of automated infrastructure configuration tools stand proudly in the DevOps spotlight, little has been said about the actual infrastructure that modern tools such as Puppet and Chef automate.

DevOps in Hybrid IT Environments


Much has been written about Chaos Monkey, a tool that ensures individual software components work independently by randomly killing instances and services within Netflix’s Amazon Web Service (AWS) infrastructure. This process clearly stresses AWS infrastructure operations as automation scripts reconfigure infrastructure components on the fly. Without taking anything away from the operations excellence this displays, how would an enterprise match this feat across a hybrid IT environment? How would you support the DevOps philosophy across a hybrid IT infrastructure?

The DevOps philosophy embodies the practice of operations and development engineers working together through the entire service life cycle, from design to development to production support. It’s linked closely with agile and lean approaches and abandons the siloed view of the development team being solely focused on building and the operations team being exclusively centered on running an application.

As enterprises adopt both private and public clouds, they typically do not throw away their in-house infrastructure. Although consolidation, outsourcing and IT efficiencies may reduce the number of corporately owned data centers, a hybrid operational environment will still remain. Extending the DevOps philosophy into such an environment requires active management of all an organization’s IT infrastructure, regardless of its source. This active IT management is different from the budget-and-forget management seen in the past and requires the following:
  • Active monitoring and metering of all IT services;
  • Continuous benchmarking and comparisons of similar services; and
  • Viable options for change among pre-vetted and approved IT infrastructure service options (IT supply chain management).
These management functions are delivered by IT service broker enablement, which refers to the integration of platforms that aggregate, customize and/or integrate IT service offerings through a single platform. In transforming the traditional, mostly static infrastructure model into a multisourced IT service supply chain operation, these platforms also deliver financial management and hybrid IT solution design support. They uniquely enable the infrastructure dynamism needed to pursue DevOps across a hybrid IT environment.

A DevOps Mindset in the Dynamic World of Cloud


According to Gravitant, hybrid IT is also more than just a catalog of public and private IT infrastructure resources. It is a strategic approach that unifies the hardware and software operational components of an end-to-end solution. With this approach, an organization standardizes the delivery of multisourced solutions by doing the following:
  • Leveraging existing tools and resources without disruption;
  • Offering additional, automated choices for users who need speed and agility; and
  • Addressing architecture holistically, with the optimal balance of technology investments — on-premises, off-premises, hosted, private or public.

This concept requires a shift in structure and mindset because the dynamic world of the cloud requires a new organizational structure. The shift in structure helps organizations move from a

Wednesday, March 23, 2016

Are electronic medical records worth it?

The use of Electronic Medical Records (EMR) by medical professionals has increased dramatically. According to HealthIT.gov, 2015 statistics show that 56 percent of all U.S. office-based physicians (MD/DO) have demonstrated meaningful use of electronic health records. The downside of these statistics is that when HIPAA was enacted in 1996, privacy was not a major focus and it actually took HHS eight years to publish the initial HIPAA Privacy Rule. It then took the agency several more years to publish initial security rules which directed “covered entities” (e.g., providers, hospitals, health insurers) to perform a risk assessment, understand where their vulnerabilities were, and to adopt reasonable safeguards to fix them.
Unfortunately this timeline has made healthcare records easy pickings for cybercriminals. Since 2010, incidents of medical identity theft have doubled, according to a survey conducted by the privacy-focused Ponemon Institute. A second report by the Identity Theft Resource Center on breaches in the first four months of 2015 showed that one-third of all data breaches by industry occurred in healthcare: 82 instances in total, exposing over 1.7 million records. Modern Healthcare, in fact, estimated that the medical records of almost one in eight Americans have been compromised. The American Action Forum estimates that all the breaches since 2009 have cost the healthcare system $50.6 billion. Data breaches have been so bad that Blue Cross Blue Shield has announced that they will offer their customers identity protection in 2016.
Figure 1- Number of personal data breach incidents by industry over time (http://www.gemalto.com/brochures-site/download-site/Documents/Gemalto_H1_2015_BLI_Report.pdf)
According to a report by the medical research firm Kalorama Information, the problem will worsen over time because the $25B electronic medical record industry is predicted to grow at a 7-8 percent clip in the coming year. Much of the growth is spurred by the 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act, which offered financial incentives for using electronic records until 2015 and penalties for not using EMR thereafter.

Is EMR worth the cost in privacy and peace of mind?


The value of the technology has been heralded as improved diagnosis and treatment through better information access and sharing. Researches, however, have found that the vast majority of providers don’t share electronic patient data outside their own practice. According to a study by the Agency for Healthcare Research and Quality, just 14 percent of providers were sharing data with other providers in 2013 Psychology Today notes that many medical centers’ outpatient

Saturday, March 5, 2016

Finding a Framework for Hybrid Cloud Risk Management



 (Sponsored by IBM. Originally published on Point B and Beyond)

Hybrid cloud is rapidly becoming essential to today’s information technology processes. This is why hybrid cloud risk management has become the keystone to many modern corporate strategies. To effectively manage this shift, leading enterprises are reorganizing how the business side of IT is accomplished. When this reality is coupled with the rising cost of poor cybersecurity, decisions often rise to the board level.

Threats that challenge cloud-based information systems can have adverse effects on organizational operations, organizational assets, employees and partners. Malicious entities can exploit both known and unknown vulnerabilities, compromising the confidentiality, integrity or availability of the corporate information being processed, stored or transmitted by those systems. In this environment, risk management must be viewed as a holistic activity that is fully integrated into every aspect of the business.

Establishing Standards for Hybrid Cloud Risk Management

The National Institute of Standards and Technology (NIST) offers a very good model for hybrid cloud risk management that groups activities into three categories based on the level at which they address the risk-related concerns. It divides activities and concerns into:
  • The organization level (tier 1);
  • The mission and business process level (tier 2); and
  • The information system level (tier 3).
Addressing these activities in reverse order, the NIST Risk Management Framework (RMF) provides a disciplined and structured process for integrating tier 3 enterprise information security with risk management activities. Since mission or business processes govern tier 2, those details generally lie outside the scope of general treatment. Tier 1 organizational level aspects are, however, at the heart of the organizational restructuring needed to deal with risk management within today’s hybrid IT environments.


Tuesday, February 23, 2016

Cancer, cloud and privacy shield


(Originally published in Dell PowerMore)
For more than 10 years, the rapid rise of cloud computing has enabled an even more rapid application of cloud to genomic medicine. In fact, since the U.S. National Institutes of Health (NIH) lifted a 2007 ban on uploading their genomic data into cloud storage, the explosion in cloud use has accelerated. Some of the most impressive accomplishments in this field have included the:
• Pan-Cancer Analysis of Whole Genomes (PCAWG) project, that brings together whole-genome sequencing data from the International Cancer Genome Consortium (ICGC) and The Cancer Genome Atlas (TCGA) projects. This resource describes over 2000 tumor and matched control samples that cover more than 30 cancer entities. Academics will be using the EMBL-EBI Embassy Cloud and high-performance computing centers at the University of Chicago, the Electronics and Telecommunications Research Institute in Seoul, the University of California in Santa Cruz, the University of Tokyo and the Heidelberg Center for Personalized Oncology at the German Cancer Research Centre (DKFZ) (http://news.embl.de/science/1507-genome-cloud/);
• 100,000 Genomes Project in the United Kingdom includes genomic data from 70,000 people including National Health Service patients with a rare disease, plus their families, and patients with cancer;
• Phoenix, Arizona-based Translational Genomics Research Institute (TGen) and the Neuroblastoma and Medulloblastoma Translational Research Consortium on personalized medicine trials which uses high-performance computing in the cloud to sequence tumor samples to create new treatment database that help determine personalized drug therapy approaches for children. The project is also being expanded to pediatric cancer clinical trials in EMEA, starting with sites in France and Lebanon.
“Cloud Powered Genomics” brings with it:
• Speed. The time it takes to process the required 90 billion data points is reduced from 10 days to 6 hours or less