Monday, September 19, 2016

Understand The Language Of Data: Strata+Hadoop World and TAP


Our world is driven by data.  It may speak in whispers, but it can also scream insight and information to those that understand it’s language. This is why I’ll be attending Strata+Hadoop World, Sept 26th to 29th, in New York City.

Even though data can also speak many different languages, data scientist act as our interpreters and guides.  They help us survive and thrive in this data-driven world by addressing and taming the many business challenges it presents, including:
  • An appropriate interpretive language, be it The language itself algebraic notation, an adapted programming language or both;
  • Separating the data signal from the data noise;
  • The enablement of data access and data connectivity within the enterprise;
  • Handling the complexity and variety of complex data which can include images, videos and abstract representations of both the physical and living world;
  • Integration of the time variable into the data interpretation process;
  • Security and protection of the data; and
  • Collaboration with a strong and innovative technology partner.[1]
That last challenge is actually why I’m anxious to learn more about the Trusted Analytics Platform (TAP), open source software optimized to create cloud-native data analytics applications. This multi-tenant platform contains connectors for data ingestion, multiple distributed data stores, advanced processing engines and collaborative analytics capabilities.  It even includes machine learning, model building and visualization within a multi-language application runtime environment. This last feature enables developers and data scientists to use the languages with which they are most familiar. At every layer of the platform, performance optimizations maximize analytic operation speed.  Data security enhancements are also embedded, from the silicon up, to ensure protection of both the data and processing.

Instead of starting from scratch and deploying a host of different tools, packages and services, TAP provides an extensible environment that combines many open-source components into a single, integrated platform.  This integrated architecture provides the APIs, services and extensibility to support the needs of data scientists and application developers for varied analytics on virtually any data, of any size, located anywhere. It also provides management tools and services to control and monitor operations from top to bottom.

TAP also includes a rich marketplace where tools and services can be easily integrated and provisioned on demand. This marketplace is accessible through a simple, browser-based interface to a purpose-built service catalog. Application developers, data scientists and system operators all have the flexibility to choose the tools and services that they need for ingestion, storage or manipulation of data. In addition, system operators can add services to the TAP Marketplace in their instance of TAP, which saves time by eliminating the need to identify and curate key tools and libraries. All of this is done in a secure and collaborative high performance environment. A growing number of organizations support, use and contribute to TAP in order to address many use cases like:
  • Customer behavior analysis using wearable IT systems;
  • Tracking disease progression and treatment;
  • Asset management using RFID data;
  • Equipment failure prediction and optimization using sensor data; and
  • Privacy-preserving genomic analysis using diverse distributed data sets.
Join me in New York next week at Strata+Hadoop World to learn more. To prepare, you can read TAP documentation and code at https://github.com/trustedanalytics , visit their public Jira at https://trustedanalytics.atlassian.net or contact them directly at trustedanalytics@gmail.com.




[1] https://dzone.com/articles/challenges-of-bigdata

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)




Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)



Monday, September 12, 2016

Transformative Training for Hybrid Cloud

Figure 1- Shawn Bolan, Technical Training Manager,
New Horizons of Nebraska,
https://www.linkedin.com/in/shawn-bolan-96b8a9103
In a recent CloudTech article multi-cloud, or hybrid cloud, strategy was heralded as “…
transformative for businesses, allowing them flexibility to scale offerings, save on hosting solutions, and ultimately offer better solutions to their customers. The article goes on to cite:
  • A 2016 Dimensional Research survey of more than 650 IT decision-makers that indicated that 77 percent of businesses are planning to implement multi-cloud architectures in the near future; and
  • A 2015 IDC study that found that 86% of enterprises predicted they will need a multi-cloud approach to support their solutions within the next two years.  

The actual benefits of hybrid cloud include:

  • Improve disaster recovery and geo-presence
  • Ability to use unique cloud-specific services from different providers as they are needed
  • Ability to leverage the public cloud benefits of low-cost and unlimited scalability in order to move agile applications to the cloud
  • Use of a private cloud for red-tape bound applications or more traditional infrastructure

This may be why Shawn Bolan, Training Manager for the world’s largest independent training vendor New Horizons , decided to prepare for the new hybrid cloud world by becoming a NCTA CloudMASTERS® instructor. With over 300 locations across 70+ countries, New Horizons always wants to help keep their clients ahead of the curve and the NCTA CloudMASTER Certification is certainly posed to do just that.

Although Shawn already possesses numerous vendor certifications from companies like VMware, Cisco and CompTIA, this “non-vendor specific” preparation, will enable him and his team to deliver the high-level training needed to implement quick hybrid cloud adoption at the demanding pace of IT. According to Shawn, “Training doesn’t stop once you’ve entered the cloud. It actually enhances your

Monday, September 5, 2016

Surviving the coming "Hackerpocalypse"

Photo credit: Shutterstock
With all the excellent training available on television today, we are all now well prepared to deal with the coming Zombie Apocalypse.  Our failure as a society lies, however, in our misunderstanding of the nature of the cybersecurity challenge. This failure threatens us all and our survival will depend on society’s ability to deal with the evolution and maturation of the changing enterprise cybersecurity challenge.

If you’re completely oblivious to the living dead threat, a zombie apocalypse refers to a widespread (usually global) rise of zombies hostile to human life.  The zombies will engage in a general assault on civilization where victims may become zombies themselves. This causes the outbreak to become an exponentially growing crisis. The spreading phenomenon swamps normal military and law

Sunday, August 28, 2016

From PC Break/Fix to CloudMASTER®


It was late 2011 and Steven Donovan was comfortable working at SHI International Corporation, a growing information technology firm, as a personal computer break/fix technician. His company had been growing quickly from a $1 million "software-only" regional re-seller into eventually becoming a $6 billion global provider of information technology products and services.

At that time, cloud computing was just starting to explode onto the information technology scene. Although Amazon Web Services had been offering its Elastic Compute Cloud (EC2) since 2006, browser-based enterprise applications from companies like Google had only been around since 2009. Steven wanted to somehow elevate himself professionally so after hearing good things about the National Cloud Technologist Association's  CloudMASTER® certification , which was available at

Wednesday, August 17, 2016

Is Data Classification a Bridge Too Far?


Today data has replaced money as the global currency for trade.

“McKinsey estimates that about 75 percent of the value added by data flows on the Internet accrues to “traditional” industries, especially via increases in global growth, productivity, and employment. Furthermore, the United Nations Conference on Trade and Development (UNCTAD) estimates that about 50 percent of all traded services are enabled by the technology sector, including by cross-border data flows.”

As the global economy has become fully dependent on the transformative nature of electronic data exchange, its participants have also become more protective of data’s inherent value. The rise of this data protectionism is now so acute that it threatens to restrict the flow of data across national borders. Data-residency requirements, widely used to buffer domestic technology providers from international competition, also tends to introduce delays, cost and limitations to the exchange of commerce in nearly every business sector. This impact is widespread because it is also driving:
  • Laws and policies that further limit the international exchange of data;
  • Regulatory guidelines and restrictions that limit the use and scope of data collection; and
  • Data security controls that route and allow access to data based on user role, location and access device.
A direct consequence of these changes is that the entire business enterprise spectrum is now faced with the challenge of how to classify and label this vital commerce component.


Figure 1- The data lifecycle
The challenges posed here are immense. Not only is there an extremely large amount of data being created everyday but businesses still need to manage and leverage their huge store of old data. This stored wealth is not static because every bit of data possesses a lifecycle through which it must be monitored, modified, shared, stored and eventually destroyed. The growing adoption and use of cloud computing

Tuesday, August 9, 2016

Vendor Neutral Training: Proven Protection Against Cloud Horror Stories






Cloud computing is now entering adolescence.  With all the early adopters now swimming in the cloud pool with that “I told you so” smug, fast followers are just barely beating the early majority. The gold rush to cloud is also driving the IT herd to get cloud computing training.  Training vendors from multi-billion dollar behemoths to little Mom and Pop shops are ready to cash in with fast and easy, vendor-specific certifications for just about any cloud service provider.

Although at first glance, all is well with this vision, the industry’s adolescent hubris has started to show some troubling warning signs. The source of the trouble, however, is not with the Cloud Service Provider (CSP). The problems are actually caused by the CSP customers themselves!

Driven by an almost reflexive assumption that the planet’s largest providers are always best, most customers fail to conduct even the most basic CSP adoption due diligence tasks. These same customers also have a very limited appetite for learning foundational cloud computing concepts. These facts have combined to make cloud computing pilot errors typical and CSP transition failures much more common. The broadening use of hybrid cloud solutions and the rapid growth in the sheer number of cloud service provider options have also contributed to this unfortunate trend. Although there is always great value in vendor-specific training, this type of focused investment should be made after enterprise IT professionals have been well grounded in cloud computing fundamental and well versed in the now plentiful cloud service provider options. This is why vendor-neutral cloud computing training is so critical to today’s IT professional and, by extension, the modern business enterprise.

As more companies take advantage of cloud service benefits, the need for IT professionals to be skillful in the use and implementation of a wide range of cloud services becomes even more acute. This form of training also serves as a layer of

Wednesday, July 20, 2016

Cognitive Business: When Cloud and Cognitive Computing Merge


Cloud computing has taken over the business world! With almost maniacal focus, single proprietors and Board Directors of the world’s largest conglomerates see this new model as a “must do”. This rapid shift is, in fact, accelerating. As Jeff Bertolucci observes in “The Shift to Cloud Services Is Happening Faster Than Expected”:

“According to the sixth annual Uptime Institute Data Center Industry Survey, which examines the big-picture trends shaping IT infrastructure delivery and strategy, the move to cloud services is accelerating. The Uptime Institute’s February 2016 poll of more than 1,000 data center and IT professionals predicts that an even faster shift to the cloud will occur over the next four years, reports ZDNet.” 

Another maybe even more important trend, that is actually being driven by cloud computing, is the rapid expansion of cognitive computing. In this arena, IBM’s Watson, famously known for defeating Jeopardy gameshow champions Ken Jennings and Brad Rutter, has quickly established itself as a commercial cognitive computing powerhouse. Contemporary reports of the Jeopardy contest from the New York Times cited this victory as IBM’s “…proof that the company has taken a big step toward a world in which intelligent machines will understand and respond to humans, and perhaps inevitably, replace some of them”. Although we are not yet at the human replacement stage, the merger of cloud and cognitive computing is rocking the business status quo.

Coined as “Cognitive Business” this trend can deliver quantum level improvement to just about any industry vertical. Examples include:
  • Using highly automated and economic cloud infrastructure to deliver proactive and predictive monitoring and threat interception in cybersecurity;
  • Leveraging cloud computing device independence to enable real-time social media analytics that coordinate delivery of context driven information and commercial offers across multiple marketing channels;

Saturday, July 9, 2016

Government Cloud Achilles Heel: The Network






Cloud computing is rewriting the books on information technology (IT) but inter-cloud networking remains a key operational issue. Layering inherently global cloud services on top of a globally fractured networking infrastructure just doesn’t work. Incompatibilities abound and enterprise users are forced to use “duct-tape and baling wire” to keep their global operations limping along. The continuing gulf between IT professionals and business managers only exacerbates this sad state of affairs. IT professionals, however, bear a more significant amount of blame for the current state because we are the ones responsible for providing the operational platform and enabling the new information delivery models that drive modern constituent services and commerce.

The use of cloud has also driven changes in how governments and commercial enterprises approach data security. In the cloud era, organizations can no longer get away with treating all data at some arbitrarily high level of protection.  More than ever, they need to address data protection requirements and controls based on the lifecycle stage of the data. They also need to evaluate the numerous permutations of business function, data user role, location of access, legal or regulatory guidelines and user devices. This is especially important in the public sector where organizations use public funding and operate within a framework of public trust. While this type of analysis could have arguably been seen as overkill when organizations had direct control over the networks they used, taking that view today is tantamount to declaring open season on data for any hacker, identity thief or

Sunday, June 19, 2016

System Integration Morphs To Cloud Service Integration





Cloud Service Brokerage is changing from an industry footnote toward becoming a major system integration play.  This role has now become a crucial component of a cloud computing transition because they help organizations aggregate multiple cloud services, integrate services with in-house applications, and customize these services to better meet customer needs. CSBs also help by consulting and recommending the best fit cloud services according to business requirements and goals. Cloud brokers may also be granted rights to negotiate with different service providers on behalf of their customers. This transformation is driven by the rapid rise of cloud computing, which has risen from under $6B in 2008 to a point where the market is expected to almost reach $160B in 2020. The global Cloud Service Brokerage Market itself is expected to grow from $5.24 Billion in 2015 to $19.16 Billion by 2020. 



Since CSBs merge the functions of reseller, systems integrator and independent software vendor (ISVs) into a convenient service delivery model, they deliver solutions by aggregating cloud services sourced from multiple cloud service providers. They can also customize those services to meet

Thursday, June 16, 2016

Networking the Cloud for IoT – Pt 3 Cloud Network Systems Engineering

Dwight Bues & Kevin Jackson

(This is Part 3 of a three part series that addresses the need for a systems engineering approach to IoT and cloud network design.  Networking the Cloud for IoT - Pt. 1: IoT and the Government , Networking the Cloud for IoT - Pt. 2 Stressing the Cloud )


The Case For Cloud Network Systems Engineering

IoT networking requirements are vastly different from those supported by today’s cloud network. The processing and transport levels are multiple orders of magnitude higher than ever seen before. More importantly though, societal economic and the safety ramifications of making mistakes during this transition are off the scale. This is why system engineering of the cloud computing network is now an immediate global imperative.

System engineering has many different aspects, but it all starts with the question, “What do I really want to know?” This is the beginning of the CONOPS document referenced earlier. This document captures User Needs which are formal statements of what the user wants from the system. This CONOPS leads to Derived Requirements which, through an iterative process, are analyzed against a Target Architecture. Once a project is underway, methods of Integration are planned in order to provide Validation (did we build the right system?) and Verification (did we build the system right?) of the requirements. Further considerations for SE include: how to conduct Peer Reviews of a design (either Systems, Hardware, or Software), studying Defects, and establishing processes to ensure the Quality of the final product and Compliance with Standards.



While multiple sources indicate that the business world is investing heavily in the IoT, there are no indication that these investments are addressing the question of what does society really want to know in the IoT world. To ensure success, design formality is necessary, lest “IoT” become the latest retired buzzword. Dr. Juran, in Juran on Leadership for Quality, makes the point that quality