Wednesday, December 28, 2016

Olisipo Can Make You A CloudMASTER®

Olisipo Learning in Portugal has a culture that is based on the constant search for new challenges. Recognized by their clients as the “Best HR Supplier”, they have placed more than 450 specialists into some of the country's most attractive and innovative IT projects. As a leading Portuguese IT sourcing partner, they also take the responsibility of managing and developing consultant skills very seriously. Olisipo's diversified and differentiating training experience accelerates professional success and enhances talent. As the only Portuguese IT company listed as a Top 500 Growth Company in Europe, they also contribute directly to the creation of jobs in the European Union. These are just a few of the reason why Olisipo has prioritize NCTA CloudMASTER® Certification in 2017 for their candidates, consultants and clients.

The current European Cloud Computing Policy is set towards building a Digital Single Market Strategy through the European Cloud Initiative, the European Free Flow of Data Initiative and the emerging issues related to ownership, access, portability of data and switching of cloud service providers. This is why Olisipo sees themselves as more than just a company, They see themselves as a "Sea of Opportunity" for those seeking to participate in the growth of cloud computing in Europe.

In order to deliver on their promise of opportunity, 2017 CloudMASTER® Certification training is being delivered at their Lisbon office on the following dates:
  • January 23-27, 2017 - Cloud Technologies: Managing the core set of cloud technologies.
    • This course is designed for system administrators or prospective system administrators who have one year of experience working with Windows Server or other server platforms and who want to develop their evaluation, selection, and implementation skills for cloud services, including Software as a Service (SaaS) solutions, Platform as a Service (PaaS) solutions, and Infrastructure as a Service (IaaS) solutions that target businesses of all sizes.
  • February 20-24, 2017 - Cloud Operations: Deploying, configuring, and administering Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) solutions.
    • This course is designed for system administrators or cloud technologists who wish to further develop their skills for evaluating, deploying, and administering cloud services. This includes evaluating and selecting Platform as a Service (PaaS) solutions and deploying applications to the cloud, as well as maintaining, securing, and optimizing cloud solutions to achieve the best Total Cost of Ownership (TCO) and Return on Investment (ROI).
  • TBD 2017 - Cloud Architecture: Manage the design and plan the implementation of cloud architectures.
    • This course is designed for system administrators who wish to plan, design, and implement cloud services for their organizations. This includes the ability understand cloud solution features, capabilities, and components offered by cloud provides at a deep level so as to design cloud and hybrid solutions for application deployment and infrastructure scenarios. Cloud architects must also evaluate, and plan for the appropriate compute, network, database, and security components to build a solution that meets the needs of their organization. In addition, they must secure, monitor, and optimize those solutions.
These three CloudMASTER courses guide students through a wide expanse of cloud based technologies. Technologies covered include Microsoft Windows and Azure, Amazon Web Services, VMware, Linux, Google Docs, Drupal, WordPress, OpenStack, Rackspace, Digital Ocean, Chef, and Chef Solo. In comparison with other cloud computing certifications, the CloudMASTER® certification demonstrates real-world knowledge through practical activities and lab exercises, allowing students to learn and showcase a complete portfolio of skills on a wide range of common cloud technologies.

If Lisbon, Portugal is not a convient location for you, NCTA CloudMASTER® training is  also available from many other global locations. For more information please visit the NCTA/Logical Ops registration page.

Olisipo from Olisipo on Vimeo.

  Olisipo from Olisipo on Vimeo.

This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.

Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)

Monday, December 26, 2016

What is Your 2017 Business Strategy?

Photo credit: Shutterstock

End of year predictions in December are as predictable as tomorrow’s sunrise, but a recent video on 2017 Technology Trends helped me really understand how important a digital transformation strategy is to just about any business executive. The CSC Town Hall conversation actually identified some of the specific tasks that today’s corporate executive really need to address in the coming year. To enhance your opportunity for 2017 business success, your action item list should include:

  • Reconfiguring both corporate front and back office operations for digital execution which is expected to reshape organizational structures, employee compensation models and every business’ partner community;
  • Preparing for intelligent machines advisory services to the enterprise that will drastically increase productivity and business competition;
  • Leveraging the industrial internet, aka the Internet of Things, that will use sensor cross connectivity to improve human safety and machine productivity
  • Dealing with radically new and culturally driven business innovations that originate from the East Asian “Sinosphere”
  • Using simplified cloud computing platforms that will drive 80% of all corporate information technology into public cloud platforms by 2020; and
  • Capturing the value proposition of the truly interactive virtual experience and digital interface for enhanced user experience and worker productivity

To survive these imminent changes, corporations must identify new transformational business opportunities within the context of their specific industry vertical and competitive landscape. This requires much improved collaboration between business and technology leaders around developing an explicit linkage between the cloud computing economic model and the relevant business economic model. This linkage will inevitably include the effective use of virtual IT operational models that include dynamic infrastructure provisioning, infrastructure auto-scaling, application microservices and serverless computing. Preparing for these changes may also require a rethinking of your core business because customer experience design may actually drive your business success. Since quality experiences are based on customer empathy, business analytics, and cognitive technology, a successful business strategy may need to blend all of those capabilities. This also means having an ability to engage with both your customers and employees in meaningful ways, no matter where they happen to be.

Your corporate information technology team itself will also need to deal with an extremely rapid shift from technology operations to robust and automated IT service management. The coming transition to a public cloud dominated IT industry landscape also will also demand corporate strategies around:
  • Matching cloud deployment and service model options to organizational risk tolerance;
  • Enhancing organizational expertise on cloud computing through an increased training investment strategy; and
  • Preventing digital operations failures due to an inability to monitor and enforce the necessarily strict IT governance models.

One final thought. Organizational risk tolerance levels may also need some significant recalibration because it only takes one successful “fail-fast” entrepreneur to reshape an entire industry. Chief financial officers and chief risk officers need to take a holistic approach that integrates business risk management and performance management, including compliance where required, as part of the overall business strategy and execution. In addressing this need, external risk management services may be the key to evolving your company from basic compliance and ad-hoc responses, to optimized business risk management, in which the value of risk management far outweighs the costs. 

This post was brought to you by IBM Global Technology Services. For more content like this, visit

Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)

Friday, December 23, 2016

Firebrand Announces 2017 Accelerated CloudMASTER® Dates

Firebrand, the leader in Accelerated Learning, has recently announced it's 2017 delivery schedule for their accelerated CloudMASTER® training course. Delivered in partnership with Logical Operations and the NCTA, this unique opportunity to immerse yourself in cloud computing will be offered during the following periods:
  • February 6-14, 2017 
  • March 20-28, 2017
  • May 1-9, 2017
  • June 12-20, 2017
Through this accelerated nine day CloudMASTER® course, students will learn practical skills on how to manage cloud technologies from a wide range of leading providers.  This course delivery model is 40% faster than traditional training and covers cloud technologies from Microsoft Azure, Amazon Web Services, Drupal, VMWare, WordPress, Google Docs and Digital Ocean.

You’ll learn how the technologies work in each cloud system, from multiple perspectives. You’ll then move on to learning the specifics of how the cloud systems operate in your business. This will help you deploy complex applications across multiple cloud technologies. You’ll also learn practical cloud architecture skills, helping you plan and implement cloud architecture to complete your specific business goals.

Through Firebrand’s unique Lecture-Lab-Review technique:
  • Lecture - You will be in a learning environment where there are no time restrictions. Your instructor will not be rushing away for lunch and no one is eager to catch the first train home. Appealing to both auditory and visual learners, instructors use demonstrations and real-world experience to keep the day interesting and engaging.
  • Lab - Many benefit from this form of kinesthetic learning. In our state-of-the-art labs you will be able to practice what you learn. The labs are available 24 hours a day - this really is your chance to fully immerse yourself into the subject.
  • Review - Take this opportunity to ask any questions and seek guidance from your instructor. With most training courses you will wait months before sitting the exam. Not at Firebrand Training. We operate the largest exam centre in the country, and you sit your exams as part of the course. Instructor-led self-test systems ensure you will be fully prepared for the exam.

This course is delivered by official NCTA instructors using official NCTA course materials. Once completed, you’ll be prepared for all three CloudMASTER exams covering topics like:
  • Identifying business benefits of cloud computing
  • Implementing Azure cloud services and virtual machines
  • Implementing Rackspace cloud servers
  • Protecting business continuity
This course is best suited to System Administrators looking to improve all areas of their cloud implementation knowledge. Learning practical skills will help Administrators conduct smoother implementation of cloud technologies, throughout the entire process.

For more information please visit the NCTA/Logical Ops registration page.

This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.

Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)

Tuesday, December 20, 2016

TAP Accelerates Artificial Intelligence

Photo credit: Shutterstock

Over the past few years, the use of artificial intelligence has expanded more rapidly than many of us could have imagined. While this may invoke fear and dread in some, these relatively new technology applications are clearly delivering real value to our global society.  This value is generally seen in four distinct areas:
  • Efficiency - Delivering consistent and low-cost performance by characterizing routine activities with well-defined rules, procedures and criteria
  • Expertise - augment human sensing and decision making with advice and implementation support based on historical analysis
  • Effectiveness – improve the overall ability of workers and companies by improving coordination and communication across interconnected activities
  • Innovation - enhance human creativity and ideation by identifying alternatives and optimizing recommendations.
One of the key drivers in sustained growth of AI is the rapidly increasing availability of data. The broadening global use of the Internet and the connectivity the Internet affords have combined to deliver data in volumes that have never been experienced before. Applications to capitalize on this use and connectivity have also helped society grow from generating approximately 5 zettabytes of unstructured data in 2014 to a projected approximation of 40 zettabytes of unstructured data in 2020.

Impressive innovations in big data algorithms have also added fuel to the explosive growth of AI. The mostimportant of these algorithm categories include:
  • Crunchers. algorithms use small repetitive steps guided with simple rules to number crunch a complex problem.
  • Guides. These algorithms guide us on how to best navigate a policy, process, or workflow based on historic actions that were successful
  • Advisors. These algorithms advise us on our best options by providing us with predictions, rankings, and likelihood-of-success based on historic patterns

Monday, December 19, 2016

Cognitive on Cloud

Photo credit: Shutterstock

According to the IBM Institute for Business Value the market will see a rapid adoption of initial cognitive systems. The most likely candidates have moved beyond descriptive and diagnostic, predictive and routine industry-specific capabilities. 70 percent of survey respondents are currently using advanced programmatic analytics in three or more departments. In fact, the widespread adoption of cognitive systems and artificial intelligence (AI) across various industries is expected to drive worldwide revenues from nearly US$8.0 billion in 2016 to more than US$47 billion in 2020.

The analyst firm IDC predicts that the banking, retail, healthcare and discrete manufacturing industries will generate more than 50% of all worldwide cognitive/ AI revenues in 2016. Banking and retail will each deliver nearly US$1.5 billion, while healthcare and discrete manufacturing will deliver the greatest revenue growth over the 2016-2020 forecast period, with CAGRs of 69.3% and 61.4%, respectively. Education and process manufacturing will also experience significant growth over the forecast period.

Figure 1- Credit Cognitive Scale Inc.

So what can cognitive computing really do? Three amazing examples of this burgeoning computing model include:

·         DeepMind from Google that can mirror some of the brain’s short-term memory properties. This computer is built with a neural network capable of interacting with external memory. DeepMind can “remember” using this external memory and use it to understand new information and perform tasks beyond what it was programmed to do. The brain-like abilities of DeepMind mean that analysts can rely on commands and information, which the program can compare with past data queries and respond to without constant oversight.
·         IBM Watson which has a built-in natural language processor and hypothesis generator that it uses to perform evaluations and accomplish dynamic learning. This system is a lot more advanced than the digital assistants on our smartphones and allows users to ask questions in plain language, which Watson then translates into data language for querying.
·         The Qualcomm Zeroth Cognitive Computing Platform that relies on visual and auditory cognitive computing in to reflect human-like thinking and actions. A device running the platform can recognize objects, read handwriting, identify people and understand the overall context of a setting. Zero

Tuesday, November 29, 2016

Europe: NCTA CloudMASTER® Hotspot

The ongoing digital transformation continues to generate a steady demand for workers with increasingly sophisticated digital skills. This process is multi-dimensional and workers with these highly specialized skills are very much sought after. The European Union Commission estimates that there could be a shortage of around 800,000 information and communications technology (ICT) specialists in the EU by 2020. A third dimension is the fact that there is a growing need to reskill the existing workforce, especially in light of the Fourth Industrial Revolution and the incorporation of the Internet of Things (IoT) and cyber-physical systems into the industrial production process. The “smart factory” also opens up new possibilities of individualized and efficient customer care and smoother communication with suppliers in the supply chain logistics. This is based on cloud-based platforms and artificial intelligence.

According to the Jacques Delors Institue in Berlin, digital skills in general are now needed in almost all types of work. In this recent contribution to the debate on the European Union the Institute has targeted the societal changes being driven these broad changes. To address the European continent-wide impact of digital transformation, this think thank is proposing a Europe-wide strategy to reskill workers for the requirements of connected production.

The core mission of the Jacques Delors Institute is to produce analyses and policy proposals targeting European decision-makers and the wider public. The work of the Jacques Delors Institute is inspired by the action and ideas of Jacques Delors, and organized around three axes:
  • "European Union and citizens", which covers questions of policy, institutions and civil society, focusing in particular on the themes of participatory democracy, European institutions, European political parties and European identity.
  • "Competition, cooperation, solidarity", covering economic, social and regional issues with a specific focus on the European budget, intra-EU solidarity, agriculture, cohesion policy, economic governance, and energy policy.
  • "European external action", bringing together work with an international dimension, including EU-US relations, EU relations with neighbors, and extra-EU regional integration.

These changes are, in essence, combining platform-based communication with cloud computing, improved sensor technology and the application of sophisticated algorithms to large and unstructured pools of data generated by these sensors. This combination makes it possible to link up an almost infinite number of interconnected physical objects. One of the main economic impacts of this is seen

Sunday, November 27, 2016

Smart Manufacturing Is Cloud Computing

As cloud computing simultaneously transforms multiple industries many have wondered about how this trend will affect manufacturing. Often characterized as “staid”, this vertical is not often cited when leading edge technological change is the topic. This view, however, fails to address the revolutionary nexus of cloud computing and the manufacturing industry. Referred to as Digital Thread and Digital Twin; these cloud driven concepts are now driving this vertical’s future.

Digital Thread is a communication framework that connects traditionally siloed elements in manufacturing processes in order to provide an integrated view of an asset throughout the manufacturing lifecycle. Digital thread implementation also requires business processes that help weave data-driven decision management into the manufacturing culture.

A Digital Twin is a virtual representation of a manufacturer’s product used in product design, simulation, monitoring, optimization and servicing. They are created in the same computer-aided design (CAD) and modeling software that designers and engineers use in the early stages of product development. A digital twin is, however, retained for later stages of the product's lifecycle, such as inspection and maintenance.

Figure 1- The smart manufacturing landscape

When successfully combined these processes can deliver on the promise of Smart Manufacturing, which include:
·         Ability to receive published data from equipment using secure open standards, analyze and aggregate the data, and trigger process controls back to equipment, systems of record and process workflows across the enterprise and value chain connected via A2A and B2B open standards.
·         Autonomous and distributed decision support at the device, machine and factory level.
·         Ubiquitous use of mined information throughout the product value chain including end-to-end value chain visibility for each product line connecting manufacturer to customers and supplier network.

Tuesday, November 22, 2016

George Youmans, Jr.: The CloudMASTER Fashionista!

So how could a NCTA Certified CloudMASTER accelerate his career in the fashion industry?

To answer that question, you would need to catch up with George Youmans, Jr. He has been with fashion giant Ralph Lauren since October 2012. That was around the time he decided to complete the NCTA CloudMASTER curriculum. After graduation George was first promoted to Senior IT Technician and then to Senior Technologist.

So why would a company like Ralph Lauren even need a cloud computing specialist? 

Ralph Lauren installed interactive window displays at London’s most up-market department store, Harrods. Shoppers can use their smartphone to activate an interactive map which led them directly to the Fashion Lab where they could buy all of the items they saw on display. If the store was closed then users could still access information about the Ralph Lauren collection from the Harrods website.

Ralph Lauren conducted a technology trial where they embedded RFID tags in clothing that can be detected by the dressing-room mirror. Details about clothing items are displayed on the mirror (several languages are supported), and the system also synchronizes with inventory and point-of-sale systems. The mirror can also mimic the lighting of various environments. Some of its lighting options are white, dusk, club and aquarium. Other lighting options are tailored to the Ralph Lauren brand like "Fifth Avenue Daylight," "East Hampton Sunset" and "Evening at the Polo Bar".

The luxury fashion brand has also joined the race to produce fashionable products for the wearable-technology market. Its men only PoloTech Shirt was designed to read vital signs like heart rate and variability, breathing depth and recovery, intensity of movement, energy output and stress levels,

Sunday, November 20, 2016

Is Cloud Interoperability a Myth?

Photo credit: Shutterstock
As the industry matures, cloud computing will increasingly rely on interoperability in order to grow and deliver more value to industry. Assuming this is a fact, what does it mean when eighteen major OpenStack vendors come together to work through the challenges involved with achieving enterprise interoperability? Events at the OpenStack Summit in Barcelona helped provide a window into the promise of tomorrow's interoperable environment.
In cloud computing, interoperability generally refers to the ability of service models from different cloud service providers to work together. Specifically:
·         Infrastructure-as-a-service
o   Access Mechanism - defines how a service in cloud may be accessed by users and/or software developers,
o   Virtual Resources - service delivery as a complete software stack of installing a virtual machine,
o   Network - addressing and API,
o   Storage - management and organization of storage,
o   Security - authentication, authorization, user accounts and encryption,
o   Service-Level Agreement - architecture format, monitoring,
·         Platform-as-a-Service
o   The exchange of data and services among different platforms hosted on different infrastructures on cloud;
o   Data compatibility among different platforms,
o   Portability between platforms
o   Data transfer procedures (i.e. packing, copying, instantiating, installing, deployment and customization)
·         Software-as-a-Service
o   Interoperability among applications in the same cloud,
o   Data exchange and operation calls in applications on different cloud-computing environments
o   Software programs that are distributed in different cloud environments and integrate data and applications in cloud in a unified way, and
o   Migration of applications from one cloud environment to another
If this isn’t enough of a challenge, one would also need to specifically address the many embedded and overriding interoperability aspects, including:

·         Technical interoperability - development of standards of communication, transport and representation;
·         Semantic interoperability - the use of various different terms to describe similar concepts may cause problems in communication, execution of programs and data transfers;
·         Political/Human interoperability - the decision to make resources widely available has implications for organizations, their employees and end-users;
·         Interoperability of communities or societies - there is an increasing need to require access to information from a wide range of sources and communities; and
·         International interoperability - in international matters, there are variations in standard, communication problems, language barriers, differences in communication styles, and a lack of common basis.

 As one may imagine, the rapid growth of cloud computing and the global proliferation of service providers has created an intractable many-to-many interoperability quagmire that can never be tamed. Knowing this, the Openstack Interop Challenge looks toward cultivating success by leveraging the open source cloud technology as a common integration layer.  Participants include AT&T, Canonical, Cisco, DreamHost, Deutsche Telekom, Fujitsu, HPE, Huawei, IBM, Intel, Linaro, Mirantis, OSIC, OVH, Rackspace, Red Hat, SUSE and VMware. The goal was to publicly demonstrate how OpenStack delivers on the promise of interoperability across on-premises, public and hybrid cloud deployments.

Boris Renski, co-founder of Mirantis, argues that 
interoperability doesn't start at the infrastructure layer. 

Although you would expect this strategy would greatly simplify the integration challenge, contrarian views are out there.  One of the most vocal is Boris Renski, co-founder of Mirantis and member of the OpenStack board of directors. He believes interoperability does not necessarily start at the IaaS layer. He believes that applications can be built to be interoperable across different infrastructure platform. Quoting his OpenStack Summit keynote:

"Even across Mirantis-powered OpenStack clouds like AT&T and the Volkswagen cloud, they are both based on the same distribution, but the underlying reference architectures are dramatically different…Volkswagen can't throw something at AT&T and it will just work."

In this post I’m happy to report though that the participating OpenStack cloud vendors were able to announce a successful completion of the interoperability challenge. While this success is clearly a baby step on the long and treacherous road to cloud interoperability, it is worth noting because this modest achievement also led to the creation of automated tools for the deployment of applications across a variety of OpenStack environments.The effort also generated significant collateral on cloud computing interoperability best practices and is expected to drive even further interoperability collaboration across the Openstack community.

This post was brought to you by IBM Global Technology Services. For more content like this, visit Point B and Beyond.

Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)

Wednesday, November 9, 2016

Should Data Centers Think?

As cloud computing becomes the information technology mainstream, data center technology is accelerating at a breakneck speed. Concepts like software define infrastructure, data center analytics and Nonvolatile Memory Express (NVMe) over Fabrics are changing the very nature of data center management.  According to industry research firm IDC, organizations will spend an additional $142.8 billion oninfrastructure for both public and private cloud environments in the next three years (2016-2018) to boost efficiency and business agility.

To support this rapid evolving space, Intel announced a “Cloud for All” initiative last year in order to help businesses get the most out of their cloud infrastructure. Specific goals for this initiative include:
  • Investing in the ecosystem to accelerate enterprise-ready, easy-to-deploy software defined infrastructure (SDI) solutions;
  • Optimizing SDI solutions to deliver highly efficient clouds across a range of workloads by taking full advantage of Intel platform capabilities; and
  • Aligning the industry and engaging the community through open industry standards, solutions and routes to market to accelerate cloud deployment.
As cloud infrastructure management is moving towards these new management paradigms, those at the leading edge are exploring how to make data center’s think for themselves. Industry leaders like Dr. Brian Womack, Director of Distributed Analytics Solutions in Intel’s Data Center Solutions Group, and Das Kamhout, Senior Principal Engineer at Intel are learning how to use data, artificial intelligence frameworks and machine learning to create data centers that think for themselves. Two key components of their vision are SNAP and TAP.

SNAP is a powerful open data center telemetry framework. It can be used to easily collect, process, and publish telemetry data at scale. It enables better data center scheduling and workload management through access to underlying telemetry data and platform metrics. The framework greatly improves system administrator control of the intelligent use of data center infrastructure in cloud environments by:
  • Empowering systems to expose a consistent set of telemetry data;
  • Simplifying telemetry ingestion across ubiquitous storage system;
  •  Improving the deployment model, packaging and flexibility for collecting telemetry;
  • Allowing flexible processing of telemetry data on agent (e.g. machine learning); and
  • Providing powerful clustered control of telemetry workflows across small or large clusters.
Trusted Analytics Platform (TAP) makes the SNAP telemetry usable by providing the tools, components and services necessary in the creation of advanced analytics and machine learning solutions. TAP makes these resources accessible in one place for data scientists, application developers and system operators. An open-source software platform optimized for performance and security, TAP simplifies solution development through the delivery of a collaborative and flexible integrated environment.

With TAP, Interactive analysis, modeling and algorithmic process flows on any type of raw data, streaming in real-time or batch data, is possible using either a GUI or a text-based shell. These models and flows can be used for batch processing or be integrated into applications. TAP includes REST APIs usable by any web-capable language (e.g., Python, Java, PHP, Ruby, Javascript) over HTTP, as well as a Python API, for server-local access. It operates on most data stores and file systems, including cluster federations that can enable data sharing (with security). The integrated operations management tools in TAP allow monitoring and control from top to bottom. In support of trust, TAP Security follows layered security and deep defense principles to provide transparent encryption and decryption, as well as fine-grained access authorization, based on a variety of authentication mechanisms and assurance levels.

Used in combination, SNAP and TAP could be used to make sentient data centers a reality.

Visit Chip Chat to hear more more about creating a data center that thinks for itself!

This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.

Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)

Monday, October 31, 2016

For Top Cyber Threats, Look in the Mirror

A recent report by Praetorian, a cybersecurity company headquartered in Austin, TX, focused on threats that resulted in data compromise or access to sensitive information. Based on a review of 100 separate internal penetration test engagements the study identified the five most prevalent threats to corporate data.  The amazing thing about these weaknesses  is that the top four are all based on utilizing stolen credentials and the last one helps an attacker be more effective in using those stolen credentials.  In other words, the enemy is right there in the mirror!  The study spanned 75 unique organizations and only focused on security weaknesses that were used to obtain a full network compromise.
Where are your pain points?

The most prevalent threat is something we’ve all heard of before – Weak Domain User Passwords.  Since most corporate environments use Microsoft’s Active Directory to manage employee accounts and access, it needs some improvements in order to fully address complex passwords. Since Active Directory only requires passwords to be a specific length and contain specific character sets so addressing this weakness will require the use of third-party software.

The next most common corporate threat is Broadcast Name Resolution Poisoning.  Using this vector, an attacker responds to broadcast requests (i.e. LLMNR, NetBIOS, MDNS, etc) by providing its own IP.  When this is done, the credentials of a user accessing network resources can be instead transmitted to the attacker’s system.

The next big no-no is when system administrators all use the same Local Admin password. If an attacker is able to compromise the LM/NT hash representation of the password, then the attacker can use the hash to authenticate and execute commands on other systems that have the same password.  Using the hash, an attacker doesn’t need the actual password at all!

Microsoft Windows operating systems have another embedded password weakness.  Believe it or not, the operating system stores domain credentials in cleartext within memory of the Local Security Authority Subsystem Service (LSASS) process.  Although this weakness requires an attacker to have Local Admin or SYSTEM-level access, it ranks high on the threat list.

This last threat enhances all of the other - Insufficient Network Access Controls. Many organizations don’t restrict network access based on business requirements.  This will enable unfettered attacker mobility after only a single system on the internal network has been compromised.
These threat vectors, last updated by Praetorian in June 2016, were evaluated as part of a complete corporate network compromise kill chain.  They also highlight the importance of understanding the cybersecurity threat.  Although the mirror is a good place to start improving on network security, you must also work to identify all your organization’s security pain points.  With that knowledge you can more effectively enhance your team’s defenses and eventually evolve towards a better understanding of your security threat environment.

If you are serious about protecting your data, download the full report and read about the effective strategies your company can use to protect itself.  If you are a CISO or corporate executives, IBM also provides some excellent information on how to secure the C-suite.  They also provide an interactive tool that can help better analyze your threats, protect your users and save your data from these and many other security challenges.

This post was brought to you by IBM Global Technology Services. For more content like this, visit Point B and Beyond.

Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)

Friday, October 28, 2016

Your Choice: Cloud Technician or Digital Transformer

The CompTIA Cloud+ certification validates the skills and expertise of IT practitioners in implementing and maintaining cloud technologies.  This is exactly what it takes to become a good cloud technician.  In the past few years, however, the National Cloud Technologists Association (NCTA) has recognized that evolving market demands have changed cloud computing technology  in at least 13 ways:
  1.  Variable pricing Cloud service providers charge different prices at different times based on  demand
  2. Pre-emptable machines – Providers are offering a lower price for machines that could be shut down and restarted at a later time without aborting the assigned task
  3. Shift from hardware to algorithms where the hardware is bundled into the software price
  4. Use of reserve instances where the user buys compute power in advance
  5. Buying in bulk where pricing is based on aggregated use even if it is sporadic in nature
  6. Cloud providers offer shared data sources along with commodity hardware
  7. Autoscaling where newer software layers offered by cloud vendors handle infrastructure scaling automatically and billing is done by service request instead of by the machine
  8. Graphic processor units have become available for jobs requiring heavy-duty parallel computation
  9. Much improved analytics that monitoring the performance of your systems.
  10. Significant increase in the number of options available for various business requirements and loads
  11. “Bare metal” servers that aren’t virtual.
  12. Containers, like Docker, that makes deploying software much easier and faster.  The cloud will therefore spin up a new instance with a container-ready version of the OS at the bottom.
  13. A growing proliferation of exotic and specialized options, all offering anything you need with the extra phrase “as a service

This means cloud computing isn’t just about technology.  It is about leading organizations through the Digital Transformation era.  This is why the NCTACloudMASTER® certification was created.
Digital transformation is the profound and accelerating transformation of business activities, processes, competencies and models to fully leverage the changes and opportunities of digital technologies and their impact across society in a strategic and prioritized way. Executives in all industries are using digital advances such as analytics, mobility, social media and smart embedded devices as well as improving their use of traditional technologies such as ERP to change customer relationships, internal processes and value propositions.

Serving as “Digital Transformers”, a NCTA CloudMASTER®:
  • Help the organization transforms customer experiences through
    • Customer understanding;
    • Top-line growth; and
    • Customer touch points.
  • Optimizes internal processes through
    • Process digitization;
    • Worker enablement; and
    • Performance management.
  • Transforms a company’s core functions and activities through
    • Digital modifications to the business;
    • Creation of new digital businesses; and
    • Digital Globalization.

This means that if you want to have an IT career in five years, you must strive to be a Digital Transformer, not just a cloud technician.  Our society is experiencing a fundamental shift in information technology’s overarching mission, with the support-and-maintain mind-set giving way to a more strategic, software-centric vision for IT.  IT staff of the future need the skills of a businessperson to stay current, as their company's software requirements and the options for satisfying them will be deep, varied, and changing quickly.  The IT department five years from now will also need to keep pace with nearly constant change. CloudMASTER® training and certification is comprised of three courses with exams:
  • NCTA Cloud Technologies that provide an overview of cloud computing that will help you develop a deep understanding of the models and understand the landscape of technologies used in the cloud and those employed by users of cloud services. You will receive multiple points of view, firsthand experience and a foundation in managing industry leading cloud services like Amazon Web Services, Drupal, Wordpress, Google Docs and Digital Ocean.
  • NCTA Cloud Operations that helps you study the management of cloud operations and addresses the application need for compute power, managing CPU scaling, and meeting both structured and unstructured storage requirements. You will learn how to painlessly deploy fairly complex applications that scale across multiple instances in cloud technologies including Windows Azure Chef, Chef Solo, Linux and Windows Tools.
  • NCTA Cloud Architecture that includes hands-on experience with OpenShift, OpenStack, VMware, Amazon Web Services, Azure and Rackspace, and provides a framework to assess application performance needs while addressing business requirements of Return on Investment (ROI), Total Cost of Ownership (TCO) and Key Performance Indicators (KPIs). Groups will complete a cloud assessment of Fortune 100 firms using public information and make presentations to the client.

The more complex and interconnected cloud environments become, the more a general understanding and knowledge of how it all works together will be valued.  IT staff will no longer be the ones responsible for “managing the plumbing”, they'll be the people who are thinking of new ways to monetize, share, and use corporate data for organizational success.

So which future do you want for you and your family?

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)

Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)