Wednesday, May 31, 2017

Crisis Response Using Cloud Computing

Cloud computing is more than servers and storage. In a crisis situation it can actually be a lifesaver. BlackBerry, in fact, has just become the first cloud-based crisis communication service to receive a Federal Risk and Authorization Management Program (FedRAMP) authorization from the United States Government for its AtHoc Alert and AtHoc Connect services. If you’re not familiar with FedRAMP, it is a US government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services. The Blackberry certification was sponsored by the US Federal Aviation Administration.

While you may not need a US Government certified solution in an emergency, your organization may really want to consider the benefits of cloud computing for crisis response. From a communications point of view, companies can use cloud based services to quickly and reliably send secure messages to all members of staff, individual employees or specific target groups of people. Smartphone location-mapping functions can also be easily installed and used. One advantage of using application-based software installed on an employee’s smartphone is that it can be switched off when an employee is in a safe-zone, providing a balance between staff privacy and protection. Location data can be invaluable and result in better coordination, a more effective response and faster deployment of resources to those employees deemed to be at risk. 

Using the cloud for secure two-way messaging enables simultaneous access to multiple contact paths which include SMS messaging, emails, VOIP calls, voice-to-text alerts and app notifications. Cloud-based platforms have an advantage over other forms of crisis communication tools because emergency notifications are not only sent out across all available channels and contact paths, but continue to be sent out until an acknowledgement is received from the recipient. Being able to send out notifications and receive responses, all within a few minutes, means businesses can rapidly gain visibility of an incident and react more efficiently to an unfolding situation. Wi-Fi Enabled devices can also be used to keep the communications lines open when more traditional routes are unusable.  

While you’re thinking about your corporation’s crisis response plans, don’t forget about the data. Accessing data through cloud-based services can prevent a rescue effort from turning into a recovery operation. Sources for this life-saving resource include:
  • Data exhaust - information that is passively collected along with the use of digital technology
  • Online activity - encompasses all types of social activity on the Internet such as email, social media and internet search activity
  • Sensing technologies – used mostly to gather information about social behavior and environmental conditions
  •  “Small Data” - data that is 'small' enough for human comprehension and is presented in a volume and format that makes it accessible, informative and actionable
  • Public-related data - census data, birth and death certificates, and other types of personal and socio-economic data
  • Crowd-sourced data - applications that actively involve a wide user base in order to solicit their knowledge about particular topics or events

Can the cloud be of assistance when you’re in a crisis? Cloud-enabled crisis/incident management service from IBM may be just what you need to protect your business. IBM Resiliency Communications as a Service is a high availability, cloud-enabled crisis/incident management service that protects your business by engaging the right people at the right time when an event occurs, through automated mission-critical communications. The service also integrates weather alerts powered by The Weather Company into incident management processes to provide the most accurate early warning of developing weather events and enable proactive response

This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.

Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2017)

Tuesday, May 30, 2017

Cloudy Thinking and Digital Transformation

(Originally posted on the Engility Corporation Blog)

There’s a lot to gain from cloud computing, but success requires a thoughtful and enterprise focused approach. Cloud computing decouples data and information from the infrastructure on which it lies. A process that is a LOT more involved than dragging some folders from your desktop to a shared drive.
Cloud computing as a mission transformation activity, not a technological one.
As an organization moves from local information hosting to the cloud, one of the most important challenges is addressing cloud computing as a mission transformation activity, not a technological one. Cloud computing isn't a new technology. It's a new way of consuming and provisioning information technology services. Adopting cloud computing means paralleling your mission processes, rethinking the economic models and abstracting your applications from the technology stack silos, which are currently the norm.

Interactions and dependencies between mission applications may be more important than the data or application itself.
One of the first lessons we learned supporting customers was that cloud migration shouldn't be planned as an application-by-application movement to a different hosting environment. Cloud adoption is an application portfolio activity. Interactions and dependencies between mission applications may be more important than the data or application itself. That's why upfront screening, analysis and digital infrastructure modeling are so critical. Boeing flew its Dreamliner aircraft designs on a computer before they started to build. Shouldn't we (and our customers) test future IT infrastructure on a computer before moving to the cloud? 

That is the digital transformation approach we recommend to our customers, and we have now built an entire methodology around it called Cloud ASCEND. We formed an alliance with a few select partners: Cloud Security Alliance, Burstorm, Sequoia and IBM. These companies bring tools, lessons and optimizations available from the commercial sector (the technical operations viewpoint). We blend those offerings with the experience we've gained actually transitioning applications to the cloud and the lessons we've learned in the DoD and intelligence community (the secure mission delivery and performance viewpoint).

We knew the Cloud ASCEND digital transformation methodology couldn’t be some static, one-size-fits-all approach we trot out for every customer challenge. Our methodology constantly evolves because the world is always advancing. This is an important realization that all organizations need to internalize. Cloud computing enables rapid employment of new mission processes. It lets mission owners deploy capabilities that they didn't know existed. Cloud ASCEND is agile because effectively delivering the mission requires an agile methodology. 

It lets mission owners deploy capabilities that they didn't know existed.
Getting ready to migrate to the cloud? Consider a digital transformation strategy that delivers information mobility, operational scalability and mission agility. These are the real benefits that make the process worth the effort. Organizations can apply a digital transformation methodology to determine when and how to get started, allowing them to reduce risk, reduce complexity and migrate with confidence. Cloud ASCEND enables a sort of future proofing because digital transformation means thinking today and doing tomorrow.

Cloud Musings

Wednesday, May 17, 2017

Blockchain Business Innovation

Is there more than bitcoin to blockchain?

Absolutely, because today’s blockchain is opening up a path towards the delivery of trusted online services.

To understand this statement, you need to see blockchain as more that it’s more famous bitcoin use case. As a fundamental digital tool, blockchain is a shared, immutable ledger for recording the history of transactions. If used in this fashion, it can enable transactional applications that can have embedded trust, accountability and transparency attributes. Instead of having a Bitcoin blockchain that is reliant on the exchange of cryptocurrencies with anonymous users on a public network, a Business blockchain can provide a permissioned network with known and verified identities. With this kind of transactional visibility, all activities within that network are observable and auditable by every network user. This end-to-end visibility, also known as shared ledgering, can also be linked to business rules and business logic that can drive and enforce trust, openness and integrity across that business network.  Application built, managed and supported through such an environment can now hold a verifiable pedigree with security built right in that can:
  • Prevent anyone - even root users and administrators - from taking control of a system;
  • Deny illicit attempts to change data or applications within the network; and
  • Block unauthorized data access by ensuring encryption keys can never be misappropriated.

From an industry vertical point of view, this approach can:
  • Give financial institutions an ability to settle securities in minutes instead of days;
  • Reduce manufacturer product recalls by sharing production logs with original equipment manufacturers (OEMs) and regulators; and
  • Help businesses of all types more closely manage the flow of goods and related payments with greater speed and less risk.
Innovators within just about any industry can build, run and manage their own business blockchain network. And even if the organization isn’t quite ready to do the heavy lifting, it can consume a blockchain service from companies like IBM.

Ready-made frameworks as also available from the Hyperledger Project, an open source collaborative effort created to advance cross-industry blockchain technologies. Available hyperledger business frameworks include:
  • Sawtooth - a modular platform for building, deploying, and running distributed ledgers that includes a consensus algorithm which targets large distributed validator populations with minimal resource consumption.
  • Iroha - a business blockchain framework designed to be for incorporation into infrastructural projects that require distributed ledger technology.
  • Fabric - a foundation for developing applications or solutions with a modular architecture that allows components, such as consensus and membership services, to be plug-and-play.
  • Burrow - a permissionable smart contract machine that provides a modular blockchain client with a permissioned smart contract interpreter built in part to the specification of the Ethereum Virtual Machine (EVM).

If you’re team is looking to innovate and take a leadership position within your industry, business blockchains may be the perfect enhancement for your business focused application.

This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.

Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2017)

Friday, May 5, 2017

How Quantum computing with DNA storage will affect your health

By Guest Contributor: 
Taran Volckhausen, Contributing Editor at Vector (

Moore's Law, which states that processing speeds will double every two years as we cram more and more silicon transistors onto chips, has been faltering since the early 2000s when the law started to run up against fundamental limitations presented by the laws of thermodynamicsWhile the chip industry, with Intel leading the charge, has found ways to sidestep the limitations up until now, many are now saying that despite the industry’s best efforts, the stunning gains in processor speeds will not be seen again by the simple application of Moore’s Law. In fact, there is evidence to show that we are reaching the plateau for the number of transistors that will fit on a single chip. Intel has even suggested silicon transistors can only keep getting smaller during the next five years.
As a result, Intel has resorted to other practices to improve processing speeds, such as adding multiple processing cores. However, these new methods are just a temporary solution because computing programs can benefit from multi-processors systems up until a certain point.

RIP Moore’s Law: Where do we go from here?

No doubt, the end of Moore’s Law will certainly present headaches in the immediate future for the technology sector. But is the death of Moore’s Law really all bad news? The fact the situation is stirring heightened interest in quantum computing and other “supercomputer” technology gives us reason to suggest otherwise. Quantum computers, for instance, do not rely on traditional bit processors to operate. Instead, quantum computers make use quantum bits, known as “qubits,” which is a two-state quantum-mechanical system that can process both 1s and 0s at the same time.

The advances in processing speeds made possible by quantum computing would make Moore’s Law look like a caveman’s stone tool. For instance, the Google-funded D-Wave quantum supercomputer is able to outperform traditional computers in processing speeds by a mind-blowing factor of 100-million. With the advantages offered by “quantum supremacy” easy to comprehend, the race is now on between tech-heavyweights such as Google, IBM, Microsoft and Intel to successfully prototype and release the first quantum computer for commercial use. However, due to the “weird” quantum mechanics the technology relies on, there are few barriers to working with and storing data derived from processing with qubits.

Brave new world: Quantum Computing with DNA-based Storage

Basically, the fundamentals of quantum mechanics don’t permit you to store information on the quantum-computing machine itself. While you could convert its data for storage on traditional devices, such as the solid-state hard drive, you would need to process a nearly infinite amount of information, which would require an impossible amount of space and energy to achieve. However, there could be a solution, but it requires us to look within. Not in a hippy-dippy “finding yourself” sort of way, but rather the double helix code found in in humans and almost all other organisms: DNA. For decades, researchers have toying around with using DNA as both a computing and a storage device. Recently, a team of researchers at Columbia University demonstrated that their coding strategy based on one strand of DNA could store 215 petabytes of information. "Performing sentiment analysis on quantum computing and DNA storage topics with Vector API, may uncover robust demand for these technologies in various industries such as healthcare." says Jo Fletcher Co-Founder

What would supercomputers mean for health treatments?

The human body is an incredibly complex organism. While the markets have released many life-saving drugs, there are many barriers holding us back from realizing their maximum potential. Standard computing isn’t powerful enough to truly predict the ways a drug will react with an individual’s particular genetic composition and unique environmental factors. With quantum computing based on DNA storage, however, you would have the ability to examine pretty much any scenario imaginable by mapping a much more accurate prediction of the of any given drug’s interaction with a particular person based on their genetics and environment. With quantum computing, medical professionals will be able open a new chapter in drug prescription outcomes by tailoring each treatment to meet the exact requirements of each individual.

About Vector

Vector is a natural language processing application that performs information extraction on millions of news stories per day. It provides high value to any quantitative researcher, adding a collaborative-authoring workflow in perfect synergy with the most powerful and unique faceted search in the business. For more information, please visit or

Useful Links

About Indexer

Indexer is a tech start-up in the artificial intelligence space and has a focus on computer vision and natural language processing technologies.

This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.

Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016)