Author Archive

Cloud Adoption: Strategy vs. Reality

Vladimir Mandic

Chief Technology Officer & Distinguished Engineer Data Protection Cloud, Core Technologies Division, Dell EMC
Vladimir has been driving technical innovation and change within EMC for the past 10 years, first in the area of data protection software and, currently, in cloud technologies. Prior to that, he’s had rich industry experience as a solution integrator and in the service provider space. When not working on technology innovation, he may be difficult to locate due to his passion for world travel.

Latest posts by Vladimir Mandic (see all)

Myths About Migrating to the Cloud

Myth 1: Cloud Bursting
One of the original highly publicized use-cases for public cloud was bursting. The story made sense: as your demand for computecloud adoption-vlad increased, you would use the public cloud to increase the capacity of your private infrastructure. Like so many good stories, bursting didn’t really happen. In fact, bursting is one of the least common public cloud use cases.
Why did bursting not become more widespread? Enterprises are either keeping applications on-premises in newly designed IaaS private clouds or they are moving them to the public cloud. It’s an OR function, not an AND one. Furthermore, it almost always happens per-application. You evaluate your future application needs and decide where it makes more sense to run the application for those needs. Bursting across environments is just too complex.

Myth 2: Multi-Cloud
Most enterprises have neither a comprehensive hybrid cloud nor an end-to-end multi-cloud strategy that covers entire IT cloud comic-vladenvironments. Frequently there is a general desire for multi-cloud strategy to minimize the dependency on a single cloud provider. But that strategy turns out again to be a per-application choice rather than a centralized plan.
Organizations choose to run some applications in the private cloud and some in different public clouds. Every cloud has very different functionality, interfaces, and cost optimizations. And each time an application developer chooses an environment, it’s because that cloud was the optimal choice for that application. As a result, application mobility becomes a myth; it’s something desired, but very few are willing to settle for the smallest common denominator between different choices just to enable application mobility.
Even if customers wanted to and could move the application, it’s unlikely to happen. Moving large amounts of data between environments is challenging, inefficient, and costly. So, once the choice of a cloud provider is made, the application stays where it is, at least until the next tech refresh cycle when per-application considerations can be re-evaluated.

Cloud Adoption for Legacy Applications
While so much of the focus has been on creating new applications, enterprises are also migrating traditional workloads. So what are the stages of cloud adoption?

  • Step 1: Infrastructure as a Service. Treat the cloud like a typical infrastructure; in other words, think of servers and storage as you currently think of them. Applications are installed on top of the infrastructure. Because everything is relatively generic, the choice of a cloud provider is not too critical.
    But as applications start to move, a new way of thinking evolves; you start looking at the infrastructure as services instead of servers.
  • Step 2: Software as a Service. Some legacy applications are swapped for new ones that run as a service. In this case, you don’t care where your SaaS service runs as long as it’s reliable. The choice of a cloud provider is even less relevant; it’s about choice of the SaaS solution itself.
  • Step 3: Rewrite the Application. Some applications are redesigned to be cloud-native. In some cases, the cloud is an excuse to rewrite decades of old COBOL code that nobody understands. In other cases, features of the cloud enable an application to scale more, run faster, and deliver better services. Not all applications should be rewritten.

The Core Issue: Data. When thinking about moving the applications, what’s left is the actual data, and that is where company value truly resides. Some data moves with applications where it resides, but not all data is application structured. And that is the last challenge of cloud adoption—looking how data services can enable global, timely, and secure access to all data, whether it resides inside an application or outside of it.

The Role of IT
Just what is the role of the central IT organization, and is there a single strategy for IT? Not really.
The word “strategy” comes not from having a single plan that covers all applications, but from a comprehensive evaluation that should be done before choices are made and from having a unified set of services that ensure security, availability, and reliability of all those different environments.

Consider how IT organizations are evolving to become service brokers. For example, sometimes:

  • It makes sense to build a private cloud based on new converged (or hyper-converged) infrastructure.
  • It may go with the software-defined data center (SDDC), but that is more the case of when they have to deal with unknown external consumers instead of explicit requirements
  • IT organizations will broker services from public cloud providers such as AWS, Azure, GCE, or VirtuStreamThe alternative is so-called “shadow IT” where each application team attempts to manage their own applications without understanding the global impacts of their choices. In such scenarios, security is typically first to go and data protection follows closely.

I’ve written before how with move to public cloud, the responsibility of infrastructure availability shifts to the cloud provider. But that does not negate the need for a comprehensive data protection strategy.

You still need to protect your data on-premises or in the cloud from external threats such as ransomware or internally caused data corruption events (as the application is frequently the cause of corruption, not just infrastructure failures), or from the common (and sometimes embarrassing) “threat” of “I deleted the wrong data and I need it back.”

Companies weigh the costs and benefits of any investment. There are places where different types of infrastructure deliver the right answer. For IT to remain relevant, it needs to support different types of environments. IT’s future is in delivering better on-premises services, becoming a service broker, and ensuring that data is securely stored and protected.

Conclusion
The cloud is real and it is part of every IT team’s life. IT can be instrumental in the successful adoption of the cloud, as long as they approach it with calmness and reason—and an open mind. The goal isn’t to design the world’s greatest hybrid cloud architecture. It’s about choice and designing for application services instead of looking at servers and storage separately from the applications. There will be well-designed private clouds and public clouds that are better fits for specific applications. But the applications will dictate what works best for them; they will not accept a least-common denominator hybrid cloud.
In the end, hybrid cloud is not a goal in itself; it is a result of a well-executed strategy for applications and data.

Security or Protection: Which One?

Vladimir Mandic

Chief Technology Officer & Distinguished Engineer Data Protection Cloud, Core Technologies Division, Dell EMC
Vladimir has been driving technical innovation and change within EMC for the past 10 years, first in the area of data protection software and, currently, in cloud technologies. Prior to that, he’s had rich industry experience as a solution integrator and in the service provider space. When not working on technology innovation, he may be difficult to locate due to his passion for world travel.

Latest posts by Vladimir Mandic (see all)

security-protection-vlad-1Ransomware
A long time ago I heard an anecdote that the highest level of security certification was given to a system that sat in a secure room and was isolated from a network. Today we live in a connected world, and that creates much bigger surface areas for security threats. As much as IT organizations would like to limit exposure, users expect unlimited access to both business and personal email, to be able to work with attachments, to surf the web, and to interact on social media.

Pandora’s Box has been opened security-protection-vlad-2.jpg
Although risk cannot be completely eliminated, it can and should be managed. In parallel, ransomware has emerged as a top cyber threat to business. The number of attacks and their complexity is unparalleled. These are not simple drive-by threats (such as a random user visiting a site that contains malware); instead, they are custom-designed to bypass an organization’s perimeter security and target specific high-value data sets.

The combination of open access and more advanced threats is something that requires far more attention!

Many organizations derive a false level of confidence from their investment in perimeter security: firewalls, authentication/authorization, antivirus solutions and encryption over-the-wire. When assessing security and protection, however, assume that the perimeter has been breached! The breach point is already beyond antivirus software and firewalls; it is now within authenticated systems where encryption becomes transparent. Do you know what your level of protection is?

How big is a Ransomware threat?
• Ransomware has headlined on FBI, DHS, DOJ, and NSA lists in 2016 and triggered multiple US Senate and Homeland Security questions that have resulted in FBI, DHS, and DOJ responses.
• It’s growing fast: At the end of Q1 2016, 93% of all phishing emails contained encryption ransomware. That’s a 763% increase year over year!

First, let’s look at the infrastructure
If you do control the infrastructure, be sure to take advantage of Isolated Recovery Solution (IRS) for systems such as EMC VMAX and Data Domain. IRS ensures that (a) you have a replica of your storage for fastest recovery, and (b) replication is enabled over a link which is air-gapped when replication is not occurring. That way, any corruption of primary data can quickly be recovered from an unaffected replica copy.

If you outsource your infrastructure (for example, by using the public cloud), does that mean security is no longer your responsibility? Remember, an IaaS provider takes responsibility for infrastructure availability and resiliency, but not for data validity. That means protecting your data on your core systems is your responsibility.

Regardless of the location or ownership of the infrastructure, you should be asking yourself these questions:
• Does it matter if we (or the provider) have certification XYZ.123 or not?
• In case of compromise, how do we recover data?
• Do we have a clean copy of the data that is isolated?
• How quickly can we recover?

Second-level safety is provided by having a well-designed data center protection strategy, including a backup solution, which provides an additional level of isolation for your data. That data should be secure and immutable, and it should be available for quick recovery to any point in time. Solutions such as EMC NetWorker or Avamar data protection software together with EMC Data Domain protection storage provide this level of protection. (more…)

Road to Efficiency, Part 2

Vladimir Mandic

Chief Technology Officer & Distinguished Engineer Data Protection Cloud, Core Technologies Division, Dell EMC
Vladimir has been driving technical innovation and change within EMC for the past 10 years, first in the area of data protection software and, currently, in cloud technologies. Prior to that, he’s had rich industry experience as a solution integrator and in the service provider space. When not working on technology innovation, he may be difficult to locate due to his passion for world travel.

Latest posts by Vladimir Mandic (see all)

As business look at cloud adoption, the question is, “What is the cloud good for?” Yes, the cloud can be efficient and elastic, but what would be its real use in complex environments?

A different way of looking at the road to the cloud is by considering where your data—both primary and secondary copies—resides.
efficiency2-1Cloud as 3rd copy
In a typical data center your primary file and application data would reside in on-premises storage arrays such as XtremIO, VMAX, VNX or Isilon. Second-level protection is offered via data protection solutions using secondary storage such as EMC Data Domain. As a last step, select data can be tiered to the cloud via products such as CloudBoost or CloudArray, to either private clouds built using ECS or public clouds. That means that data in the cloud is a 3rd tier of data. This is a good way to achieve efficiencies for specific use cases such as long-term retention, offsite copy of data, and data archiving while maintaining all primary processing within the data center.

Cloud as 2nd copy
A more direct way of using the cloud is by having data copied directly from primary storage to the cloud (that is, storage tiering) or protecting data directly to the cloud. This results in even higher efficiencies; however, this creates a much larger dependency on the cloud for operational recovery purposes as there is no second copy of data on premises.

Ideally, at this point we would look at direct-to-cloud tiering and protection with ability to maintain on-premises copies of active data for quick access.

Cloud as 1st copy
The last step in cloud adoption is where your primary data resides directly in the cloud, either with SaaS applications (such as Office 365, Salesforce, and Google Apps) or hosted applications running on cloud-based PaaS (such as developed using Pivotal’s Cloud Foundry platform) or IaaS (such as Amazon AWS or Microsoft Azure). In that case, the need for data protection still remains even if infrastructure resiliency responsibility has moved to the cloud solution provider. And to achieve efficiencies, data protection solution and resulting data copies also reside in the cloud itself. However, in that case you may need to export data back to on-premises either for safety, compliancy, or other reasons. (more…)

Road to Efficiency, Part 1

Vladimir Mandic

Chief Technology Officer & Distinguished Engineer Data Protection Cloud, Core Technologies Division, Dell EMC
Vladimir has been driving technical innovation and change within EMC for the past 10 years, first in the area of data protection software and, currently, in cloud technologies. Prior to that, he’s had rich industry experience as a solution integrator and in the service provider space. When not working on technology innovation, he may be difficult to locate due to his passion for world travel.

Latest posts by Vladimir Mandic (see all)

In the new IT, there are so many buzzwords, especially around cloud services. Where does the cloud actually fit?
Clouds can be private or public, and they can serve traditional “Platform 2” applications as well as new “Platform 3” applications. So let’s look at cloud services from that perspective.
Road to Efficiency 1Of course, some things don’t change regardless of the quadrant of the matrix. We always need to:

  • Protect the data wherever it is
  • Simplify management across environments
  • Get more value out of the data

(more…)

SUBSCRIBE BELOW

Categories

Archives

Connect with us on Twitter

Click here for the Cloud Chats blog