Posts Tagged ‘dell technologies’

Tips for Running a Database as a Service

Yoav Eilat

Director of Product Marketing, Dell EMC
Yoav is Director of Product Marketing at Dell EMC, driving the marketing efforts for database and application solutions. He joined the EMC XtremIO team from Oracle, where he spent several years in the applications, middleware and enterprise management product teams. Yoav has an extensive background in enterprise software and data center technologies, and holds a B.Sc. in mathematics and computer science from Tel Aviv University and an MBA from the University of California, Berkeley.

Latest posts by Yoav Eilat (see all)

Database as a Service (DBaaS) is becoming another one of those industry buzzwords that can mean almost anything. Obviously it has something to do with running databases in a cloud model. But technology vendors don’t hesitate to apply that term to any product that’s even remotely related to that topic. Database software? Yep, that’s DBaaS. Storage arrays for your database? That’s DBaaS too. A coffee machine? Probably!

For a serious discussion about DBaaS, it’s useful to look at the state of databases today. Data is the foundation on which modern businesses are built, and much of it lives in commonly used databases such as Oracle Database or Microsoft SQL Server. Database sprawl and the resultant explosive growth of database copies represent an enormous challenge for enterprise IT teams. In an IDC survey, 77% of enterprise IT decision makers said they have more than 200 instances of Oracle Databases or Microsoft SQL Servers in their data centers.

db-as-a-service

Source: IDC Data Management Survey for EMC, November, 2015

In the same survey, more than 80% said they have more than 10 copies of each given production instance, typically for development, testing, data center operations, analytics, data protection or disaster recovery. While database copies are critical for these business activities, database administrators have often been reluctant to expand the number of database copies, due to the hardware, software and administrative costs involved.

And it’s not just about costs: these databases are typically not standardized, and comprise of a wide range of versions, patch levels and configurations. This sprawl and lack of standardization make it challenging to manage governance and compliance, and to meet service-level agreements. Inefficient management tools and a lack of visibility into the copy infrastructure can exacerbate these challenges.

So how can databases be made available to critical business activities while keeping costs under control and delivering quick service and time to market? How do you set up an efficient cloud environment that will reduce complexity, ensure data availability and accelerate business processes? Let’s go through a checklist for making sure your DBaaS initiative is a success.  (more…)

Cloud Adoption: Strategy vs. Reality

Vladimir Mandic

Chief Technology Officer & Distinguished Engineer Data Protection Cloud, Core Technologies Division, Dell EMC
Vladimir has been driving technical innovation and change within EMC for the past 10 years, first in the area of data protection software and, currently, in cloud technologies. Prior to that, he’s had rich industry experience as a solution integrator and in the service provider space. When not working on technology innovation, he may be difficult to locate due to his passion for world travel.

Latest posts by Vladimir Mandic (see all)

Myths About Migrating to the Cloud

Myth 1: Cloud Bursting
One of the original highly publicized use-cases for public cloud was bursting. The story made sense: as your demand for computecloud adoption-vlad increased, you would use the public cloud to increase the capacity of your private infrastructure. Like so many good stories, bursting didn’t really happen. In fact, bursting is one of the least common public cloud use cases.
Why did bursting not become more widespread? Enterprises are either keeping applications on-premises in newly designed IaaS private clouds or they are moving them to the public cloud. It’s an OR function, not an AND one. Furthermore, it almost always happens per-application. You evaluate your future application needs and decide where it makes more sense to run the application for those needs. Bursting across environments is just too complex.

Myth 2: Multi-Cloud
Most enterprises have neither a comprehensive hybrid cloud nor an end-to-end multi-cloud strategy that covers entire IT cloud comic-vladenvironments. Frequently there is a general desire for multi-cloud strategy to minimize the dependency on a single cloud provider. But that strategy turns out again to be a per-application choice rather than a centralized plan.
Organizations choose to run some applications in the private cloud and some in different public clouds. Every cloud has very different functionality, interfaces, and cost optimizations. And each time an application developer chooses an environment, it’s because that cloud was the optimal choice for that application. As a result, application mobility becomes a myth; it’s something desired, but very few are willing to settle for the smallest common denominator between different choices just to enable application mobility.
Even if customers wanted to and could move the application, it’s unlikely to happen. Moving large amounts of data between environments is challenging, inefficient, and costly. So, once the choice of a cloud provider is made, the application stays where it is, at least until the next tech refresh cycle when per-application considerations can be re-evaluated.

Cloud Adoption for Legacy Applications
While so much of the focus has been on creating new applications, enterprises are also migrating traditional workloads. So what are the stages of cloud adoption?

  • Step 1: Infrastructure as a Service. Treat the cloud like a typical infrastructure; in other words, think of servers and storage as you currently think of them. Applications are installed on top of the infrastructure. Because everything is relatively generic, the choice of a cloud provider is not too critical.
    But as applications start to move, a new way of thinking evolves; you start looking at the infrastructure as services instead of servers.
  • Step 2: Software as a Service. Some legacy applications are swapped for new ones that run as a service. In this case, you don’t care where your SaaS service runs as long as it’s reliable. The choice of a cloud provider is even less relevant; it’s about choice of the SaaS solution itself.
  • Step 3: Rewrite the Application. Some applications are redesigned to be cloud-native. In some cases, the cloud is an excuse to rewrite decades of old COBOL code that nobody understands. In other cases, features of the cloud enable an application to scale more, run faster, and deliver better services. Not all applications should be rewritten.

The Core Issue: Data. When thinking about moving the applications, what’s left is the actual data, and that is where company value truly resides. Some data moves with applications where it resides, but not all data is application structured. And that is the last challenge of cloud adoption—looking how data services can enable global, timely, and secure access to all data, whether it resides inside an application or outside of it.

The Role of IT
Just what is the role of the central IT organization, and is there a single strategy for IT? Not really.
The word “strategy” comes not from having a single plan that covers all applications, but from a comprehensive evaluation that should be done before choices are made and from having a unified set of services that ensure security, availability, and reliability of all those different environments.

Consider how IT organizations are evolving to become service brokers. For example, sometimes:

  • It makes sense to build a private cloud based on new converged (or hyper-converged) infrastructure.
  • It may go with the software-defined data center (SDDC), but that is more the case of when they have to deal with unknown external consumers instead of explicit requirements
  • IT organizations will broker services from public cloud providers such as AWS, Azure, GCE, or VirtuStreamThe alternative is so-called “shadow IT” where each application team attempts to manage their own applications without understanding the global impacts of their choices. In such scenarios, security is typically first to go and data protection follows closely.

I’ve written before how with move to public cloud, the responsibility of infrastructure availability shifts to the cloud provider. But that does not negate the need for a comprehensive data protection strategy.

You still need to protect your data on-premises or in the cloud from external threats such as ransomware or internally caused data corruption events (as the application is frequently the cause of corruption, not just infrastructure failures), or from the common (and sometimes embarrassing) “threat” of “I deleted the wrong data and I need it back.”

Companies weigh the costs and benefits of any investment. There are places where different types of infrastructure deliver the right answer. For IT to remain relevant, it needs to support different types of environments. IT’s future is in delivering better on-premises services, becoming a service broker, and ensuring that data is securely stored and protected.

Conclusion
The cloud is real and it is part of every IT team’s life. IT can be instrumental in the successful adoption of the cloud, as long as they approach it with calmness and reason—and an open mind. The goal isn’t to design the world’s greatest hybrid cloud architecture. It’s about choice and designing for application services instead of looking at servers and storage separately from the applications. There will be well-designed private clouds and public clouds that are better fits for specific applications. But the applications will dictate what works best for them; they will not accept a least-common denominator hybrid cloud.
In the end, hybrid cloud is not a goal in itself; it is a result of a well-executed strategy for applications and data.

Harnessing Simplicity in Midrange Storage Ain’t Easy

Joe Catalanotti

Product Marketing Manager, Dell EMC, Midrange Storage
Joe Catalanotti is a Product Marketing Manager with Dell EMC Midrange Storage focused on Unity all-flash storage products and solutions. Joe has over 25 years of product and channel marketing experience in storage hardware/software, asset management, and CAD/CAM technology. Joe has been instrumental in the development and execution of go-to-market plans, product launches, and other facets of product marketing. Joe holds a BS degree in Industrial Engineering and Management (Sigma Epsilon Rho) from Northeastern University as well as a degree in Architecture from Wentworth Institute of Technology.

lightswitchChewing gum is simple. So is turning on a light. And it’s simple enough to forget where you parked. OK, maybe so, but when it comes to the sphere of technology “simple” and “simple to use” are frequently advertised terms but rarely live up to the claims. Trying to represent simplicity in technology is actually complex. There are no shortcuts to producing simplicity – especially when it comes to product design as it’s a difficult principle to do well. Edward Tufte said, “Clutter and confusion are failures of design, not attributes of information.” Many products probably start out with simplicity as a real objective but not everyone succeeds. Why? Because simplifying complexity is a commitment to avoid gimmicks, work-arounds, and confusion in the design. It’s about being able to build and express simplicity with substance. Simply put, simple is very hard to do.

There’s nothing easy about what midrange storage arrays do for small-to-medium enterprises and their applications.  There is, however, a solution with Dell EMC Unity Storage. The difference with Unity is that it seamlessly translates the complexities of storage setup, management, and support into an integrated, powerful, and balanced unified platform with superior simplicity and ease of use. Unity has introduced a new level of simplicity and affordability into a family of all-flash arrays targeted to the fastest growing segment of the storage market – small and mid-sized organizations. It was Leonardo da Vinci who said, “Simplicity is the ultimate sophistication.” The sophistication of Unity’s simplicity takes away obstacles long confronted by IT generalists shortening their path to improving processes and practices, delivering IT service management, and pursuing innovation. The design principle of “Keep it Simple Stupid” (KISS) is harder to accomplish than it sounds. Well maybe for others anyway.

You can learn all about the simplicity of Unity in the full blog here.

Purpose-Built Data Protection for Today and Tomorrow

Meredith Soper

Marketing Manager, Dell EMC Data Protection Division
As a former college athlete who never lost her competitive edge, I continuously challenge myself to learn new things and become an expert on others. However, my focus is no longer basketball, but the world of data protection. My MBA and innate passion for technology led me to a career in product marketing at Dell EMC, where I aspire to add some pizazz to the already-exciting world of backup and recovery. Outside of the office, I’m a born and raised Bostonian who has trouble pronouncing her R’s (think “pahk the cah”). I love sports, shopping, and a good glass of red wine. Follow me on Twitter @Meredith_Soper and I promise to #followback!

Buying a car can be challenging. As a buyer, there are certain things you look for and see value in. And those things differ buying personas 1according to your needs. For instance, my ideal car is an SUV with four wheel drive, has a voice-activated GPS system, and is backed by a trusted car brand. From this list of must-haves you can infer my needs; I need a vehicle that is built to withstand New England winters, provides navigation assistance, and is historically known to be reliable. I look for certain features based on my transportation needs. It’s important to note that your list of car requirements is likely different from mine. You may live in a warmer climate where it rarely snows, and therefore prefer an eco-friendly, compact car. Or you may be a NASCAR driver and require a vehicle with a manual transmission and a five-point harness, rather than a traditional seatbelt. I can confidently say your ideal vehicle looks slightly, if not completely, different from mine. Our individual buyer personas are based on each of our distinctive needs.

The same goes for many other aspects of life, as well as business, including the way in which we buy and consume data protection solutions. Each data owner has a distinct set of requirements they need to meet in order to successfully do their job. For instance, the backup admin, database admin and vAdmin each play an important role in the business’ overall data protection strategy. Each individual may be partially, or even solely, responsible for protecting their data and therefore have specific requirements for visibility and control for that protection. And for efficiency and familiarity purposes, each data owner is best empowered through using their native tools and interfaces. The 2016 EMC Global Data Protection Index (GDPI) confirms these points, indicating that management of data protection aligns with environment, meaning that the storage team focuses on storage based protection, the virtualization team on virtual environments, and so forth. And 40% of those surveyed also noted that they prefer a collaborative model, giving both IT and app owners self-service in terms of visibility and management capabilities. Although each individual is working towards the common goal of fully protected and accessible data, the way in which they manage the data protection environment is different. Each situation is unique and every role has unique needs.

In addition, as enterprises engage in implementing more modern approaches to data protection, new requirements will emerge which introduce new buyers. For instance, while self-service in the data center has brought to bear many benefits, it has also contributed largely to copy data sprawl.  The proliferation of copy data leads to increased cost and risk, detracting from an enterprise’s ability to invest in next generation apps and infrastructure. The buyer concerned with controlling copy data is probably the person held accountable for setting SLAs for both protection and storage. They may be an infrastructure or compliance manager and their requirements differ from those of an application, backup, or storage admin, because they have a unique set of needs.
(more…)

Beyond the Buzzword: Modern Data Management

Tyler Stone

Dell EMC eCDM Product Marketing
Tyler is currently a software engineer in the Engineering Leadership Development Program at Dell EMC. He has spent the past three years at EMC across multiple roles in storage, IT, and data protection. Tyler has a love of technology and innovation, and something about data centers has always sparked his sense of wonder and amazement (as he says, maybe it’s the flashing lights). When his head isn’t up in the cloud, he is out travelling, taking pictures, or playing golf. Follow him on Twitter @tyler_stone_

When I think of technology buzzwords, I think of really big, complex concepts that slowly grow into all-encompassing terms to make those concepts easy to reference. Due to the complexity of some technology topics, however, buzzwords often create an opportunity for misconceptions. I’ve had people ask me if I work “for the cloud” when I talk about what I do, which is close – yet so far away.

In an earnest attempt to combat misconceptions, I want to discuss a term near and dear to my heart: modern data management. On the surface, it looks like this phrase implies that we want to manage data… modernly…? Well, that’s not wrong; but there’s a lot more behind this buzz-phrase.

First, we need to understand a broader trend in the industry: self-service. Basically, businesses no longer depend on IT to solve infrastructure problems due to the development of new technologies to enable increased agility. Before self-service, if an application administrator or DBA needed a copy of a production database, they would have to place a request to a storage or backup administrator and wait for the request to be fulfilled. Now, application owners and DBAs have easy access to copy creation tools from their native utilities, making it extremely easy to create copies themselves for any desired use case. Self-service is the foundation for modern data management.

While self-service is necessary to keep up with the demands of agile businesses, it raises some concerns. Specifically, if application owners and DBAs are managing their own copies, how can managers enforce a specific set of rules or policies to ensure compliance with business objectives, such as protection SLAs or retention policies?  This trend towards self-service has led to siloes of storage managed by decentralized teams, making it difficult for IT managers and protection administrators to monitor their copy ecosystem and ensure service level agreement (SLA) compliance.

Moreover, native utilities have no way to monitor or enforce SLAs or protection policies. Copy creation tools simply relay that the copy was created successfully. Even if application, database, and backup administrators create copies to align with protection SLAs, they can’t be expected to constantly check the environment to ensure that nothing has happened to those copies. Administrators are left to assume that their copies are sitting healthy on storage, even though they could be missing their SLAs.

Ok, so how does modern data management solve these problems? Well, modern data management provides a comprehensive view of data generated in the modern enterprise to enable SLA-based copy data lifecycle management, thus ensuring enterprise compliance and security. In order to fulfill this mission, modern data management consists of two additional components that build on self-service: intelligent copy oversight and analytics.

modern DC Management

Intelligent copy oversight, such as Dell EMC Enterprise Copy Data Management, provides managers with a view of all copies in a data center. This component allows managers to create SLOs and apply them to protectable storage assets. Then, copies of assets will be continually monitored for compliance, and SLA enforcement can be automated to ensure compliance with business objectives. With intelligent copy oversight, managers will have a clear way to view and manage their copy ecosystem. (more…)

SUBSCRIBE BELOW

Categories

Archives

Connect with us on Twitter

Click here for the Cloud Chats blog