Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Profiles/Reports

Report

Page 1 of 45 pages  1 2 3 >  Last ›
Report

Cloud Automation as a Service: Now Ready for Prime Time

As companies continue to develop private clouds and selectively adopt the public cloud for some of their business apps, the benefits of hybrid cloud are becoming increasingly clear. A hybrid cloud architecture brings together private and public clouds into a single platform, delivering not just well-known cloud benefits such as near-infinite scalability and consumption-based, pay-as-you-go services, but also a whole new level of agility and deployment flexibility.

While two-thirds of companies today are running a traditional datacenter or private cloud architecture, Taneja Group research indicates that more than half expect to move to a hybrid cloud within the next two to three years. This trend extends to business applications as well, which are being adapted to run in the public cloud. True hybrid architectures offer application portability and a compatible cross-cloud runtime environment, effectively enabling workloads to operate seamlessly across public and private clouds and allowing customers to choose to deploy workloads when and where they best fit.

One of the challenges companies face when moving to a hybrid cloud is how best to automate and manage their hybrid cloud infrastructure and applications. Self-service for users and developers is an essential attribute of a hybrid cloud, and the only way to effectively enable self-service capabilities is through cloud automation. Up until recently, management and automation have primarily been done using traditional on-premises IT toolsets, extended to accommodate the cloud. But this approach is often a compromise, since toolsets originally designed for an on-premises data center can be less than optimal in a new cloud-based paradigm.

To overcome the limitations of traditional IT infrastructure and application toolsets, many companies are looking to adopt an as-a-service approach to automation and management. This approach has some key advantages, especially in a hybrid cloud world.  For example, as companies move to a hybrid cloud, they will find a need for automation solutions that simply and transparently span private and public cloud boundaries and provide a consistent interface across their cloud environments. Cloud automation as-a-service solutions are ideally suited to meet this requirement, since they are not tied to any particular system, cloud or location.

In this paper, we’ll look at the growing market awareness of a services approach to cloud automation, and the factors that are motivating IT buyers to consider adopting an as-a-service approach in place of or in addition to traditional on-premises tools.

We’ll then examine the key characteristics and use cases buyers are looking for in cloud automation services and the benefits they expect to achieve. Finally, we’ll look at a specific example of a vendor-delivered cloud automation as-a-service offering: VMware vRealize Automation Cloud. We will show how such cloud automation services are being used and the advantages they are bringing to a growing number of companies.  

Publish date: 08/30/19
Report

End-to-End NVMe Storage: Is your enterprise ready?

In the past few years, flash technology has transformed the storage market forever. Today, flash-first arrays are the new normal. We believe the new NVMe-oF shared storage protocol combined with the advent of a broad range of NVMe-based devices including SCM will prove as disruptive to the external storage market over the next five years as NAND flash technology was in the recent past.

Read this Market Perspective to understand why now is the time to ask your storage vendors important questions about their array architectures, what questions to ask them, and how to interpret the answers. Arrays that efficiently read and write to SAS-based SSDs won’t see much gain from switching their backend to NVMe until the frontend also supports NVMe-oF. This report will explain why, when, and how this shift will take place.  

Publish date: 08/30/19
Report

HPE RMC 6.0: Extending Beyond Copy Data Management

If you’ve worked in IT, you know that a large percentage of your company’s data has been copied at least once, and often multiple times, to meet the needs of various use cases. Whether it’s backup copies for data protection, archival copies for compliance, or clones for test/dev or analytics, any particular set of data is likely to have spawned one or more copies. While these copies are nearly always made for a good reason, in many organizations they have spiraled out of control, creating a copy data sprawl that is tough for IT to get its arms around, let alone manage. As copies of data have proliferated, so have the pain points of greater storage complexity, footprint and cost. The performance of production databases also suffers as copies are made for secondary applications.

It is these very issues that copy data management (CDM) is designed to address. CDM solutions focus on eliminating unnecessary duplication of production data to reduce storage consumption, generally through the use of data virtualization and data reduction technologies. The results can be compelling. Nearly one-third of the companies that Taneja Group recently surveyed have either adopted CDM solutions or are actively evaluating them, looking to achieve benefits such as reduced storage costs, faster data access, and better data visibility and compliance.

But while first-generation CDM offerings have proven helpful, they are not keeping up with the demands of new technologies and user requirements. In particular, Flash and Cloud bring new data management opportunities and challenges that cannot be addressed by traditional CDM solutions. User needs and expectations for CDM are also expanding, moving beyond just policy-based snapshot management among homogeneous arrays.

As we’ve learned in our research, next-gen CDM must meet a new set of user needs driven by Flash and Cloud innovations, including support for heterogeneous arrays, greater but less hands-on control of copies based on intelligent policy-based automation, and coverage of new use cases across the data lifecycle, such as test/dev, reporting and analytics. Customers are also looking for integrated solutions that combine CDM with data protection and other secondary storage functions.

As we’ll see, HPE Recovery Manager Central (RMC) 6.0 provides all these capabilities and more. In fact, we’ll argue that the updated RMC 6.0 offering has helped to make HPE a leader in the data management space, streamlining costs and enriching the experience of HPE customers while still delivering on the backup and recovery features that RMC is well known for.

Publish date: 10/16/18
Free Reports / Report

HPE and Micro Focus Data Protection for Azure Stack

Hybrid cloud is increasingly gaining popularity among enterprise IT buyers, as companies recognize and begin to validate its benefits. With a hybrid cloud, organizations can take advantage of the elasticity and agility of the public cloud, especially for new cloud-native apps, while continuing to run their businesses in the near term on their existing apps on premises. Users gain the choice of deploying new and existing workloads in the public cloud or the data center, wherever it makes the most sense, and the flexibility to migrate them as needed. A hybrid cloud significantly eases the transition to the cloud, enabling organizations to compete in the new cloud-driven world while preserving current IT investments. With these benefits in mind, well over 80% of organizations we recently surveyed are in the process of moving or planning a move to a hybrid cloud infrastructure.


In this brave new world, Microsoft Azure and Azure Stack are increasingly being adopted as the foundation for companies’ hybrid cloud infrastructure. Microsoft Azure is a leading  public cloud offering that, based on Taneja Group research, consistently ranks neck-in-neck with Amazon Web Services in enterprise adoption, with more than 50% of companies using or planning to use Azure within the next two years. Azure Stack enables organizations to deliver Azure services from their own data center. Delivered as an integrated solution on HPE ProLiant servers, Azure Stack allows customers to run Azure compatible apps on premises as well as use cases that benefit from a hybrid deployment. Together, Azure and Azure Stack provide a natural and relatively frictionless path for Microsoft Windows customers to move to the cloud, along with support for new cloud-native tools and services that allow customers to fully take advantage of cloud agility and scalability.


As organizations move critical apps and data to the cloud, data protection quickly becomes a key requirement. But as buyers evaluate solutions, they often find that cloud providers’ built-in backup tools lack the flexibility, breadth of coverage, app awareness and enterprise capabilities they have become accustomed to on premises. As a result, companies look to other vendors—often their on- premises providers—to meet their data protection needs. As we’ll see, Micro Focus Data Protector offers a fully integrated, robust and comprehensive solution for backup and recovery on HPE Azure Stack.


In this piece we’ll further explore the need for data protection in a hybrid cloud environment, and examine the specific backup and recovery approaches that buyers are looking for, as revealed in our recent research. Then we’ll briefly examine what makes Micro Focus Data Protector an ideal solution for protecting an organization’s key information assets in an HPE Azure Stack hybrid cloud setting.
 

Publish date: 06/18/18
Report

Hedvig Takes Your Storage to Hybrid and Multi-Cloud

With data growth exploding and on-premises IT costs creeping ever higher, an increasing number of organizations are taking a serious look at adopting cloud infrastructure for their business data and applications. Among other things, they are attracted to benefits like near-infinite scalability, greater agility and a pay-as-you-go model for consuming IT resources. These advantages are already driving new infrastructure spending on public and private clouds, which is growing at double-digit rates as spending on traditional, non-cloud, IT infrastructure continues to decline.


While most companies we speak with are already developing cloud-native apps in Amazon Web Services (AWS) or Microsoft Azure, a much smaller number have actually deployed typically backend business apps in the public cloud. What’s preventing them from taking this next step? As it turns out, one of the biggest hurdles is productively deploying existing data storage in the cloud. Public clouds don’t have the compatibility to fully support the range of storage protocols, data services and use cases that companies’ key business apps tend to rely on, making it difficult and less useful to move these workloads to the cloud. Some organizations consider reengineering their applications for cloud-native storage, but this is both costly and time consuming, and in fact may not lead to the results they are looking for. Based on recent Taneja Group research, IT buyers want a simple path for lifting and transferring their app data to the cloud, where it can be supported for both primary and secondary use cases. They are also looking to run many workloads flexibly in a hybrid cloud deployment while maintaining the level of data security and governance they enjoy on premises.


In addition to these technical requirements, companies must also weigh potential business costs, such as the risk of getting locked into a single provider. Our research reveals that customers are increasingly concerned about this risk, which is exacerbated by a lack of data mobility among various on-premises and public cloud infrastructures.


Fortunately, the founding team at Hedvig understands these customer needs and set out more than five years ago to address them. The result of their initiative is the Hedvig Distributed Storage Platform (DSP), a unified programmable data fabric that allows customers to simply and securely deploy any type of workload and application data in a hybrid or multi-cloud environment. Based on software-defined technology, Hedvig DSP enables your existing workloads, whether based on block, file or object storage, to take advantage of cloud scalability and agility today, without the expense and delays of a major reengineering effort. With Hedvig, IT teams can automatically and dynamically provision storage assets using just software on standard x86 servers, whether in your own private cloud or a public cloud IaaS environment. Hedvig enables your workloads to move freely between different public and private cloud environments, avoiding lock-in and allowing you to choose the cloud best suited for each application and use case. Hedvig can support your primary storage needs, but also supports tier-2 storage so that you can backup your data on the same platform.


In this piece, we’ll learn more about what IT professionals are looking for in cloud storage solutions, based on our research findings. We’ll then focus specifically on Hedvig storage for hybrid and multi-cloud environments to help you decide whether and how their solutions can meet your primary and secondary storage needs.
 

Publish date: 03/26/18
Report

Emerging Market Report on Multi-Cloud Primary Storage

Public cloud utilization continues to grow at a phenomenal rate. Infrastructure spending on the public and private cloud is growing at double digit rates while spending on traditional, non-cloud, IT infrastructure continues to decline and within a few short years will represent less than 50% of the entire infrastructure market. AWS alone, as the current gorilla of the public cloud market, continues to grow at over 40% year over year and now has an annualized run rate of around $15B. Microsoft boasts similar revenue numbers when their Office 365 SaaS offerings are included. This trend is not surprising and has been widely predicted for several years. The surprising element now is how strong the momentum has become toward public cloud adoption, and the question is where the long-term equilibrium point will be between public clouds and on-premises infrastructure.

AWS was a pioneer in public cloud storage services when it introduced S3 (Simple Storage Service) over ten years ago. The approach of public cloud vendors has been to offer storage services at cut-rate pricing in what can be called the “Hotel California” strategy – once they have your data it can “never leave.” After having a company’s data in their cloud infrastructure, they then offer a wide variety of higher priced services to complement access to that data. Global content distribution, data analytics, and a wide variety of individual compute capabilities are just a few examples of services offered. Recently, we have been hearing increased grumbling from customers that they are very concerned about losing the option to change vendors and the resulting reduction in competition.

In response, IT professionals are beginning to consider multi-cloud approaches to primary storage, to gain the scalability and agility benefits of the cloud but without the penalty of lock-in. This is a fresh, emerging and innovating space, which promises to open up cloud storage to a range of new customers and use cases.

To gather data and develop insights regarding plans for public and multi-public cloud use, Taneja Group conducted two primary research studies in the summer of 2017. In each case, we surveyed 350+ IT decision makers and practitioners around the globe, representing a wide range of industries and business sizes, to understand their current and planned use cases and deployments of applications to the public cloud.

Specifically, we wanted to understand the need for an emerging set of storage products we call multi-cloud primary storage. These products provide their data services across more than one cloud simultaneously.

Publish date: 10/31/17
Page 1 of 45 pages  1 2 3 >  Last ›