Items Tagged: Dedupe
The Greening of the Data Center Technology in Depth
As industry analysts and consultants, the Taneja Group has a single driving mission: to deliver hype-free, accurate, and responsible information about important issues impacting the server and storage industry. This is why we have turned our attention to the green data center, which we believe badly needs insight and clarity in the midst of marketing smokescreens and competing claims. This Technology in Depth represents Taneja Group's take on this important issue. We present the background of the energy crisis, explore important data center trends, and tell you how we believe the industry should respond to a real problem and a real opportunity.
It’s no question that data deduplication is one the hottest technologies in data storage today. But even as organizations are benefitting from deduplication technology, some are beginning to worry about the proliferation of secondary disks. Could global data deduplication deliver more efficiency across large amounts of data?
StorSimple – Storing data the right way
In this Taneja Group Technology Validation exercise, we put an innovative new array to the test to see just what was possible when a vendor stretches their wings and takes on much more than another version of storage done the same old way.
The vendor in this case is StorSimple, and what they propose is that a single storage system can do everything, while doing it more cost effectively, and with almost zero footprint in the data center. While that may sound like an almost unbelievable set of capabilities on the surface, what StorSimple has done is integrate cloud storage over the Internet behind a primary storage array for a truly unique set of capabilities. By coupling the two together, StorSimple unlocks infinite storage capacity, a small form factor, utter simplicity, and the tools to automatically protect data in place, forever. We anticipate this will hit home with a lot of customers – no more migration, no more backup, and built in DR finally makes storage seem like a technology that matches the needs of today’s business.
Join us in this Technology Validation report as we see if StorSimple’s promises hold true, and just what those promises look poised to do for today’s data center.
Rapidly growing data volumes and the difficulty of archiving and accessing them are the challenges at the heart of data protection and big data management. San Jose-based data protection specialist Quantum believes three new products it released this week are the answer.
HP used its Discover event in Vienna to both broaden and deepen its core storage portfolio, strengthening its file and deduplication offerings to compete better with EMC and NetApp.
The HP Storage Portfolio – Building the Foundation for the Virtualized Infrastructure
Over the past couple of years, HP has executed an impressive number of storage acquisitions, and is systematically innovating around each of three key technologies – HP 3PAR, P4000, and their deduplicating StoreOnce. Perhaps nowhere are the synergies more clear than among the virtual infrastructure. In this Opinion, we’ll turn a critical eye toward these synergies, and render the Taneja Group perspective on whether HP is on the right path.
It may seem like deduplication is old hat for enterprises pushing toward virtual and cloud environments. But while the technology may be firmly established, deployment and configuration issues remain very much unsettled.
With hard disk drive (HDD) prices rising and some models tough to find, there are steps to take to reduce your dependence on hard drives while gaining other benefits along the way.
FalconStor's VTL could already flexibly dedupe at the current and post positions. Now VTL 7.5 has added inline capabilities to serve all three deduplication choices.
SEPATON Cuts Cost and Complexity of Backing up Oracle & SQL Data by Becoming the First Vendor to Deliver Deduplication for Multi-streamed, Multiplexed Databases
EMC Data Domain Extended Retention Software: Meeting Needs for Long-Term Retention of Backup....
EMC Data Domain Extended Retention Software: Meeting Needs for Long-Term Retention of Backup Data on EMC Data Domain Systems
EMC extends EMC Data Domain systems with DD Extended Retention software to fulfill short-term and long-term backup storage needs by combining multiple tiers in a single appliance. This software addresses many of the challenges organizations are struggling with as they balance backup retention requirements with long-term recoverability.
Astute Networks Inc. today said it will deliver its next-generation flash storage systems designed to speed performance of storage for virtual machines and virtual desktops.
Mike Matchett takes a closer look at the future of data storage technology in 2016 based on research from the Taneja Group.
- Premiered: 01/06/16
- Author: Mike Matchett
- Published: TechTarget: Search Storage
Virtual Instruments WorkloadCentral: Free Cloud-Based Resource for Understanding Workload Behavior
Virtual Instruments, the company created by the combination of the original Virtual Instruments and Load DynamiX, recently made available a free cloud-based service and community called WorkloadCentral. The service is designed to help storage professionals understand workload behavior and improve their knowledge of storage performance. Most will find valuable insights into storage performance with the simple use of this free service. For those who want to get a deeper understanding of workload behavior over time, or evaluate different storage products to determine which one is right for their specific application environment, or optimize their storage configurations for maximum efficiency, they can buy additional Load DynamiX Enterprise products available from the company.
The intent with WorkloadCentral is to create a web-based community that can share information about a variety of application workloads, perform workload analysis and create workload simulations. In an industry where workload sharing has been almost absent, this service will be well received by storage developers and IT users alike.
Read on to understand where WorkloadCentral fits into the overall application and storage performance spectrum...
Adding small amounts of flash as cache or dedicated storage is certainly a good way to accelerate a key application or two, but enterprises are increasingly adopting shared all-flash arrays to increase performance for every primary workload in the data center.
- Premiered: 06/23/16
- Author: Mike Matchett
- Published: Enterprise Storage Forum
HPE StoreOnce Boldly Goes Where No Deduplication Has Gone Before
Deduplication is a foundational technology for efficient backup and recovery. Vendors may argue over product features – where to dedupe, how much capacity savings, how fast are its backup speeds -- but everyone knows how central dedupe is to backup success.
However, serious pressures are forcing changes to the backup infrastructure and dedupe technologies. Explosive data growth is changing the whole market landscape as IT struggles with bloated backup windows, higher storage expenses, and increased management overhead. These pain points are driving real progress: replacing backup silos with expanded data protection platforms. These comprehensive systems backup from multiple sources to distributed storage targets, with single console management for increased control.
Dedupe is a critical factor in this scenario, but not in its conventional form as a point solution. Traditional dedupe is suited to backup silos. Moving deduped data outside the system requires rehydrating, which impacts performance and capacity between the data center, ROBO, DR sites and the cloud. Dedupe must expand its feature set in order to serve next generation backup platforms.
A few vendors have introduced new dedupe technologies but most of them are still tied to specific physical backup storage systems and appliances. Of course there is nothing wrong with leveraging hardware and software to increase sales, but storage system-specific dedupe means that data must rehydrate whenever it moves beyond the system. This leaves the business with all the performance and capacity disadvantages the infrastructure had before.
Federating dedupe across systems goes a long way to solve that problem. HPE StoreOnce extends consistent dedupe across the infrastructure. Only HPE provides customers deployment flexibility to implement the same deduplication technology in four places: target appliance, backup/media server, application source and virtual machine. This enables data to move freely between physical and virtual platforms and source and target machines without the need to rehydrate.
This paper will describe the challenges of data protection in the face of huge data growth, why dedupe is critical to meeting the challenges and how HPE is achieving the vision of federated dedupe with StoreOnce.
The long-awaited HPE-SimpliVity deal cost HPE $650 million for the hyper-converged pioneer. The buy gives HPE an installed base, as well as data reduction and protection features.
- Premiered: 01/18/17
- Author: Taneja Group
- Published: TechTarget: Search Converged Infrastructure