IBM Storage — Consistency, Resiliency and Performance Across the Enterprise and why that’s more important than ever

Over the course of my lifetime the concept of “enterprise computing” has evolved from monolithic mainframe systems, to departmental minicomputers, to client/server PC/server architectures, to integrated distributed systems – followed by virtualization, private clouds public cloud, and hybrid clouds with container and microservices.  And all along this journey, enterprises have had to integrate and blend these various technologies in order to maintain existing computing environments while pushing the envelope with new technologies.  And don’t forget that also during this evolution information technology (IT) executives have had to maintain and manage this entire range of infrastructure, including integrating disparate storage, servers and networking components from a host of different vendors while maintaining performance and security service levels (especially during this time of threatening and disruptive cyberattacks).

To handle this monumental and constantly evolving integration/management/performance/security task, IT executives have needed the assistance of various hardware and software vendor partners whose products help to streamline systems integration while ensuring service level commitments.  IBM’s storage portfolio is an excellent example of this cooperative integration effort.

IBM’s storage portfolio has evolved to address integration and management challenges by providing integrated management and data services across the multi-vendor, multi-cloud “universe” with solutions that enhance resiliency, performance and scalability.  This latest set of October 2021 announcements include Microsoft Azure support for IBM Spectrum Virtualize and IBM Safeguarded Copy; enhanced Cloud and container support for IBM Spectrum Protect Suite; enhanced IBM Spectrum Scale high-performance object ingest; higher capacity FlashCore Module for IBM Elastic Storage system performance capacity; and Turbonomic integration for FlashSystem monitoring and recommendations for application needs.

This report examines IBM’s storage portfolio in greater detail.

Background
The COVID pandemic has increased business and individual reliance on digital data.  It has necessitated the adoption of new/social patterns that require high-performance remote data access; the ability to easily and efficiently store and share data across private, hybrid and public cloud environments at scale; stronger authorization /authentication security measures; and greater protection of information systems from cyberattacks.

With IT budgets tight and major public cloud vendors beefing up security in response to increased remote operation requirements, solutions such as AWS and Microsoft Azure have become increasingly popular. In fact, according to Synergy Research Group as of Q1 2021, these two vendors comprise over 50% of public cloud market share with AWS at 32% and Microsoft Azure with (20%). Furthermore, according to Gartner, Inc., worldwide end-user spending on public cloud services is forecast to grow 23.1% in 2021 to total $332.3 billion, up from $270 billion in 2020. Gartner attributes this growth to the pandemic as well as emerging technologies such as containerization, virtualization and edge computing.

Clabby Analytics reported on this trend in 2020. In a Clabby Analytics report entitled With the Market Shift Toward Containers, Will IBM’s Storage Portfolio Remain “Simple?”, published in November 2020 we stated: “IBM faces a new challenge in storage. One of the company’s cornerstone strategic initiatives is to make it possible for its customers to build device/vendor transparent hybrid cloud environments. To do this, the company needs to move its customers from traditional virtualization schemes and public/private clouds to a more open, vendor-agnostic, containerized hybrid cloud environments that utilize container-native storage to break down siloes of data and enable transparency across multiple cloud architectures – both public and private.”  Containers have made moving workloads to the public cloud much easier and more cost-effective because of their portability, efficiency, ability to share resources and to scale horizontally.

IBM October 2021 Storage Announcements – A Closer Look|
IBM Spectrum Virtualize for Public Cloud on Microsoft Azure
IBM’s Spectrum Virtualize supports over 500 IBM and non-IBM Storage systems providing a consistent management interface and storage software capabilities across a broad and varied range of storage. Organizations can use these same supported storage platforms to take advantage of public cloud services enabling new functionality including workload migration, data resiliency, and cloud DR. And IBM Spectrum Virtualize data reduction can provide savings of up to 65% in cloud storage use.  By virtualizing leading public clouds, AWS and Microsoft Azure, customers have consistent data operation, data services, data protection and security across the entire IT estate – legacy, private cloud, hybrid cloud, IBM Cloud, AWS and Microsoft Azure.  In addition, IBM Storage Insights offers cross-platform monitoring.

IBM Safeguarded Copy on Microsoft Azure
Business’s data protection needs today encompass cyber resiliency and data resiliency, rather than just backup and disaster recovery.  Data protection means minimizing the frequency and impact of cyber and ransomware attacks.  In addition, organizations are using DevOps and AI to provide the ability to reuse applications and data. IBM Safeguarded Copy provides the ability to protect data before, during and after a data breach. Already supported on IBM DS8900, IBM FlashSystem, SAN Volume Controller (SVC) and Spectrum Virtualize, this announcement adds support for Microsoft Azure. IBM Safeguarded Copy automatically creates immutable copies of data that can be used for rapid recovery following a cyberattack.  The support on Azure can be used to protect Azure data or, together with remote mirroring, it can protect on-premise data as well. During an attack, IBM QRadar and IBM Guardium provide insight on possible attacks. And post-attack, IBM Safeguarded Copy ensures business is up and running again with the least disruption.

IBM Safeguarded Copy covers the entire spectrum of data protection needs with automatic creation of regular backup copies, immutable copies of production data and logical “air-gap” isolation preventing modification during a cyber attack or user error. In addition, data can be quickly restored to the source volume or to a new volume and can be kept for forensic analysis. This functionality spans on-premises and into the cloud allowing replication of data to Microsoft Azure, for example. In this way, back-ups are protected and potential threats can be isolated.

IBM Spectrum Protect Suite Enhancements
IBM Spectrum Protect adds support for multi-site replication with concurrent replication to two sites. Using policies to set up backup retention, data can be backed up to cloud object storage including IBM Cloud, AWS S3, MS Azure, Google Cloud and validated S3 providers and Archival cloud storage including IBM COS Public Archive, AWS Glacier, and AWS Deep Glacier—for long-term data retention at a much lower cost. In addition, critical data stored in OpenShift and Kubernetes is also protected with IBM Spectrum Protect Plus.

IBM Spectrum Scale Enhancements
IBM Spectrum Scale has been enhanced to support a high-performance object (S3) interface, helping to overcome challenges associated with data that is spread across private and public cloud, on-premises and the edge, with many data sources and types. IBM Spectrum Scale connects to data and applications wherever they reside, and in a consistent manner, so performance, efficiency and security are optimized. This “single source of truth” eliminates data siloes and ownership issues by creating a global cache with no need to copy data or move it around. Multiple interfaces can access the same data including big data, high performance file data, object data, network attached and container-based data, providing faster AI results and faster ingest from the edge.

FlashCore Enhancements
With IBM Elastic Storage System 3200, new 38TB IBM FlashCore modules can store 2 times the data in the same physical space using less power and with 11% Capex savings per TB. Thinking longer term, this could allow up to 8YB (yottabytes) per cluster!!

IBM FlashSystem and Turbonomic Integration
Turbonomic integration for FlashSystem provides monitoring and application resource management for compute, network and storage with monitoring and recommendations for arrays, controllers and volumes. Metrics gathered include storage capacity and storage actively provisioned, iops, and latency. Recommendations to resize volumes or provision a new controller can be made. Future enhancements will provide more recommendations and the ability to look at applications based on type, and then automate the appropriate changes.

Conclusion
IBM has integrated emerging technologies including containers/virtualization/hybrid cloud by introducing solutions such as Red Hat OpenShift Container Storage and embracing the market’s move to public cloud with its support of leading cloud providers AWS, and now Microsoft Azure. This latest set of storage announcements reinforces that strategy by improving common management, consistent data access and data sharing, as well as data resiliency across multi-vendor public, private and on-premise environments– eliminating data siloes and gaps in data security, and enabling IT specialists to focus on higher-level issues.