Software Defined – Part 2

March 28th, 2017

Steve Adil
Disk Products Manager

IBM’s Watson™ is becoming ubiquitous in IBM’s marketing message. From their purchase of the Weather Channel (one way to establish your street creds in Cloud!) to AI enabled medical diagnoses to helping cities intelligently manage traffic patterns, Watson demonstrates IBM’s leadership in applying Information Technology to solving problems. This blog is the second part of a two-part series exploring the Software Defined Infrastructure, and IBM Storage Division’s adoption of this architecture. Part 1 focused on how a Software Defined infrastructure offers greater management simplification, lower cost and superior responsiveness. Part 2 focuses on how IBM uses cognitive analysis to determine how well the storage infrastructure is performing, and how IBM enables effective policies for storage management automation.

In Part 1, we highlighted the advantages of a Software Defined infrastructure which graphically could look something like this:

In order to get there, with the most efficient and effective design, IBM® Spectrum Control™ Storage Insights combines analytics leadership and a rich history of storage management expertise, with a cloud-based delivery model, enabling Mainline’s customers to:

• Accurately identify and categorize storage assets
• Monitor capacity and performance from the storage consumers’ perspective—including application, department and server views
• Increase capacity forecasting precision using historical growth metrics
• Reclaim unused storage to delay future purchases and improve utilization
• Optimize data placement based on historical usage patterns

The fully-realized Software Defined infrastructure utilizes the information gathered through tools like Storage Insights to determine the most effective methodology for storage management. Without this information, you are “flying blind.” Guess wrong on how your storage assets are utilized, and you will be faced with either poorly utilized assets and wasted money, or overutilized assets and processing instability. Get it right and you will dramatically reduce hardware costs, while at the same time, improve reliability.

Storage Management policies determine first, where does data arrive when it is created, and then, where does it reside over its lifecycle? The policies governing data creation take into consideration what is dictated by the policy’s Service Level Agreement or SLA. The SLA may, for example, dictate that the data achieve certain levels of performance and reliability. The lifecycle aspect of where the data resides determines what type of storage media hosts the data, as usage patterns change over time. When the data is ultimately deleted is also considered. Higher levels of function also take usage patterns into account, and react by dictating where the data resides geographically. For example, ™ facilitates the movement of data, as most used data (hot data) can be kept in a hot medium (for example A9000 All Flash Array), and historical images can be kept on inexpensive medium (for example tapes through IBM Spectrum Archive™).

Mainline offers customers a comprehensive review and inventory of their current environment as a first step towards a comprehensive Software Defined infrastructure.

Please contact your Mainline Account Executive directly, or click here to contact us with any questions.

Submit a Comment

Your email address will not be published. Required fields are marked *