Intelligent Workload Management

With the emergence of new technologies and new business models, the world of IT has changed. As a result, and to keep up with the demands of today's enterprises, it has become clear that IT leaders must adopt new principles of computing.

  • First, the risks and challenges of computing across multiple environments must be controlled.
  • Second, users should have unimpeded access to the full computing services they need to do their jobs correctly.
  • Third, computing should be secure, compliant and portable.

As IT leaders, we don't exist just to build new data centers. We exist to provide the IT services that enable our businesses—whether it's building airplanes or cars, providing health insurance or banking services, or educating our future leaders. As an IT leader, it's our job to help those businesses deliver results to their customers while minimizing cost, complexity and risk.

It's All about Balance

The bottom line is this: The challenges the market is facing today are becoming increasingly complex. And now, more than ever before, it's all about balancing the flexibility users need with the control IT and the organization demands.

New Technologies Mean New Flexibility

The IT market is changing quickly. And that's probably a good thing. Because in this connected, "always-on" world you must be able to handle spikes in demand and downtime in the most efficient way possible—and maintaining a large, resource-draining data center or server farm just isn't the way to go. Luckily, in today's ever-evolving environment, you can create a new infrastructure that uses your current physical environment, as well as virtualization and the cloud.

These new technologies bring users the flexibility they need - and in some cases demand. The cloud, for example, promises limitless scalability on-demand. But that flexibility comes at a price. Because once you embrace the cloud, workloads that once ran on machines you could see in your data center, now run somewhere "out there." And not just that. Sometimes new technologies run amok. A user wants his new smart phone to run with his productivity applications, so a new virtual machine is added. The CxO buys a new iPad and wants you to make sure that it's always running properly. Or members of the sales team need to share documents so they independently create a file-sharing site in the cloud. This happens week after week—and IT administrators eventually lose track and control of what's happening in their environments.

In the second quarter of 2010 alone, worldwide mobile device sales grew by almost 14% - that's a total of 325 million devices sold just in Q2. There are currently 250 active vendors in cloud computing (see - Novell being one of them.

Regulations Require More Control

This explosion of new technologies and devices poses a problem because, frankly, today's IT managers needs to exercise ever stricter control over IT systems and resources. They need both operational control and risk control. In order to stay ahead of increasingly sophisticated security threats, they need to know who is looking at what data where, when they are looking at it, and whether or not they should have access to it in the first place. And to remain in compliance with continually proliferating regulatory requirements, you need to create detailed audit trails and these trails need to cut across every environment.

In the 2010 Verizon breach report, they reported that:

  • 48% of breaches were caused by insiders (an increase of 26%)
  • 98% was the work of criminals outside of the victim organization
  • 48% involved privilege misuse (an increase of 26%)
  • and...96% of breaches were avoidable through simple or intermediate controls

But it doesn't stop there. With the constant change and introduction of new devices by end users, operational controls are also required. There is increasing pressure to keep the systems up and running - even devices that aren't owned by the organization. This includes, but is not limited to, patching, service desk, performance, and up-time. Needless to say, today's environment is nearly impossible to control.

IT spends most of their time doing break / fix tasks instead of spending their time on more strategic projects.

Physical, Virtual and Cloud Utilization

While everyone is talking about cloud computing and virtualization, the reality is that both technologies are still in their infancy. Especially cloud computing.

According to leading analysts, less than 2 percent of enterprise workloads will run in the public cloud in 2010.

Gartner and IDC predict that less then 16 percent of all computing will be virtual in 2010.

But by 2015, these number will have risen to 20 percent and 45 percent, respectively. So no matter what platform you're currently using, you will need a way to manage all three sets of computing resources—physical, virtual and the could—in a secure and compliant manner for at least the next decade or so.

The key takeaway here is that you will have a mixture of physical, virtual and cloud for a really long time. In fact, we like to say that you don't "move" to the cloud, you "add" it.

Intelligent Workload Management

So how do you balance flexibility and control and how do you manage all of your resources and capacity across a heterogeneous environment? The answer is intelligent workload management.

The Evolution of the Workload

Before we can understand what intelligent workload management is, we need to understand the core concept of what a workload is. While workloads have been around as long as IT, the definition of a workload has evolved. The modern workload is an integrated stack of the application, middleware and an operating system that accomplishes a computing task. This modern workload is the building block of the next-generation IT infrastructure. Today that workload is integrated, but in the future it will be not only portable across physical, virtual and cloud, but it will also need to be platform-agnostic.

Another way to look at a workload is a software appliance. IDC projects that there will be more than 1 million software appliances built by 2012. Most of these workloads will be built for IT infrastructure.

Workloads Deliver a Business Service

But no end user cares about a workload. They care only about getting their job done - the use of a business service provided by IT. In the same way that you don't care about how the electricity or the type of wires it runs in are delivered to your house. You just want to flip the light switch and have the lights come on.

While a single workload could be a service, in some cases multiple workloads come together to deliver a business service. For example, the user needs to see an inventory report from SAP. In this case, there are three different workloads that come together to deliver the SAP business service. It's more than likely that each workload is running in a different environment. The database workload is running on physical systems in the data center, the application server workload is running in a private cloud, and the presentation server is running in a public cloud.

This gives the IT department the flexibility to leverage their entire capacitiy - physical, virtual, and cloud - while maintaining the control they need in delivering the service users require. Ultimately, like I said earlier, the user doesn't care where the workloads are running ... he simply wants to get his job done! But, managing and securing all these workloads in three different environments is a headache for the IT department that needs to be solved - fast.

The Customer Challenge: Manage a Siloed Infrastructure

The reality is that most organizations manage a siloed infrastructure because most IT shops have large investments in internal data centers, and even if they add virtualization and cloud computing to the mix, they are not planning to shut down those physical data centers any time soon. Companies will continue to run a lot of their core business services in their internal data centers, many on a Linux platform. The problem is that each silo needs IT service management to manage those workloads, and they'll need business service management tools to provide the executive dashboards that map IT services to business objectives. And it goes without saying that it will need to be done in a way that is both secure and compliant.

What will the data center add next? The internal (or private) cloud. The internal cloud essentially delivers on the promise of virtualization. A promise of better resource utilization and service delivery through the abstraction of workloads from physical IT infrastructures.

Problem is, the CIO now has two silos of internal resources to manage—his new flexible internal cloud, and his legacy systems. Both still behind the firewall, both still under his complete control, but with two very different IT architectures. What's needed now are new management tools that can move workloads between the two pools, and a way to measure performance — in fact, guarantee performance — while meeting the service commitments to end users. And of course, all this needs to be done with the tight security and compliance frameworks required by today's business.

But we're not done... and here is where it gets interesting. With the maturation of the public cloud the CIO now has a third capacity option—the external (or public) cloud—on which to run his core IT services. He needs to figure out which workloads to run on physical, which on his internal cloud, and which on that external cloud. And equally important, he needs to figure out which data sets can go outside his firewall.

So now our CIO needs tools to securely move and manage workloads inside and outside his firewall, across three different technology architectures, and of course make it seamless to the end user, who frankly doesn't care what IT resources are delivering his business services, as long as the service just works.

It's no wonder IT infrastructure and services are 60% of the overall IT spend and user satisfaction is declining. There are now three silos with completely different tools, people and processes to manage them!