With the advent of the Nlyte 7 launch, Nlyte’s CEO, Doug Sabella, was interviewed by the Huffington Post to discuss key considerations for managing assets in the data center and to provide insight into how the new Nlyte 7 platform can maximize the financial benefits from the optimized utilization of power, space and assets.
The discussion kicked off with reference to an anti-data center series penned by New York Times last fall. The series “Power, Pollution and the Internet” basically called out data centers as bad for the environment and the cause of “brown clouds.”
At Nlyte, clearly we believe that it is not that data centers are bad for the environment but that most are relying on old technology, dated or analog inventory tracking methods and many companies over-rely on Power Usage Effectiveness (PUE) metrics as their main point of energy measurement. Relying on PUE alone can inadvertently cause companies to use more energy and increase data center costs because this measurement does not factor in all of the layers that now exist in the data center.
To really maximize efficiencies in the data center, companies must be able to assess inefficiencies across all physical, virtual and IT logical layers.
The Nlyte software platform addresses each of the aforementioned layers, enabling customers to plan, track, and maintain all of the mission-critical aspects of a data center. This includes management of all resources: power, cooling, network, miscellaneous combined with the three ‘S’s of space, servers, and storage.
Nlyte recently rolled out Nlyte 7, the industry’s first data center infrastructure management (DCIM) solution to seamlessly integrate with all of an organization’s business IT management fabric. To better align with our customers’ business interests, we now offer the following features:
- Central Data Repository – Provides contextual relationships between all enterprise data center attributes, giving IT comprehensive views and information on their data center.
- Business Intelligence – Nlyte’s industry-unique BI engine provides dashboard and reporting information for trending and what-if scenario planning, with a rich set of included dashboards and reports plus the flexibility for the user to define and create their own.
- Physical and Logical Row Viewing –Nlyte users can now define and build logical row views of physical and logical equipment for detailed analysis of multiple cabinets side by side. User selections can be based on business requirements, whether by geography, line of business, or any other parameter that might be needed. This feature also provides multiple cabinet views of grouped pods or user-defined areas.
- Cabinet Device Overlay Reports – Extends resource visualization from floor plans to rack elevations, identifying potential hot spots, resource consumption, equipment types, organizational ownership, etc.
- Data Model Integrity – Data models are always accurate resulting from automated reconciliation by using Nlyte Reconciliation in conjunction with BMC’s ADDM or other discovery products.
Click here to read the entire article.
As a general rule of thumb, the more connected your DCIM solution is to your existing IT management frameworks, the more strategic your DCIM deployment will be. The more connected the solution is, the larger the population of users will be. The larger the population is, the more financial impact DCIM will have in your organization.
We see DCIM customers all the time who start their research and investigation of available DCIM solutions in a very hands-on tactical mode. DCIM comes in all shapes and sizes, and in fact includes everything from sensors and power monitoring, to full-fledged lifecycle management suites, like Nlyte. Time and time again over the course of DCIM discussions with our experts, ends-users begin to see the much bigger opportunity and their thinking quickly transitions to attaining much more strategic value, across wider audiences, which is ultimately realized with the deeper integrations into their existing thought-processes, and their ITSM management structures.
DCIM must not be yet another island of features. Many of the DCIM industry’s early adopters started their journey with DCIM tools that were nothing more than exhanced drawing solutions, where visual fidelity reigned king. In fact many of the currently available DCIM solutions and even the latest open-sourced OpenDCIM projects are basically enhanced spreadsheets with drawing built-in, which works great for documenting devices, but still misses the BIG opportunity. STep away from the trees and you might see this too.
Keep in mind that the industry has produced a number of tremendous discipline-related data center management frameworks, ITIL and COBIT are two good examples. These frameworks are voluminous and as such have not been as widely adopted due to the practical/tactical/reactive nature of running IT shops. This is changing!
In 2013, we see a fundemental shift in discipline and accountability. Everyone wants to look forward rather than backward, and few folks are trying to protect their previous ways. The ITIL-like approaches (anything that enables discipline and accountability) are becoming much more interesting in this new climate. Thats where DCIM thrives. In this new climate, new DCIM solutions must complement and integrate with existing management apps.
DCIM (when done right) forms the critical enhancement to ITSM. What do I mean by ’Done Right’? It’s when DCIM is deeply integrated with ITSM and becomes part of the critical path for change management. How will you know you’ve been successful? The number of users will grow into the dozens or hundreds, remedial accuracy will increase, and timeframes for labor-intensive operations will shrink.
Adopters of Nlyte’s DCIM suite realize just how much BUSINESS can be done when you extend ITSM with DCIM. DCIM simply gives a broader picture on which to make decisions upon. It directly supports migration and accuracy goals. Supports audit and fiscal management planning. It’s really all about leveraging investments, longer-view asset lifecycle management, with a keen eye on the financial aspects of managing all this change.
That’s what we do at Nlyte…
I smile when I hear about all the new industry ‘revelations’ and the expressions of hot new ‘strategic directions’ revolving around the idea that DCIM and ITSM should be connected. I have to ask, Where have you been? Finally folks are realizing that DCIM is not a technician’s toolset or drawing package, but instead it’s a powerful and critically important extension to the asset and services management that you already know and love. Sure the best DCIM suite solutions can draw pretty pictures, and allow you CAD-like capabiluties to put racks on floorplans and assets in racks. Frankly, that’s just a basis for everything else. It’s the 3D model that MUST exist for everything else to make sense.
Well, the really amazing part is that Nlyte recognized years ago! We saw that DCIM when deployed by itself creates yet another island of features and appears very tactical. Nlyte’s view has always been that DCIM is strategic, and is a logical next step in getting your operations and asset financials under control, so we have worked with all of the major ITSM suppliers to bi-directionally connect what we do to what they do. Over the years we created a series of solid out of the box connectors for change management, CMDB, virtualization and Instrumentation from many of the most prevalent data center management frameworks, including BMC, HP, VMware, etc.
This natural linkage between DCIM and ITSM should not be surprising to anyone, as the most successful DCIM deployments will be an integral part of your best practices. It just makes sense that Nlyte is tightly connectable to BMC’s Remedy and Atrium, or to HP’s CMDB. It should be of no surprise that Nlyte provides a real-time connector for VMware so that the visibility of any asset extends from Physical to Logical to Virtual. It just makes sense, and frankly would be odd if we DID NOT include such core out of the box integration capabilities.
Nlyte has been connecting our DCIM suite to ITSM frameworks for years. Our sales professionals revolving their entire value proposition on these linkages to ITSM and can help show exactly how the magic happens. Think Strategic, and then discuss with Nlyte how we can deliver our industry defining DCIM suite with these off the shelf ITSM connectors today!
If you’re anything like me, whether it is your work email or your personal email account, you are constantly receiving emails, reading or scanning most of them, and then simply just leaving them in your inbox. To stay. Never to be deleted. You might put them into a mail folder and call that “managing your email.” In my case, my personal Yahoo! account is free, and I’m nowhere near my space limit there. My work account is limited, and since my IT group doesn’t support archiving, I’ve been unable to delete enough emails to fall below my 600mb limit, and have just been granted 300mb more space. But that’s email – pretty low impact.
So what’s the corollary to a datacenter you ask? Well, think about it: your IT and business groups are constantly submitting requests for new applications, not to mention your finance department is counting on depreciated assets to fall off the books and thus has the Capex in place for new infrastructure to be rolled in. You’re so busy walking the floor or checking your spreadsheets or in-house solution (or worse, inexpensive DCIM solution with no controls) to keep up with demand that you don’t effectively clean out your inbox: decommissioned servers stay put and become what we in the industry call “zombie” servers (they are dead but are not properly buried and thus still consume energy – in the case of real zombies, human brains, but in the case of servers, about 60% of a peak power draw – not to mention critical space and network connections, too.) So you have trouble placing the new assets AND still have the old ones in place AND you’re consuming precious data center resources, which, believe you-me, are quite a bit more costly than expanding an email account!
But wait, there’s more (bad news). If you treat your data center like an inbox, you are just receiving things, when other people want to send them, while you’re unaware of when you’re about to receive something. So you might get flooded with requests, with no rhyme or reason, and as a data center manager, be unprepared and thus seemingly unresponsive for such an influx. There’s no SLA for an email response (instantaneous?) But for a server installation, there are huge repercussions that a business is constantly exposed to. Will that application have available hardware? Do I have power/space/cooling for that hardware? When can I count on the business savings of that new app? Am I leveraging the full depreciation cycles of my hardware, or are things just sitting on my receiving dock, unbeknownst to me?
So you can clearly see that there are significant items at stake in your data center, which are quite a bit more serious than those you face with your email inbox. What process exists to decommission items, on time, what process exists for me to receive something, with full notifications, how can I let someone know when I can receive and act on something? Do then, ask yourself, why again is your organization still treating your data center like an email inbox?
Take a deep breath (and two if you live in the State of Washington) and have a look at David Chernicoff’s recent article about the BIG new data center in Washington. Did I say BIG? Yes, it’s big, really BIG. In fact, as we have come to find out, it’s actually FIVE TIMES the size that it needed to be, and it’s a Lease-to-Own deal that the residents of Washington will continue to enjoy for years to come. (I think that “enjoyment” has recently been quantified as $5 per person each year). How fun is that? I am sure Washington taxpayers are thrilled to be paying for a data center facility that sits mostly empty.
The point of my note today is that forecasting should not be just a check-box line item for IT. Forecasting has real impacts and sets the stage for ALL other potential goodness to come. IT professionals shouldn’t just create their forecast to get a gold star, they should LIVE their forecast. They should FEEL the impacts of the choices that they make. Ideally, IT professionals will begin to pay attention and begin to see that the opportunity for forecasting to guide REAL decision making has changed dramatically over the years. VERY mature tools to support bet-your-business lifecycle forecasting for the data center have become easily available. And along with those solutions, there are a myriad of metrics that can help identify resource usage and availability, there are highly refined capacity planning and optimization solutions. There is also a solid understanding about how physical and logical and virtual worlds are all inter-related. These same IT professionals should begin to realize that all they have to do is RAISE THEIR HAND to start the business forecasting journey.
In the end, we ALL have the ability to start thinking about the cost to do work. That’s really the bottom line. At a macro level it’s easy. Take all of your costs (hardware, software, and resources) and then divide by whatever metric makes sense for you. For instance (and in an over-simplified model) the US Internal Revenue Service (IRS) could simply divide their whole IT budget by the number of tax returns. This will yield a “Cost to process each Return” for a given IT approach in a given fiscal year. Compare this year over year and the IRS could easily see if they are getting better or worse in their IT efficiency goals. Compare this to the forecasted change in population and the changes to technology and you’d have a pretty good starting point for your forecasting model. The resulting forecast for data center capacity would be directly related and poor forecasting will be seen in a dramatic fashion: create havoc, delays, higher costs to the taxpayers, etc.
As I said above, the solutions to plan the future of the data center are within reach for those that choose to. The goal is to have just the right amount of capacity online at every point in time, Why build a data center that you’ll ‘grow into’ like they did in the pacific northwest? Think of the waste. With all of the options available, In-House, Co-Lo, Modular and Cloud, you already have the opportunity to add capacity in a highly granular fashion at attractive ‘per-unit’ pricing. Bruce Taylor at Uptime talks about this Converged Hybrid Digital model all the time. It’s real and it all starts with the Forecast, but not as a NOUN, but instead as a VERB!