Exciting Time in the World of DCIM

On September 9, 2014, in Nlyte Guest Blog, by Sev Onyshkevych

Exciting Time for DCIMIt’s an exciting time in the world of DCIM.  As the market matures, DCIM is getting more attention, gaining momentum and its role of optimizing mission-critical facility operations is being understood more clearly.  We are also seeing the “hype” around DCIM fading, as companies are concentrating more on the fundamental benefits of the solution.

It’s important to note that DCIM software is a category – not a single application, just as “desktop software” encompasses many different applications.  As evidence, the leading analysts in this space, 451 Group and Gartner, are analyzing DCIM not as a monolithic, single solution anymore, but as a number of various, inter-related modules.

Hence, DCIM should be considered as a category, from which a solution can be customized to answer the needs of the end-user.  A user can implement the specific pieces of the software that apply to their specific needs.  When planning to implement multiple applications, it is important for data center operators to understand how the software pieces must inter-relate — this is critical for data center IT and Facility management considering DCIM implementation.  Many of us will recall how more than a decade ago, the IT side moved away from running independent IT management systems in “silos,” and moved more towards a Service Oriented Architecture (SOA) where each module could speak to other modules in a controlled, predefined, fashion.  Some of these very IT systems – trouble ticketing systems, CMDBs, and workflow management – which drove the adoption of SOA — are the same applications with which today’s organizations are trying to integrate their various DCIM tools.

While there are numerous functional modules within the broad category of DCIM, the foundation of DCIM rests atop two fundamental modules:  DCIM Monitoring and IT Asset Management.

Either of these two, or, better, these two basic modules working together, provides the fundamental layer of data, and offer direct, tangible benefits.  They can also enable a number of DCIM “applications” including Capacity Planning, Computational Fluid Dynamics (CFD), Dynamic/Adaptive Cooling and IT/Server Control systems, to name just a few.  The applications turn the data into recommendations or direct actions.

The 451 Group’s recent report estimates the DCIM Monitoring market to be 67 times bigger than the CFD market, and IT Asset Management is 40 times bigger than CFD, and growing slightly faster.  Data center operators are now focusing more on getting the basics right and implementing these two areas first, rather than starting with the various “applications” and trying to get them implemented without the underlying flow of information about what you have in the data center, and how it is performing over time.

This evolution in the market’s understanding of DCIM has driven DCIM vendors to specialize and focus on being “best of breed” in one or more areas of DCIM, , rather than try superficially to deliver every single module of DCIM (the “Swiss Army Knife” approach).  For a successful integration and deployment of DCIM, mission-critical IT and Facilities teams should focus on vendors that concentrate on DCIM Monitoring and Capacity Planning, as well as “Big Data” Analytics which can help translate all the collected data into insight and actionable recommendations.

We should note that DCIM can be extended even further, and integrated with IT Services Management (ITSM) applications into something 451 calls “DCSO” (Data Center Service Optimization).

Notwithstanding the hype this area gets, DCIM is 8.5 times bigger than DCSO, and growing at a faster pace, and of course, in order to implement DCSO, one must have implemented the basics of DCIM first.

We can now confidently declare that DCIM is truly coming of age.

Share
 
DCIM can be Done Right!

DCIM can be Done Right!

In a recent article published over on Processor.com and entitled “DCIM done Right” I was thrilled to see both Gartner and Forrester chiming in on the fascinating world of DCIM and providing some insight into how end-user could start planning for their own implementations, along with a healthy dose of expectation setting. Both David Cappuccio of Gartner and Richard Fichera of Forrester have been sheepherding the DCIM segment for years and in this article have shared their opinions on the value of DCIM today, some specific things to consider and where end-users might start their journey.

A quick summary of their collective points with some context as covered in the publication:

  • Determine your goals. Ask everyone that could be connected with DCIM now and in the future and capture each of their set of needs. This will identify WHY you are buying DCIM, and will be critically important when conducting your evaluation process.
  • Inventorying your assets is essential. The DCIM model must be started somehow and there are various means available to do so ranging from automated to manual. Although this may appear daunting at first, it is actually very straightforward and the place to start each deployment. Technologies that will likely be leveraged will include Barcode, electronic import, CMDBs, etc.
  • A well implemented DCIM system becomes the system of record, the source of truth, and usually more respected than older G/L based asset management since it includes more physical detail. Treat this seriously and get it right since DCIM will be the cornerstone of your structure going forward.
  • Optimizing Power and Cooling can be thought of as low-hanging DCIM fruit and may yield 20% in savings due to energy and thermal management efforts alone… but that is just the beginning of the value of DCIM as there is a whole slew of non-trivial IT-process oriented savings.
  • Space management is one of the major opportunities with DCIM. Optimal placement of devices is a key value, as well as understanding your consumption of space (the square feet) as a trend is a critical capacity planning benefit of DCIM (Bonus: See Cappuccio’s report Doc ID #235289 for a discussion about DCSE, his proposed Data Center Space Efficiency metric)
  • No DCIM offering will do everything. Regardless of what you hear, you will need to fill in the gaps for every offering by adding other solutions to your plan. Consider vendors that have demonstrated their willingness to work with other players in the DCIM space.
  • DCIM is much more valuable when it is connected to ITSM systems (like change management and CMDBs). A key observation when deploying DCIM is that many times the existing processes are flawed, so modeling and executing them in a new DCIM suite will just propagate the inefficiency. DCIM initiatives therefore become the catalyst to re-think your best practices to assure they are modern and fit the CURRENT business’ needs.
  • DCIM implementations will require startup services and you should discuss these requirements up front with your selected vendor to properly set everyone’s expectations. Services should be about deployment, not about new features that are missing.
  • Start with just the set of new capabilities that you and your team can handle. Major DCIM offerings can be built up in capability over time. Purchase what you need from a reputable vendor that is committed to progress their solution and has demonstrated this maturation in the past.
  • Train your people. Get them comfortable with each new component to the point that these new capabilities become part of their daily routines. Then add analytics and reporting deliverables that are valuable to each of them uniquely.

These are basic rules of thumb and set some very pragmatic expectations. I applaud Dave and Richard’s comments and common sense. The year 2014 has already proven to be the year many forward-looking major organizations began their DCIM journey….

 

Share
 

(Note: this blog is a excerpted from the full article in Data Center Knowledge: http://bit.ly/1kBsPAf

I recently had the opportunity to not only exhibit at the Gartner Infrastructure and Operations show, but to attend several sessions in an effort to soak up the latest and greatest in the industry. I was pleasantIT Services Portofolio Building Blocksly surprised to hear that Gartner was saying a lot of what we, at Nlyte, have been observing. Notably:

IT services are built on assets and processes – In order to build an IT service portfolio, you need a catalog of IT services, which are built upon processes and assets, upon which services and finally value can be created.

Figure 1: Nlyte’s interpretation of a presentation by Debra Curtis and Suzanne Adnams of Gartner.

Change is a collaborative process – Professor Eddie Obeng of the Henley School of Business gave the opening keynote, “Transforming with Confidence.” I was thinking about how his animated presentation could apply to someone thinking about bringing DCIM into their organization, but might be meeting resistance because their audience isn’t familiar with DCIM. Professor Obeng indicated that in order to see change, you need to reduce the level of fear, and data can help reduce that.

Business value trumps all – Jeff Brooks, the event co-host made this point in his session, “Tell the Story with Business Value Dashboards.” It came up repeatedly to the extent that infrastructure and operations teams need to convey what 99.5 percent uptime means to the business and therefore to the executive team.

Business value has a specific “speak” – Business value metrics of transactions per hour, capacity utilization to plan and unplanned downtime can be greatly affected by things the infrastructure and operations team focuses on: Connectivity, Security and Compliance, Service Support and Continuity, as well as Hardware and Software – all things that DCIM helps manage.

More expensive assets have lower total cost of ownership – Jay Pultz’s session exhibited that two-thirds of the cost increase is due to staff maintaining a “cheaper” server.

Share
 

The New IT model: A Seller & A Buyer

On August 5, 2014, in View from the Top, by Mark Harris
Delivery of IT Services is now a Seller/Buyer game

Delivery of IT Services is now a Seller/Buyer game

The IT world has been stood up on it’s head in the past few years. In years past, corporate users simply wanted what seemed like a never ending level of capabilities which were usually only partially defined in nature, and IT wanted nothing more than to satisfy as many of those needs as humanly possible and as soon as they could get to it. Corporate users were entirely captive to their corporate IT organizations, so everyone just made the best they could with the cards they were dealt.

Times have changed and we are transitioning to a buyer/seller dynamic. Your IT organization is now the SELLER of IT “products”, and your users are now BUYERS.  Luckily, your buyers are still giving their IT organizations the first crack at solving their needs.  They are asking them to come up with a plan for the IT services they need, commit to it’s cost, timeframes and support levels and then deliver it as agreed. Just like any seller/buyer relationship, seller must be able to meet the need of buyers.

Make no mistake, this is NOT a simple re-labelling exercise of the old relationships users have with their IT organization. That relationship is gone. This is different and the IT organization’s very long-term existence is at risk. A handful of years ago, Gartner made the prediction that 20% of Business will own no IT assets by 2012. BYOD, Cloud, SaaS, Virtualization were all part of the trends cited which are contributing to this decrease. Whether you agree with the numbers or the timing is mostly irrelevant, it is directionally sound. This is the time for IT organizations to put their business hats on and think about delivering services as if they were an external vendor trying to sell their wares to new customers. What they’ll find is that as a seller or supplier of products, they have the same needs to innovate, engineer, position, market and support their products, albeit with a ‘slight’ advantage of having a historically-captive audience. No longer entirely captive, that audience has somewhat of a preference to at least try to shop for their IT products internally first. How much of a preference? According to Ellen Messmer at PriceWaterhouseCoopers, 30% of all IT spending is happening outside of the IT organization. Other various analyst estimates put the figure between 15% and 35% but ALL agree that the number is BIG and it is growing! How can this be? Well it turns out that if a seller doesn’t offer the right products, available in the right timeframes and at the right cost, customers will go elsewhere. That’s why ONE-THIRD of the spending in IT is happening as a “Shadow-IT” process today!

So what does this mean to IT organizations that wish to stay in business? It means that today is a great time to re-evaluate your core approaches to service/product delivery.  The products you choose to deliver, the costs to do so, and how agile your infrastructure is to allow rapid turn on new applications.  Your “buyers” are willing to pay a certain price for each and every need they have and hopefully find one of the IT organization’s offerings to be a fit. The buyers will require delivery of those in a timely manner (their time by the way).  In fact they may be willing to pay a hint more and take delivery a bit slower to allow internal IT to be the seller, but only a tiny bit. Ultimately, they will vote with their dollars.

So that brings me to DCIM. Once an IT organization determines which products it wants deliver as its core business, they must think about the COST to deliver those products, the PRICE they will charge their customers and the speed at which they can deliver each. Whether these offerings are EMAIL services or STORAGE services or COMPUTING services, everything is a “product” that must have a higher VALUE to your customer than the price you will be charging them, must cost less to deliver than the amount you charge and must be deliverable in the timeframes the buyers set.

So there are two questions that should be at the top of  YOUR mind at this moment. The first one is “How can I determine the price I wish to charge my buyers if I don’t know my true COSTS are to implement those products?”  The key is we are not talking about approximations or best guesses here. It’s not just about estimating simple costs like it was in the old days. This is not Monopoly Money. IT spending is real money and is being spent externally a third of the time already due to the lack of alignment between buyer’s needs and seller’s capabilites!  So an IT organization must be very precise when considering the costs models and these must be all-inclusive to be a value business planning tool.  Sellers (remember, that is the IT organization)  that wish to stay in business can’t just guess what their cost is and then sell their products above their guess. They must KNOW what their actual costs are. Some of the basic costs are easy to recognize and to add up. These usually consist of the costs of hardware and software licenses. Just a little bit harder to see would be the cost of power for running the active gear and the cost of management and administrators.  But there are many ‘hidden’ costs that simply don’t get included in the traditional cost analysis and in this new model of IT, these can’t go unaccounted for any longer. The cost of the building, cooling and water. Costs of financing and depreciation. Warranty costs, support costs, cost of everything!  All of it needs to be accounted for before you are able to determine what your ‘selling’ price should be.

The second question is “How can I manage my data center capacities more proactively, to allow new applications to be implemented faster?” This is all about building a data center structure that is more agile. A structure that can be modified at a minute’s notice.  Adding or removing devices should be simple. Understanding the cascading effect and upstream/downstream relationships should be matter of fact. The business use of each and every device should be very clear and well defined.

That is where DCIM comes in. A comprehensive DCIM suite allows you to understand exactly what is in production (right down to the patch cable color) as well as quantify all of these costs involved in delivering IT services (product offerings). A solid DCIM solution allows you to create a financial model that supports your product catalog of IT products. Each IT service/product that you wish to sell can be quantified to determine the true cost to deliver that to each customer and the timeframes involved. The DCIM suite also enables you to plan and manage technology refreshes of aging equipment, which affects costs as well. DCIM enables the IT organization to shorten delivery times. DCIM really is a fundamental business planning tool for those IT professionals looking forward, rather than backward.

And what happens if you don’t implement DCIM and find your costs are different than you originally guessed, or if your turn-around times are too long? Buyers go elsewhere. A third of them to be exact. Buyers want the right products, at the right price and in a reasonable timeframe. In the “New IT” model, buyers have the ability to go elsewhere for these IT products, and IT organizations must adapt to delivering their offerings in a competitive market according to buyer’s rules. DCIM is the enabler to do so.

Share
 

Connector This, Connector That

On July 29, 2014, in View from the Top, by Matt Bushell
Nlyte DCIM has off-the-shelf integrations (connectors) for industry leading CMDBs

Nlyte DCIM has off-the-shelf integrations (connectors) for industry leading CMDBs

If you’ve been following Nlyte at all over the past year, you would have observed several announcements regarding Connectors to our DCIM suite.  You might ask, “What’s all the fuss?” or, “What’s wrong with your suite that you need all these Connectors?” or “Yet another connector?”  or, “aren’t all connectors the same?” So allow us to explain. The answer to the last question is a resounding, “no, not all connectors are the same.”  Part of the appeal of Nlyte’s march on connectordom is that they are flexible and don’t require re-testing and re-programming every time software is updated – all that’s required is perhaps some reconfiguration settings and that’s it.  To answer the other questions, let’s examine the roots or beginnings of Data Center Infrastructure Management itself. According to Wikipedia, DCIM deployments over time will integrate information technology (IT) and facility management disciplines and data center infrastructure management (DCIM) is a category of solutions which were created to extend the traditional data center management function to include all of the physical assets and resources found in the Facilities and IT domains.  This goes back to the rationale behind Nlyte’s recent wave of connector rollouts. The fuss is about including or connecting to the IT domain.  But not just any part of IT, the ones that have to do the most with a Data Center’s Physical Infrastructure, and from the industry’s leading products.

Let’s start with configuration management databases (CMDBs). They host information on configuration items (CIs) or assets.  Again, citing our good friend Wikipedia:

Its contents are intended to hold a collection of IT assets that are commonly referred to as Configuration Items (CIs), as well as descriptive relationships between such assets. When populated, the repository becomes a means of understanding how critical assets such as information systems are composed, what their upstream sources or dependencies are, and what their downstream targets are.

Any DCIM system not connected to an organization’s CMDBs is asking that organization to silo its information. Instead, a connected CMDB-DCIM system allows each to enrich each other.

Another key piece of ITSM software is Change Management systems. Reviewing its Wikipedia entry shines some light on the subject:

The objective of change management…is to ensure that standardized methods and procedures are used for efficient and prompt handling of all changes to control IT infrastructure, in order to minimize the number and impact of any related incidents upon service.

Sounds scarily similar to a large part of what DCIM is supposed to do, right? Manage moves, adds and changes within the data center?  Again, by not connecting/synching/coordinating with change management systems you run the obvious risk of siloing your IT team from your Data Center team, and becoming uncoordinated, with the potential for great loss (too great to cover here).

And I didn’t even consider virtualization hypervisors, lest you not know where your VMs are running, on which machines, should there be a power, space or cooling issue.

So stay tuned, the team at Nlyte is dedicated to building the bridge between DCIM and ITSM!

Nlyte offers Connectors for these products:

CMDB/Discovery – for bi-directional configuration item (CI) / asset information sharing and reconciliation:

Change Management – for workflow process management and communication:

Virtualization Hypervisors – for simplifying the management of physical and virtualized resources:

Sensors – tightly integrate asset location and/or environmental performance information:

Power and Rack Planning:


Share
 

Privacy Policy + Terms of Usage

© 2003 - 2014 nlyte Software, Ltd. (unless otherwise stated) - all rights reserved.