In a World Where Data Is Exponential...

Archie Hendryx

Subscribe to Archie Hendryx: eMailAlertsEmail Alerts
Get Archie Hendryx: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Cloud Computing, Virtualization Magazine, IT Strategy, Virtual Instruments

Virtual Instruments: Blog Post

2011 Predictions: Virtual Instruments Will Overcome the Virtual Stall

2011 will still primarily be a year of experimentation with private clouds

No Wasteful IT Spending in 2011
One of the key outcomes from the "Great Recession," which thankfully has come to a close, is the notion that IT resource and asset efficiency will be highly scrutinized.   Wasteful IT spending simply won't be tolerated.   The theme of "doing more with less" will continue into 2011 and beyond.  This is why virtualization has been so successful as it helps IT efficiency and ROI in very meaningful ways.   My predictions for 2011 reflect this new reality.   So here's my perspective from speaking to dozens of large enterprise IT organizations over the last year.

The Gap Between "as marketed and sold" and "as installed" Will Widen
Although both the major IT vendors and the smaller start-ups are preaching the benefits of new technologies, the ability and resources required for mainstream enterprise IT organizations to adopt new technologies is extremely limited.  Two cases in point that I constantly see.  The major SAN switch and HBA vendors are heavily marketing 8Gbit SAN devices and some are even talking about 16 Gbit infrastructure.  The reality is that most IT organizations are still at 4 Gbit, with many still at 2Gbit, and are barely using the available bandwidth from their existing infrastructure.  The second example is FCoE.  There has been massively hype about FCoE from certain vendors, but the business case to migrate to FCoE for typical enterprise applications just doesn't yet hold water.  Perhaps in 2012 or 2013, but today's IT organizations are having enough problems getting their existing infrastructure to stay up 24x7 and perform adequately.   Hence, vendors will keep marketing the future, while installing proven solutions with proven, near-term ROI.

Private Cloud Adoption for Business-critical Applications Remains Gated by the Ability to Monitor and Analyze the Cloud Infrastructure
2011 will still primarily be a year of experimentation with private clouds.  Most enterprise IT staff time will be spent on designing and architecting private clouds as opposed to actually deploying them in production.  Without the ability to monitor the infrastructure to guarantee availability and performance SLAs in the cloud, the risk of moving business-critical applications to a private or public cloud is simply too high.

The Majority of Companies Will Experience "Virtual Stall" for Enterprise Applications
Today, about 30% of servers are virtualized in a typical data center.  The low-hanging fruit applications  (file servers, web servers, mail, and test & dev) have already been virtualized in most companies.   To get beyond the 30% and not experience "virtual stall" requires virtualizing the I/O-intensive business-critical applications such as order processing, financial transactions, ERP, and CRM.  These applications are usually built upon Oracle, SAP, and DB2.  To leverage a famous quote from former president Bill Clinton, "It's the I/O stupid".   Without a comprehensive understanding of how the I/O path (server->HBA-> edge switch-> core switch;> storage array) affects the performance of virtualized applications, the realized benefits from and adoption of virtualization will simply hit a wall.   For example, understanding SCSI reservation storms caused by vMotion transfers and queue depth metrics will soon be terms that every VMware admin will need to understand.

Infrastructure Deployment Models Will Change Rapidly
Although stored data continues to grow at rates exceeding 50% per year, I/O has been an afterthought for most IT organizations implementing virtualization or cloud initiatives.  IT staff have been adding more and more storage and deploying larger, lower cost drives, de-dupe technology, and thin provisioning to combat the data explosion and rampant over-provisioning that every IT organization has experienced.  Unfortunately, this is necessary, but not sufficient to hit the CIO's goals related to the performance, utilization and availability of the virtualized infrastructure.  Tools that understand the infrastructure and correlate the performance and response time data to the deployed assets will become essential to deploying virtualized applications in a "high density" infrastructure.   The relationship between capacity and performance is becoming more intertwined.  You can't add or consolidate server/SAN/storage capacity without a detailed understanding of how these infrastructure changes affect application response times.All of the above predictions have their basis in getting more out of and reducing the risk of broadly-deployed virtualization investments.  Virtual Instruments is the only company directly addressing these challenges with its VirtualWisdom and SANInsight virtual infrastructure optimization solutions.  By providing deep, real-time visibility into the virtualized infrastructure, performance and availability SLAs can be ensured while maximizing the utilization of existing assets.

•   •   •

Article taken from: Virtual-Strategy Magazine. By Skip Bacon. http://www.virtual-strategy.com/2011/01/04/2011-prediction-virtual-instruments?page=0,0

More Stories By Archie Hendryx

SAN, NAS, Back Up / Recovery & Virtualisation Specialist.