In a World Where Data Is Exponential...

Archie Hendryx

Subscribe to Archie Hendryx: eMailAlertsEmail Alerts
Get Archie Hendryx: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Virtual Instruments

Virtual Instruments: Interview

Virtual Instruments CEO John W.Thompson Talks to InformationWeek

Virtual Instruments' Thompson New Kid On Virtualization Block

Taken from the interview with Charles Babcock, InformationWeek

Oct. 14, 2010 
URL: http://www.informationweek.com/story/showArticle.jhtml?articleID=227701234

www.virtualinstruments.com

Over 40 years, John Thompson, former CEO of Symantec, has watched basic changes periodically sweep the computer industry in repeated cycles.

"Client server was an inflection point. We've reached another inflection point in the march of democratization (in use of computer power)," he said during a recent visit to InformationWeek in San Francisco. By that, of course, he means cloud computing, a sweeping change that delivers more power to more end users than conceived possible before.

Thompson, an IBM executive for 28 years, has a keen eye for such changes. He was an executive in software development and marketing at IBM, ending up as head of IBM Americas, before being appointed CEO and chairman of Symantec. He took the job because he realized client/server computing made PC computing ubiquitous in business, which in turn had set loose changes that would yield new fortunes.

Those changes generated a vast, new industry devoted to producing anti-virus software, malware detection and various screening mechanisms that would try to re-establish the secure perimeter once offered by the IBM mainframe. Symantec grew to be one of two dominant players in that field with a stock value that increased 500% from 1999-2005. Thompson in 2006 was listed by Forbes as the eighth best paid executive in the technology industry. He is still chairman of Symantec.

Now he believes that virtualization and cloud computing are taking root and this disruptive change will unleash even more opportunities downstream from where the now heavily virtualized servers are chugging away. The cloud is a huge disruption, and with big disruptions come big opportunities for small companies, such as his current undertaking, Virtual Instruments. Thompson once again is chairman and CEO.

The field of server virtualization is already crowded with everyone from little VKernel and Veeam up to Oracle. We are still early, however, in the process of virtualizing applications. Gartner said at the end of 2009 only 16-17% of the data center was virtualized.InformationWeek Analytics, however, says the pace of virtualization will accelerate over the next two years.

Virtualization, Thompson says, introduces a new layer of abstraction and to some extent both the application and application performance disappear into that layer. Yes, it's possible to discover and count virtual machines as they run, but there's a larger problem of understanding what's wrong if they're not thriving the way they used on their own unfettered server. It was a horribly wasteful approach, this allocation of one application per server, but at least it gave you a view of what was going on.

The fundamental question for the virtualized data center, he says, is: "How do I keep track of these assets? That market is worth $10-$15 billion." Virtual Instruments has staked a claim on one slice of it, albeit a potentially big slice: offering a view of storage I/O on the SAN network on an application-by-application basis. It's addressing one of the hidden factors in running more and more virtualized applications. While TCP/IP and the Ethernet network is an open book, the Fibre Channel storage network thus far has yielded mainly lump sum statistics. We know it's working. We don't know how well for particular applications. It's difficult to get a end user point of view of how storage I/O is affecting a particular application. It's easier to get a storage system-wide view.

So VMware's vCenter statistics will tell you your virtualized application has plenty of CPU and memory, but it has no idea what's going on with the Fibre Channel SAN. If your application slowdown is because of a SAN device far downstream from the virtual server, good luck. It's an opaque network to the virtual machine management software.

"The vast majority of issues are I/O related. We look at the server, the SAN and the I/O device itself and get a round trip view of the performance of any I/O request," Thompson noted. That means specific application requests can be tracked and evaluated and dealt with if something's wrong.

"Typically an outage is not a single event. There will be a number of small events that precede it. You can prevent an outage to a mission critical application," he says, if you can see the cascade of small events alerting you to the failure about to come.

"SAN networks were never intended to be as big and complex as they've become today. The volume of data they generate has the potential of blowing out the virtual engine" that‘s trying to manage the storage system.

There are different partial solutions available. EMC has ControlCenter. HDS offers Tuning Manager; HP, Storage Essentials. A newcomer, Akorri, has Blance Point.

But Thompson says Virtual Instruments is the inheritor of 30 patents governing SAN monitoring and measuring. They're the result of engineers building tools at fibre optic subsystem specialist Finisar, a company that helped formulate the standard for Fibre Channel. It needed tools to measure and monitor the SAN network.

In June 2008, the Network Tools unit of Finisar became Virtual Instruments, a private equity company. Thompson, with his experience in picking out the right position on an industry inflection point, became its head. I'll enjoy watching how far this promising Silicon Valley start up can go. I expect it will come to play a key role in many virtualized data center environments.

More Stories By Archie Hendryx

SAN, NAS, Back Up / Recovery & Virtualisation Specialist.