Last September, Frost & Sullivan came up with a new way to measure the “collaborativeness” of visual collaboration technology. It looks like a plug for Magor Telecollaboration dressed in pseudo-scientific language to me; but, it does suggest a quantitative way of analyzing whether a collaborative product is worth its return of investment (ROI).
How to measure your product’s Cv
Collaboration goes beyond a simple exchange of information; rather, it’s a series of communication exchanges that moves all involved parties towards a common goal. To measure a tool’s ability to facilitate collaboration, Frost and Sullivan created a unit of measurement called velocity of collaboration or Cv, which is a factor of richness x accessibility.
Factor #1: Collaboration richness measures the number of features that allow collaboration to happen, things like being able to see nonverbal cues or the ability to pass and share documents. Theoretically speaking the “‘richest’ solution would enable collaboration between unlimited numbers of remote participants, seamlessly supporting the exchange of their expertise and knowledge to reach the objective.” Frost and Sullivan identify the 7 basic properties as
- chat–instant messaging
- voice (audio)–phone, cell phone, audioconferencing
- presentation–the ability to share a prepared PPT more or less
- information access–being able to access any locally stored information and to show and share this information in real time (I’m assuming this means file transfer and desktop sharing capabilities. Since we don’t usually paw through someone else’s briefcase to get info during a meeting, I don’t believe that that’s what they meant for the electronic environment either.)
- application sharing–includes the ability to hand over control to another party
- interactive video–it’s gotta be 2-way
- immersion–so real you feel like you can touch them
Factor #2: Accessibility,measures 2 things:
- speed–how quickly a person can get to the tool, which may be instantly with the push of a button (like a phone) or several days after making room reservations (like a teleconference suite)
- ease–how available a tool is, meaning everyone has one and can use it (like a phone) or there is only one available and has to be shared with people across several locations (like that teleconference suite)
I should mention that if a tool has firewall issues or IT policies limiting its use, that means a tool has lower accessibility, thus Frost and Sullivan give chat a lower accessibility score than phone.
My Beef
My problem with Cv is that it only measures visual collaboration tools and doesn’t really account for the impact of non-visual or asynchronous ones, like e-mail and the mess of tools out there like Chatter, Jive, Yammer, Huddle, Basecamp…. I suppose they’re not really comparable, but that’s what I’d really like to know. How to measure the effectiveness of any synchronous AND asynchronous features, both individually and in conjunction with another features.
I also feel like the accessibility measurement is rigged mostly to show up the weaknesses of video room conferencing, which is the only one that requires leaving your desk. Moreover, their measurement of accessibility seems like hand waving to me. I mean, how do you assign an accurate number to accessibility if you lump together everything that contributes to accessibility? What if a tool is easy to use, but takes up a lot of bandwidth? Or what if a tool is a pain to use, but everyone has it? And how does the ability to join and to leave a meeting affect accessibility? How does the number of people who can be on the at the same time affect it?
How Does VSee Measure Up?
In spite of my complaints, VSee actually does pretty well on the Cv scale. On the richness scale, I’d say it gets 5-6 (chat, voice, presentation, information access , interactive video, and application share). On the accessibility scale, I’ll use the number they assigned for chat which is 3 for speed + 3 for ease.
VSee’s Cv = richness x accessibility = 5.5 x (3 + 3) = 33
Below is a graph from Frost and Sullivan’s paper comparing the Cv‘s of various standard collaboration tools. As you can see VSee (Cv =33 ) kicks everyone’s butt except for “telecollaboration” (Cv = 48). The next closest thing is the standard “webconferencing” tool at a Cv = 24).
Related articles:
Brian Cotton talks about Velocity of Collaboration
GigaOm Future of Webconferecing: Strategies and Tools for Virtual Meetings
Follow us on Twitter (@VSee) and Like us on Facebook to hear about the latest from VSee!