Overview of Videoconferencing desktop software

Download as PDFDownload as PDF
Introduction

Low cost and convenience are two of the most common reasons for using desktop videoconferencing software. For schools with a tight budget, it is often the only option for videoconferencing, as many products can either be installed for free, or cost tens of pounds, instead of the thousands of pounds needed for even entry-level, room-based videoconferencing appliances. Most of the products evaluated in this series are aimed at a single user with headphones and a webcam, although it is possible to project the desktop and use these products for group or class sessions. While this is not recommended, desktop software can be scaled up to this kind of use. With appropriate webcams and speaker/microphones used, this method will keep the cost of a low-quality but usable system (including the Personal Computer (PC)) down to hundreds of pounds.

In further and higher education, and other areas of the public sector, the drivers for desktop videoconferencing are similar to those in schools, despite the more common proliferation of studio and room based systems. These are: cost, convenience, reduced time and travel budgets, as well as the aim of reducing the organisational carbon footprint. In these sectors, what is required is a visual experience that feels more like a meeting than a telephone call, without having to leave the office or classroom. There is likely to be more ad hoc use when staff do not have to leave their workstations. It is sometimes inconvenient to check whether or not a room is available, book the room and cross the building or campus to get to the videoconferencing facility. These desktop products can provide the required experience at increasingly high levels of audio and video quality. An important feature is the ability to share the desktop and work together on a document or project. As well as audio and video, this data sharing functionality is also tested as part of these evaluations.

Desktop applications are already in use, and common examples of use include
 

  • Researchers in different locations collaborating on a project;

  • International students interviewing for university placements on the other side of the world;

  • Administrative and management meetings;

  • As a tool to keep rural schools open by clustering them and sharing skills;

  • Schools accessing the live video content available from around the world.

  • Remote tutors teaching lessons for which there is no local teacher (e.g. Law or Latin)

  • Ph.D. Vivas

  • Remote court appearances

Recognising that these applications are already in daily use in education and the wider public sector, and that use is likely to increase, it is hoped that these evaluations will explain some of the options and therefore help the decision-making process when an individual, group or organisation wants to deploy a desktop videoconferencing solution. While technical details have been included for the benefit of those who are interested, it is hoped that the evaluations are easily understood and of use to anyone wishing to make an informed decision.

The Video Technology Advisory Service (VTAS) has previously evaluated Skype 4.0 and Mirial Softphone 7.0. These were evaluated in a slightly different way to the next group of products, to reflect changing resources available, and also the constant improvements in equipment and internet access speeds which rendered some of the previous tests obsolete. The test process described here was used to test three more products: VCON Vpoint HD, Visimeet, and PolyCom Telepresence m100. The presence of a product indicates only that the product has been evaluated by the VTAS product evaluation team, and should not be taken as a recommendation. Similarly, the absence of a popular product from the evaluations should not be seen to reflect negatively on that product.

Videoconferencing standards

Standards exist to allow interoperability between different products produced by different companies. Some standards are de facto and emerge as a standard because of their universality and ubiquity (think of the format of office documents, for example). Others are developed by technical experts and defined in standards documents. Some are also known as 'recommendations' or 'protocols'. If the software that you buy supports a standard then it should be able to interoperate with other people's software that supports the same standard.

The main standards for videoconferencing are H.323[1] and the Session Initiation Protocol (SIP). These apply to studio-style videoconferencing equipment, room-based systems and desktop PC based systems. It is worth noting that any software installed that doesn't support these standards will usually either only interoperate with other installations of the same software, or will need a gateway (usually an appliance on the network) in order to work with standards-based equipment.

These evaluations include software (such as Skype) that does not work with any other desktop videoconferencing software (without a bridge or bridging service) and therefore restricts communication to users who are running the same software. There are also evaluations of products that support the videoconferencing standards and therefore work with other standards-based equipment (which may be desktop or room based). Currently, only those that support the H.323 standard (or have a means of bridging to H.323 equipment) will interoperate with the JANET Videoconferencing Service. Where a product does support H.323, or can bridge to H.323, it has been tested with the Janet system. SIP capabilities have sometimes been informally tested but no formal testing of SIP has been done

Test process

Test environment

The testing process used in order to produce the VTAS evaluations of desktop videoconferencing software was standardised before the evaluations took place. The testing uses a pair of Dell™ OptiPlex™ 360 series PCs, purchased in the spring of 2009.  These are well-specified machines capable of running Microsoft® Windows Vista® and Microsoft Windows 7, chosen to represent typical desktop machines. These PCs have Intel® Core™ 2 Duo processors E7400 running at 2.80 GHz with 4GB of RAM and  150GB NTFS hard drives. They are connected to high specification webcams, Logitech® Webcam Pro 9000s.

These PCs (one at each end of the test) are not the latest and greatest, but represent an average PC. The Dell OptiPlex has been widely deployed at educational institutions in the UK, so this is a reasonably common PC, with a common ‘average’ specification. They have been upgraded to Windows 7 (Professional edition).

Plantronics DSP 400 foldable USB headphones with microphones attached were used for all the tests.

All the equipment is deliberately average in specification, and is by no means of the highest possible quality. This is to try to emulate what is really in use – better equipment might produce better results than those found by the testers (and worse or cheaper equipment might have resulted in poorer results). It is also important to remember that the product tested was the release of software current at the time of testing. Subsequent software upgrades may well affect future performance and features.

Tests are conducted between each pair of PCs at two different bandwidths:

  • Unlimited Bandwidth: one PC is connected directly to the JANET network and the other is on a university Local Area Network (LAN). Where there is no rate-limiting enabled on the network, the default bandwidth used is 1920kbps (depending on the capabilities of the software and whether it allows the user to set bandwidth maximums) during these tests (in reality the bandwidth is limited according to its physical capabilities and the amount of traffic being generated on the university network).

  • 384kbps: this is to emulate what may be achievable as an upload speed on a good Asymmetric Digital Subscriber Line (ADSL) home/office broadband connection from a typical commercial Internet provider. The rate-limiting is introduced in both directions within the university's LAN. Where possible, the bandwidth settings on the software are adjusted accordingly.

It is worth noting that the tests are not carried out in a laboratory or test-bed environment. All tests are done on a live network, at different times of the day, and at different times of the academic year. Therefore, if the same tests are repeated at other times, the results might not be repeated. This does not invalidate the test results in the author's view, as the tests are all carried out over a period of weeks and in real life the software is likely to be used on a public network and will be affected by the moment-to-moment variations in usage of that network. Thus the testing emulates a real usage experience on the Janet network, rather than artificially creating a totally consistent and predictable laboratory test.

What is tested

The following informal and formal tests and evaluations are performed on each item of software:

Acquiring and installing the software

Usually, the software is acquired either by download from the web, or by the purchase of a CD-ROM (or license, if applicable). Any setup or installation documentation is followed in order to install the software and there then follows a short period, during which the evaluators familiarise themselves with the user interface and capabilities of the software. A number of informal calls are made during this process. Notes are made on how easy or difficult is it to install, register (if necessary) and configure the software, and how long this process takes.  The extent and nature of any documentation, help and/or support is recorded. The minimum and/or recommended specification to run the software is also described.

User interface

During these initial, informal calls, notes are made on the ease of use and accessibility of the user interface: how easy it is to store numbers and make a call, how easy it is to configure the equipment to allow it to interoperate with Janet videoconferencing (if this is possible), and how much control the user has over different settings are all considered. Testers consider how difficult or easy it is to make a connection, both to a PC running the same software (on and off campus), and to Janet videoconferencing service equipment.

Connectivity/interoperability

A series of tests is undertaken to test interoperability with other products (for standards-based software). It is also noted whether there are any changes that need to be made to the PC firewall or the corporate firewall, in order to allow calls to take place.

Audio tests

Tests are conducted on the quality of audio, echo cancellation and the effects of 'double-talk' (both parties speaking at the same time). These tests are repeated at the two different bandwidths. Each test is also done in a ‘talking heads’ videoconference and also a videoconference with more movement and data sharing.

Video tests

It should be noted that all formal testing is done with the software in full screen mode. Because of the resolution of the image sent (especially at low bandwidth) this may not always be the optimal screen (image) size for viewing a crisp and clear image. If the image sent is a small one, and it is enlarged at the receiving end to be displayed on full screen, then the image may well appear more blocky and blurred than if it is viewed in a smaller window. There is often a trade-off between image size and image resolution; however, in order to make the tests consistent, and to be testing like-for-like, all the tests are carried out using the software in full screen mode. In practice it is recommended that users try different image sizes as available, to find what suits them and their environment best.

The video tests are carried out can be described as follows:

  • ‘talking heads’ scenario, with audio

  • Data sharing with medium view and increased movement and audio.

View capture is not scientific, but, as a guide, the two views tested look something like:

Figure 1: Talking heads scenario

Subjective video impairments tested

During video testing, the evaluators assess a number of different factors for each test. The criteria considered are as follows:

Lip synchronisation: Testers talk to each other to score the accuracy of synchronisation of sound and vision. In some software this is adjustable. In these cases, tests are made after making any synchronisation adjustments.

Block distortion (tiling): All video transported across the Internet is first captured in a digital format and then compressed to reduce the amount of data it is necessary to send across the network. There are various encoding methods to do this. One method is to encode many blocks and micro-blocks of colour. Sometimes, rather than looking smooth, the way the picture is decoded and reconstructed using these blocks is apparent to the viewer. This measurement rates the presence of blockiness – i.e. the degree to which the micro-blocks that make up the de-coded picture are apparent.

Figure 2: A blocky image

Blurring: This refers to the amount of reduced edge sharpness and spatial detail in the picture. A blurry picture is watery and ill-defined. 

Figure 3: A blurry image

Colour errors: This is where the colours do not seem true and correct (in some, or all, of the picture) compared with the colours that are represented during the call.

Jerkiness: This is the distortion of smooth motion, where an object moves from A to B, not smoothly (as it should) but in one jump. In its most extreme form this causes the whole picture to freeze.

Object persistence: This is where a part of the picture has lagging images from previous frames as faded or outline images. This can cause a blurry looking image (as an arm is waved, for example), or can lead to parts of the picture becoming frozen (an arm stuck in mid-wave, while every other part of the picture moves on, for example).

Scale of impairments:

For the above video tests, each criterion regarding picture quality is scored on the following basis:

1 :Imperceptible

2 :Slightly perceptible

3 :Noticeable

4 :Constant

5 :Disruptive

Following independent scoring, results are compared and the testers check that there are no wild anomalies between their perceptions of the quality of the call during each test.

Data sharing tests

As well as the audio and video tests, informal data-sharing tests are carried out at each of the bandwidths (assuming this facility is available). H.239 is the standard for data-sharing within H.323, and various forms are available: sharing part of the screen, sharing the entire desktop, or sharing a particular window or application. For the purpose of these tests, data sharing is defined as displaying a picture, presentation, desktop or running program. It does not mean shared editing or multiple simultaneous interactive access to the same file.

Data sharing is attempted during a video call on both sets of PCs and at each of the bandwidths. This helps to ascertain whether or not the act of sharing data affects the quality of the audio or video during the call.

For the tests described above which involve data sharing, a presentation which updated slides every two seconds was shared while audio and video were evaluated. In order to test something more demanding, a YouTube video was also shared, although this was not marked for the evaluation score.

The evaluations

During all the informal and formal tests, both testers note their impressions of the software: its usability, performance, any unique or impressive features, glaring omissions or faults, etc. These are included as part of the evaluations now available on the VTAS web site.

VTAS Desktop Videoconferencing Evaluations - Feature and Protocol Comparison Tables

Table 5.1:  Communication, video and audio standards

This table lists the protocols and standards supported by the software versions tested by VTAS evaluators. Some of the features are dependent on the PC hardware used meeting the minimum requirements.

Software name and version

Communication standards

 /methods supported

Video encoding standards /methods

Audio encoding standards /methods

Image formats supported

P

H.323

SIP

H.239

FECC

P

H.261

H.263

H.264

G.

723.1

G.

711A

AAC-LD

SIREN

G.

722

P

CIF

4CIF

VGA

720p

1080p

Skype 4.0

Mirial Softphone 7.0

VCON Vpoint HD Basic

R

PolyCom m100

[2]

R

IOCOM Visimeet

[3]

P

 FECC = Far End Camera Control R=Receive only P = Proprietary or undocumented

Table 5.2:  Features

This table lists the software features that were available in the versions tested by VTAS evaluators.

Software name and version

Full screen mode

Contact list

P. A.[4]

Speed dials

Call log

H.239

Proprietary

data sharing

Call recording

Multi-way calls

In -call statistics

DTMF[5] tones

Janet videoconferencing compatible

Skype 4.0

---

Available in Version 4.2

---

Audio only

---

---

Mirial

Softphone 7.0

---

Video sharing

3-way calls hosted

VCON Vpoint HD Basic

---

---

---

---

PolyCom m100

     ---

---

---

---

---

IOCOM Visimeet

[6]

     √

---

---

---

---

[1]International Telecommunications Union, Telecommunications Standardisation Sector (ITU-T), Recommendation H.323 for packet-based multimedia communications systems

[2] Screen sharing only

[3] Still testing in JANET

[4] P.A. – Presence awareness (knowing if contacts are busy or available)

[5] Dual-Tone Multi-Frequency signalling - this allows tones to be generated that can permit control of remote devices

[6] Does not quite fill the screen.

Comments

0
+1 -1

Reviews of
Cisco Jabber for Telepresence
Cisco Jabber (for CUCM)
and
Cisco Jabber Guest

would be welcome