Monday, 31 March 2014

Working 9-5, what a way to measure social behaviour outside the lab: Introducing Sociometric Sensors.

At The University of Lincoln, we have recently taken delivery of several Sociometric sensors!!

Until very recently it was impossible to record social signalling in natural settings with a high level of precision over time. While observers have been employed in the past, their observations are inherently subjective, expensive and often inaccurate. However, advances in electronic measurement sensors, reduction in battery sizes and developments in computational data analysis have facilitated the measurement of what was previously deemed invisible. As a result, there is a growing body of evidence suggesting that social signalling plays a significant role in everyday persuasion and decision making, with applications extending to analysis and redesign of organisational networks.

A Sociometric sensor (Figure 1) is a small electronic device worn around the neck. It measures a variety of individual and interpersonal behaviours during a social interaction by way of four sensors: a microphone records speech; an accelerometer measures the degree and direction of nonverbal movement (Figure 2); a Bluetooth transmitter measures the proximity of sensors; and an infer-red transmitter measures when two sensors (and therefore participants) are facing one another. Once activated, the sensors record these attributes on an ongoing basis at a minimum of once per second (depending on the attribute and the sensor settings). These recordings are time synchronised so it is possible to compare behaviour across sensors, facilitating the analysis of social interactions over time (Figures 3 & 4). This can even be extended to include Social Network Analysis (a related Facebook example can be found here).

Figure 1: A Sociometric sensor

Figure 2: Body movement as measured by a Sociometric sensor across a typical Friday

Figure 3: Illustrating differing levels of social interaction between group members - this was calculated using a simple turn-taking analysis algorithm

Figure 4: Speech participation over time between group members (stacked)