Introduction
In a Tech. Brief post in October 2010 – Sensor Delivery Platforms – we briefly covered several methods of deploying sensors in an area-under-surveillance, and mentioned head/body-mounted cameras as one option. In this post we discuss a specific need, of tactical response teams, for hands-free sensor data and augmented reality inputs during tactical operations.
Functions of a Hands-Free Display System
The main functional requirements of tactical response teams, with respect to a hands-free display system are as follows:
- In-situ Sensor Data
The hands-free display system should be able to capture sensor data and make it available to all team members: typically a helmet/body-mounted camera (either optical or thermal) and microphone provides in-situ sensor data, and allows the team to act on real-time information. - Display additional Sensor Data
A tactical response team requires additional sensor data, in addition to the sensor data generated by the team, and the hands-free display system is expected to deliver such data. For example, in an operation outdoors, the tactical response team can receive feed from a UAV flying overhead. - Augmented Reality information
In addition to sensor data, generated by the team or from another source, the tactical response team will also look for additional information about the situation on hand. Such Augmented Reality (AR) information will provide a deeper understanding of the situation at hand. A scenario involving an assault in a building provides an opportunity to locate sensor data in a layout of the building, and send this augmented information to the tactical response team. More sophisticated interpretations can be interpolated, based on the availability of data. - Integration with Command & Control System
The hands-free display system has to integrate with a back-end Command & Control (C&C) application that records the data from in-situ sensors, provides data from other sensors, provides interpretations and AR information for the situation, and allows discussions and decisions to be transmitted to and fro; without impairing the fighting capability of the tactical assault team.
Components of a Hands-Free Display System
The components of a Hands-Free Display System are the sensors, the display, the local computing device, the transmission sub-system, and the back-end C&C system.
- Sensors:
The sensors that are used by tactical response teams vary from the traditional video and audio sensors, in a compact form-factor, to more esoteric types such as trace-detectors and pressure sensors. The sensors can either be worn or positioned a short distance away from the user. Communications between the sensor and the system uses the Zigbee or Bluetooth standards. - Display:
Traditionally, hands-free displays were available as head/helmet-mounted devices that occluded vision to the eye the display was positioned in front of. However, with the advances in display technology, hands-free displays are available, today, in arm-mounted and non-occluding head/helmet-mounted forms.
Today’s head/helmet-mounted displays project a virtual 15-21 inches (38.1-53.3 cms.) diagonal image in 24-bit full colour; all of this without obstructing the user’s vision.The display image is typically focused at infinity, for ease of viewing.
- Computing Device:
The Hands-Free Display System computing device uses a powerful DSP (Digital Signal Processor) engine, for capturing and transmitting video feed, as well as for post-capture image processing. It runs an embedded OS such as the embedded flavours of Linux or MS Windows, and is optimized for low-power consumption. The device is carried by the user, usually in a pocket or as part of the headgear; close to the display and the earpiece-cum-microphone, to which it communicates using Bluetooth. Transmission Network: The transmission of data to and from the tactical assault team to the C&C centre can be over one of many wireless solutions: Wi-fi, proprietary Wi-fi, or other OFDM/COFDM implementations
- Command & Control:
Please refer the Tech. Brief post of May 2010 – C4ISR for Homeland Security – for further details on C&C solutions for homeland security.
Conclusion
Hands-Free Display Systems add another dimension to the efficacy of tactical assault teams. Capturing in-situ sensor data, fusing it with data from other sensors, augmenting the data with other situational information, and leveraging this composite picture to better plan and execute the operation, is in the wish-list of any field commander.
Mistral Solutions’ C4ISR solutions and its MC3S (Mobile Command, Control, & Communications System) solution can integrate third-party Hands-Free Display System vendors, and serve as the back-end C&C engine for these systems.