CogPit is a platform for Augmented Cognition (AugCog) research developed by BMH. The AugCog project is a DARPA effort in its fourth year of a five year program. The development for CogPit started mid March of 2005.
As computational interfaces have become more prevalent in society and increasingly complex with regard to the volume and type of information presented, researchers have investigated novel ways to detect bottlenecks and devise strategies to aid users and improve their performance. Augmented cognition research includes the study of methods for addressing cognitive bottlenecks via technologies that assess the user’s cognitive status in real time. In the early 1960’s researchers speculated that electrical signals emanating from the human brain in the form of electroencephalographic (EEG) recordings could be used as indicators of specific events in human cognitive processing. The following outlines significant progress from those speculations.
As can be seen in the following pictures, BMH/NAVAIR CogPit has culminated in a complete Closed Loop Simulation utilizing cognitive brain activity to trigger pilot aiding in times of high workload.
The simulation is very robust with complete instrumentation for flight, a Radar Warning Receiver to detect Surface to Air missiles (SAMs), Chaff implementation to defeat SAM radars and a Tactical Situation Display to show threats and route waypoints. However, the most impressive component, the Targeting Pod, can locate, track, and destroy targets of opportunity.
Knowledge acquisition was performed in the initial stages of the project to understand the many separate components of the system. From the findings it was quickly ascertained the project should go open source for the flight simulation. After extended analysis, Flight Gear was chosen for the flight model, Delta 3D and Open Scene Graph for the image generation, and JSAF for scenario generation and “ghosted” simulation object weapons. The marriage of these technologies required several complex integrations with the Agile Fom Interface. Essentially, all of the HLA communications for Out the Window, the Targeting Pod (Camera), Tactical Situation Display, Radar Warning Receiver, Cogserver client, etc., had to be architected.
Below is a diagram that helps explain the interaction of these technologies.
The following image shows the targeting pod in action. This system is where most of the Delta3D work occurred.
Hardware integration used 4 Inspiron 9300 laptops running Linux Fedora Core 3, two 15in ELO touch panels (one VGA, one DVI), two 12in ELO touch panels, Cougar thrustmaster HOTAS controls, and a Hitachi CPX 445 projector. Some modifications to the Linux drivers were required to run the touch screen panels, also a UDP and limited HLA interface were built for the flight controls.
Aside from the intricate integration of the AFI, there were other notable BMH efforts on this
- Creation of a JSAF GUI to transfer and identify waypoints over HLA for presentation in the simulator
- Interpretation of Flight Gear and JSAF HLA attributes and parameters to include:
- Flight model kinematics
- Weapon kinematics
- RWR detections
- Complete Targeting Pod concept and integration, the ability to cue to a waypoint, slew, stationary track, designate and receive weapon release cues.
- Rapid prototyping and development of cockpit user interface displays using Disti’s GL Studio.
NAVAIR will run trials on 8 subjects and present the data to DARPA. BMH will continue to provide NAVAIR with updates to facilitate future experimentation.
You can find a more in depth overview of augmented cognition by visiting the Augmented Cognition website located at Augmented Cognition.