2017 International Conference on Mobile Brain-Body Imaging and the Neuroscience of Art, Innovation and Creativity.

From the 10th-13th of September 2017 Andreas Wulff-Jensen and Luis Emilio Bruni attended 2017 Interaction Conference on Mobile Brain-Body Imaging and the neuroscience of art, innovation and creativity.

This conference were about bridging and investigating the gab between art, neuroscience and technology. An interesting endeavor, which is of high interest for the lab. Many interesting new connections and acquaintances were made during the conference which all can be found here

During the conference Andreas was part of the Brain on Art hackathon, which aimed to create BCI prototypes for artistic expressions. Andreas and his team (see the picture below).

Andreas And his hackathon team from left to right Luis Mercado, Andreas Wulff-Jensen, Adam Lopez

Worked with the challenge of creating a prototype which uses brainwaves to create paintings.
The prototype, segments EEG signals into different frequency bands and associates them with different properties in the application.

– The amplitude of the Delta signal (0.5-3Hz) was associated with the green color channel.
– The amplitude of the Theta signal (3-7Hz) was associated with the X coordinate on the screen.
– The amplitude of the Alpha signal (7-13Hz) was associated with the red color channel.
– The amplitude of the Beta signal (13-24Hz) was associated with the blue color channel
– lastly the amplitude of the gamma signal (24-42Hz) was associated with the Y coordinate of the screen.

The color associations were based on an preliminary ERP test, where the subject saw 30X blue squares, 30X red squares and 30X green squares. This short test was tested on two persons. The preliminary result showed higher alphas with relation to red compared to green and blue. Higher betas when comparing blue with green and red. Lastly higher Deltas when comparing green with blue and red.

Gamma and Theta is arbitrarily connected to the x and y positions in the screen.
With these rules the following application was created.

With the application Andreas and his team won the second best artistic prototype, and a prize of 300 euro.

IEEE Brain Awards Certificate for the Second Best Artistic Prototype at Brain on Art Hackathon

 

Augmented Cognition Lab

The Augmented Cognition Lab is a research group, gathered around a lab facility at Aalborg University Copenhagen, dedicated to the study of perception, cognition, affective states and aesthetic experience in relation to multimodal media and cognitive technologies (including immersive and representational interactive displays, music communication, and multimodal digital applications).  The lab is well equipped technologically and methodologically to investigate such processes when experiencing and interacting with complex stimuli and narrative content. The lab is directed by associate professors Luis Emilio Bruni and Sofia Dahl.

Through interdisciplinary research the Augmented Cognition Lab intends to:

  • Monitor user experience by developing rigorous methodologies to improve the integration of first-person assessments of subjective experience with third person observational and bio-behavioral data obtained by a variety of methods.
  • Integrate psychophysiological and augmented cognition techniques, BCI and sensor technology with immersive-interactive applications as interfacing or monitoring devices.
  • Prospect and promote sustainable and novel applications of digital cognitive technologies in education, health, art, culture, ecology and particularly assistive technologies for people with disabilities or special needs.
  • Study and assess the cognitive limits, the effects and the cultural implications of technological acceleration in digital culture with focus on sustainability.

Research areas have included:

  • Affect and emotional responses to multimodal media (e.g. aesthetical judgements and emotional responses to music or visual art; assessing surprise and suspense with EEG).
  • Psychophysiology and media cognition (e.g. evoked brain potentials in interaction with an audio-visual game; monitoring heart rate in interaction with immersive environments)
  • Sustainable and socially responsible cognitive technologies in education, culture, and assistive systems (e.g. interactive narrative for audiologist-children relations; eye-tracking patterns of students learning programming; cognitive load and attention in relation to digital media).
  • Embodied cognition (e.g. body movement and psychophysiological responses to musical rhythm).
  • Integration of biophysiological measurements with qualitative subjective methods (e.g. integration of EEG and eye-tracking with ethnographic approaches in art perception).