11 minute read

EmoEngine: Platform for Affective Computing

A project where I created a full synchronization back-end server to handle any number of sensor devices and connect them to any type of applications, including a FPS-game. In this project I also reverse engineered a psychophysiological recording device not meant for real-time use.

Showcases skills: Java, server programming, device drivers, reverse engineering, signal fusion..

Trisolde: Biocybernetic Opera

Beginnings

I’ve been obsessed with computers from very early age. I got my first computer, a commodore Vic 20, at age of 5, and the first words I learned to read/write were PRINT and GOTO as I pain stakingly copied, character by character, the source code of games into the BASIC interpreter of the Vic 20 (back then you could buy games as books)

The BBS, MUD etc.

Senior Server Programmer

From scratch in C. data structures and memory recycling. sysadmin, custoemrs, sql etc

Musical Background

Scientists, even data scientists who are working with empirical data and statistical methods, need creativity. One way I express mine is through music…

Teaching

I’ve lectured on research methodologies and experimental design at graduate level both in Finland (University of Helsinki) and Estonia (Tallinn University). I’ve also planned and organized a whole seminar focusing on doing user studies in HCI.

I’ve supervised students at all levels from bachelor to phd, both in Finland and Estonia. I’ve supervised students from all over the world, from Japan to Turkey and from Spain to Ukraine.

I ran and designed a semester long course on physiological and affective computing where students with various backgrounds were in the end able to create working, real-time adaptive applications ranging from BCI enabled VR to a interactive training tool for public speaking that adapted the amount of audience in a virtual auditory based on the speaker’s arousal. One of the students in the group was also working as a school teacher and collected a set of movie clips of her students moving in and out of classroom to create a montage machine.

I also ran a full semester AI/ML course. In addition to giving full series of lectures, I also ran a set of labs where the students were divided into small groups and asked to pick up some Kaggle challenge and do their best to solve it. With two TAs (that I sort of personally recruited including my Japanese Masters student and his friend that I think was even studying in another techonological university) we sephereded the dozen student groups to solve these complex data science competitions.

I took part in organizing multiple workshops on different topics. At the XX we had several design conferences (vlad). Week long workshops on enactive cinema at Politecnico di Milan, as well as at ITMO in st. Pet.

academic

I acted as a program chair for short papers at nordchi2020 organizing the efforts of ACs in reviewing hundreds of submissions in themes of x. I’ve also acted as AC in the main HCI conference Chi. In addition I’ve reviewed papers for number of journals and conferences.

I’ve been part of multiple large scale EU research projects (FUGA, CEEDs, Mindsee, tikka, ) as well as nationally funded one (emokeitai). I spent 12 fruitful year at the top finnish research organization HIIT, half of it at the Helsinki University of Technology which was integrated into the Aalto University, as well as the largest and more prestigious university in Finland, the University of Helsinki, where I also did my Bachelors, Masters and PhD.

I’ve published nearly 50 articles on topics around physiological and affective computing, ubiquitouts interaction, machine learning, ..

Eyetrackers

I’ve worked on all eye-trackers from all the main manufacturers. From tobii we implemented x with their floor standing eye-tracker x. In an interesting study I did in a collaboration with x (who also did his thesis on the topic), we simultaneously tracked two people interacting in a video call player x poker. On the other side we use a SMI tracer and Tobii.

The affordable Mirametrix eye-tracker was a workhorse of multiple projects due to is relatively slow price (we could give it to students) as well as open API. Examples include:

I also spent a lot of time with the Tobii mobile eye tracker and also convinced our university to collaborate with imotions to integrate it with physiological sensors. I also have worked with the SMI glasses.

The small new eyet.

Physiological recording devices

First device I started my work on psychophysiology (back in 2006) was the venerable Varioport mobile recording device with which I’ve developed a strong love/hate relationship over the years. The Varioport was meant to be used with an internal memory card and only had real-time output for very basic monitoring (a MS DOS based interface! That is, a command prompt style interface to those who don’t remember times before Windows) To make use of the Varioport I had to reverse engineer the serial port protocol the device used to communicate with this DOS based interface which was no easy feat, as the protocol was a confusing mix of ASCII text, hexadecimal number and binary fields. In the end I was able to write my own “device driver” for the Varioport and integrate into my own sensor-fusion, real-time application serving framework I called EmoEngine. At this point I moved away from pure C (as I was mostly working in Windows..) and wrote it all in Java.

In the end I integrated wide number of devices into the EmoEngine, best perhaps illustrated by a project where I set up a whole cloud based, mobile biosignal sonification system using Nokia Maemo phones that ran Linux, and what was the cloud service back then before cloud was even a thing. It worked smoothly, and I could in real-time choose to, for example, sonify in different ways ranging from simple beeps to synthetized music, the heart-rate or any other physiological signal of my colleague(s) that might be located at another continent. This was very revolutionary back in 2009 it was made.

I’ve also worked with wide range of “laboratory” equipment such as BioPac, X, Y, Z. However, my main expertise has been utilizing all sorts of wearable devices that might or might not be meant for such use. At one national project we collaborated with the finnish company Polar and I got access to number of their ECG chest bands that I used in multiple demos and prototypes.

First signal I worked with was skin conductance, sometimes called electrodermal activity EDA and sometimes galvanic skin response (GSR). I measured it with the Varioport device but also with Biopac and X. I’ve used different wrist worn sensors including the Empatica and the custom wearable C. Moreover, I spent time exploring a ring..

eeg headset dancers

I also worked a lot with arduino and rasberry pi based setups. In several projects I used the XXX, including the biocybernetic Opera Trisolde where we measured skin conductance of multiple audience members and used it in real time to modulate the music. It was also one of the devices we used extensively in our workshops and courses where student’s invented all kinds of exciting applications. Another workhorse device that I’ve helped to integrate into dozen of real-time adaptive applications is the Muse headset (back before it was monetized…). For student work I also explored the Emotic and Neurosky headsets. I even was able to work with a custom headset combining simultaneous recording of EEG and fNIRS signals (check web link)

EEG

Muse, G.tec, neurosky, emotic

##

kinect blobo

Classical statistics

I did the statistical analysis and reporting in all the studies I was part of and learned from the best on how, in practice, analyze such multivariable, hierarchical time-series data that had about all the possible complications a dataset can statistically have. Very rarely was data i.i.d or had homogenous variance or fulfilled just about any of the usual assumptions that basic ANOVAs and such require.

As a computer scientist with mathematical background I approached statistics more methodically and never just memorized the endless number of ANOVAs, ANCOVAS and other such approaches, but systematically studied how the underlying linear models work. This allowed me to utilize the most suitable model using the more generic methods such as the linear mixed models in SPSS or lmer and its kin in R. In my Master’s thesis I did a thorough catalogue on how these methods are linked and when they are needed.

Programming Background

Novice guitarists memorize chord patterns on how to play A minor or C7 chord. More advanced guitarists understand music theory enough to be able to (even unconsciously) to list the notes needed to any theoretically possible chord and where to find these on the fretboard. Similarly, only novice programmers are fixated on list of programming languages to learn, while the more experienced ones know that in the end there are certain programming paradigms such as OOP and functional programming, and much of the difference between languages is just syntatical. Of course languages have pro’s and con’s, but in the end, a professional developer can pick up any language in reasonably short time. That why I feel bit reluctant to classify myself as C or Python programmer. As a prime example, I was once “borrowed” to a research department to help implement a complex psychophysiological recording setup and integrate it into a computer game (that acted a the stimulus). Only when sitting down in their lab did I figure out that that game in question was written in Turbo Pascal, and I had never worked on Pascal. I took few hours to study the Pascal syntax and was able to succesfully complete the project, and become a “professional pascal developer”.

Of course each language has its quirks and working long time on a specific language makes you more efficient and fast, so I’ll describe here my experience with various languages.

Technically the first language I worked with was the BASIC language on Vic 20 and Commodore 64 systems that I used to copy games from source code listings before I could even read and write a human language. However, as I was only 5, this “proramming” was not much more than copying.

I truly started to program for real during my teenage years when I became an implementor of a mid sized MUD (multiuser dungeon) in mid 90s, and spent couple of years writing C Code to enhance this multiuser game engine. I even managed to install an early Linux system on my 486/pentium boxes to develop locally, which was not a simple task back in 95. As the head implementor I also organized and coordinated a team of dozen developers around the globe. This was time before tools like Git(hub) and even before the user friendly monolithic version control systems like subversion.

After starting my university studies I was instantly hired into a startup company where I could utilize my C server programming skills in industrial setting and scale.

I also worked with C++ when necessary. In one interesting project we wrote a firefox plugin in C++ to allow using torrent protocol to load video files in a intelligent way so that you can start watching the movie before it is fully downloaded. As part of university studies I also explore more niche topics and implemented, for example, a template base generic data structure that allowed me recursively store various data structures (having a red-black trees inside nodes of red-black trees indifinitely).

After starting working as researcher at the HIIT I had started using Java as my main tool and used it to implement my EmoEngine.

Python.

In addition I’ve done quite a lot of work with javascript (and node.js) at various projects. I also took an university course on Clojure that made me really excited on purely functional programming and for a while I was really enthusiastic about it. I also took another university course on Scala because I had heard it could combine functional and OOP, as well as being a big data language. Since then I’ve used Scala every now and then in Databricks etc.

Team work.

ds comm, mud team, student groups,

Random list of projects : to be written out

  • guided the Romanian student to high quality publication / demo (cited in Nature)

  • Kinect projects
  • haptics and tangible
  • solid understanding of the theoretical CS : turing machines with pencil, mathematical proofs of np-hard graph algs in exam setting with nothing but pencil
  • All the Causality related topics: courses, code samples, ideas etc.
  • dataspaces / datameshes, federated learning, privacy, governance etc

Photoframe

I participated in a month long “workshop” in Amsterdam where together with a small team we developed from scratch an interesting prototype we called xxx. the system did x. link videos.

Fridge

link video

relaworld

video

Work stuff

bathb etc.

rep

After working for years in physiological computing I noticed that the field was in dire need of a systematic way of classifying the different layers of analysis that are always involved when going from raw psychophysiological signals to working applications.

As a part of large consortium of academics working on the domain we wrote a massive and definite guide on using physiologin in HCI (over 160 pages!). This paper already included a detailed analysis and description of the lowest level of analysis, on what are the raw psychophysiological signals and what kind of features could be derived from them. I continued to expand this thinking into a full layered framework that I presented in Berlin (x). In addition, I was able to lead a group of student’s in a half a year agile development project to build a working web repository for physiological computing. In my PhD I then went into rigorous detail on the what I had now started calling “dynamics of the biocybernetic loop” with special attention on when and where use of machine learning approaches is approriate, and what is more suitable for custom, hand-made logic.

##

Pascal game?

nice paper!!

mud/bbs etc

Full backend (postgresl, smsc …)

Fridge

Bathbomb?

sod etc

relaworld

listen to youserlf and others

cloud before cloud

Phase synch?

repository / 5 layer

competence team