Finally September came. Happily it was about time to go back at university!
This is the final and 4th year of my bachelor in Sound and Music for Games. So, after spending my summer thinking about my dissertation subject and read books about sound design. I finally found a subject that fulfil my curiosity, express my personality, and might help for my professional development.
First, I wanted to focus my question on the impact of frequencies in music and sound on emotions. But, back at university, I exposed this idea to my lecturer Kenny McAlpine, who mentioned that frequency is not the only factor in music which impact our emotions. He was definitely right! So many other parameters can affect the perception of music and our emotions: rhythmic, frequencies, music structure, tempo, melody, voices, cultural background of listeners, amplitude, harmonic and certainly so many more. He also advised me to read the book Music Instinct by Philip Ball, which I am already half way through.
I was a bit afraid to make the subject to broad and lost myself. But actually, it seems a good idea that might enrich my perspectives and views about music. It would also be beneficial to learn how to compose music in a way that impacts more our emotions while listening to it or playing a game, therefore make the experience more immersive.
As I truly enjoyed working within a team, I didn't want to work on my own during this final year. I was enough fortunate to find Sara and Elonora, 2 others students in Arts in Abertay. Our project would be based on Scandinavians during the Viking age. For now, we don't know which part of Scandinavia, or their colony or maybe history or Sagas we would use. This should be defined quite soon as all of us are doing researches on the subject. We shall talk more in depth about it next week. While doing researches, I found an interesting thesis and articles about music in viking age. From the information I could gather, it seems that Icelandic music is the closest to what viking would have played. I definitely need to deep dive in that direction.
Thus, for our project we were thinking about 2 options: doing a cinematic or a game. Each project have its own problematic; for the cinematic we would need to find a character artist as it would be weird not having any characters into it. For the game we would need to find a programmer or someone who know how blueprints works in Unreal. This will also be discussed next week.
If we end up doing a cinematic and if I have enough time, I would like to create a small interactive art installation that would contain the music I composed for the project. After many researches, I found out it is possible to couple Ableton Live with a Raspberry Pi and sensors as well as using p5js all together.
Lets go back to the honours question for the dissertation. At the moment, what would it be?
The impact of music on emotion in game (and maybe interactive art installation) context.
Create music for a narrative game.
Within an installation: with raspberry pi, Ableton live and p5js and sensors . the sound and music would be affected by data (weather – pollution in cities– luminosity – temperatures - etc) when nobody is interacting with it. The installation could also be influenced by the people could also interact with it by letting pushing buttons, turn knobs, pull ropes, or going near sensors (like movement detectors, cameras) etc.
Reading and learn from books and articles about music and psychoacoustic.
Learn how audio works in Unreal.
It looks like the year will be quite exciting. I am now curious to see how this will evolve!!