top of page
  • Writer's pictureDavid Tocknell

Additive Synthesis VS The World

Updated: Dec 1, 2019

Before I say or do anything else, let me first apologise in advance. This entry is going to be a long one, and may feel like some bizarre kind of blog baseball game. Please bear with me though, because additive synthesis is really just the beginning of something huge that could really change the way we approach the arts in stage 5.

Hold on tight, folks!

What is Additive Synthesis?


So, let’s take a moment to talk about additive synthesis. “Additive Synthesis” is a rather complex sounding term that I’m sure would make most music teachers scratch their heads in confusion. As usual, technologists and academics have found a rather complex term to describe a rather simple idea, so let’s break it down. “Synthesis” refers to the product of several different components or elements brought together. “Additive” simply means we including or “adding” more things into this mixture. I realise that sounds like common sense, but that is important to understand. Basically, when we are talking about additive synthesis, we are adding different elements together to create a new, mixed product. 


What are these elements we are adding together? Sound waves.


Additive synthesis starts with a single wave. These can be any of the simple shapes (sine, triangle, square, saw), any frequency (which in the music classroom we would describe as pitch) and amplitude (volume, or dynamic if we’re linking it to musical terminology). To create new sounds, we can change these values, add new waves, or affect the simple wave in new and different ways. I won’t go any further into additive synthesis as a concept here, but if you do want to explore the idea more there will be some links at the end which will set you off on your journey.

Different types of waveforms
Different types of waveforms

Additive Synthesis in the classroom


I am just going to take a moment and discuss my experiences around additive synthesis, which can hopefully shed some light about how to bring these ideas into the classroom. I was first introduced to the idea of additive synthesis when undertaking my Bachelor of Music (Composition) degree. When we approached it in composition, it was very theoretical. It made sense, sure, but I didn’t understand how to USE it. To help us understand how to use it, we were instructed to experiment. Start with a wave, then change the parameters. Add filters. Affect the wave in different ways. 


And I did.


I was able to make a few different sounds. It was interesting but not particularly useful. In the time it would take me to create something close to what I wanted using the concepts I had learnt, I could have found an online library, paid however much it cost to download and used the closest preset I could find. What that meant was my knowledge of the concept was entirely theoretical, with no idea of the practical applications of the concept, or how any of the filters or parameters that I could change actually affected the sound.


Korg's LittleBits Kit
We used the LittleBits Synth Kit created by Korg to experiment with physical synthesisers. They are a straightforward way of building and understanding synthesizers.

I would like to now take you to the discussion of synthesizers in this current course. First, we put together a simple PHYSICAL synthesizer. The physical part is important, because if you got the order of the parts wrong, either no sound would come out or the different parts of your synthesizer would not work as intended. Furthermore, you can easily and visually add and remove different effects on the sound, starting from the bare minimum - power, oscillator and speaker - and progressively adding more effects. Such an experience teaches students in a practical and hands-on manner what a synthesizer is and how all the different parts work.


James then showed us a digital synthesizer. This is the part for me that made everything click, and suddenly I saw WHY and HOW you would use additive synthesis. James started with a preset and, one by one, removed the affects. In front of our eyes we saw a complex, interesting synth broken down to a simple sine wave. In front of our eyes, our eyes were opened to all of the possibilities this way of approaching sound offered us. See, additive synthesis is amazing as it gives you nearly limitless options to explore once you understand how it works. Each student can come up with their own unique sound, sparking new creative ideas. Students can become sculptor of sounds, building both musical and theoretical knowledge and experience with sound and thus deepening their understanding of the musical and even natural world around them.


Let’s Start Digging Deeper


Now we have in our toolbox additive synthesis as a composition and performance tool. But where can we find existing examples in the wider musical world? Why, but film and game music, of course! If we consider film composers such as Hans Zimmer (Dunkirk, Interstellar, Batman V Superman etc.), it is nearly impossible to analyse this music in any meaningful without understanding synthesizers. Nor is it possible for students to recreate sounds inspired by pieces already in existence when they are creating their own pieces. The same can be said of music for video games, such as Mirror’s Edge (PEGI16+/M), Portal 2 (PG) or Terraria (PG).


Of course, games and films do not exclusively use synthesizers to create music and sounds present in the products. This allows for comparison between sound approaches. Why did the composer use a synthesizer instead of an orchestra? What effect does the use of this certain synthesizer have of the scene? How would you describe the sound of the synthesizer? Describe the different components of the synthesizer? Or perhaps even, how does the synthesizer interact with the analog sounds present? Some of those questions are of course harder than others, but that provides an idea of how we can approach synthesizers in a music class.


Cross-Curricular, You Say?


So we know what additive synthesis is, we know why we might want to use, and we know some types of media that we can show our students as examples of how this approach is used in the real world. But it gets even better. Let’s take a sneaky peak at the syllabus…


You can see in the cross-curriculum content section of the syllabus that by exploring game music we can easily cover at least of the competencies: ICT. But it is not difficult to cover more.


Let’s consider games for a moment. What are the elements of a game? Well, there is obviously the soundtrack and sound effects, otherwise we wouldn’t be talking about them in a musical context. Next, we have graphics to consider, so let's have a chat to the visual arts, visual design and/or photographic and digital media department. Oh, and games need code and logic in order to work, so let's bring along IST department as well. And games often need voice acting, for characters and the like, so I’m sure there is a place for the drama department as well…


And suddenly, you have a project that crosses the boundaries of the arts. This would definitely not be something you would do in stage 4; in fact, I would only do this toward the end of stage 5. I also am well aware that implementing a project such as this into a school could be very difficult. But there are many benefits to this kind of project. Just some of the benefits off the top of my head include:


  • Preparing students for HSC projects. All of the stage 6 equivalents of the above subjects require the student to complete a major work. Why not give them the experience of creating a major work BEFORE year 12? That way, when students are asked to create something for their HSC, they know how to approach such a task and have developed the necessary skills such as project management. Speaking of project management…

  • Project management. Students will have to learn how to manage their time and resources in order to complete such a project. This will benefit them immensely when it comes to their final years in high school as they try to juggle the completion of their major works and their study.

  • Communication skills and industry simulation. Each student in the group will have to bring their own specialisation to the team. Students will have to learn to talk about their specialisation in a way those not as trained can understand, and learn how to fit their knowledge into that of a greater whole. This reflects how the real world works, giving these students a leg up when it comes to working on their own side projects, or entering into an industry setting.

  • Confidence. At the end of the project, students will have something they can present and be proud of. This can be great for students who are not as confident in their abilities, or those who are having trouble with self-belief. Students can also use the final product as something to show prospective employers alongside major work/s if they decide to follow the creative path through HSC and beyond.

As you can see, a simple yet complex-sounding idea such as additive synthesis can lead us to consider new ways of approaching content and learning in our classrooms and schools.


Oh, and students learnt a thing or two about physics along the way. YEAH, SCIENCE!


Useful Links:


If you are looking for the Australian age ratings for video games, television and film, use the Australian Government's Australian Classification site.


If you want to find out more about wave-forms and additive synthesis, look through the following:

11 views0 comments

Recent Posts

See All

Week 5 - Directing a Production

This week we had the opportunity to practice the recording skills we have learnt so far this semester by recording a performance and interview featuring the fabulous pianist Lolita and our lecturer, J

bottom of page