Nowadays companies want sound logos, and if they don't they should.
So I decided to make one for my friends company Kaszazz (formerly Crafty Kids).
I did a small amount of research, in which time I found this handy little YouTube video, which looks in depth at the Intel Inside logo:
Here's what I've come up with so far.
kaszazz_alogo.mp3 (76 kB)
iieiwrmeieweeiimeemmwreiweremweireeemeimwieewwrwweereireeiimeewmiwwwemwiewimeeeremeiewmemweewieewerwemwiimiewmeiwireiiwrrewmwewree |
4.12.07
audio logo
0 comments copywrite 11:28 am
i highly recommend....
downloading radiohead in rainbows
it's free... well if your as cheap as me
0 comments copywrite 11:27 am
28.11.07
(dead air space?)
the cathode ray tube suite
This film object got premiered at last week's ear poke concert, the EMU's end of year sha-bang.
0 comments copywrite 1:48 pm
27.11.07
24.11.07
telstra sucks...
...and i'd rather be on a beach with an acoustic guitar. but back to telstra for just a moment:
i have now got the internet @ home, and let me tell you my experience with my new ISP:
-terrible service-
after being sold on the "30 minute rapid transfer" deal, i waited for a day before i let concern for my new ISP over boil. i contacted Telstra and was -instead- promised that i would be waiting through a downtime of four to five working days. not even eighteen days later, yes - 18 DAYS L8ER - had i got the net. no no. by then i (potentially) failed on blogging my music technology projects. now i have the net again twenty-two days later. that's right, some 34, 560 minutes!!! as compared with the original deal of 30. excuse my math, but that is 1, 152 times longer. during this i fraught with many IT helpers @ telstra, and all they could provide were lies with what they were going to do for me. lies, eg-
T: tomorrow your modem will arrive in the mail.
J: ok
receives modem a week later
what kind service is that? how is the service at telstra??? terrible! terrible service.
-worst website ever-
what can i say about bigpond dot com? don't go there unless you crave frustration. the search bar *does not work*. the adds are 100% unblockable, 100% unavoidable and 100% liberal propaganda. the web design is some of THE worst that i have ever scene. ever.
-what age we live in-
after all this trail & tribulation, all this stress (right when i didn't need it) my broadband finally gets connected on monday... so you ask, how is it? what are the speeds like?
average and worse. really nothing special, and i have had unexpected cut-off times. although i think im ready for anything inconvenient from telstra right now.
what an age we live in... i just wish i were apart of it!
end of rant, i just needed to make the time to let you all know:
Telstra,
Thankyou for nothing.
1 comments copywrite 1:29 am
19.11.07
weimer arena
Weimer Arena Session.mp3 (1.1 mB)
Making sound assets for own spin on the open-source game "Open Arena" was at first a little daunting, but a task I enjoyed throughly. In some ways I really feel like it put my "computer skills" (whatever that means?) to the test, particularly in the area of audio design. It's certainly something I would never thought of doing, which is a little strange, because its task that Trent Reznor was involved with in making the original Quake. The fact that Trent Reznor was involved in the early making of this, now open source version of the game, in a strange kind of way made me feel apart of a collaboration with him (in quite a 21st Century sense of course). Nevertheless - "Trent Reznor," "me" and "collaboration" - equals :)
A version of my version of an open source game, as well as other students, will be available somewhere, somehow at some-point. We are still figuring this out, because the edited game files are obviously quite large; unable to be posted on a free file server.
0 comments copywrite 6:51 pm
a tribute to syd barrett - behind the scenes
This is the elusive score for my ElectroAcoustic peice 'a tribute to syd barrett' which I was unfortunately unable to digiTise, scan that is, until today. These different coloured shapes represent the three different ambiance layers of the piece, which I improvised over in a psuedo-Alva Nota & Ryuishi Sakamoto vein. The shapes were drawn using the golden ratio: 1 to 0.5(√5 + 1). This is similar to how I used the line object in my program:
Here is a screen-shot of the OSX version of my application 'barrett grain', which was built using Max/MSP. Send me a email or comment if you would like a copy of the program.
0 comments copywrite 6:34 pm
16.11.07
offline
Encase anyone reading this may be wondering why this blog has died down somewhat lately, it is because I am in ISP limbo. I foolishly signed up for Telstra's 30 minute rapid transfer about 25920 minutes ago, but in hindsight should have just stayed with me old ISP iinet.
More on this subject and my Music Technology projects when I get the internet at home again. Currently I'm at the library, reading Crusader by Sara Douglass.
Sneez
0 comments copywrite 12:30 pm
15.11.07
the crazy diamond
a tribute to syd barrett
jake morris
4’01
“A Tribute to Syd Barrett” is a quiet instrumental track, one that a small child might fall asleep to. It explores a dark, ultramodern world through the combination of various ambiences. Mixed together these textures paint the otherworldly soundscape, in which the listener is subtly immersed into. Eventually a small voice is brought out of this world, in the form of poorly recorded upright piano. The strange sound of this piano plays gently through the dissonant lines.
The inspiration for this piece is again the 2005 collaboration of Alva Noto Ryuichi Sakamoto, 1979s The Wall by Pink Floyd, and finally the life of Syd Barret.
The overall form of the work matches with the life of Syd Barrett. The early rise and fall of tension in the electronic part represents Syd Barrett’s time with The Pink Floyd Sound. However, there is no real intension for the listener to recognise such features of the piece – it is just something I happened to be thinking about when writing it.
5 comments copywrite 4:01 pm
25.10.07
improvisationz
Playing together is not something that comes immediately to people, and forum certainly attested to this. All the same, the improvisation sounded pretty cool, I think it could have improved if people had taken cues, more often, from others. As Stephen mentioned we ‘wanted to avoided sounding like 15 solo performers playing at the same time’ and this was almost getting there.
Vinny would have to be ‘the’ crafty veteran 'round EMU when it comes to improvisatory art, and certainly kept this up during forum by running a video camera at the screen it’s displaying on, so as to create a spontaneous visualisation using the ‘auto-contrast’ on the video camera. I was quite impressed by what Vinny brought to the group.
I noticed John Delany may have been of the few performers adhering to the ‘prose score’ called “play” provided by Stephen. While this asked for short bursts of sound (from two to five seconds) most people just let there oscillators run wild.
I played along side Luke, and ended up merging my instrument with Lukes, using his peizo microphones to pickup the sounds coming from me. Playing a duet on Lukes instrument actually allowed us to craft the sound in a little more into the aesthetic of the score Whittington provided. Although to say there is an aesthetic to that score could be contentious.
.sources.
Whittington, Stephen 18.10.07, "Music Technology Forum," EMU, Adelaide University.
0 comments copywrite 11:05 am
24.10.07
+ spatialisation
.cc - week eleven.
Excuse the brevity, but it's rather late. Here's the Granular Synth with a Quad panner.
stereo pan.mp3 (580 kB) | jam.gransynth~updaaate.zip (20 kB)
I spent quite some time searching for some kind of UI object that I could use to control the panner. The dial object falls short because it does not spin a full 360 degrees. The main one I played with was the LCD one, as it appears in the cartopol help patch, but I just could not get it going, however, I would like to continue this search - particularly in reference to the major project.
.sources.
Haines, Christian 18.10.06, "Spatialisation," EMU, Adelaide University.
1 comments copywrite 12:26 am
17.10.07
+ delay line
.cc - week ten.
Having found my found Granular Synth patch fairly useful, I decided to use this DSP delay task to implement a delay effect into it. Here is my delay line sub-patch...
At the bottom of this picture you can see the delay line, which is a fairly simply delay - allowing for feedback of the delay signal back to create a overdriven tone.
jam.gransynth~3.zip (16 kB) | gransynthdelay.mp3 (476 kB)
I created this short excerpt using the sample I made for jumping into water for my Open Arena project. In this example you can mainly here me toggling the "dry / wet" slider, as the signal becomes 'wetter', you can here the feedback increase and it sounds less like someone going to the toilet.
.sources.
Haines, Christian 11.10.07, "Processing - Delay," EMU, Adelaide University.
0 comments copywrite 1:12 pm
15.10.07
amby
.aa - week ten.
To start with I created the ambient backdrop for the level "OA_RPG3DM2," for my open source Quake game. As far as I can tell, there is actually only one sound file being looped as the ambient background for this level, and I can't find the file yet. It sounds like the one file is just being mixed differently for when the player moves in and outside, rather than changing sound files. Not being able to be sure about the specific file yet, I just created a looping 30 second ambient background for the overal level. I used a started with one sample that comes with Reason, in the "Hardcore" Drum Samples for the ReDrum. I plugged this sample, which I called "wink," into my Granular Synth patch, and eventually came up with this:
level ambience.mp3 (356 kB)
In certain parts of my level a loud machine ambiance become very prominent - this sound file I was able to locate in the Open Arena package. The orginal "drone6.wav" (36 kB) sound was actually pretty good to begin with, so I tried to stick fairly close with it. This kind of engine rumble sound reminded me of my Bus environment project from last year, so I quickly fired up my old project patch in Plogue and came up with this asset:
drone6 2.mp3 (90 kB)
The final ambiance I worked on was for the small underwater section. Again I was able to find the original sound files in the package - three sound files, being getting into the water, underwater and getting out of the water (or onset, during and offset). Each of these sound is enclosed in a compressed folder below. To create these sounds I grabbed a droplet sound of soundsnap.com, then processed in my Granular Synth patch. I also used Peak to reverse the offset sound and hack around with it in general:
original water.zip (132 kB) | jake water.zip (384 kB)
.sources.
Haines, Christian 9.10.07, "Ambience (1)," EMU, Adelaide University.
0 comments copywrite 2:38 pm
9.10.07
reflection
It's always refreshing to find somebody making “new music” that is still able to hold a human conversation – to laugh and so forth. Equally, it's great to hear an avant-garde musician who states that their way is ‘a’ way, not necessarily ‘the’ way.
Stuart Favilla and Joanne Cannon were an insightful sort, and while their music did not really do it for me, their attitude certainly did. That is to say I respect them – Australians who are pushing the boundaries, and ones who are finding out what the future holds. More of these people are needed!
This session was actually a ‘forum’ in the true meaning of the word – everyone seemed to want to put their two cents in and throw their ideas around. To be honest, it made me miss the days when this is what forum class was all about, I found the discussions intellectually stimulating then and whenever I spoke the pinch of pressure that I’d better have something important to say arouse. It really helped one shape not only their own opinion, but to speak to an audience and to ‘forum’ their ideas. But alas, nowadays ‘forum’ is just another set of instructions, like the rest.
Having digressed (and whinged) considerably, I guess I’ll add now that my own visions of instruments of 21st Century are quite different. I think Ross Bencina was closer to the mark with his interactive glove, I think that we’ll see instruments that map out every parameter, every twitch of the body. Interactive environments are in their own way an instrument, and I think that this is one of the new avenues for musicians.
.sources.
Favilla, Stuart. Cannon, Joanne 4.10.07, "Bent Leather Band," EMU, Adelaide University.
3 comments copywrite 7:56 pm
+ fast fourier transform
.cc - week nine.
Finding this weeks tutorials quite difficult, I went to a bit of extra effort trying to work out FFT. This led me to Nathen Wolek's Windowmaker patch, which has all of the typical FFT window types premade.
Implanting this into my own Granular Synth patch took much longer than it should have, but anyway, now I have a Ubumenu allowing the using to select the FFT window type (hamming, triangle and so on). Double clicking on the (unhidden) buffer shows the user a graphical representation of the window:
The aural manipulation which my pFFT patch offers is like the 'noise gate' shown in the tutorial patch.
.sources.
Haines, Christian 4.10.07, "Processing FFT," EMU, Adelaide University.
0 comments copywrite 7:21 pm
7.10.07
Open Arena
.aa - week nine.
One day I was backing up an old cassette tape to digital, and I spent some time fiddling around with the tape player and amplifier before I had them going into my laptop at a useful level. In doing this I made a few scratchy recordings before I was able to just get tape, and from one of these recordings comes the basis of my UI mouse-over button sound effect for Open Arena. The first step toward getting the sound to how I wanted it (in my mind) was by doing some EQing – removing that hiss that seems to always come from tapes. Here I used my “favourite” EQer, particularly for this kind of digital Sound Design exercise, which is just the AU Graphic EQ (in Plogue).
One thing I particularly like about starting in Plogue, is that you can easily loop the file, and then you free to do whatever you want to it in real time – and if that means stacking five of the EQs after each other, so be it. Here is how the Plogue patch ended up looking to create the final sound.
Finally, I created a walking sound, which I am not particularly happy with. I have already loaded onto fileden.com so I am not going to be bother changing it, but basically from here I think it will need to be pitched down at the least. I created it using white noise, my Granular Synth patch, Plogue's EQ, Compression and also a cool feature of Peak called "Amplitude Fit..."
jake_aa_wk9.zip (74 kB)
.sources.
Haines, Christian 7.10.07, "Assets (1)," EMU. Adelaide University.
0 comments copywrite 6:18 pm
24.9.07
+ MIDI control
.cc - week eight.
Had some fun adding some controllers to the granular synth patch. Basically, I had the idea of creating pseudo-MIDI controllers using only the keyboard and mouse. The way I have implemented these is to use the mouse to control the grain density using the mouse. While the caps lock key is switched on the value rolls forwards and backward depending on how the mouse has been moved. While the shift key is held down, the mouse acts more like a pitch-bend wheel, moving back and fourth, then snapping back to its previous value once the shift key is released.
The grain density is parameter of the synth patch that I have chosen to modify, however the controller patch is encapsulated in a sub-patch and easily be added to other parameters. Here's where the magic happens:
Get it? You can see it used in the synth patch itself, which is shown below.
The mp3 result that I have put together this week is a sort of sound design extract that I imagine accompanying space ships flying around the cinema. Say, like this:
Ignore the clicks. Sitting ironing out the bugs.
.sources.
Haines, Christian 13.09.07, "MIDI and MSP," EMU, Adelaide University.
Star Wars IV intro on YouTube
0 comments copywrite 11:13 am
18.9.07
active noise reduction
Today I was 'ploguing around', trying to test out the phenomena of wave cancellation. I really wanted to find out whether 'noise cancellation' can occur binaurally. Unfortunately, I couldn't get the cancellation working through headphones, nor even speakers. I could not get phase cancellation happening in the physical world - the only way the signals canceled out was if I ran them through the same channel inside Bidule. Thinking that perhaps the latency created by using the phase-invert unary operator was offsetting the waveform enough to ruin the cancellation effect, I recorded a sine wave, then inverted it in Peak. I placed both these sound files in a 'dual mono' sound document but to still no avail. It seemed however that there was some reduction in the sound, like maybe the fundamental had been lost/changed.
Any thoughts?
Maybe I need an anechoic chamber for this.
1 comments copywrite 4:46 pm
14.9.07
.forum - week five.
Tying up loose ends before the mid-semester break - here's the week five blog that never got posted. To kick-off this weeks electrobitz bonanza, Ben and I adventured to Toys R' Us. Alas, we again bought a toy that did not work!
Anyway, we ended just using the phone toy again. This circuit was welded together, as you can see here:
This circuit was built with a potentiometer to control pitch.
video evidence
.sources.
Haines, Christian; Tomczak, Sebastian 23.07.07, "Physical Computing (1)," EMU, Adelaide University.
2 comments copywrite 12:52 pm
12.9.07
.making peace with satan.
.forum - instrument proposal.
name :
additive / granular synthesis guitar
statement :
The basic idea is to use an electronic guitar as an input device into a computer, more specifically the music coding environment Max/MSP. Firstly, I would like to be able to play the guitar quite naturally, but manipulate the timbrel content of the sound via a Max/MSP interface (additive synthesis). Also, I would like to incorporate the concept of granular synthesis, a method oftentimes suited to making complicated sonic textures - drones which can float gently amid improvisatory performances. It would be desirable to be able to manipulate the aesthetic of these textures from firstly the 'grained' samples from the guitar, but primarily then the Max/MSP interface.
sketch :
0 comments copywrite 11:25 pm
physical computing II
.forum - week seven.
After a trip to Toys R' Us and a second dud toy later, Ben and I were left using the 'opewator' telephone toy again this week. Our first operation was to set it up with both the Arduino and Breadboard, so that we control the power through the computer and Max/MSP. After doing this we decided to add a 'metro' object to the patch, turning the power on and off every 50 milliseconds. Having added a potentiometer to control the pitch of the circuit too, the parameters of the device reminded somewhat of a crude granular synthesizer.
After completing this, the next circuit on the agenda was called some like "the generative moley decepticon" (or something like that). I was pretty excited by the title, and too the fact that all of the legs of the chip were being used by 33kΩ resistors. But alas, the circuit did not do anything for some reason or another, and I am sure it was not because any of the resistors legs were touching! At this point Seb, Ben and I had to part ways so I suppose you win some and you lose some.
video evidence
.sources.
Haines, Christian; Tomczak, Sebastian 6.09.07, "Physical Computing (2)," EMU, Adelaide University.
0 comments copywrite 10:29 pm
physical computing
.forum - week six.
This week we got introduced to 'physical computing' via the Arduino board. I am extremely pleased by this new topic, physical computing is something that has more or less been on the back of my mind, the end of tongue for awhile now, and now we are getting into it. Having said that, I do find that the 'step-by-step' tasks dampen the exciting prospects of creative outlet that physical computing provides. However, and I guess as always, you need to learn to walk before you can run.
These introductory exercises had Ben and I creating square-waves pretty quickly. Initially we play the oscillator with the Ptah Max/MSP patch, then through just a potentiometer. All of this is shown in video evidence on this page.
.sources.
Haines, Christian; Tomczak, Sebastian 30.07.07, "Physical Computing (1)," EMU, Adelaide University.
3 comments copywrite 2:07 pm
i invented the granular synth not Ben nor Xenakis
.cc - week seven.
I worked together with Ben Probert to create this patch. Currently he is creating a new interface for the patch, so keep your eyes out on his blog.
As usual I have spent all of my time making the patch and left none to simply using it - however, I can see how this granular synthesis concept could have great outcomes. If you happen to download the patch, clicking on the load button will prompt you to open the sample (noise) twice. Just select the same sample both times. I tried to get around this but loading the file for both the buffer~ and sfinfo~ was a necessary evil. If there a way you can load it into both through a single pop-up window please let me know.
version 1.0 (as of 11.09.07)
I created the mp3 example using version 1.0, however the improved 1.1 contains an envelope feature (inside the poly~ object) which removes the clicks. Also, there is a CPU usage display, a new colour scheme and "no more frills" inside the patch.
.sources.
Everything.
Haines, Christian 6.09.07, "Sampling (2)," EMU, Adelaide University.
Nothing.
2 comments copywrite 12:30 am
5.9.07
buff man
This week I've put the time in and struggled quite a bit with the task. After playing round with the buffer~ record~ index~ count~ play~ groove~ wave~ and selector~ objects, the whole "buffer" concept really was not all that clear - the different things that these objects (must) offer is not all that obvious. As far as I can tell, incorporating these objecs into making your own objects is a 'finicky' task at the least. In what I have uploaded above I think I have atleast achieved something; my idea was to make an object which could record from your DPS input, and then playback at different speeds (and 'scrub') that soundfile.
.sources
Haines, Christian 30.08.07, "Sampling (1)," EMU, Adelaide University.
0 comments copywrite 2:10 pm
club and mace?
.cc - week five.
This week our given task was to make a FM synthesis object with all the usual features (user interface, polyphonic version, help files and so on). In doing the task however I got a side tracked and ended up just playing around with all the sounds that you can make using FM. Here's the result:
This track was taken straight from Max/MSP - has had no post production. I'm not trying to boast this point, I just need more time for such things.
.sources.
Haines, Christian 23.08.07, "Vibrato, FM and Waveshaping," EMU, Adelaide University.
0 comments copywrite 1:00 pm
3.9.07
machine gun design
.aa - week six.
This week I had a go at the 'cycle event' tutorial using FMod and created a machine gun sound effect loop. Here's a screenshot:
machinegun.mp3 (208 kB)
In this audio example I let the sample (originally taken from freesound) I let the loop run for a few seconds before hitting the 'key off' button.
.sources.
FMOD 2007, "FMod Designer," accessed 30.08.07.
Haines, Christian 30.08.07, "Sound Asset Design," EMU, Adelaide University.
0 comments copywrite 2:52 pm
27.8.07
sound design with fmod
.aa - week five.
FMod seems like a fairly useful program for sound designing a video game, even if its user interface is a little bit confusing to get used to. I used Plogue Bidule to knock together a few example game sound assets - in this case a waterfall, wind and a teleport device (like the one in halo 2). The examples I grabbed from FMod actually sound a bit less realistic than what I originally created in Plogue, because the parameters of the sounds have been randomized. In hindsight I do not think randomising the pitch of the waterfall, nor wind, was such a great idea but at least I can tick that box for the exercise. Here's the 'event editor' for the teleport sound and an mp3.
Next up are the two environmental sounds I created. These are deliberately lower in volume.
waterfall.mp3 (208 kB) | wind.mp3 (208 kB)
Getting over the initial frustration of learning how a user interface works is a common task nowadays, not least for geeks like me. It's fair to say that FMod is one of the more frustrating of these, however I think once you get down to the nuts and bolts it is pretty awesome.
.sources.
FMOD 2007, "FMod Designer," accessed 28.08.07.
Haines, Christian 21.08.07, "Audio Engines Analysis," EMU, Adelaide University.
0 comments copywrite 12:46 pm
22.8.07
circuit bender
.forum - week four.
Oft partner in crime, Ben Probert, purchased the car toy shown in the video below. I really do not wish to start with the "back in my day" rant at age 19, but, as if they make kids toys that yell 'You're a dead man!' What the...
I made a quick sketch of the circuit on my laptop, which is shown below. As it shows, there was no obvious resistors that Ben and I could get our hands on, which made our bending task a little labourious.
Here a few of our outcomes, firstly we got the recording to speed up gradually which was quite comical. Then we got this weird droning tone sound going, which comprises the next two videos. Videos of Ben and I destroying the car are shown on Ben's blog.
...Well, all this YouTubing should kill off my internet quota by the mid-semester break so I guess after this point we wont have to blog anymore :)
.sources.
Haines, Christian 16.08.07, "Circuit Bending (I)," EMU, Adelaide University.
2 comments copywrite 1:19 pm
additive synthesis
.cc - week four.
i. Additive Synthesis object
jam.addsynth.zip (8 KB) | addsynth.mp3 (488 KB)
.sources.
Haines, Christian 16.08.07, "RM, AM and Additive Synthesis," EMU, Adelaide University.
0 comments copywrite 12:45 pm
20.8.07
halo engine
.aa - week four.
Both Halo and Halo 2 use a similar GDE (Game Development Environment) created by Bungie Studios, with “creative oversight” (financial support, I’d suggest) of Microsoft. The major development between these Halo installments is in fact in a middleware upgrade – the widely used physics engine: Havok. In line with the new features this development offers, the audio team decided to develop entirely new audio for Halo 2. Martin O’Donnell, the chief sound designer and also composer for the Halo franchise said this on game sound: “I believe that there are three equally important stages for game audio: producing content, implementing content and mixing content” (O’Donnell, 2002). This ‘three stage’ system reminds me of the "Audio for Games: planning, process and production" publication by Alexander Brandon that we have been looking at in Audio Arts.
The sound effects are implemented through various tables in an extensive database. Even the music for the game uses its own database system which allows for the mood, theme and tempo of the game-play to dictate the audio. Halo is certainly not the first game to utilise such an audio system, but its defiantly one of the better and most successful implantations of this, somewhat lofty idea of what music for game could be. In fact this is true of the audio throughout the game, in the final stage of O’Donnell’s process (‘production’) he has made the entire soundscape of the Halo 2 a seamless integration of VOs, SFXs and music in which the other world of Halo truly comes to life.
Game Development Environment (Halo Engine)
Advantages: Well established, bug issues ironed-out, upgradeable (eg: Havok physics), proven to work, cross-platform (using C/C++ and developed for PC and various consoles).
Disadvantages: Fundamentally out of date (for example, unable to display at 1080p resolution).
Game Audio Environment (Halo 2 Audio Engine)
Advantages: Many technological advantages, 5.1 channel audio, integrates with the physics engine (Havok) to control DSP effects parameters (for example reverberation size), programmed specifically for this title.
Both Halo and Halo 2 use a similar GDE (Game Development Environment) created by Bungie Studios, with “creative oversight” (financial support, I’d suggest) of Microsoft. The major development between these Halo instalments is in fact in a middleware upgrade – the widely used physics engine: Havok. In line with the new features this development offers, the audio team decided to develop entirely new audio for Halo 2. Martin O’Donnell, the chief sound designer and also composer for the Halo franchise said this on game sound: “I believe that there are three equally important stages for game audio: producing content, implementing content and mixing content” (O’Donnell, 2002). This ‘three stage’ system reminds me of the “Audio for games: planning, process and production” publication by )(#*)( that we have been looking at in Audio Arts.
The sound effects are implemented through various tables in an extensive database. Even the music for the game uses its own database system which allows for the mood, theme and tempo of the game-play to dictate the audio. Halo is certainly not the first game to utilise such an audio system, but its defiantly one of the better and most successful implantations of this, somewhat lofty idea of what music for game could be. In fact this is true of the audio throughout the game, in the final stage of O’Donnell’s process (‘production’) he has made the entire soundscape of the Halo 2 a seamless integration of VOs, SFXs and music in which the other world of Halo truly comes to life.
Game Development Environment (Halo Engine)
Advantages: Well established, bug issues ironed-out, upgradeable (eg: Havok physics), proven to work, cross-platform (using C/C++ and developed for PC and various consoles).
Disadvantages: Fundamentally out of date (for example, unable to display at 1080p resolution).
Game Audio Environment (Halo 2 Audio Engine)
Advantages: Many technological advantages, 5.1 channel audio, integrates with the physics engine (Havok) to control DSP effects parameters (for example reverberation size), programmed specifically for this title.
Here is a link to an interesting interview with the man himself Martin O'Donnell. It's quite insightful, despite not saying one bad work against halo (heresy).
.sources.
Haines, Christian 28.08.07, "Game Engine Overview," EMU, Adelaide University.
0 comments copywrite 3:01 pm
15.8.07
plogue bidule II - oscillate this
.forum - week four.
Dual Potentiometers
Light-Sensitive Potentiometer
DJ Probeboy hacking up Tool
On the topic of electro-bitz and hacking, I found this interesting tutorial on how to turn a DVD player into a light saber. Freakin' laser-beam!
Laser Flashlight Hack! - video powered by Metacafe
.sources.
Haines, Christian 9.08.07, "Modular Electronics," EMU, Adelaide University.
2 comments copywrite 11:22 am
14.8.07
polyphony and instancing
.cc - week three.
This week we updated our cycle~ and phasor~ objects. They are now polyphonic.
i. Polyphonic Cycle
Thankfully this weeks subject matter was a little bit easier to get my head around. So hope you like it.
Is it really week four already?! What the hell happened? Where is the time going?
.sources.
Grosse, Darwin 2006, "The Poly Papers (1)," cycling 74' online.
Haines, Christian 9.08.07, "Polyphony & Instancing," EMU, Adelaide University.
0 comments copywrite 5:58 pm
13.8.07
halo 2 online
.aa - week three. Halo 2 assets.xls (excel doc) | Halo 2 assets.jpg (image)
Halo 2 has been played online several million times. The audio component to this online sensation is vital to its success, and has been observed as such by many ‘game award’ ceremonies. As shown in this video, Halo 2 multiplayer is intensely fast paced, where players must use absolutely all information feed to them from the game to their own advantage.
One aspect of the audio that I do not mention in this asset list is the player-to-player communication provided by the Xbox Live headset (pictured). This type of communication is widespread now in almost all online console gaming and all online PC gaming. Everyone who uses plays online games is affected by this development in multiplayer gaming. It makes for a much more personalised gaming experience, and I see it very much as a new avenue of globalisation. We have all heard of online dating, well this development in game audio has made the online game environment just as suitable place to find new love as the pub. And yes, that’s a little unnerving.
.sources.
Haines, Christian 7.08.07, "Process & Planning," EMU, Adelaide University.
1 comments copywrite 1:58 pm
7.8.07
fun with piezo
.forum - week two.
Learning to solder was going great, until on the last weld I burnt my finger. ƒΩçK!!
From there I had a bit of a play with a piezo mic (or speaker if you catch them on a saturday night). The great thing about these is you can throw them about like a cheap whore - just like you'd normally do to university microphones. Haha that's a joke just for you Christian (you read these right?)
Anyway, I tried a number of different way of using the piezo as a sort of 'contact' mic to try and catch sound vibrations out of water. Toward this goal, I filled my lunchbox with water and put a layer of glad-wrap over the top. I tried swishing the water around but it just did not achieve the sound I was after. Then I had the idea of putting my iPod headphones onto the glad-wrap - to see how traveling through water would alter the sound of .. Parkway Drive as the case was.
I must confess this is simply a reenactment of the way it went, I actually recorded the setup much further back from my laptop, so to avoid spill from the headphones into the laptop mic. After accidentally getting my headphones, the piezo and my iPod wet I figured I would just mess around with the little mic to see what sounds (nay, noise art!) I could come up with. The result was entitled i need take a piezo bad (584 kB)
.sources.
Haines, Christian some time last week i rekon it was "Welding and fondling the electro-bits," EMew! Aderaide University... You know the one on North Terrace. NO~ not the the cool looking one next to Fowlers, the one further up where all the ugly kids get off the bus
0 comments copywrite 10:14 pm
6.8.07
musick
i have been working on some new tracks. this one's sits in the minimal glitch electronica vain again.
slow.mp3 (4.2 MB)
it starting off being called 'slow', then 'not quite asleep', the 'dress formal for bed', and now it's just called 'slow' again. ahh, whatever... perhaps i need through a cricket ball at my keyboard and see what comes outs (ala Autechre).
0 comments copywrite 3:03 pm
signal switching and routing
.cc - week two.
i. sine function generator
jam.cycle.zip
ii. sawtooth function generator
jam.phasor.zip
We added these sound creation objects to our MSP libraries this week, as well as adding a mute function to our previous objects. You can find these updated objects in my archived library below.
jam.msp.library.zip (16 kB)
The major challenges I faced this week was understanding the concept of the 'phase' inlet with the phasor~ and cycle~ objects. I understand the fundamental concept of phase, but what I can not figure out is how sending a number greater then one or less then negative one effects the sound, if that is the range of these functions..?
Aside from that it was another fairly straightforward week of MSP, with my main challenge being to read through the tutorials. Reading a Max/MSP tutorial is an exercise that I likened to eating a raw Weet-bix.
.sources.
Haines, Christian 31.07.07, "Signal Switching and Routing," EMU, Adelaide University.
0 comments copywrite 2:13 pm
3.8.07
final fantasy VII
.aa - week two.
Final Fantasy VII came out in 1997 for the PlayStation X and I was stoked.
There are absolutely no VO (voice-over) recordings in the game whatsoever. Basically the story is so long and convoluted, and contains so many characters, that audio director Nobuo Uematsu decided that the limitations of using recorded audio were too great. Interestingly this was not the only limitation he was faced with, he decided to only use the PlayStations internal chip and created a huge score of MIDI tracks for the game. He did this not only to free up the disc to utilise the power of the PlayStations visual hardware, but also so that the music would never stop to load – much unlike the visual element. In this way, FF7 is certainly an example of a game in which the audio was a primary factor to its success. Given that the game comprises three discs, it was a truly colossal game, with hours of typically frustrating game-play, movies and (best of all) music.
Aesthetically the game opens in the industrious world of Midgar (shown below), with the sounds of steam blow-offs and chrome metal smashes. Most of the sounds tend to conform into this feeling that a dark, cold metal city sits in the night sky. This kind of image is often painted too by industrial music, making use of sharp ‘nasty’ MIDI sounds; this is what probably draws me to FF7 the most.
The interface is navigated with two sounds – as you move the cursor over different buttons and when you select one of these buttons. Environmental sounds normally come intertwined with the music – for the most part the game is set in industrial cities, and the music matches this. Also I recall a part of the game when the characters walk through a windy desert, for which the music contains a swishy white noise sound. As well as the environment, the music often incorporates the mood of the story. A simple example of such mood-music would be the characters enter battle or face a boss, and also when the player wins a battle. In this last case, a triumphant marching band sort of theme is played, and a ‘chimey’ sound effect represents the money being collected from the bad guy.
.sans analysis.
These recordings are pretty average quality, but hopefully you get the idea.
UI SFX
In this example you here me move the cursor three times then make a selection. These are the interface sounds used throughout the game (main menu, ingame and so on).
FMV music
This is the audio that accompanies the opening cut-scene, in which I think you get a good idea of the industrial feel to the music. After around 5 seconds you here the ‘verby’ footsteps of the Aeris Gainsborough walking down a chrome metal street in Midgar. Towards the end, after 31 seconds, you hear the huge chords which accompanies the main FF7 logo (see above) coming onscreen.
Battle music
This sample begins with a swishy blow-off sound, which any FF7 player will know, occurs when a player goes into battle. The feel of increased tension in the music suits the in-game action. You can hear the same UI sounds as before after about 11 seconds (selecting to attack) then at the 12 second mark you hear the main character Cloud Strife jump over the enemy, attack him with his sword, and finally you hear the enemy die. This is last sound effect makes the sorts of ‘a-o-waaa’ noise. Love it. At around the 25th second, you hear the music change, playing big strong chords, signifying the battle has been won. After this music changes again to a more mellow feel, as the player chooses what loot to collect from the defeated enemies.
Critical Strike
Many games make use of ‘critical strikes’, whereby the attacker has a chance to do around double the damage they would normally do on a melee attack. After this sound effect this sample also contains the enemy dying (a-o-waaa).
.sources.
Brandon, Alexander 2005, "Audio for Games: planning, process and production," New Riders Games, Berkeley, Calif.
Haines, Christian 31.07.07, "Game Audio Analysis," EMU, Adelaide University.
0 comments copywrite 10:25 am
30.7.07
2 + 2 = 5
.cc - week one.
This semester we are creating our own MSP (Max Signal Processing) libraries, with the aim of pooling them altogether for the final project later on. To begin with we were asked to make the following two objects:
i. Pop-less DAC (Digital to Analogue Converter)
ii. Volume fader
.thoughts.
For the most part the original dac~ object is better than the jam.dac~ object. While my object may not pop to begin with, it instead has a small volume ramp. Having said that, I have not really encountered a problem with the original dac~ popping. On top of this dac~ has other features like the 'stopwindow' message, as well as an endless addition of channels and messages which arrange those channels. If you double click on the dac~ object it opens the Max "DSP Status" window which has heaps of options/preferences for Digital Signal Processing in Max. So while I learned some stuff about MSP making jam.dac~, it's not going take over dsp~ just yet.
.issues.
The only way I know incorporating these objects into other patches is by using the 'bpatcher' object or by creating a 'patcher' and copying the whole objects patch into that window. Having said that, clicking on 'help' for each of my custom objects will only open the 'thispatcher.help' window. And also, I was unable to set the 'Ramp Time' for the jam.fadeamp~ object as a argument (it's only inputted through an inlet).
It is going to be a semester of number crunching. I can smell the blood, sweat and tears already.
.sources.
Haines, Christian 26.07.2007 "Music & Sound Processing." EMU, Adelaide University.
0 comments copywrite 2:35 pm
victorian synth
.forum - week one.
victorian synth on youtube
Making a ‘Victorian Synth’ in Music Tech forum was a fun and simple introduction to the realm of instrument construction. As far as I could tell, the two main ways to alter the sound of the speaker being connected to a 9V battery was to, firstly, alter the signal flow between the battery and speaker (like running the connection point over a metal file) or to, secondly, alter the output capability of the speaker itself (like burying the speaker in a bowl of rocks). Using the second of these possibilities, Seb used a paper clip to generate a pitch sweeping pseudo-oscillator tone. For me this was the most impressive used of the victorian synth.
Mitchell Whitelaw’s keynote presentation at ACMC last year featured a video in which a small green dot wobbled around and expanded into different shapes. I was really impressed with by this video. He went on to explain how it was created; by connecting a weird mishmash of electronics into a video input. I am hoping that this ‘Electronics, Instruments and Improvisation’ course touches on other media (particularly visual) at some point.
This kind of experimentation with electronic componentry can obviously have breathtaking results, having said that, I do not think I will ever need to play a victorian synth again. Just on that, if we were to compare what we were doing with the vicorian synth to the tele-visual media, that is, if we were outputting through a screen rather than a speaker, we would then have been perhaps drawing on a screen and putting objects on it, like we were doing to the speaker. However in the visual medium one would tend not to do this, it seems a little stupid, and having said that I am happy working from the behind the output device. The possibilities of what one can create and send to a speaker, rather than what one can create by altering the vibration of that speaker, are greater and preferable.
This course intends for each student to create an instrument, which to then use for improvisation. I have idea for my own interface/intrument, which springs from this dilemma: Oftentimes I am sitting at my computer tapping a beat on the desk then trying to write this into a program like Sibelius or Reason. While both extensive experience with this, and years of aural classes have diminished this problem for me, I still feel that an interface which cuts out the middle man between my desk, my brain and the computer should exist (perhaps one does?). So basically what I am envisaging is an electronic drum kit that is played with your fingers and is primarily (but not exclusively) a tool for composition. As I have said already, the avenues toward visual output are also attractive. Just an idea.
.sources.
Haines, Christian. Tomczak, Seb. Whittington, Stephen. 26.07.07 "Introduction to Electronics, Instruments and Improvisation," EMU, Adelaide University.
0 comments copywrite 11:51 am
27.7.07
game audio
.aa - week one.
Bubble Bobble was released in 1986, before I was born, but was popular enough to make it onto plenty of subsequent game systems. Beginning as an arcade game, its now avaliable for PS2 and xbox 360.
It incorporates two-player cooperative really well (second player can join whenever). It was one of the first games to have multiple endings; finishing the 100 levels in single player mode will only warp the player back to a previous level. Finishing the game with two-players will successfully rescue bub and bobs' girlfriends. Giggity. However, this is not the "true" ending, for that, one must enter the code obtained from rescuing the girlfriends and enter that on the starting screen to unlock 'super' bubble bobble. In this mode the enemies are randomised, making it very difficult for the two-players to complete the game (again) and clock the true ending! Some of the baddies...
Stoner
Baron von Blubba
In the game there is this repetitive song that loops over and over, until the single melody of the song is so ingrained in your head all you can think about is shooting bubbles out of little dragons mouths and bopping along to that silly tune. Apart from that the only sound effects are the sound of the dragons jump and the sound of a bubble popping. Aesthetically, the music blends in nicely with the sound effects and the feel of the whole game generally - bubbly and silly.
There is a remake of the game available for OSX here
.sources.
Haines, Christian 24.07.07, "Introduction to Game Audio," EMU, Adelaide University.
http://en.wikipedia.org/wiki/Bubble_bobble
accessed 27.07.07
0 comments copywrite 10:34 am
26.6.07
saxophone quartet
In the previous week I recorded a saxophone quartet. The quartet includes:
Bianca Pittman - Baritone Saxophone
Hamish Buckley - Tenor Saxophone
Martin Cheney - Alto Saxophone
Kristy Williamson - Soprano Saxophone
1 comments copywrite 4:18 pm
extravehicular
9 ‘ 19
This piece is built over a thick ambience drone, which is composed from a drum sample being played extremely slowing – at a speed that oscillates back and forth, playing it in reverse too. The main instrumental accompaniment to this is a “probability” drum machine. This specifically written program uses programmed percentages to determine the likelihood of playing each drum. The resulting drum score is hazardous, lending itself to the industrial samples used. Together the drone and drum parts offer a very ‘digitally’ imbued version of electronica. Against this an acoustic piano is juxtaposed, to provide a small and distorted element of human melody – distorted in the way the piano is produced and cut up to add unnatural elements to the instruments distinctive sound. Aesthetically, the piece sits somewhere around the Industrial sound of Nine Inch Nails and the minimal glitch sound of the Alva Noto and Ryuichi Sakamoto collaboration. The piece is also greatly inspired by the “Odyssey” series by Arthur C. Clarke; this piece is too a snapshot of a human operating in space. The droning ambience part would here be emblematic of the machinery in such a scene, while the piano would be representative of the human.
1 comments copywrite 4:13 pm
11.6.07
snerf snerf
.cc - week twelve.
"I want this to go from this beat to that beat over this amount of time, with this curve, which is shaped according to this equation." (Booth, 2004)
This quote from Autechre was my primary inspiration for this semesters Max Patch. I have called it a 'probability drum machine', which basically uses user-defined percentages to determine whether a drum should be played on a certain beat. For example, a user can specify 100% chance of a kick on beat one, and 56% chance of a kick on beat 'two-and'. This creates an exciting deference to what is commonly found in dance music- instead of the common 'hit the crash every four bars', there could be a 25% chance of hitting the crash every bar. Another nice thing that the prototype produced was the ability control the bar between randomisation and specified input- it was easy set up a train of kick on one and snare on three, but over the top have somewhat more random, and realistically intuitive hi-hat hits. I also want to be able to morph from one beat to another over time (ala Autechre), create simple panning and volume automations and simply turn the thing on 'auto'.
The program itself will work best in collaboration with "real" instruments. So for the piece I make I intend to accompany the laptop with the piano. In the piece I am drawing inspiration from Autechre, some Nine Inch Nails industrialisation, as well as the electonica minimalism produced from the collaberation of Alva Noto and Ryuichi Sakamoto.
.sources.
Haines, Christian 31.05.07, “Miscellaneuos Topic,” Electronic Music Unit, University of Adelaide.
0 comments copywrite 1:39 pm
6.6.07
sit on the spiral
.forum – week twelve.
It is an interesting polarity between improvised and composed music. Like all good polarities, I am interested in (for some strange reason) comparing it foremost to left and right wing politics. As such, improvised music I would compare straight away to communism, while composed music would be more indicative of dictatorial Fascism. Now, what the hell does that have to do with anything? Well…
Polarity. It is a scheme which seems to control everything; darkness versus light, good versus evil, classical versus jazz. No matter what your philosophical viewpoint is, surely it sits somewhere on a scale, a polarity which caters for all. Perhaps the most intriguing polarity to which I have thus far been introduced is the one that Professor Mark Carroll is oftentimes found spouting. That is, the polarity between Apollo and Dionysus, the head and the heart and between intellect and the senses.
So how does this sit in the war between improvisation and composition? Composed music requires a certain degree of intellectualism, while improvised music could be considered more impulsive and sensual. Stephen Whittington seemed to make mention of composition appearing to ‘win out’ in western culture. I feel that this is just one battle of polarities that continues to define the life of music, to this day. While certain entities can sit entirely on one side of the spectrum, the spectrum continues to exist with many wonderful exponents.
In the current day, post-modern as I later found out, I find it quite amazing that people can still cling so stringently to their particular position on the spectrum. It is obviously still accepted, I mean there are still people making money with dance music and even ‘old-school’ metal is making a return. But to be honest, at the moment I am listening to Gui Boratto’s new techno wonder “Chromophobia,” I have Stravinsky’s “PȨtrouchka” in my car’s CD player, am learning “Laid to Rest” by Lamb of God on guitar and one of Clementi’s Rondos’ on piano. I like dipping my toes into all aspects of argument, and am just genuinely interested in doing way too much. In other words, right now I am sitting a bit like white noise on the spectrum.
Although, it’s entirely possible that I am just a dirty commi.
.sources.
Whittington, Stephen and Harris, David 01.06.07, “Composition versus Improvisation,” Electronic Music Unit, Adelaide University.
3 comments copywrite 12:47 pm
4.6.07
mastering superman
.aa – week twelve.
This week we continued the mastering exercises, taking original (un-mastered) recordings to make our own using digital plug-ins. We also had the Edensound Mastered versions to compare our own efforts to. I used the filterbank EQ, Aphex Aural Exciter, MC2000M4 compression, Maxim Limiter and the power dither.
superman (un-mastered) | superman (my version) | superman (Edensound)
The main difference between my mastering and the Edensound, is the bass is stronger in my version and the Edensound sounds a little more ‘chimey’ and ‘verby’ around the vocals and acoustic. They are both loud, the Edensound is both a little louder and a little warmer (less crunchy) in pushing the gain. This is the difference between digital and analogue equipment. I noticed also that while I grew to prefer my EQ and compression settings, the only worked particularly well with the speakers I was monitoring with, but when I played the recordings through my laptop speakers the Edensound version was again better.
.sources.
Grice, David 29.05.07 "Mastering (2)." Electronic Music Unit, University of Adelaide.
0 comments copywrite 2:15 pm