iieiwrmeieweeiimeemmwreiweremweireeemeimwieewwrwweereireeiimeewmiwwwemwiewimeeeremeiewmemweewieewerwemwiimiewmeiwireiiwrrewmwewree
ewreirrmewwiemirrirwiermiemerweeirwiimweeeeiirerwrwiiwewemeirimemmeeimiiwwrmimiwweiwwmrmiriweeeiremriiemeweireerweeirmwewememriiew
eieewmwmwmrereeiwireeeewermeeereriirriwmemermmmimewmeerwemieieremrrewmemmmmmeirmrmriemmrmweimwrrimeemermiimwwiweeemereieemeeeeweww
eiwereiirmieerwerieeieeereeieweeiieweiwrirmrewieemewiiemeiewmemmewrmreewemmeieiwimwiieieiireeemrmeewmmmmrermewiriiwireereemeirerem
mwrmeeemeeieeieereimmrweweeweeireweeweeieiiieememmwerrrrewewremreeemeeewimwwrewrmrrreeiwmrwewrrwmireemiwwieimieiweeewmmrmemmerieee
emiweiemewrmirerereeerwrmereemeweeiirewmrwwweeewwememrewmwewrmewmwrewemeeiirewwreeeeiweeeiirrimmeimrmrieieeerewrmwewemeerwewwmrwie
iriieweemwwmrwweeeeriirwerirmwiieiieeeerrerwmrmeeerireeweeeemmwmiiweeiewreweiewwiewwwwermrwemwiewwmrrwiiriereereimwieerrermeemerwr
rmimmrreeieerwmirwreiweewermririwmeemiewrieewrmemmmrrweeeieimmwriieeewiemewmemeereeewmwwmweemimewrrewrrermrmeeririiiweiiwwwimeerme
rmemmrweriiwiemeewiierimweierwremrreeweemeireerrrmwewiwwiiiwrwweewreeeirweemwewermeemiieeeiiereriwrererrwrmewiwiwwmeirewrieiiwrrrm
wmeweemireemieeeriererrmemeweewerewerrerwewirremereieririeiimrwireemeweeewirirrwirrmeeeewewriwmrweerremewmwmerrererreeemeriwerewee
erimeimremreierwrwemeeeeemeewwmrwerremeirrrmwewimrrieirewwiewwreeerierwewwmweeiwemrmewemiiewweewrweerremwerewieeweiwweiwwmwwmiieri
eeerrriwmmeweeewrerwrreeeeewemeiiermmeewirmiiremmieeeweiwiirwmeieemwemewmmmiemwiwwieemewerrememewmewieeiimmemrrwereierwieewerirwrr
mmeemerwermerieemiemewrireeeeiwieewrimeiwmiwrerwwrmreeerewreirrwwmieiewemrremmerrwweemiweweiwmmwmmweiirrieerimewmwwremwriweewimeme
rmwiiimeiirwriewmrweeewirwimiremeewrmemmmrrmeieeewwriweeriwrwreimirwewmmmieerewwewwwememimeieeeeewmrreeemmemeeewimeeremeeerriweemi
reewmeeewmiiweimermeeemeeewiwmewemeeweerrwememiweemrwmmiremirirmrerrweemermmewiwweeirmrwmewrmwweewiremimreewiwmeeiiriiwiiirrewmeiw
iereermmeimrwirreirrewerwrirrwemiemrieermwmimeiwwrrreemmermwiewwmiweewemeeewmiweeeriwwrewrweweremrreireiemieimweiieimeieiiemmiemmm
weemwewerrmereirwewiremimiimweeemmwieweieeerimeeweiewrmiwiiwmiemmrwemewemieirewmwieimimeemeeiiemrwrrememmemieiieiiwremmeeeemrewree
emerreiweierrrrirrrwemrmmweeremeeieewrmieeriireiwrrmeemieriereeirimweerwmiiieeimmmrwwewereerrmerreeeiwweirirmiwmeirmrwwmreewieeiie
emeeeimieweiwiiiwemeiiweirmmremwmwmieimmeewwwmerwermimeieieiiwimiewrremrmwewwmeemwermemrrereiemrwrmmimiimeimiieimireweeeirrewewwww
reiereieweerimriieeeeriereimrerweiwiiwirirmwweeiiwremrmriereewiieiererimmweeimrrmmmmmewmrmeeweemwerieiiewimrerrwwwmwmermrwerewrrir
wmmerwwiewriweierirrermemeireeerieememriiewrmmiemmwiweeieeewmrwmiiiiieimrmermirweeweiweeiiiemwwrmirrwemeewmeiemiimerieieiewmermrie
meiireemeemmiieiereemeeeiiiewrrmweewwremmeemirmewweerwwmmmeeeiiieerrmrwereweiiemewirremiemewewimmwrewiiewiriwiririwewrriwemeiewrii
eirrrmrwrrwrieiwriimiememeiiewweeiemrwewewirriiwmwwmimereerieewwrrmmrrmemereieeewimrreeiiwweeiwwweiiemrwimreieiimiwweremeeweemriwr
erriremerwwmremewriweemweerrweemeiwmimerieweiemmrrrwrmeirmrmewmwweeimrwmewwwrmemerimreerrewmrrewrieeemmwemwrwmeermwwemreeeewweweei
ireeeeeiirmeiireemmremrerirwererieewiwerimmeieerermmmiimeeieeiwmiimrermrewemmremimiwireeeremrereiemmemewrrereerewwrewimemreewmirir
rmeeimmmieeewiiimmeeewrwerewreeimeeweeiwwemimrwmwriwreiimeiieeeeewewewimrmiwrieemeeirweemieeimmmrmreereewmmmeirmiemwrrwriwmeieimwr
rwrimwrwemimimwweewiimremiweereeeeririeemeereeerweirweremrreeiwwmmieeewrwriemeimewreeierieermreieeirmmwwwweiwwieemmimrermreeeeeeri
iewmmeereweiiierieriirmwmemreiremiereemwimmwwmeemreiirreereemmreeimeiwwwrewriirewweewmmrwereemerieemiewiiirrremmerwrewmeririrweeiw
eiwiremwrireieemiieemerewrrwiieemerrmmemeiwwerewiemweewiemmeiemrwwirermrrewmeirmiieeeeeeewemeeeeiiimreeiiiwwewirrreeewriwreiimeiee
miermerwmeewrmeirwmemiweewwwmemwieeeeemeiieeeiiwrmwieewmiewwwewewmreerewrerweeewwmmermmwremwriemeeereerremmwierwiweeemwwmweirrriwe
eememewimrriererwrerwemweiemeierireeememewewieiirmwrwwerimwimiemiemimmmieweewwiierereemiiiwmwwrrwwemrwiiewereeimewrwrreieiwimwmrmr
wmmmeirrrirerwiimweewrieweiieimwiwiweiwiirwemeeererewrreiemwwiwrermmiwieirirrmeeiimriwewrrmrreeereeeeewmrwimrwrrmiewemwrwimreemwwi
eeeerrerwrierriermeeerwermmeeiwimmiereerwewimwewieremwewreimeeiwireriireemiemiwerweriewireeeemmererwmrmmreeererweemmerwrmirrmeeeee
meewreiirwrmmmwwemiimreeeiieweeeewiieeeermremewmermweerrermwwmemeerireiimwwiwewermeermmrreermimiiermeewereeeewmrwwrwmmerwwrimeeeii
rmiieeewwirimemmwermimeremrmeimmwiiewmmerrrieiiemwemwmmmewrmmeemreeewemewmewieeemieirwmmermeemreemieweeeiieweirremeieeeeemieiiieew
eemwrermeeeeeeeeemrrmieeeemwiewiieerrreiewmmeiwiwiirirmewewrwwrimeiermiwmrriweiiwweweewmwmmewwmrieieewwremiiiimririiiwmimewwmeieer
irrememieemrrmemrweerimmeimeeiwirimeeweweeewreeiermewiweweieieirrwwwrmereeiwermeeeirmieeeweiiewwwiereremeemmrmmwmmwwiiwiweieereimr
wwmremiiwrreeirrieewerweimweremeweemmmieriwemeemiwerirmeeeimerewmrrereeweiewermrimiwmwremireimriimiweewmmweeremewirrmiwwemmewmweee
ieremmmwrrmreereiweermewreiriewirerimrirerreemmwmeieriieeeemiirireemeiiewrmeermwmremmrwwmwmwewreirierrmewmwmewmmwwwemimrmmrmeewrmr
ewrrmriewreemmmmewwiewmewiiimrreeimeeieeewmeeweweiiwimiewrwriwwreremewwewmweeiemmrieiwermimmewireeweiwrirerewemeeieeewmeweeeiemmir
rremewemwwimiiweeiemerwiweiwreeeieerewreremweewieweermmrieermrimmermmeieirerririeeweireemeirwiiemwmiwrwwewrewrmeeeeemmmwremeeereem

15.11.06

eclipses
Home
by jake morris & deanna djuric
image by ben probert

there were some issues with the gain levels going over 0 in the recording bidule (which unfortuanatly doesnt display this) and so this recording is a bit shabby. if you feel upto ignoring those issues, then click the picture and have a listen.

8.11.06

here's the music technology (sound design) minor project. this is the stereo bounce of a work made for 5.1 surround, but it gives you a bit of an idea into what i came up with, so enjoy!

26.10.06

Unlike last week, where the ‘mojo’ was perfect for improvisation, this week the group had a little more difficulty making a decent sound. Nevertheless, the members (who attended) should be congratulated in that we indeed managed to achieve a good sound at several points in the jam session. It may not have fallen together easily, but we were still able to come up with something, so well done crew. One song in particular was awesome; in this my drum loops did not change too much, but I just tweaked them gradually to sound cleaner and more complex as the song went on, to the point where I really did not want them to stop. Ever.

As I have done every week, I wanted to bring something new to the session. However having not had time to make any other loops for the session, I instead decided to get a copy of the freely available, under 65 Mb demo of Live 6 to change things up a bit. My verdict? Wow. Live is an easy to use program, with endless possibilities and is becoming more and more refined in being able to do what you want it to. I truly would recommend playing with Live to any aspiring musician.

By the time I had fired up Live 6 for the first time, my group had already begun rocking out. As I really threw myself in the deep-end with this software, I probably ruined the chance of creating a decent drum sound for 15 minutes at most, but by the time everyone was starting to settle and listen to each other I was able to work my way around the nuances of the new Live well enough to create a decent beat. I know that once I have Live 6 worked out as closely as I have Live 3, that I am going to be a multibillion dollar rockstar skilled loop artist.

This session we spent together will probably be the last before the end of year concert, where the EMU guys and gals all let there hair down for one more chance to dance the night away. Actually I was being sarcastic, but who knows, I remember being at my school formal thinking ‘why are they all dancing!? NOBODY told me there was dancing?’ Assuming I do not have to dance, I am actually rather looking forward to this concert. So long as the members of our group who never turn up do the same for the concert, I am convinced that we can put on some decent music.

So loyal readers of weimerhead: the untold tales of the ninja robot samurai from outer space, this post sadly brings the year to a close. I myself will henceforth be busily finishing assignments, so drink a beer for me and I will see you all at TOOL - BIG DAY OUT 2007.

25.10.06

Home

23.10.06

Last Thursday my group had much more success in the improvisation session. We all seemed to fit together much more, and for a change the hour we spent jamming was actually a lot of fun. We set up a little bit differently than we usually do, with everyone in a circle. This allowed everyone to make eye contact anyone else and on the whole communication was increased greatly compared to other weeks. Another factor aiding the level of communication, was the fact that Ben walked around and was able to dictate to the group upcoming ideas that we could consider. An example of this was, in one particular song Ben wanted to slowly decrease the tempo. Obviously if I did this ‘mid-jam,’ without everyone knowing my intentions, it would have quite disastrous effects, but Ben was able to say to me- ‘gradually slow it down, everyone knows it coming.’ Dragos and Albert were absent, and our bass playing started to play more rhythmically than melodically, which left melodic construction in the capable hands of Dave and Matt. I was busy concentrating on the rhythmic side of things, but from what I heard Matt and Dave seemed to share the weight pretty well.

In the rhythm section Ben and I were at last able to share the sound of the drums a little more freely. For a change this week I starting with very cut back drum loops, usually with only hi-hats to play around with for a while. This allowed Ben much more freedom with his own improvised beat over the top. This format would continue for a few minutes and depending on what Ben was doing, usually I would start to put in a snare on the same beat as Ben. By this time Ben would be starting to fatigue a bit, so I might then take over the drums for a while. While Ben and I were doing all this, I noticed that our bass line was very in tune with what we were doing and seemed able to lock in and even adapt with the changes in the drum section. In the rhythm section at least, it was one of the first times I have really felt like I am able to feed ideas of the other group members and that they are able to do the same for me.

In Music Technology’s final hour Luke Harrold came in to the EMU to give the students an idea of the goings on of NIME 06. He presented a video of a performance by Adachi Tomomi in which he gives a demo of a couple of his creations- the carrot flute and an infered music shirt. It was pretty obscure really, but I must admit that Mr. Tomomi put on an exciting show. The way he begun each performance was to casually sing old style Japanese chants, and then abruptly bring in the futuristic sounds and movements of his instrument. This made him look quite comfortable with the equipment, making the performances feel almost like a glimpse into the future.

Lastly I will add that I am quite sick at the moment, and you may not see too much of me in this last week of the term. Feeling very out of it, but I am doing my best to get everything done one step at a time. Update completed!

Reference:

Harrold, Luke. “NIME 06.” Lecture presented in EMU space, 5th Floor, Schulz Building, University of Adelaide, 19 / 10 / 06.

17.10.06

Disorganised is the way I would describe last weeks’ improvisation session. This thread was perhaps seeded in the late email I received (on the Thursday morning) that established this forum slot would indeed be used to play with your group. As a result I did not have my laptop with me, and could not perform the way I normally do. However, the email is not all that can be blamed; their are certain members of my group who are still deciding which method they want to use to create sound for the effort- even a week after the performance?!?! This disorganisation led to an atrocious sound and on my part a headache.

With the loss of my beat-box laptop, I decided to instead play the electronic drum kit, using the little drum skills that I have salvaged from drum lessons I took late last millennium. For some time at least I tried to keep my playing simple, just so that everyone might stay in some sort of time. However the amplification of the kit was not placed near me, and I my playing was dwarfed over the G5 which basically surfed different samples of drums and who knows what else, while everyone attempted to play. When I say surfed, I do not denote any sort of Ableton Live-esk real time seamless integration of samples, I mean someone opening sound files when they like while people attempt to create music. Here I make an example of the sound-source responsible for giving me a head-ache, but might I add this was not the only contributor apparently ‘in their own little world.’ Sometimes I feel like the few in the group who do in fact try to collaberate with others, and ‘do their homework’ so to speak, have pretty much given up. And who would blame them? Who would want to play in a three-piece band with a few soloing anomalies, when everyone could just solo?

If this week will be the same session of playing with your group (with no email as yet to confirm nor deny), I hope that things could be improved via the following:
- less sound
- stable, repetitive rhythm section
- synthesizer’s (if played during that particular week / jam) placed on a quieter and more ambient (less tonal) setting
- levels checked, then maintained
- no mish-mash of people jamming and others working out which synth / electro kit patch, drum sample etc. to use
- everyone commence the ’jam’ at the around the same time
- decided scale / mode
- one main melodic cetre / voice at a time

16.10.06

AudioArts - Spectral Water: week 10.

Home

9.10.06

audio arts - week 9 - tone wheel

Home

week NINe - creative computing

Click on the image below to hear Will & Luke using the Open Sound Control to manipulate a Plogue patch running on my computer, which I'm controlling with the JV-30 keyboard. The music I'm playing is 'Right In Two' by Tool. Luke is manipulating the harmonics with the menu shown, while similarly Will is using the flanger shown. We call it 'Will & Luke slamming my bar' - a gay time was had by all.

Home

8.10.06

On Thursday DJ TR!P spoke in to the forum class about his understanding and application of improvisation. What I found interesting about his presentation was that his situation is a little ‘closer to home’ for me, compared to Dr. Chandrakant Sardeshmukh’s presentation for example. His public speaking ability was not that great, but that did not get in the way of him ultimately being able to get his message across, and I feel that those who payed careful attention to what he had to say were rewarded. He spoke extensively about collaboration, and gave me a few ideas to try when it came to performing with him.

Being an Adelaide based musician, I was really interested to get an understanding of TR!P’s career and how everything is working out for him and music around my city. I still have not decided whether I would be content with a career such as his, but he certainly has given me plenty to think about. There is an appeal in working in bars and clubs, having a job like that, but at the same time, I cannot see there being too much money in it. Perhaps work like this would be good to do while your still young, in your 20’s or around that mark. However if you ever wanted to start a family or business or buy a car or anything like that, well I think you might have to reconsider your line of work. Plenty of thinking to be done.

Accompanying TR!P was a CD turntable, something I have heard about but never scene in the flesh before. If I ever were to consider using a turntable (it is very possible) I would defiantly go for one of these- simply before you can burn a CD with your own material on, rather than having to recycle old music as with a turntable. The rest TR!P’s arsenal: a sampler, a fader/mixing desk, a game boy and of coarse a laptop running Windows. He, like so many of his kin, used Fruity Loops from his laptop.

The actual performance went really well from what I could tell. The audience seemed very pleased with what we produced- this is the main thing I guess. The only thing that I am a little concerned with, is that people might not have been able to distinguish between what was coming from my laptop and TR!Ps’. Apart from that it was a reasonably deep performance and a nice environment for everyone to jam in. As always, Dave’s guitar lines were tasteful while still being over-the-top awesome- his ability to improvise on his solo instrument was surely freer than mine or probably anyone else. I certainly envy his skill with improvisation. Nevertheless the rest of the group had their moments, and it pulled together well.

One other thing that I though about in the performance was the how TR!P and I had to agree on a tempo before each song. We verbally told each other the tempo then manually set it from our computer. However, if he were using a Mac, and if Live and Fruity Loops supported OSC (OpenSoundControl) we perhaps would not have to discuss the tempo, but rather just have them synced. Also we could have then changed the tempo if we so desired and still been on exactly the same page. All this in the future I guess.

Bibliography

Hopprich, Tyson. “Improvisation." Workshop presented at EMU Space, 5th Floor, Schultz Building, University of Adelaide. 5/10/06.

18.9.06

The moment I entered the EMU space last Thursday I was helplessly enchanted by the wonderful sound of Dr. Chandrakant Sardeshmukh’s Sitar (like a guitar except you sit to play it). This man’s wonderful instrument really was a testament to the fact that there are just so many different types of sound out there. At one stage near when I first walked in I think the doctor looked at me and smiled; I must have been sitting there grinning like a schoolgirl. Just such a truly beautiful sound.

His playing ability was obviously very advanced, I would probably gained a lot more from the workshop if I were able to understand more of what he said. Much was mentioned about Indian music and form, and unfortunately I really did not understand much of what he said! However, it is quite refreshing to hear people from other countries speak English, especially to hear words that you are familiar with, used in contexts that you are not.

Toward the end of the workshop it once again came time for the guest to rock it out with one of the improvisation groups. During this concert I was particularly impressed with the performance Vinny. He played a sort of whiney ambience sound from his laptop and then went ahead and played the tabla. Prehaps his own Indian heritage assisted him understand the playing style of the doctor, or perhaps his own skill with improvisation allowed him to preform quite comfortably and effectively. Most likely both of these things. The remainder of the group contained Tyrell, Luke, Seb, Daniel and Poppy. Tyrell was quite good, he was able to listen to the others and know when to enter with his old-school synthesizer. The fact that he used a tone that sounded like it would suit an Atari video game, he did well to fit in with the natural sounds of the Sitar and Tabla. Luke and Seb were again able to choose when to play and when no to. However, I think if given the chance to perform with the group again they might have added a lot more of their dirty tones and not act as such a slave to the Tabla and Sitar. Daniel played some weird noise maker thing and among all the sound I am not too sure what sounds he made and which he did not, but he certainly did not seem to detract from the overall aesthetic. Finally Poppy used her voice to produce both long droning howls and short, percussive shrieks. I thought that at one stage she might of broken into one of her infamous ‘spoken word’ recitals and start preaching anti-Immigration propaganda, but alas, she choose to stick safe with ‘non-English.’

Bibliography
Sardeshmukh, Chandrakant. “Improvisation Technique.” Workshop presented in EMU Space, 5th floor Schulz building, University of Adelaide. 14/09/2006

the Soundscape of Pitcairn Avenue

BUS (Distant)
1. 200 metres
2. 30 dB
3. Distant, but somewhat distinct
4. Lo-fi, technological
5. Isolated
6. No Reverb

CAR
1. 40 metres-pass-15 metres
2. 62dB
3. Indistict, except when hitting a ditch (crescendos, decrescendos)
4. Lo-fi, technological
5. Periodic repetition
6. Short reverb

DOOR SHUTTING
1. 20 metres
2. 68dB
3. Heard distinctly
4. Hi-fi, human
5. Short, percussive
6. Long natural reverb

BIRDS
1. 15-50 metres
2. 35-55dB
3. Distict
4. Hi-fi natural
5. Peridic repetition
6. No reverb

Ambient Murmer
1. 250+ metres
2. 25dB
3. Indistict
4. Hi-fi, natural
5. Unending
6. No reverb

Home

This Thursday’s forum contained an extremely intriguing presentation from saxophonist Derek Pascoe. He talked primarily about his experience with improvisation, taking on a purely personal perspective. Rather than taking the common ‘this is about what I do and why everything do is great’ angle, he simply deconstructed his improvisational technique and how he developed it from the ground up. I really enjoyed the way he was able to meld completely abstract concepts like spirituality with practical ideas to consider in an improvisational situation. He talked in detail about techniques he has practised in daily to become prolific in improvisation. His own worship to improvisation, through such practises, has given him a personal and unique view into the art. Every concept mentioned he was able explain how he came to arrive at that idea. This made the forum very interesting, particularly as did not simply seem to constantly quote ideas from other musicians, but rather discuss his solitary approach to thinking about improvisation technique.

As a part of the presentation Derek Pascoe perform with one of the improvisation project’s groups. There were four songs performed, which by and large progressively improved as the afternoon progressed. The first song involved audience participation, however everyone was too nervous and really the whole thing was not worth mentioning. However, during the next three songs there were certain moments that were really quite fascinating. In the third song, the performers were instructed to follow and mimic each other’s noise. This unfolded with quite interesting results, with at one time the bass, guitar and saxophone sounding quite trance-like. The last performance involved the performers having conversations. To get the ball rolling here Derek Pascoe played ‘smoke on the water’ and whatever that song you hear on the trumpet at the race track. John responded, but unlike Derek Pascoe who extended and lengthened each note, he embellished the line in a guitar shred. In that respect I think that John was what Derek Pascoe titled “the villain” towards the saxophone part. All considered, it was a very interesting couple of hours, and it certain changed, in some way, how I listen to music since.

Bibliography


Pascoe, Derek. “Improvisation Technique." Workshop presented in EMU Space, 5th floor Schulz building, University of Adelaide, 07/09/2006.

12.9.06

6.9.06

Home

click on the picture for sound

here instead of using a MIDI keyboard for MIDI input I have used the note generator (or two, simulating the left and right hand i hope). It sounds like the ocean, does it not?

4.9.06

This week’s improvisation session sounded possibly worse than last week. The additional of two tone-exploring synthesizers had much to do with this I feel. Every week in the improvisation session I seem to pick up some way to improve my drum loops, usually in response to critisms made by my colleagues. Again this week Dave was the catalyst here, suggesting that listening to the same two-bar loop for the hour can get a little tedious (no surprises here). I have decided then, that maybe it would be a good idea to create longer samples, so that they take longer to loop over, and are not as grating. Also Dave suggested maybe adding ‘fills’ to the performance, reminding me of an idea I formed weeks ago- to set the drum loops to group A (within Live) and a continuously playing (but muted) fill to group B. Hence, whenever I feel the time is appropriate, I pan across to group B and play a fill. Thankfully, the issues I had last week have been overcome, that is, no longer are people complaining that the drums cut-out completely and leaving them exposed as the tentative improvisers they mostly are. I achieved this by implementing the idea I specified last week.

This week’s presentations included Tyrell, Poppi, Albert and Josh. I particularly enjoyed Poppi’s presentation. Her movies were very intriguing and well suited to the music, as was her solemn ‘spoken word.’ All of these films were cut short for the sake of saving time, but now I would really like to see more. This might sound a little strange, but the movies were very typical of a uni student, if that makes any sense. Probably not. Tyrell talked a bit about the music he has been making for video games. I was pretty interested to hear something of composed by Tyrell, but unfortunately he mainly played us music he created for a video game for children so the emotional depth was severely shunned, in place of… well… childhood cheesiness. Despite the fact Josh seemed about as excited about his musique concrete piece, as I am about going to work 21 minutes, I thought the piece was really very good. Interesting too that when I went up and talk about my piece I spouted about some elaborate story of a mythological hero and the best that Josh could say is that ‘he tried to use sounds that sound metallic’, yet that the final result is reasonably similar. But hey, if it were the final sound that counted, words like Scheonburg and Boulez might not sound like more than fancy small cars or import wines (and I am going to hell).

im worked to hard to count the music (final)
the final sound track for the sd1.mov with plogue FM synthesis and other variations.
the video is avaliable in a previous post.

30.8.06

Home

click on the picture for sound

28.8.06

Stupid Task

click on the picture for sound

27.8.06

So during this week’s improvisation session I began using Live to play with a drum sample. Yes, no plural here as I have only made one so far. One of the more promising sections of our free-for-all jam was when I used the Three-Band EQ to remove everything but the high-hat from the sample. This both allowed for and seemed to inspire the other improvisers to create appropriate music (remembering our group’s motto: to make music and none of this ambient stuff). However, when I tried using the EQ to solo other parts of the drum beat, such as the kick and snare, the EQ didn’t seem to work as well. This has led me toward the next step in my development of this project- that is to make individual sound files for the kick, snare, high-hat and whatever else. These files can be easily collated in a single ‘scene’ within Live, and that way can have there own effects added to them. Having specific effect added to each part particularly appeals to me due to one main compliant my group made of my sound during the jam- inconsistency of rhythm. It makes sense, I hold a key role in the rhythmic grounding of the project. This failed me during points of the jam when I would dump a new effect such as ‘BeatBox’ onto the drum loop and it would immediately ruin the rhythmic grounding, which the rest of the group relied upon. However, if I were able to drag such an effect onto just the high-hat for example, the total annihilation of this sample would not have such a disastrous effect, as would completely ruined the whole drum sample. Hence, the rest of the group would not have to stop while I quickly switch off/adjust the parameters of the plug-in. Other than this I am slowing learning which plug-in’s sound cool, provide something interesting to improvise with, have bugs and which individual parameters are best to map the XY pane that Live provides. As the weeks progress I intend to narrow in on the particular rhythmic grounding I will offer the rest of the group, who I think will have more apparent avenues for improvisation than me, through the use of melody and so on. Live itself is far from the ideal piece of software, but it is nonetheless really intuitive, easy to use and fun. This use of a vector-driven interface is more than just the ‘gimmick’ it seemed to be when I first looked at the application months ago. Rather, it provides a clever flow to the interface, in a way I find difficult to explain. However, take the use of the F11 shortcut, which allows the application to use the full width of your display or monitor and the way the program resizes in an instant, which I think must owe something the use of vector graphics. This way in which the whole interface flows around and is easily customisable through minimal clicks, is important for is onstage applications. Having had used the program now in a ‘jamming’ situation, I really commend Abelton on the way they have made this program simple and smooth. Never in this ‘tense’ situation did I feel at all lost or confused in the interface that in all honesty, I had only used about five times. And that, my friends, is Live.

Apart from the fact that during my presentation the word ‘um’ seemed to have manifested through my vocabulary beyond all sensibility, I think it went okay. Ben commented that at the end of every sentence the pitch of my voice went up a little, or something, and I certainly appreciate any such critisms because I understand how important presentations like this are in the metropolis of Academia. My great respect for all the music technology students and staff, along with my dream to create great music, made me quite a nervous little camper during my presentation of ‘new surroundings’. Anyway, from the feedback I have received, people did not pick up on this too much, but if you feel otherwise then please let me know via the comment button below.

The other presenters included William, Ben and Timothy Gabbusch. Like everything he is involved with, I could not help but laugh when it came to Ben’s speech. His presentation style was confident, light and humorous while his piece ‘Vocalikov’ was really very good. William’s piece was equally very cool, with a particularly rhythmic ‘glitchy’, ‘buzzy’ sound that caught my attention and is stuck in mind. Timothy’s piece contained a lot of awesome sounds and was quite good, but lacked a feeling of form and structure. I feel this probably owes to his use of analogue techniques in its composition.

Referances

Whittington, Stephen. "Music Technology Workshop." Lecture presented at the Electronic Music Unit, University of Adelaide, South Australia, 24/08/2006.

21.8.06

Gday fellow soldiers in the eternal struggle of the Blog.

My group has continued making baby steps towards the improvisation project that we will perform one day not so soon. This week I brought in my laptop and played a few drum loops I composed in Sibelius and played them in Reason 3. Both Dave and Ben seemed to take a liking to them; might I quote Dave who remarked, “shall we go with that for this Rhythmic basis.” (Or something similar). This was fairly encouraging, especially because I know the drum loops would be heaps better if I had spent more than twenty seconds on them. Apart from this part of the session, we continued listening to people moan about technological disabilities of studio 5 and Dave’s (awesome) guitar playing. Adrian’s computer bit the dust so we may have to wait some time before hearing anything from him and SuperCollider. Dragos was away, so still pretty uncertain about what he is doing. Ben seems pretty enthusiastic about his vocal/effects contribution, and the other day I was listening to “Intergalactic” by the Beastie Boys and thought of how this may offer him some inspiration (‘another dimension, another dimension, new galaxy’). We also welcomed a new member to the group, Albert. I was less than excited by the thought of adding yet another instrument to melange of sound we have at the moment, but going by his usual attendance, I do not think I have to much to worry about (needless to say, I will be glad to be proved wrong here).

Next up, a few more students presented their work in the new Tech (“now like Jazz and Classical students”) Forum. This commenced with Henry Reid who played ‘Lucky’ a song written about his Grandfather’s life. After the initial excitement of how this form of song writing has recently produced favourites of my own like Tool’s “10 000 days part 2 (Wings for Marie)”, I was a little bit disappointed with the final product. This dissatisfaction mainly stemmed from one instrument which played throughout in what I can only call a lame synth tone, because the rest of it was pretty good I think. Daniel played a recording by his band ‘Enemy Of?’ and I have to say it was pretty good. I intend to record a Heavy Metal later this year, and I am pretty intimidated by their two guitar, bass, drums, vocal line up. Daniel mentioned how he used a program other ProTools to record the drums. After this I was left thinking- there are other programs than ProTools: WHAT THE HELL?!?! Anyway, one day if I am feeling social enough, I would like to talk to him about this program and what his thoughts are on recording heavy music. One thing I really want to learn to do is make good drum recordings. More than anything else. Getting back to Daniel’s piece called, “When you say the heaven you mean the gates of hell” I was overall quite impressed with the song, but the recording did not have the edge (not surprisingly) of a professional recording. But why? I feel like we have all the equipment there to do it. Maybe I will figure it out one day if I stop playing video games…

15.8.06

It is a little disappointing that we will no longer have any of David Harris’s tripped out offerings to look forward to on a Thursday afternoon. This class will be replaced with the improvisation project. My personal experience with group work at uni thus far has lended me to trying to avoid it where possible. Nevertheless, going by last weeks workshops, things may well be different- the Music Tech’ers seem and enthusiastic bunch. My randomly selected group contains four members from my year: david, ben, matt and myself. The other members are Dragos Nastasie and Adrian Reid who are second and third years’ respectively. So far not much in the way of themes or have surfaced, while many ideas have been thrown around. Ambience has been affronted quite thoroughly, I was hoping this might turn into some kind of right-wing new-music crusade, but so far we only decided that chords will be used. But what chords?

I must c e the rest of my group for their effort in bringing instruments along for last weeks session. It sure inspired me to want to be ready with something for this Thursday. For this, I am thinking of composing a few drum loops in Sibelius, making them sound cool in Reason and finally improvising with them using Ableton’s Live. This program is the muse of Creative Computing at the status quo. Another option I was considering was to learn some SuperCollider. Adrian mentioned that he is thinking of doing something with it, so it might be a good opportunity to get into it. I know the JITLib interface is tailored for live applications. I only know about || much SuperCollider though, so I might just be barking up the wrong tree. I will talk to Adrian about it.

My experience with ‘playing music’ is far older than that of my improvisation. This certainly is not something I boast proudly, but as they say, sometimes that’s the way the cookie crumbles. I have to say that I will be glad to discover myself as an improviser.

10.8.06

creative computing - soundtrack for sd1.mov

"im worked too hard to count the music so i only dance to the video anymore"
translated by weimer in under an hour.

31.7.06

Cubase Video Setup

In creative computing we have finally got our heads out of the ProTools gutter and found that other software exists. Namely, this was Steinberg’s highly-documented sequencer Cubase. Interestingly, we focused on using Cubase to manage video. Since the ACMC I have been quite curious toward new media, particularly video, but unfortunately these are things that cannot be used towards assessments. However, I do not think I should let this hold me back, and I am hopefully going to use my knowledge of Macromedia’s Flash to make a video for my next Musique Concrete project piece. I have also decided that it’s not going to be as strictly Musique Concrete sounding either. We are not cutting up tape anymore so why pretend?

Anywho, here is a couple of snapshots of my experience with importing video into Cubase and getting a few markers laid out. I found Cubase very easy to use, much easier than learning ProTools. Whether this is because it is similar to ProTools, or due to a better interface: who knows?


Using Cubase: Less painful than you may think.


Time Markers: Mark out your project files.


I got the ReWire bus running pretty easily, Christian kindly showed me how to lock host Reason with ProTools last semester. For pictures, see one of my fellow first-year colleague's blogs.

The readings we were provided in Audio Arts this week were pretty interesting. It was quite intriguing to hear that Christian has worked as a graphic designer, as this not only something that interests me, but what my brother does for a living. The reason I posted this picture here is that as I was reading it- the way it describes the way to sort through a complex reminded me a lot of fourth-level (event driven) programming languages like those used by Visual Basic. IF THEN ELSE statements and so on. Interesting too that it refers to breaking things down to a set of smaller functions and that this is exactly what code does. Or maybe it's boring and I'm just a loser.



















I’m having a fair bit of trouble tearing myself away from Blizzard’s World of Warcraft at the moment. I feel like a 6 year old and it’s defiantly detracting my studies. But then again, I feel like a 6 year old!


It’s 8pm and DJ Shadow just starting playing at Thebby… and I’m here updating my blog :_(

Bibliography

Haines, Christian. “Cubase Video Setup.” Class of the University of Adelaide, in the Schulz Building, EMU, level 4 (Audio Lab) on 27 / July / 2006. Class of Adelaide, in the Schulz Building, EMU, level 4 (Audio Lab) on 27 / July / 2006.

Martin Armiger stopped by the EMU last Thursday to give us all insight in his wonderful life as a film composer. It was great to see him talking about what inspires him with what he does, even if I didn’t personally get excited by the film “The Sea Hawk” and whatever the other one was. His ability to speak to the audience was quite notable; the audience was indeed larger than normal for a Thursday afternoon at the EMU, with new faces showing up for Armiger presentation. I would of perhaps preferred if he had talked a little more about work that he has done, I only realised this when someone asked me what films Arminger has worked on- I could only respond with ‘uh, dunno!’ Nevertheless seeing Arminger get all animated during ‘the Sea Hawk’ scene with strings playing arpeggios representing wind and similarly with the torture scene from ‘Reservoir Dogs’ was good value. Of all the lines of work that music technology students can consider, he certainly did not turn me away from film music, which was something presenters had been getting good at last semester. I though during his speech he might have touched on multichannel sound, as this available to the film (& DVD) music world. In fact, when he commenced his speech with the ten dollar question- “What’s wrong with film music?”, I thought he might be going to say something like ‘most film music is produced for stereo while film composers can use 5.1’… or something. Oh well. Seems nobody really cares about multichannel except computer nerd music composers and sound installation artists. Maybe. I really do not know much about the stuff, but it just seems like noone gets as excited by more speakers like me.

Mr. Whittington gave us EMUers a few pointers in scoring, analysing and generally commentating our music, during the hour David Harris normally fills with Pink Floyd. I am glad someone talked a bit about these things, because they are a part of all music technology major assignments and something I felt pretty unsure and ending up not caring about last semester. His allegory of the cut paper to a score for music was very understandable, and I am starting to see that he is good at what he does and why he is the head of my faculty.

Bibiography
Armington, Martin. "What's wrong with film music?" Presentation made at the Adelaide University, Schulz Building, EMU (level 5), Emu Space on 28 / July / 2006.

Whittington, Stephen. "Commentating Music." Presentation made at the Adelaide University, Schulz Building, EMU (level 5), Emu Space on 28 / July / 2006.

30.7.06

Audio Arts: Sound Design
San analysis of sd1.mov:




















12.6.06

Here I am updating again, which begs the question-
Just where the hell did the past seven days go?

Well, spent a bit of time making the blog look pretty for the jyotisha project. Today I spent a few hours getting a flash movie (.swf) to play within the header. If you think HTML is an idiolect then try using CSS. Anyone reading this who is interested in getting involved or has any ideas for jyotisha then send me an email and ill make you a part of the community blog so you can add your flavor!

In audio art’s Christian drew pictures detailing how to pan on a width and height plane. Quite interesting stuff indeed, cannot say I have ever thought about sound in that way before. Christian’s certainly trying to cram one thing in our heads: forget about your other futile senses, open your f_cking ears. More and more nowadays, I find myself listening; standing at the bus stop listening to the sound cars make as they cut the air on the way toward me. The higher pitch the car makes because the sound vibrations are squashed closer together and how they then sink to a lower sound when the car has past since the sound vibrations can be further apart. Sound is an interesting being.

The “3D mixing” article by Paul White has had quite an effect on the status quo of the ‘rainforest’ recording. I used his idea of using a lo-pass filter to move a sound to the background, and panned some of these narrowly of centre. I think his point on using contrast in a mix is quite a valid one, “the brain doesn’t deal in absolutes, but instead prefers to compare one thing directly with another… for something to sound upfront in a mix, something else needs to sound further away” (Paul White, Page 3).

Creative Computing’s latest chapter included discussion of the ‘gate’ effect plug-in ProTools. This allows you to trigger sounds on one track from another track’s amplitude waveform. I previously had heard this style of plug-in in music I had listened to before, and wondered what was happening. Namely, this was on Radiohead’s Lucky from “OK Computer.” Guess now I know, I might try using beach waves as the amplitude for something else.

David presented a few more of Floyd’s ‘tripped out’ stuff in the latest forum. My pick of the week indeed was Steve Reich’s Electric Counterpoint, mmv. 1- Fast (1987). I noticed there was a bit of call & response happening in the panning of the work, this has certainly given me a couple of ideas to try out. The use of tension and release was another entertaining facet of the work. In the piece, the feeling of tension and release were really quite tangible and enjoyable.

Lastly, the day of salvation came on Saturday when I got a MacBook Pro. Anyway, time to unwind for awhile, so I will try not to update for the next few weeks during the mid-semester break. Until next time, peace out home boys and girls.

References

Haines, Christian. "Audio Arts – Mixdown Basics." Practical Class presented in the Audio Lab, 4th floor, Schulz Building, University of Adelaide, 6th June, 2006.

Haines, Christian. "Creative Computing – Tape Techniques pt 2." Lecture presented in the Audio Lab, 4th floor, Schulz Building, University of Adelaide, 8th June, 2006.

Harris, David. "Music Technology Listening Workshop – Steve Reich, Tristram Carey and Pink Floyd." Lecture presented at the Electronic Music Unit, EMU space, University of Adelaide, 8th June, 2006.

White, Paul. (November, 1994) “3D Mixing.” Sound on Sound. SOS publications.

4.6.06

In creative computing this week we examined the application of musique concrete type techniques in digital audio editors. One of the techniques that I found particularly fascinating was the ‘palindrome.’ Although this is something I’ve defiantly used before, sometimes you need to hear someone (Christian) talk about something, and give it a name, before you consciously start applying it. As such, the introduction of my creative computing project contains plenty of palindromes now and gives me the exact sonic effect of mechanical workings that I was after. I also found the discussion of ‘white noise’ quite intriguing, giving the sound designer the opportunity to sculpt something out of an extremely thick blot of PCM data. The possibility for a snare drum, through the use of a fade out, is one of the more obvious of such possibilities.

I’ve long anticipated the audio arts class in which we record the drum kit. After accessing the recordings of the drums after the class, I found it a little difficult to differentiate between which postions of the ambience-dedicated Neumann U87 were what. Of the different positions which Christian sampled, there is a variation in the amount of bass and mid/treble which the microphone manages to pick up, but apart from that observation I am still pretty clueless about the aparent "spots" in the EMU space. Can you tell me where the EMU-spot is?

Personally, I did not have the best week at workshop, first of we had “Wrath of Angry Gools” by Gutbucket. With many abstract timings, it reminded me of a lame version of Tool or The Mars Volta, whose timings seem to fit their melodies rather than just being weird timings for the sake of being weird timings. Proceeding this piece we heard Bach’s “Ricercar” from Musical Offering, which was clearly a masterpiece. I often wonder how much more profoundly this music would fall on my ears if I were hearing it in the 18th century and was not so used to such textural and timbrel variety in music. It’s this aspect that makes the extremely complicated music a little dull. He’s still a genius though - sorry Bach if you're reading this.

Further into the forum, second year student Vinny Bhagat consumed the class with an emotional performance. In a bit of a Sigur Ros or prehaps Mum vain, he soloed on the piano while electric drones wailed from his laptop running Reason. It was a fantastic performance, congratulations to him.

Later, Patrick spoke about a concert being held in an observatory later this year – asking if anyone was interested in taking part. I was quick to let him know that I share his interest in space, and would like to perform at a concert were everything is pitch black. Sounds freegin’ cool! A friend of mine, who plays the bass, has taken interest also, so I’m thinking I will compose something for bass, keyboard and umm… laptop.

For all the LSDj addicts here's the x files fugue.

Well its come to that time of the uni year, the first semester is over. It has gone so damn fast, I’m starting to get envious of the C4 students: I can feel my time running out already. It’s been great (and somewhat relieving) meeting and getting to know people who share my passion for music making. You, and the mass of work we do, make everyday exciting.

Referances

Haines, Christian. "Audio Arts – Recording Drum Kits." Practical Class presented in the Electronic Music Unit, Studio 1 and EMU Space, 5th floor, Schulz Building, University of Adelaide, 30th May (2006).

Haines, Christian. "Creative Computing – Music Concrete Type Techniques" Lecture presented at the Audio Lab, 4th floor, Schulz Building, University of Adelaide, 1st June (2006).

Harris, David. "Music Technology Workshop – Gutbucket, Bach, Toby Twining, Arnold Dreyblatt, Yoshihide and Stravinsky." Workshop class in the Electronic Music Unit, EMU space, 5th Floor, Schulz Building, University of Adelaide,
1st June (2006).

28.5.06

In this week’s audio arts class we were given the opportunity to record a classical vocal singer- Jodie Mills. My group used studio one and EMU space, as well as a Neumann U87. Over the mic we placed a ‘popper-stopper’ which reduces the amount of sacrilegious ‘pops’, ‘pahs’ and otherwise plosive breath sounds. Walter Cronkite (left) agrees.
I had a chat to Peter about popper-stoppers - he mentioned that you can make one out of old (and new) stockings. Seems like a great way to save a few hundred dollars when you’re setting up your own studio.

The recording itself turned out reasonably well- below is an example of both an uncompressed recording, and one with slight compression of 4:1 (default of ProTool’s BombFactory Plug-In). I was also keen to try recording with the Neumann U87 configured to an omni-directional polar pattern, as well as cardioid as in the recordings below. Unfortunately, my group members did not share that enthusiasm and/or time did not really permit. Jodie Mills (Opera Singer):
uncompressed.mp3
compressed.mp3

She was a little bit more ‘warmed up’ for the compressed take, however the technological/recording aspect is better in the uncompressed take. Morale: Opera Singer’s do not need any effects. Except in this slightly NIN, slightly Mills, very Oscilation based song I call the city of Zion?

The visibly ill Christian made it to Creative Computing to continue our look at ProTools. He examined the seemingly endless options for bussing and general signal flow possibilities. It seems that with all the audio software I have been delving into of late, the signal flow and routing functions are absolutely fundamental to have a handle on. This includes applications like Plogue, Reason, Cubase and ProTools. I suppose this aspect of the software is derived from the reality of routing for real life, old school, synthesizers and now in the studio in general. I anticipate that the future will make this routing feature of audio software more of an automated function. Why? People want to make music not plot signal paths. I’m finding it a little tricky to work out Reason’s interface.

In the latest instalment of Forum, it was great to find out a little more about the man behind the legend- Stephen Whittington. I particularly enjoyed the part of forum in which he outlined the attitudes behind the song “X Is Dead”. Put simply, I think he was taking a swipe at Pierre Boulez and his essay “Schoenberg Is Dead.” I guess there’s an anarchist in all of us.


The Tyndall Assembly concert held last Thursday was a really enjoyable local show. It began with clapping music, a 1972 work by Steve Reich, which was a pretty funky way to kick of the evening; however, I much prefer this piece when it’s performed by more like twenty people. Next mcgherkin played his piece ‘Saladdin’, on a couple of pseudo-bongos, which were miked to speakers via some delay effects. Personally, I think it would have sounded much better with the delay pumped higher. I think this would have bridged the live bongo sounds with the electronic sounds (from his PowerBook) a lot more seamlessly. Then again, I have always been a hound dog for delay. It was an epic performance, with Patrick clearly in a meditate-like state; from time to time he would turn to gain some inspiration from the visual slide show and video (film/edited by Nick Rusk). The slide-show contained a fair bit of propaganda, something I have always been interested in; take a look at this WWII anti-Nazi piece (below).



For those of you who have been playing along at home with the rainforest recording project, this is the latest effort, with the percussion part recorded and with better overall mixing.
Configurations:
Neumann U87’s in Mid Side for Piano/Piano Effects.
Rode NT4 (x/y configuration) for Cymbals.
Yamaha MX-“nugget”-204 for hard percussions (woodblocks, cabassa, shaker, tamborine, vibraslap)





Bibliography

Bitoun, Chris. “Rainforest” composed in 2006. Recorded by Jake Morris, courtesy of EMU.

Haines, Christian. "Audio Arts – Opera Singing Recording". Practical Class presented in the Electronic Music Unit, EMU space, 5th floor (Schulz Building), University of Adelaide (30/05/2006).

Haines, Christian. "Creative Computing - ProTools". Practical Class presented in the Electronic Music Unit, EMU space, 5th floor (Schulz Building), University of Adelaide (01/06/2006).

21.5.06

Not a real lot to report on this week. Friday I spent in the studio with composition student Chris Bitoun. I say it every week, but each time I go into the studio I end up with improved results- something which is making producing an increasingly enjoyable experience. For the first time, this week, we tried using over-dubbing to avoid bleed between the microphones. This was an idea Christian gave me during audio arts class, in which we mainly focussed on vocal recording. Using over-dubbing techniques gave me a much greater feeling of control over the recording/production of each instrument in Rainforest. Next Friday, we are going back into the studio to add violin and some percussion to the track, which at the moment only has two piano parts, as well as manual piano effects recorded. Chris and I are spending a fair bit of time with this song, firstly cause it’s a composition which he needs to hand up with a audio recording, and secondly because I’ve decided to use it for my major audio arts project. However, above the bureaucracy, it’s a sparkling composition which deserves the attention it gets and might I add that it’s quite an honour to work with Chris. I know now that we are getting to know each other much better, we have started to work much more effectively in the studio… which makes me think… These local bands who get their demos recorded and made in two or three days- the relationship between producer/band surly is not strong enough to make decent music together. Luckily I’m not stuck in this situation – I can basically spend as long in EMU as I want. Anyway, here’s rainforest (listen to the cool rain effects):



Christian was unable to make it to creative computing on Thursday, which was the first time such a thing had happen to me whilst at uni. I’ve actually often wondered what happens in this situation. Unlike high school it’s not an opportunity to take the piss out of some sorry relief teacher. Ben -I’m sure- can relate to this, lately we’ve been enjoying a bit of "undergraduate humour" (Haines, Christian).

In workshop we again listened to a range of musicians including Mr. Bungle, My Bloody Valentine and Stockhausen. Stockhausen’s work called ‘Hymnen’ (1966/67) primarily contained the instrumentation of a short-wave radio. The result was bizarre and fantastic. The idea of a sporadically tuned radio representing a nation anthem (meaning of the word ‘Hymnen’) seems quite appropriate, especially for a time when electronic music technologies, like the radio, were still a reasonably new idea. The works certainly has given me a bit of inspiration for my creative computing project.

Coming Thursday (the 27th) is mcgherkin’s performance at The Tyndall Assembly. Follow the tyndall link for more information about the show and hopefully I’ll see you there!

Bibliography

Bitoun, Chris. “Rainforest” composed in 2006. Recorded by Jake Morris, courtesy of EMU.

Haines, Christian. "Audio Arts – Vocal Recording". Practical Class presented in the Electronic Music Unit, EMU space, 5th floor (Schulz Building), University of Adelaide (18/05/2006).

Harris , David. "Music Technology Workshop – Mr Bungle, Stockhausen, My Bloody Valentine". Workshop class presented in the Electronic Music Unit, EMU space, 5th floor (Schulz Building), University of Adelaide (18/05/2006).

15.5.06

In the successive study of recording a variety of instruments, this week in audio arts we got around to the electric guitar. I think Christian tends to favour dynamic microphones for this, so when I headed into the studio to record the electric for myself, I used a Shure’s SM 58 and Beta57. Each of these mics I tested on/off axis, and I also placed a Neuman U-89i around 80cm back from the speaker. I did this because I wanted to know how condenser mics, with larger diaphragms handle the amplifier and also explore the benefits and disadvantages of recording general room ambiance with the electric. Click on the following to have a listen.

SM 58 on axis:


Beta 57 on axis:


I increased the gain slightly (on the mixer) for the Beta57 ‘on-axis’ take, so I apologise- this is not an entirely consistent experiment with the mics. In the ‘Beta57 take’, clearly there is a richer transient response, while more of low-frequency power in the SM 58 take. It’s hard to say whether that has anything to do with the 1000 Hz higher frequency range of the Beta57, or whether it has some to do with the change in gain levels, or maybe some religion that no one knows about, but I think that overall the Beta57 did a better job (on axis) for this style of whiney, monophonic, clean guitar line. When I soloed out the 89i in ProTool’s, I expected perhaps a little more reverb, but this wasn’t particularly the case (being in the dead room), and really the sound was not as dissimilar from the SM 58 and Beta57 as I had expected. Something I need to look into now is recording with distortion/overdrive, so any comments on this would be appreciated.

Later in the week I continued recording my ‘studies in composition’ group, using a few new ideas. For one, I used both of the sound screens in EMU, between the piano and string section, trying to reduce bleed between the mics in recording. For the string section itself, I used a Neumann KM-84i for both the cello and violin, and used a Neumann U-89i for the string section in general, which gained a little more ambience over the smaller KM-84s’. I used the Yamaha MZ-204, which I’ve nicknamed “nugget”, to record woodblocks, and finally I recording a suspended cymbal and Tamborine with Rode NT5s’. Also, with the help of a valiant and determined Luke, I got the ‘talkback’ system to work. Ultimately, this all gave me a much better recording quality than last weeks feeble attempts and I feel like I’m ready to do the Audio Arts assignment next friday. Here is Zaen Lee’s Reef:


With our look at ProTool’s in creative computing we continued to review the main functions of the program, uncovering various shortcuts along the way. Christian showed us the function of each cursor tool, with time compression/expansion crop being one of the more interesting of these. Another was the ‘random draw’ pencil tool, which apparently finds its functionality in making plug-in effects feel progressive. In the brief chance I had to make a NIN remix, I experimented with most of these tools:


The airing of Shine on you Crazy Diamond in workshop was a sheer delight, appearing on what I feel is one of Pink Floyd’s better albums (‘Wish you were here’). Songs like this I heard for the first time only years ago, and remember being shocked when I realised how profoundly they, and Pink Floyd have influenced music. The use of the synthesizer is extremely tasteful when compared with most of Pink Floyd’s contemporaries, and is so reminiscent of bands like Air or Ladytron today. The lead guitar lines in Shine on you Crazy Diamond have been at the heart of rock ever since, just think of Tool. The chordal harmonies of their rhythm guitar, as well as simple melodies, in the album’s title ballad Wish You Were Here reminds me so much of Radiohead’s ‘The Bends’ and ‘OK Computer’ era, as well as their followers like Coldplay. If The Beatles can be considered the pioneers of chordal structure as know and love it, I think then Pink Floyd are certainly their equivalent for musical texture.

Bibliography

Lee, Zaen. “Reef”. Composed in 2006. Recorded by Jake Morris, courtesy of EMU.

Christian, Haines. "Audio Arts – Recording Electric Guitar." Practical Class presented in the Electronic Music Unit, Studio 2 and Dead Room, 5th floor, Schulz Building, University of Adelaide, 9th May (2006).

Christian, Haines. "Creative Computing – Digidesign’s ProTools." Lecture presented at the Audio Lab, 4th floor, Schulz Building, University of Adelaide, 11th May (2006).

Harris, David. "Music Technology Workshop – Pink Floyd." Workshop class in the Electronic Music Unit, EMU space, 5th Floor, Schulz Building, University of Adelaide, 11th May (2006).

7.5.06

We’ve starting examining microphones and there application, in this week’s in Audio Arts class, so I finally don’t have to feel like I’m completely guessing when it comes to recording. In this class we considered the difference in the way higher and lower frequency sounds propagate and how this effects microphone positioning. This idea I considered last Friday when I recorded my group for the class - studies in composition. During this recording I picked up on a fair bit of new stuff about recording in general, like for instance, dynamic microphones tend to pick up all sounds in the room. Particularly the Shure SM 58’s. Also, it wears you out. I was fortunate enough in that Tyrell happened to be traversing the EMU during this recording session and he stopped to have a chat. He explained in some detail the Mid-Side technique, which uses a pair of Neumann U87 mics to record a grand piano. This is something I’m eager to try tomorrow afternoon, when I continue recording this group. Heres a ‘work in progress’ recording of composition student, Chris Bitoun’s Rainforest and my composition Outback (which is a recording ruined by stomping Aboriginals tribal dancers in Schulz level 6):

*note: the levels are really low in this rainforest take, so you might wanna turn your speakers up.


On Creative Computing’s side of the coin, we continued our look at ProTools. I learnt plenty of new things in this class, all are fundamentals of the software, like:

  • Auto ‘movements’ (write, latch etc) for panning, volume, muting, effects and whatever other plug-ins.
  • Use of the ‘option’ key to audition an audio file, to drag + duplicate an audio region and effects all tracks when you edit a track (ie: track size or waveform/volume view)
  • Create a new 'playlist' over a track. Basically it creates a new track on top of an old one, switching the old one off. Then you can toggle between them (even on other tracks).

In the brief chance I got to test out all of these ideas, I created this mix of the Nine Inch Nails track (click on the logo). I used SoundHack, Peak and Protools for this, but I would like add that I was really only focusing on testing out what I’ve learnt in class, rather than this sonic result:

In the latest forum the music technology students were given a very interesting insight into the work of two honours students; Darin Curtis and Jasmine Ward. I find the work of both these students very fascinating and somewhat baffling. My criticism of their work however, is in its worldly application. I really fail to understand how the work of a music technology student is meant to have an effect on pollution problems, nor do I understand how it relates to complicated biological modules. I know I don’t know anywhere near enough about ‘what these people do’ to make any of these judgements. But, I wouldn’t have minded to see at least one honours student mention that, in fact, they also would like to ‘make great music.’

Bibliography

Bitoun, Christian. "Rainforest." Composed in 2006.

Morris, Jake. "Outback." Composed/Arranged in 2006.

Haines, Christian . "Audio Arts - Acoustic Guitar Recording." Practical Class presented in the EMU (5th floor of the Schulz Building), University of Adelaide, 02 May (2006).

Haines, Christian . "Creative Computing - Digidesign's ProTools." Practical Class presented in the Audio Lab (Schulz 4.07), University of Adelaide, 04 April (2006).

Curtis, Darrin . "Biological applications in Music Technology." Presentation presented in the Audio Lab (Schulz 4.07), University of Adelaide, 04 April (2006).

Ward, Jasmine. "Music Technology Vs. Pollution." Presentation presented in the Audio Lab (Schulz 4.07), University of Adelaide, 04 April (2006).

28.4.06

Yesterday’s Creative Computing was my first comprehensive examination of DigiDesign’s Protools. It’s designed for sequencing audio files, doing fade ins/outs and other editing. These files can be collaborated from all the other sound editing softwares I’ve been using. We discussed interface design and its direct relation to creative output. As such, the future of software design will certainly allow for increasing customisation of software interface, allowing the user to appropriate interface to his/her creative goals. To my amusement, the ProTools session we are using for playing around with is Nine Inch Nails, “Only”, from their latest With Teeth. Pictured here is their mainstay and musical genius, Trent Reznor (looking very emo):

With ProTools there’s a lot to get your head around, hell, it took me half an hour to figure out that I need to plug in my headphones into the M-box, not the G5, if I want to hear anything. Most of that time I spent reading the ProTool’s Reference Guide; my time wasn’t totally wasted. Other things I figured out in ProTool’s include using manual tempo, the function of the ‘slip’ and other modes (toggled with keys F1 - 4) and showing/hiding the various rulers above tracks. I’m sure I learnt plenty more too, but alas, it’s all a bit subliminal within the 21st Century brain.

I’ve been toying around with Spear lately, seeing what happens when you do which; basically, I’m finding it often leaves you in an underwater sound world. Take the following two examples:

The truly digitalised glissando:


So basically I have used the lasso tool to slice the audio diagonally downward. The possible sonic results of such an edit on audio really excited me when I first delved into Spear. I mean, think about where each of these overtones begins playing, in time, and how rhythmically complicated that would look in notation. In the last small section, I used the ‘time region selection’ tool to select and transpose the last part down a considerable amount. Click on this image, to hear the way the sound ‘splashes’ down (gliss.) then is dunked underwater (transposed). Next:

Transposes up and up, then time stretches:

Slightly more complicated, using different techniques, alas, again this achieves watery sounds. The audio highlighted in red, is repeated (twice before and after the highlighted section) each time receiving a frequency transposition of around double what it was previous. The final section of audio is time stretched, with a little artfully chopped of the bottom. Click, listen to bubbles. Basically, it’s slightly disappointing that the sonic output stays in the watery ‘sound world’ even once you run entirely different processes on the audio.

Workshop presentation was again filled with delightful music, particularly impressive were pieces by Iannis Xanarkis and Phillip Glass. After reading and having heard much of Xanarkis, I was thrilled with his “Voyage absolu des Vnari vers Andromedc.” (1989) I’m so glad the speakers were pumped for this, at plenty of points the piece aroused the emotion of fear in me. I don’t know why. Apparently, he composed it entirely using graphs, well - nothing scares me more than mathematics. No, no, that can’t be it. Phillip Glass’s “Rubrick” (1982) was equally impressive, with mainly arpeggios, in a juxtaposition of sextuplets and quintuplets, all shaping for me the ‘rubrick’. Such rhythms look like:


Bibliography

Haines, Christian . "Creative Computing- DigiDesign's ProTools" Practical Class presented in the Audio Lab (Schulz 4.07), University of Adelaide, 27 April (2006).

Harris, David . "Music Technology Workshop - Xanarkis, Glass and Morris." Workshop presented in the Electronic Music Unit, University of Adelaide, 27 April (2006).

7.4.06

This final week’s episode of audio arts was spent discussing procedures for sending signal around each of the spaces available on level 5. In this first term we’ve wholeheartedly developed an understanding of signal flow/routing in the EMU. I’m hoping that soon we will be collaborating this with a greater study of microphone recording (with sound sources other than a radio). Currently, I’ve just been referring to the microphone webpage on the EMU site; however, I’m sure there’s plenty more to it. Acoustics is considered ‘the science of sound’ by Martin Russ in his book Sound Synthesis and Sampling. As such, this phenomenon of physics is commonly associated with a lot of complex formula. In such books I’m often finding myself knee deep in perplexing theory regarding acoustics; all of which are ideas to be considered when using microphones. Personally, I’m a little more au fait with the science of computers, rather than say physics, but this is the nature of studying, I guess.

In Creative Computing we continued powering through audio software, the latest edition being Tom Erbe’s ‘SoundHack’. Once again I was impressed with the application, which alas for my PC is another OSX patron. All the more reason to get a PowerBook. And just while I’m plugging Macintosh, the pace and power that the G5’s in Schulz 4.07 offer you, in terms of computing extremely complex processes on audio in a matter of moments, is truly inspirational. I never thought I’d say it, but now I know what U2's ‘beautiful day’ Bono was onto when he made those apple commercials.

It’s my feeling that Varèse has Hitler and then the seismic ideological shift of humanity in the post war years to thank for the fact his name appears on much more than his gravestone. Despite this, the man is undoubtedly a dedicated pioneer of music and his ‘audio palette’ is actually quite thoughtful, even if it’s a little of the wall (certainly not a negative): “I can’t give you any structural insights or acedemic suppositions about how his music works…His music is completely unique.” (Frank Zappa, 1967). Varèse’s ability to more seamlessly integrate the typical ‘sound world’ of an orchestral ensemble with other sounds world is quite astounding for its time. The use of the Ondes Martenot in his Ecuatorial (1934) is one example of this.

Later in the workshop, Harris yet again made me feel archetypical of a newer generation when he played an audibly more modern piece, Wings of Nike (Barry Truax, 1987). I know the main reason as to why I got into the peice more than others Harris has shown us was because it benefitted from more advanced 'music technology' (recording, electronic instuments etc) than any of the other, older pieces we've analysed in workshop. Made a year prior to my birth, Wings of Nike had an intense atmospheric environment, created by what sound like a plane engine idling. This basis for the work was met with really only one other sound source, high frequency ‘speckly’ resonances which in my mind sounded like a swarm of cyborg mosquitoes. I’m hoping that this was Truax’s metaphor for Nike.

Bibliography

Frank Zappa quote taken from:
Fei, James. Stereo Review (page 62). G. Ricordi, 1971.

Russ, Martin. Sound Sampling and Synthesis. Focal Press, 1996.

Apple (2006) ‘Apple - PowerBook - G4’, at: http://www.apple.com/powerbook/ (accessed: 7 April, 2006)

Electronic Music Unit (Adelaide University) (2006) ‘Electronic Music Unit - Microphones’, at: http://www.emu.adelaide.edu.au/resources/guides/hardware/microphones.html (accessed: 7 April, 2006)

Haines, Christian . "Audio Arts - Routing the EMU." Practical Class presented in the EMU (5th floor of the Schulz Building), University of Adelaide, 04 April (2006).

Haines, Christian . "Creative Computing." Practical Class presented in the Audio Lab (Schulz 4.07), University of Adelaide, 06 April (2006).

31.3.06

In Audio Arts we finally stepped into the realm of studio one. Its capability to produce audio in 5.1 channels is something I’m quite excited to endeavour into, despite Christian mentioning that it’s a little used facility. Once I progress from such a novice level of recording, this will certainly be something I want to investigate. Other advantages of the studio include the C24 mixer, which seems quite extensive and superior to the 01V, particularly for recording multiple instruments. And I realise everyone goes on about it, but the fact that a monkey works inside it, synchronising track levels with Pro Tools all day is just awesome. Coming Monday I’ll find much more out about the studio during the three hour session I’ve booked in there.

During Creative Computing we continued to cover visual representations of sound. Moving on from pulse code modulation and Peak LE, we went onto look at Spectral sound representation and Spear. It seems an invaluable tool for timbre manipulation and I can’t believe it’s only a 1.5 Mb download. It must run under a lot of ‘algorithmic’ type code, and would seem to be perfect for making ambient music. As Christian described it, it’s something of a ‘paint pallet’ for creativity with sound. Each week Creative Computing is making me want an Apple of my own, with not only some sound applications not being available on Windows, but file quality and universality not always being the best when working with both OS’s. For this reason, I’ve starting looking into perhaps getting an old, high-end G4 laptop. There can be no understating how much better this would be than an external hard-disk drive but, unfortunately, it will take me bit longer to save up for.

Lately I’ve been reading much about ‘modernist’ composers, such as the likes of John Cage, so in Forum class it was pleasing to finally put the name to a noise. The more I equally learn and listen to these unique composers, the more I’m getting into it. For example, I noted that the first piece ‘Music for Carillon’ (1954) was comprised solely of ‘cluster chords’ played on some kind of chimey organ. (Click on the image at the bottom of this post to hear a 'cluster chord'). In his novel New Directions in Music, David Cope outlines the origins of this technique, “[Around 1911] the cluster chord came into existence. These both represent some of the first uses of traditional noise as an acceptable musical element and, as such, involve a greater philosophical meaning to contemporary music than many have suggested.” Prehaps the thing i most enjoy about these modernists composers is not their music (because I didn’t especially enjoy ‘Music for Carillon’), but just how innovative their thinking and methods where. Although I won’t be rushing out to hear more of Cage, ‘sound-mass’ as an idea is somewhat stirring. For the same reason, it’s now my feeling that the more sound you can have an appreciation for, the more compositional tools you free to your disposal.

Sound-Mass:

Bibliography

Cope, David. New Directions in Music. Dubuque: Iowa, 1971

Harris, David . "Music Technology Workshop - John Cage." Workshop presented in the Electronic Music Unit, University of Adelaide, 30 March (2006).

Haines, Christian . "Audio Arts - Studio 1." Practical Class presented in the EMU (5th floor of the Schulz Building), University of Adelaide, 28th March 2006.

Haines, Christian . "Creative Computing - Peak LE." Practical Class presented in the Audio Lab (Schulz 4.07), University of Adelaide, 28th March 2006.

26.3.06

David’s latest workshop instalment included pieces by Jack Vees, Ingram Marshall, Michael Gordon and most intriguingly, himself. To commence he played ‘Surf Music II’ by Vees, which vividly reminded me of last week’s piece. As such, the instrumentation of the works are somewhat analogous, both containing the bowed electric guitars, in Vees’s case, a single electric bass. The piece primarily explored a juxtaposition of very low and high frequencies. The deep bowed notes of the bass created a sort bedding to listen to, allowing overtones to infrequently bounce in and out. In some ways this reminded me of a point made by Mark in Foundation of Music History, how the ‘tenor’ voice in early times derived its name from the Latin word tenor which meant “to hold” (referring to the way the lower voice persistently held notes for longer lengths). Despite the fact that Vees, or any other modernistic composer, would deny any parallel between their music and anything pre-1950 (let alone 10th century), I think it’s suffice to say that the human ear prefers held lower frequencies as the held bedding with higher frequencies sprinkled more lightly over the top, or at least it has been embedded into us thanks to this stylistic choice of 9th century+ musicians. One thing I’ve particularly enjoyed in these ambient works David has presented us with, is the great lengths the pieces go for, establishing dispositions that sound alike throughout the work (repeating similar samples over and over), yet nothing about the works are rhythmatically repetitive. So in one light, the pieces sound similar throughout, however nothing in particular really ever repeats. This is, I think, what allows minimalist composers to establish such unique, colourful sound environments, which personally are exciting toys for the imagination. I like.

Not unlike with Surf Music II, which he quoted as inspiration, ‘piano piece’, the first composition David shared with the class was another investigation of harmonics created by low notes. From where I sat, a metre from the lid of the grand piano, the rapid low notes of David’s song created a wealth of different overtone situations. During this performance I was able to really intensely tune into hearing the web of harmonics created by rapid low notes, and was very excited by this. I have both David’s compositional purpose and the quality sound generated by that particular grand piano to thank for that. I didn’t get into his other pieces as much, especially the work for two violins. Personally, I find the sound produced by violins a little aggravating. This combined with a composition which explores notes and intervals outstanding from the conventional 12-semitone scale ultimately created a sound which was just too irritating for me to get into. Nevertheless, my respect for both his taste in music and unique style of composition seem to escalate every Thursday.

Bibliography

Harris, David. "Music Technology Workshop." Workshop presented in the EMU (Schulz Level 5), University of Adelaide, 16 March (2006).

Carrol, Mark. "Foundations of Music History - Lecture." Lecture presented in Hughes Lecture Theatre, University of Adelaide, 17 March (2006).

19.3.06

Free Image Hosting at ImageShack.us
This week in Audio Arts we focused on routing signal through patch-bays. The picture above shows the three main types of connection; Semi-normalised, Normalised and Thru. This class gave me an appreciation of the pertinency of routing, in terms of the organisation and efficiency of a studio. For somebody working in a studio, having a clear image of the way signal travels around patch bays, mixers and other devices is a vital element of being able to utilise the technology to the fullest potential. The organisation and efficiency of a sound technician seems to rely somewhat on the user being familiar with the equipment and the way it has been implemented. Again highlighting the importance of concise organisation, in Creative Computing we discussed Metadata.

In forum class I was somewhat enthralled by Glen Branca’s ‘Symphony Num. 3’ -Gloria. After David mentioned that Branca composed the piece for six electric guitars (among other instruments) which were to be bow with loose horse hair I was intrigued. Despite reminding me of the soundtrack to a 1980’s science fiction film, I found the mantra extremely intense and it vividly played with my imagination. With such a constant, thick array of sounds I found that while I listened to Gloria it was impossible to take in everything at once, and at times some ‘ringey’ tones became less or more annoying as you focused on different noises. This ultimately gave me the idea that all music and noise in general is a unique experience for each listener. The spectrum of hums created by the array of guitars combined to sound like the engine of a flying saucer. The frequent chime of bells added another eerie quality to the piece, which I though aided it in sounding unearthly. The other obvious catalyst of this was the exploration of overtones, which progressively gave my mind’s eye an image that the large flying saucer that was gearing up for space flight. Ultimately, the plethora of harmonics built up, propelling the piece towards climax. The abundance of dissonance in the piece, which could be felt on chords held for extremely long periods, established that this was not ‘pleasant music’ but rather an idea of a ‘sound scape’. Works that aim not simply to please the ear, instead existing under a different objective, always seem to leave more of an impression on me, mainly because I think it’s more reflective of life rather than pleasure.

Bibliography

Haines, Christian. "Audio Arts - Patchbay Routing." Practical class in the EMU (5th floor), Schulz Building, University of Adelaide, 07 March (2006).

Haines, Christian. "Creative Computing - Audio File Formats." Practical class presented in the Audio Lab (Schulz 4.07), University of Adelaide, 09 March (2006).

Harris, David. "Music Technology Workshop." Workshop presented in the EMU (Schulz Level 5), University of Adelaide, 09 March (2006).