objective c - Offline rendering with The Amazing Audio Engine -
this post posted on the amazing audio engine forum.
hi everyone, new amazing audio engine , ios dev, , have been trying figure out how bpm of track.
so far have found 2 articles on offline rendering on forum:
- http://forum.theamazingaudioengine.com/discussion/comment/1743/#comment_1743
- http://forum.theamazingaudioengine.com/discussion/comment/649#comment_649
as far know aeaudiocontrollerrendermainoutput
function correctly implemented in this fork.
i trying offline rendering process track , use algorithm described here (javascript) , implemented here.
so far i'm loading fork, , using swift (i part of make school summer academy @ moment, teaches swift).
when playing track code works me (no offline rendering!)
let file = nsbundle.mainbundle().urlforresource("track", withextension: "m4a") let channel: anyobject! = aeaudiofileplayer.audiofileplayerwithurl(file, audiocontroller: audiocontroller, error: nil) audiocontroller = aeaudiocontroller(audiodescription: aeaudiocontroller.noninterleavedfloatstereoaudiodescription()) let receiver = aeblockaudioreceiver { (source, time, frames, audiobufferlist) -> void in let leftsamples = unsafemutablepointer<float>(audiobufferlist[0].mbuffers.mdata) // advance buffer sizeof(float) * 512 let rightsamples = unsafemutablepointer<float>(audiobufferlist[0].mbuffers.mdata) + 512 println("leftsamples: \(leftsamples) rightsamples: \(rightsamples)") } audiocontroller.addchannels([channel]) audiocontroller.addoutputreceiver(receiver) audiocontroller.start()
trying offline rendering
this code trying run while using this fork
audiocontroller = aeaudiocontroller(audiodescription: aeaudiocontroller.noninterleaved16bitstereoaudiodescription()) let file = nsbundle.mainbundle().urlforresource("track", withextension: "mp3") let channel: anyobject! = aeaudiofileplayer.audiofileplayerwithurl(file, audiocontroller: audiocontroller, error: nil) audiocontroller.addchannels([channel]) audiocontroller.start(nil) audiocontroller.stop() var t = audiotimestamp() let bufferlength: uint32 = 4096 var buffer = aeallocateandinitaudiobufferlist(audiocontroller.audiodescription, int32(bufferlength)) aeaudiocontrollerrendermainoutput(audiocontroller, t, bufferlength, buffer) var renderduration: nstimeinterval = channel.duration var samplerate: float64 = audiocontroller.audiodescription.msamplerate var lengthinframes: uint32 = uint32(renderduration * samplerate) var songbuffer: [float64] t.mflags = uint32(kaudiotimestampsampletimevalid) var frequencyanalyzer = frequencyanalyzer() println("renderduration \(renderduration)") var outisopen = boolean() augraphclose(audiocontroller.audiograph) augraphisopen(audiocontroller.audiograph, &outisopen) println("augraphisopen: \(outisopen)") (var i: uint32 = 0; < lengthinframes; += bufferlength) { aeaudiocontrollerrendermainoutput(audiocontroller, t, bufferlength, buffer); t.msampletime += float64(bufferlength) println(t.msampletime) let leftsamples = unsafemutablepointer<int16>(buffer[0].mbuffers.mdata) let rightsamples = unsafemutablepointer<int16>(buffer[0].mbuffers.mdata) + 512 println("leftsamples: \(leftsamples.memory) rightsamples: \(rightsamples.memory)") } aefreeaudiobufferlist(buffer) augraphopen(audiocontroller.audiograph) audiocontroller.start(nil) audiocontroller.stop()
offline rendering not working me atm. second example not working it's getting me lot of mixed errors don't understand.
a common 1 inside channelaudioproducer
function on line:
// tell mixer/mixer's converter unit render audio status = audiounitrender(group->converterunit ? group->converterunit : group->mixeraudiounit, arg->ioactionflags, &arg->originaltimestamp, 0, *frames, audio);
it gives me exc_bad_access (code=exc_i386_gpflt)
. among other errors 1 common.
i sorry total noob on field stuff don't understand. should use noninterleaved16bitstereoaudiodescription
or noninterleavedfloatstereoaudiodescription
? how implement mdata
?
i love on since i'm kind of lost @ moment. please when answer me try explain can, new stuff.
note: posting code in objective-c fine if don't know swift.
Comments
Post a Comment