I created the music in on create like this:
music_background = Gdx.audio.newMusic(Gdx.files.internal("background_music.mp3"));
music_background.setLooping(true);
the problem that its not playing in loop.
I also tried without the loop and instead registering for setOnCompletionListener but it also doesn't play. when I tried to reload the file like this:
music_background = Gdx.audio.newMusic(Gdx.files.internal("background_music.mp3"));
Inside the event it worked but only one time.
I think that the problem is that when its done playing the file dispose itself...
How can I play music in loop? what I'm doing wrong?
You are doing it correct, but MP3s are not good for looping, use OGG instead. MP3s will add a short silence at the start, OGG or WAV doesn't have this limitation.
Here is my code that works perfectly:
menuMusic = Gdx.audio.newMusic(Gdx.files.internal("data/sounds/music_menu.ogg");
menuMusic.setLooping(true);
menuMusic.play();
If you have all your files in MP3 just download Audacity, import you MP3s, edit away the blank audio and export as OGG.
Related
I would like to add a short outro video to a list of videos or add a "watermark" that covers the entire video during the last 5 seconds. How can I achieve this? I added ffmpeg in the title but if there is another free way of doing it in bulk then I am all ears. My OS is Windows 10.
Thanks!
I'm finishing a script that playblasts several pieces of animation into different movie files. It works fine but I'd like it to NOT open all the video files once it is done playblasting them.
Can't find that option even inside maya playblast options itself... Any light on that?
You need to pass the viewer flag to false, are you doing playblast via code ? if so check this doc
I have a list of files (videos and images) I would like to show on the screen using gstreamer 1.0, means iterating over the elements (file paths) in the list and "play" them sequentially in the c application with "delays" e. I tried different examples which partly work, but I cannot get the whole picture together to implement.
So what is the conceptual solution for this? Should I use one "dynamic" pipeline or two (one for images - because I think here is imagefreeze before videoconvert necessary and one for video)? And how can I use decodebin to detect the format of the media automatically? decodebin works from the command line, but with errors like no video decoder found for 'jpeg' in c application?
Try to make universal pipeline (or two for videos and images). i.e. you put to input any file from your list and get output video or image. This pipeline(s) should works from gst-launch. After that try to implement this pipeline in to C code, or write pipeline here.
My way:
Take file from list. If image -> create image decode pipeline, if video -> create video decode pipeline. Delete pipeline. Delay. Go to next file.
I am using GPUImage framework to record multiple videos one after other in close intervals with having various filters enabled in real time using GPUImageVideoCamera and GPUImageMovieWriter.
When I record the video, video starts with a jerk(freeze for half a seconds) and ends with a jerk also. I know the reason behind this are the statements in which I pass the movieWriter object to VideoCamera's audioEncodingtarget.
So In my case when I record multiple videos one after other(with different objects of GPUImageMovieWriter), the video preview view freezes at start and end of each recording.
If I remove the audio encoding target statement, conditions improves significantly but of course I don't get the audio.
Currently I am using a AVAudioRecorder while recording to save audio tracks but I believe this is not a ideal work around.
Is there any way to solve this problem.
-- I looked at the RosyWriter example by Apple, their app work almost similarly but smoothly at almost constant 30 fps. I tried to use the RosyWriter code(after removing the code that add purple effect) to save the required videos while showing GPUImageVideoCamera's filtered view to user but in vain. When applied unmodified rosywriter code just records two videos and rest video fails. I also tried to pass in the rosywriter code the capture session from GPUImageVideoCamera but only gets videos with black frames and no audio.
Please help on how can I can record GPUImage filtered videos with audio without this jerkiness. Thanks in advance
I faced the same issue and here is my workaround.
As you pointed out, this problem happened because setAudioEncodingTarget method internally calls addAudioInputsAndOutputs to set audio in/output to the capture session.
To avoid this issue, I created justSetAudioEncodingTarget method for VideoCamera as below,
(on GPUImageVideoCamera.m)
// just set
-(void)justSetAudioEncodingTarget:(GPUImageMovieWriter*)newValue {
if( newValue == nil ) {
return;
}
addedAudioInputsDueToEncodingTarget = YES;
[super setAudioEncodingTarget:newValue];
}
The following steps is my scenario and I checked out it smoothly worked.
Called VideoCamera's addAudioInputsAndOutputs after the VideoCamera was created.
This is not right before starting the recording. :)
Set MovieWriter to the VideoCamera by justSetAudioEncodingTarget that I made above.
I need to:
play a sound
loop
continue to play in background
All works fine with AVAudioPlayer, but now I have to use AVPlayer to play also music songs from the iPod library. But there is a big problem: AVPlayer has no numberOfLoops property like in AVAudioPlayer.
Ok, you can register a notification to the NSNotificationCenter for AVPlayerItemDidPlayToEndTimeNotification (or you can also use the addBoundaryTimeObserverForTimes method with specified times equal to the song duration) to be notified when a song has played to the end.
But this works only if the app is in the foreground, in the background the song continues to play, but stops at the end.
The notification is received also in the background (checked with a NSLog), but the selector's code:
AVPlayerItem *playerItem = [notification object];
[playerItem seekToTime:kCMTimeZero];
...has no effect, the song doesn't loop.
Any suggestions?
The problem you have is the second the player does a seektotime your app looks like it's done playing audio, and so it's thrown on the bonfire. :)
I spent days creating this workaround:
Add "App plays audio" to "Required background mode" in info.plist
Create another AVplayer that plays a one second mp3 file at 0.00000001f rate (silentMp3Player.rate=0.0000001f) which starts when your app enters background and stops when it re-enters foreground.
Try to add this code:
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
try AVAudioSession.sharedInstance().setActive(true)