Windows Phone 7.1 play recorded PCM/WAV audio - silverlight

I'm working on a WP7.1 app that records audio and plays it back. I'm using a MedialElement to playback audio. The MediaElement works fine for playing MP4 (actually M4A files renamed) downloaded from the server. However, when I try to play a recorded file with or without the WAV RIFF header (PCM in both cases) it does not work. It gives me an error code 3001, which I cannot find the definition for anywhere.
Can anyone point me to some sample code on playing recorded audio in WP7.1 that does not use the SoundEffect class. Don't want to use the SoundEffect class because it's meant for short audio clips.
This is how I load the audio file:
using (IsolatedStorageFile storage = IsolatedStorageFile.GetUserStoreForApplication())
{
using (Stream stream = storage.OpenFile(audioSourceUri.ToString(), FileMode.Open))
{
m_mediaElement.SetSource(stream);
}
}

This playing code looks good. Issue have to be in storing code. BTW 3001 means AG_E_INVALID_FILE_FORMAT.

I just realized that the "Average bytes per second" RIFF header value was wrong. I was using the wrong value for the Bits per Sample value, which should've been 16 bit since the microphone records in 16-bit PCM.

Related

Is it possible to record audio in iOS with Codename One?

My app features a button to record audio (and another to play it back when the recording is over). I send the recorded audio files on a server. On Android the files is recorded as .amr (mime type audio/amr) and can be played back.
On iOS however the file can neither be played back on the device (iPhone 4 or 4S) nor on a computer. ffmpeg -i reports
[mov,mp4,m4a,3gp,3g2,mj2 # 0x2fac120] moov atom not found
9gMjOnnmsj9JJZR3.m4a: Invalid data found when processing input
Please note that VLC cannot play it either.
I give the m4a extension because Voice recorder uses it (along with aac codec).
Here is the code I use (mostly based on https://github.com/codenameone/CodenameOne/blob/master/Ports/iOSPort/src/com/codename1/impl/ios/IOSImplementation.java#L2768-L2794 ) :
audioMemoPath = ParametresGeneraux.getWRITABLE_DIR() + "AudioMemo-"
+ System.currentTimeMillis() +
(Display.getInstance().getPlatformName().equals("and")
? ".amr"
: ".m4a");
audioMemoMimeType = MediaManager.getAvailableRecordingMimeTypes()[0];
audioMemoRecorder = MediaManager.createMediaRecorder(audioMemoPath, audioMemoMimeType);
// If the permission audio has not been granted
// the audiomemorecoreder will be null
if (audioMemoRecorder != null) {
audioMemoRecorder.play();
boolean b = Dialog.show("Recording", "", "Save", "Cancel");
audioMemoRecorder.pause();
audioMemoRecorder.cleanup();
...
}
Moreover if I display the available mime types on iOS, it yields "audio/amr" which I doubt according to all the posts I could read that tell you amr is not supported on iOS. Looking at the source it appears amr is the by default mime type because it is always returned :
/**
* Gets the available recording MimeTypes
*/
public String [] getAvailableRecordingMimeTypes(){
return new String[]{"audio/amr"};
}
So my question : is it possible to record audio on iOS, and if it is, how can it be done ?
Any help appreciated,
Have you looked at the Capture class? That seems to be more straightforward.
https://www.codenameone.com/javadoc/index.html
Ok I got it working by overloading some methods of the MediaManager, namely getAvailableRecordingMimeTypes() and also createMediaRecorder() to prevent it from using its getAvailableRecordingMimeTypes method.
Here is the code for getAvailableRecordingMimeTypes():
/**
* Normally the method returns amr even for ios. So we overload the original
* method to return aac on ios.
* #return
*/
public static String[] getAvailableRecordingMimeTypes() {
if (Display.getInstance().getPlatformName().equals("ios")) {
return new String[]{"audio/aac"};
} else {
return new String[]{"audio/amr"};
}
}
createMediaRecorder() is left as is (copied without changes).
Now it is possible to record audio in iOS and play it back in both iOS and Android!

Is it possible to convert a Windows bitmap with an Alpha channel to a MagickImage?

Using Magick.Net, Is it possible to convert (in memory) a Windows bitmap with an Alpha channel to a MagickImage? When I try the following, it fails with .net formats Format32bppPARgb and Format32bppARgb, but works fine with Format24bppRgb. The error message is "no decode delegate for this image format `XWD'".
bmp = New System.Drawing.Bitmap(400, 300, PixelFormat.Format32bppPARgb)
img = New MagickImage(bmp)
You are getting this exception due to a bug in ImageMagick. It reads the stream and tries to determine the format. It incorrectly decides that the format is XWD instead of BMP. I submitted a patch to the GIT repository of ImageMagick to fix this. Your code will work in Magick.NET 7.0.0.0018 that has not been released at the time of writing.

Why won't any other video file format work except .mov?

I have an app on the app store which plays a selection of videos. Currently all of the videos are in the .mov file format but this makes the size of the app rather large so i'm trying to use a different file format to reduce the overall size of the app.
I am trying to use the mp4 format as this is reducing the size of each video by more than a half but when I do, the app crashes when I try to play the video with the following error message:
Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* -[NSURL initFileURLWithPath:]: nil string parameter
I have used the following code for each video in my implementation file and changed the file name and type to match the new video so I don't understand why there should be a problem with the file path.
- (IBAction)playDaresWins:(id)sender {
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"DaresWins" ofType:#"mov"]];
_moviePlayer =
[[MPMoviePlayerController alloc]
initWithContentURL:url];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(moviePlayBackDidFinish:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:_moviePlayer];
_moviePlayer.controlStyle = MPMovieControlStyleNone;
_moviePlayer.shouldAutoplay = YES;
[self.view addSubview:_moviePlayer.view];
[_moviePlayer setFullscreen:YES animated:NO];
}
Am I missing something?
.mov isn't a video format or codec, it's a container. The developer documentation provides a list of supported video codecs, bit rates, and resolution (link here - I won't post them here as they can change from OS version to OS version).
However, I don't think that's the problem, because it looks as if you're getting an exception when you're creating the NSURL, not when you're playing the video. That suggests that the path you're providing for your video doesn't exist. Are you sure you have a) the right filename, b) the right extension (perhaps it's MP4 instead of MOV), or c) have added the movie file into your project correctly?

Silverlight Plug in Crashes in Video Conference

We hav developed an application for video conference using silverlight.
It works properly for 15 to 19 min then video get stopped and silverlight plugin has crashed.
for video encoding we r using the JPEG encoder and single image from capturesource get encoded and send on each tick of timer..
I also tried to use Silversuite but message popup arrives i.e. Silversuite expire
Is der proper solution for encoding or timer or plug in...
Thanx...
we extend the crashing period from 15 min to 1 to 1 n 1/2 hr ......by flushing the mem stream and decreasing the receiving buffer size .....

Syncronization audio and video

I need to display stream video using MediaElement in Windwso Phone application.
I'm getting from web-service a stream that contains frames in H264 format AND raw-AAC bytes (strange, but ffmpeg can parse with -f ac3 parameter only).
So, if try to play only one of stream (audio OR video) it plays nice. But I have issues when try it both.
For example, if I report video sample without timestamp and report audio with timestamp, my video plays 3x-5x faster then I need.
MediaStreamSample msSamp = new MediaStreamSample(
_videoDesc,
vStream,
0,
vStream.Length,
0,
_emptySampleDict);
ReportGetSampleCompleted(msSamp);
From my web-service I getting a DTS and PTS for video and audio frames in following format:
120665029179960
but when I set it for sample, my audio stream plays too slow and with delays.
Timebase is 90khz.
So, could someone tell me how I can resolve it? Maybe I should calculate others timestamps for samples? If so, show me the way, please.
Thanks.
Okay, I solved it.
So, what I need to do for sync A/V:
Calculate right timestamps for each video and audio frames using framerate.
For example, for video I have 90 kHz and for audio 48 kHz and 25 frames per second - my frame increments will be:
_videoFrameTime = (int)TimeSpan.FromSeconds((double)0.9 / 25).Ticks;
_audioFrameTime = (int)TimeSpan.FromSeconds((double)0.48 / 25).Ticks;
And now we should add these values for each sample:
private void GetAudioSample()
{
...
/* Getting sample from buffer */
MediaStreamSample msSamp = new MediaStreamSample(
_audioDesc,
audioStream,
0,
audioStream.Length,
_currentAudioTimeStamp,
_emptySampleDict);
_currentAudioTimeStamp += _audioFrameTime;
ReportGetSampleCompleted(msSamp);
}
For gettign video frame method will be the same with a _videoFrameTime incrementation instead.
Hope this will be helpfull for someone.
Roman.

Resources