GetSampleAsync does not fire in MediaStreamSource on WP7 device - silverlight

I've been developing an audio app for Windows Phone 7 and up to this point have been using the WP7 emulator. The app uses a custom MediaStreamSource class to stream audio to a MediaElement.
On a real device, the custom MediaStreamSource completely fails. After calling MediaElement.Play(), the MediaStreamSource's GetSampleAsync method never gets called. It works just fine in the emulator.
I've started the app in the debugger (running on the device) and no exceptions get thrown anywhere.
I'm wondering if maybe my stream source is using a sample rate, bits per sample, or channel count that is not supported? I cannot find any documentation on what values are supported - however I find it hard to believe that my settings are not supported (44,100 Hz, 16 bits/sample, 2 channels).
Thoughts?

The answer is that the Zune software interferes with the phone's media capabilities. The app will work on the device if you disconnect the device from the computer, or by using the WPConnect tool: http://blogs.msdn.com/b/jaimer/archive/2010/11/03/tips-for-debugging-wp7-media-apps-with-wpconnect.aspx

Related

Read BLE Beacons from react PWA

I'm designing a PWA for customer support.
One of the functionalities will require precisely (5 meters error margin) positioning the user's phone.
I'm thinking of using beacons as their range and precision suites my needs.
the thing is: How can I read beacons from a react.js (no native) PWA? Of course for triangulation, this would require reading several beacons.
Sorry, there is no practical way to make a web app that scans for bluetooth beacons due to the lack of browsers' support for bindings to raw bluetooth scanning.
Google Chrome does support BLE interaction via JavaScript through the Web Bluetooth API. However, it only supports discovering and connecting to GATT services, not arbitrary BLE scanning needed to find beacons. This means you cannot find iBeacon, Eddystone or AltBeacon compatible devices.
You might be able to discover a connectable custom BLE device that simulates beacon behavior through a connectable GATT service. But even if you did this, it would only work on Chrome, not on Safari, Microsoft or Firefox browsers, because Web Bluetooth is not supported on those platforms.
Even if you get this working on Chrome with a custom beacon, bluetooth beacons only provide very rough distance estimates. Triangulation only works decently at very close ranges of 3 meters or less. More practical indoor positioning techniques with beacons use RSSI fingerprinting not triangulation. And again, even Google Chrome with Web Bluetooth does not support the arbitrary scanning needed to do this.

WPF MediaKit Audio Issues

I am currently using the WPF MediaKit in an application I am working on. I am having an issue with audio. On some computers playback would not work at all. It was confusing because it seemed to work sometimes and not others. Then I noticed it was failing to open the media because of an audio issue. If there are no audio devices connected the cannot open the audio stream and fails completely. This is a problem, we can tell people they have to connect speakers for the application to function. Is there a way I can tell it to just ignore the audio if this happens and still play the video? Here is the error I am getting.
System.Runtime.InteropServices.COMException (0x80040256): Cannot play back the audio stream: no audio hardware is available, or the hardware is not responding.
at DirectShowLib.DsError.ThrowExceptionForHR(Int32 hr)
at WPFMediaKit.DirectShow.MediaPlayers.MediaPlayerBase.AddFilterByDevice(IGraphBuilder graphBuilder, DsDevice device)
at WPFMediaKit.DirectShow.MediaPlayers.MediaPlayerBase.AddFilterByName(IGraphBuilder graphBuilder, Guid deviceCategory, String friendlyName)
at WPFMediaKit.DirectShow.MediaPlayers.MediaUriPlayer.InsertAudioRenderer(String audioDeviceName)
at WPFMediaKit.DirectShow.MediaPlayers.MediaUriPlayer.OpenSource()
0x80040256 is VFW_E_NO_AUDIO_HARDWARE "Cannot play back the audio stream: no audio hardware is available, or the hardware is not responding." which means that the system does not have audio output device to play to.
WPFMediaKit however assumes you have some, and attempts to create audio renderer unconditionally. That is, you have to edit this in WPFMediaKit code to work it around.

Streaming, SIlverlight MediaFramework, Not playing in Device

I am trying to stream video content to a windows phone.
I am using the following code.
"player" is the Silverlight Media Player used here.
PlaylistItem item = new PlaylistItem();
item.DeliveryMethod = Microsoft.SilverlightMediaFramework.Plugins.Primitives.DeliveryMethods.AdaptiveStreaming;
item.MediaSource = new Uri("http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest");
item.VideoHeight = strmPlayer.Height;
item.VideoWidth = strmPlayer.Width;
player.Playlist.Add(item);
player.Play();
I am able to play it in the emulator but on the Device i dont see anything.
Can anyone correct me where i am going wrong ?
I sometimes get this log in the debug output window.
A first chance exception of type 'System.InvalidOperationException' occurred in Microsoft.Web.Media.SmoothStreaming.dll
Are you using latest version of the Silverlight Media Framework as available from Codeplex? Could it be a bug in the implementation you are using and latest version could correct that? Otherwise, it is hard to investigate what could be wrong in the network connectivity on the device versus that on your emulator.
BTW, what device are you using?
It was bandwidth issue ! MY pc was using a good speed internet connection and so it was able to play the stream.
My device was connected to the WIFI hub, which was out pf range at some points.
When i took my device near the hub, the stream was played.

Audio conference between WPF Applications

I have 2 WPF applications that communicate using a couple of duplex WCF services. I need to enable audio communication also between them. I've been looking for a solution for a while now, but couldn't find a good one.
What I've tried is audio streaming using Microsoft Expression Encoder on the "server" side (the one that feeds the audio), and playing it on the "client" using VLC .NET. It works, at least streaming a song, but it's a big resource eater. The initial buffering also takes a lot, and so is stopping the stream.
What other options do I have? I want a clear, lightweight audio conversation between the apps, kinda like Skype. Is this possible? Thanks
EDIT: I found NAudio and it looks like a good audio library, I managed to stream my microphone quite easily. However, I have big problem - I can hear the voice clearly on the client, but it echoes indefinitely. Plus, there's this annoying background sound (could this be caused by the processor?) and after a while, a very high, loud sound is played on the receiving end. All I can do is stop the whole transmission. I have no idea what's causing these problems. I use the 'SpeexChatCodec' as in the NetworkChat example provided (sampling rate: 8000, 2 channels). Any suggestions? Thanks
It would be a lot of work to write a library that would support that from scratch... if you can spend $150 on this I would suggest purchasing a library like iConf .NET Video Conferencing SDK from AvSpeed...

Difference in CPU & memory utilization while using VLC Mozilla plugin and VLC player for playback of RTSP streams

For one of our ongoing projects we were planning to use some multimedia framework like VLC / Gstreamer to capture and playback / render h.264 encoded rtsp streams. For the same we have been observing the performance (CPU & memory utilization) of VLC using two demo applications that we have built. One of the demo application uses the mozilla vlc plugin using which we have embedded up to four h.264 encoded RTSP streams on a single html webpage while the other demo application simply invoked the vlc player and plays a single h.264 encoded rtsp stream.
I was surprised to observe that the results were as under (Tests were conducted on Ubuntu 11.04):
Demo 2 (Mozilla VLC plugin - 4 parallel streams)
CPU utilization: 16%
Memory utilization: ~61MB
Demo 2 (VLC player - 1 stream)
CPU utilization: 16%
Memory utilization: ~17MB
My question is, why is the CPU utilization lesser for the mozilla VLC plugin even though it is decoding more video streams.
Reply awaited.
Regards,
Saurabh Gandhi
I'm also using VLC mozilla plugin for my project and I have problem with h264 streams. The only way to handle such stream was to use --ffmpeg-hw (for vaapi use) which due Xlib works only in standalone VLC app (--no-xlib flag in vlcplugin_base.cpp). So I removed that flag and added XInitThreads() and it works now BUT far from performance level you had and besides no-xlib flag was there for reason (it might come to some unwanted behavior).
So the main question is HOW did you come to such results and if its possible to share your configuration flags with me and the rest.
The system I'm using is 4 core CPU and nvidia ION graphics. CPU cores stay at moderate level but stream on fullscreen doesn't play smoothly. If the same streams gets run in cvlc it works perfect. ffmpeg-hw flag is used in both accounts without any warning messages (vaapi successfully returns).
If you have hardware acceleration of some sort, then CPU only takes care of routing the data..

Resources