I am currently using the WPF MediaKit in an application I am working on. I am having an issue with audio. On some computers playback would not work at all. It was confusing because it seemed to work sometimes and not others. Then I noticed it was failing to open the media because of an audio issue. If there are no audio devices connected the cannot open the audio stream and fails completely. This is a problem, we can tell people they have to connect speakers for the application to function. Is there a way I can tell it to just ignore the audio if this happens and still play the video? Here is the error I am getting.
System.Runtime.InteropServices.COMException (0x80040256): Cannot play back the audio stream: no audio hardware is available, or the hardware is not responding.
at DirectShowLib.DsError.ThrowExceptionForHR(Int32 hr)
at WPFMediaKit.DirectShow.MediaPlayers.MediaPlayerBase.AddFilterByDevice(IGraphBuilder graphBuilder, DsDevice device)
at WPFMediaKit.DirectShow.MediaPlayers.MediaPlayerBase.AddFilterByName(IGraphBuilder graphBuilder, Guid deviceCategory, String friendlyName)
at WPFMediaKit.DirectShow.MediaPlayers.MediaUriPlayer.InsertAudioRenderer(String audioDeviceName)
at WPFMediaKit.DirectShow.MediaPlayers.MediaUriPlayer.OpenSource()
0x80040256 is VFW_E_NO_AUDIO_HARDWARE "Cannot play back the audio stream: no audio hardware is available, or the hardware is not responding." which means that the system does not have audio output device to play to.
WPFMediaKit however assumes you have some, and attempts to create audio renderer unconditionally. That is, you have to edit this in WPFMediaKit code to work it around.
Related
I made a music player in wpf using cscore. Now, I want to add a feature so I can stream the output in real-time (like a radio) to another instance of the music player over internet. I could see how to stream the data later, but first I need to know how to get the bytes of the audio output. I'm asking for help because I'm lost, I've done some research and found nothing but how to stream the desktop audio. That's not a solution, because I want to listen to the same music with some friends while hanging out on Discord, so if I stream the desktop audio, they will listen to themselves besides the music. Any help will be welcome. Thanks in advance!
I am have not used cscore I mainly use naudio a similar library that facilitates getting audio to and from the sound card. So I will try and answer in a way that allows you to find what you are looking for in cscore.
In your player code you will be pulling data from the audio file. In naudio this is done with a audio file reader. I think it is called a wavFileReader in cscore, This file reader translates the audio file into a stream of audio samples in the form of byte arrays, the byte arrays are then used to feed the WASAPI Out to allow the audio to play on the sound card.
The ideal place to start with your streaming system would be in the middle of those two processes. So rather than just passing the audio samples to the sound card you need to take a copy of the byte array containing the samples. it is this data you will need to stream to your friends.
From here you will need to look at compressing the audio and streaming protocols like RTP all can be done in c#. The issue will be, as it always is in audio having your data stream keep pace with the sound card. Every time WASAPIOut asks for more samples you need to have the ready otherwise the audio will be choppy.
I do hope this helps point you in the right direction. Others with experience with cscore may have some code examples to assist you more directly I am simply trying to point you in the right direction
I am working on a WPF application that displays RTSP video streams. Currently, the application handles communication with two types of devices that use RTSP: cameras and archivers (pretty much DVR). Over the lifetime of the app the streams may and usually will be closed multiple times, so we need to make sure they are not cluttering the memory and network when we close them.
We had a go with the MediaElement. We needed to install LAV Filters for ME to display RTSP streams at all. We could see the video from the cameras but the stream wasn’t released until we stopped the video, invoked Close() on MediaElement and set its source to null. The video seemed to be released but we still decided to check memory usage using a profiler. We simply created a loop in which we initialized a new MediaElement (local reference), played RTSP stream and closed it after establishing the connection. Over half an hour of running the test we witnessed a steady increase in memory consumption, and as a result we lost 20MB of memory to all the MediaElements we created. The reason is still unknown to us (timers being bound to the dispatcher?) but after searching the Internet we accepted that there is a problem with MediaElement itself.
We decided this is negligible for our use case (no one is going to create MediaElements with that frequency). Unfortunately, MediaElement was not releaseing streams for archivers when using the same approach. After we got rid of the MediaElement for archivers stream, the archiver’s server still reported the connection being open.
We analyzed the packets with Wireshark. Cameras and archivers use the same version of the RTSP protocol, but when we close the connection on the camera the RTCP and data packets stop coming, which is not the case with the archivers.
We decided to abandon ME altogether and switch to the VLC Player. When we hit stop on the VLC player the connections are nicely closed, but VLC has a bug causing rebuffering of streams at the beginning of any connection. It’s a known issue: https://trac.videolan.org/vlc/ticket/9087. Rebuffering is not consistent. Sometimes it happens twice, sometimes three times. We tried playing around with buffer settings for VLC (network-buffer, live-buffer… you name it) but nothing helped.
There are many questions we are looking answers for:
why ME is keeping the connection alive for archivers but not for cameras? Is archiver not handling RTSP termination packet correctly?
which component is responsible for keeping the connection open on the client side and how could we possibly work around it (terminate the stream)?
how to prevent VLC from rebuffering the stream after the connections is established?
Did you have any success with streaming multiple RTSP streams without performance/memory issues in your application? What components did you use?
Side note: we also played with MediaPlayerHQ, which behaves nicely, unless we kill the process. If we do that the stream remains open for couple of minutes.
I will appreciate any hints and suggestions!
Check out https://net7mma.codeplex.com, it is excellent on memory cpu. It has been tested with 1000 clients connected and the end users never experienced any additional delays. Vlc achieves Rtcp synchronization with my library so buffering should happen only once if at all. The library should also help to allow you to decode the video / audio using media foundation. It also supports getting media information from container files and will support their playback through the included rtsp server or writing them to alternate container formats.
The code is completely managed and written in c# and released under the apache license.
I am trying to stream video content to a windows phone.
I am using the following code.
"player" is the Silverlight Media Player used here.
PlaylistItem item = new PlaylistItem();
item.DeliveryMethod = Microsoft.SilverlightMediaFramework.Plugins.Primitives.DeliveryMethods.AdaptiveStreaming;
item.MediaSource = new Uri("http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest");
item.VideoHeight = strmPlayer.Height;
item.VideoWidth = strmPlayer.Width;
player.Playlist.Add(item);
player.Play();
I am able to play it in the emulator but on the Device i dont see anything.
Can anyone correct me where i am going wrong ?
I sometimes get this log in the debug output window.
A first chance exception of type 'System.InvalidOperationException' occurred in Microsoft.Web.Media.SmoothStreaming.dll
Are you using latest version of the Silverlight Media Framework as available from Codeplex? Could it be a bug in the implementation you are using and latest version could correct that? Otherwise, it is hard to investigate what could be wrong in the network connectivity on the device versus that on your emulator.
BTW, what device are you using?
It was bandwidth issue ! MY pc was using a good speed internet connection and so it was able to play the stream.
My device was connected to the WIFI hub, which was out pf range at some points.
When i took my device near the hub, the stream was played.
I have 2 WPF applications that communicate using a couple of duplex WCF services. I need to enable audio communication also between them. I've been looking for a solution for a while now, but couldn't find a good one.
What I've tried is audio streaming using Microsoft Expression Encoder on the "server" side (the one that feeds the audio), and playing it on the "client" using VLC .NET. It works, at least streaming a song, but it's a big resource eater. The initial buffering also takes a lot, and so is stopping the stream.
What other options do I have? I want a clear, lightweight audio conversation between the apps, kinda like Skype. Is this possible? Thanks
EDIT: I found NAudio and it looks like a good audio library, I managed to stream my microphone quite easily. However, I have big problem - I can hear the voice clearly on the client, but it echoes indefinitely. Plus, there's this annoying background sound (could this be caused by the processor?) and after a while, a very high, loud sound is played on the receiving end. All I can do is stop the whole transmission. I have no idea what's causing these problems. I use the 'SpeexChatCodec' as in the NetworkChat example provided (sampling rate: 8000, 2 channels). Any suggestions? Thanks
It would be a lot of work to write a library that would support that from scratch... if you can spend $150 on this I would suggest purchasing a library like iConf .NET Video Conferencing SDK from AvSpeed...
I've been developing an audio app for Windows Phone 7 and up to this point have been using the WP7 emulator. The app uses a custom MediaStreamSource class to stream audio to a MediaElement.
On a real device, the custom MediaStreamSource completely fails. After calling MediaElement.Play(), the MediaStreamSource's GetSampleAsync method never gets called. It works just fine in the emulator.
I've started the app in the debugger (running on the device) and no exceptions get thrown anywhere.
I'm wondering if maybe my stream source is using a sample rate, bits per sample, or channel count that is not supported? I cannot find any documentation on what values are supported - however I find it hard to believe that my settings are not supported (44,100 Hz, 16 bits/sample, 2 channels).
Thoughts?
The answer is that the Zune software interferes with the phone's media capabilities. The app will work on the device if you disconnect the device from the computer, or by using the WPConnect tool: http://blogs.msdn.com/b/jaimer/archive/2010/11/03/tips-for-debugging-wp7-media-apps-with-wpconnect.aspx