I have started audio streaming from my Android Phone. I am able to play it in my laptop using pulse audio module loop back. But I wanted to play it using gstreamer. How would this be done?
How do you start the stream on the phone? How do you play it on the laptop (loop back from what source)?
A general starting point for gstreamer is gst-launch playbin2 uri="...". The first thing on the URI is the protocol (e.g. rtsp:// or http://) then your phones IP address, then the location of the stream (if needed).
Related
I hope to know how to capture audio by using dummy sound card driver.
I'm thinking how to implement the steps below.
we play audio in ubuntu, however the audio is just played through dummy sound card driver, to capture audio stream.
captured audio is sent to windows through network.
audio is actually played in windows.
What you need is to activate ALSA snd-aloop module, that provides a full-duplex virtual loopback soundcard. Please have a look to the following links for instructions about activation and example usage:
https://github.com/TheSalarKhan/Linux-Audio-Loopback-Device
https://sysplay.in/blog/linux/2019/06/playing-with-alsa-loopback-devices/
A couple of important points to consider:
The subdevices are linked in pairs; whatever you play on hw:n,0,m goes out on hw:n,1,m (see the example in the 1st link)
The first application opening one of the subdevices will force the second application to use the same set of parameters: sample rate, format, number of channels. For example, suppose the recording application opens a capture stream on hw:2,1,0 with stereo/44100/S16_LE format; the playback application on hw:2,1,0 will be forced to use a the same stereo/44100/S16_LE format
Hope this helps
I am trying to stream video content to a windows phone.
I am using the following code.
"player" is the Silverlight Media Player used here.
PlaylistItem item = new PlaylistItem();
item.DeliveryMethod = Microsoft.SilverlightMediaFramework.Plugins.Primitives.DeliveryMethods.AdaptiveStreaming;
item.MediaSource = new Uri("http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest");
item.VideoHeight = strmPlayer.Height;
item.VideoWidth = strmPlayer.Width;
player.Playlist.Add(item);
player.Play();
I am able to play it in the emulator but on the Device i dont see anything.
Can anyone correct me where i am going wrong ?
I sometimes get this log in the debug output window.
A first chance exception of type 'System.InvalidOperationException' occurred in Microsoft.Web.Media.SmoothStreaming.dll
Are you using latest version of the Silverlight Media Framework as available from Codeplex? Could it be a bug in the implementation you are using and latest version could correct that? Otherwise, it is hard to investigate what could be wrong in the network connectivity on the device versus that on your emulator.
BTW, what device are you using?
It was bandwidth issue ! MY pc was using a good speed internet connection and so it was able to play the stream.
My device was connected to the WIFI hub, which was out pf range at some points.
When i took my device near the hub, the stream was played.
I need to play content from a video camera into a web page. I'm trying for now, to use a silverlight player (which seems to work ok with publishing points from WMS). The problem is that the camera is using RTSP protocol, and the player doens't seem to know this...
So I was thinking of using WMS to create a publish point for the stream that comes from camera, but I couldn't set up the publishing point to accept the rtsp source, it gives me an error saying that "Invalid or corupt data was encountered".
Is there any way to use rtsp as content source? If it is maybe you can point some details that I should be carefull of?
The camera works with "rtsp://192.168.1.22/profile4/media.smp" (tested in VLC player)
Thank you
Well, I received an answer from the camera manufacturer support team:
WMS does NOT support RTSP Stream. Please use VLC player or Quick Time
Player.
I don' know if they are actually right...
I've been developing an audio app for Windows Phone 7 and up to this point have been using the WP7 emulator. The app uses a custom MediaStreamSource class to stream audio to a MediaElement.
On a real device, the custom MediaStreamSource completely fails. After calling MediaElement.Play(), the MediaStreamSource's GetSampleAsync method never gets called. It works just fine in the emulator.
I've started the app in the debugger (running on the device) and no exceptions get thrown anywhere.
I'm wondering if maybe my stream source is using a sample rate, bits per sample, or channel count that is not supported? I cannot find any documentation on what values are supported - however I find it hard to believe that my settings are not supported (44,100 Hz, 16 bits/sample, 2 channels).
Thoughts?
The answer is that the Zune software interferes with the phone's media capabilities. The app will work on the device if you disconnect the device from the computer, or by using the WPConnect tool: http://blogs.msdn.com/b/jaimer/archive/2010/11/03/tips-for-debugging-wp7-media-apps-with-wpconnect.aspx
I am researching a project in which I need to playback simultaneously a multi-track audio source. ( >30 mono channels ) The audio on all channels needs to start simultaneously and be sustained for hours of playback.
What is the best audio API to use for this? WDM and ASIO have come up in my searches. I will be using a MOTU PCI Audio interface to get this many channels. The channels show up as normal audio channels in the host PC.
ASIO is definitely the way to go about this. It will keep everything in sync properly, with low latency, and is the defacto industry standard way to do it. Any pro audio interfaces supports ASIO, and for interfaces that don't, there is a wrapper that is capable of syncing multiple devices.