using pipes to channel file i/o to another process - c

Just started learning/using pipes and was wondering how to route file output of an application into a pipe so that another application can use it.
To be exact, I want to pipe ffmpeg's output (transcoded video data) into my application to use it. If I create a named pipe like /tmp/out.mp4 and give it to ffmpeg as output filename, ffmpeg is going to try to create this file again, probably overwriting my pipe (Or something like that). How to deal with this kind of situation?
is there any general way to divert File IO of an application transparently?
(I am trying to write a video streaming server (Just for learning and fun) which transcodes formats like avi into streaming friendly format like mpeg4 during streaming, I found ffmpeg to be too slow for this purpose, it was taking like 2 secs to transcode 1 sec video :(
Is it the problem with my setup/PC or ffmpeg is known for sluggishness?
)
PS : I am writing this in C by the way.

ffmpeg can be persuaded to output to a pipe:
ffmpeg -i whatever.avi -f mp4 -
The "-" tells it to output to stdout instead of to a file, and the "-f" tells it what output the output file should be.
You could redirect that to a named pipe, of course, but calling it with popen to get the output as a file descriptor directly seems the way to go to me.

ffmpeg can also read from stdin and write to stdout like this:
ffmpeg -i pipe:0 -f wav pipe:1
where 0 and 1 are the standard POSIX file descriptors.

Related

Output FFmpeg rawvideo to stdin with lib

I am writing a C program using FFmpeg libav, and trying to output rawvideo (from file or device) packets to stdin so that they can be picked up by another program running on my system.
Any suggestions on how to do this?
For example, can I relate to stdin like a filename and just use avformat_alloc_output_context2 and avio_open like I would when recording to a file?
I there some code reference which deals with this scenario?
Thanks.

Feeding FFMPEG from buffer in c code RAW H264 to MP4 wrapping

i have a raw h264 stream coming in from cameras with a custom API. data gets put into a callback function in my c code.
i need to wrap this as mp4. i'm using ffmpeg to do this now, but only after the h264es file has been written and closed, so very time consuming on a beaglebone-like processor.
i have been trying to write this data to a named pipe and feed that to ffmpeg but can not get this to work. maybe i'm not opening/closing pipes properly, it hangs. or not specifying the piping properly for ffmpeg.
is it possible to feed the buffered data more directly to ffmpeg?
or, how do i set up the named pipe to work properly?
first i'm opening the fifo like this
g_fifoname="/tmp/fifocam1.h264";
mkfifo(g_fifoname, 0666); // make the fifos
fd_fifo[ch+brd*2] = open(g_fifoname, O_RDWR);
then, i'm calling ffmpeg like this, at this moment anyway. trying many things.
char* execargs[]={PATH_TO_FFMPEG,"-re","-y","-framerate","30","-f","h264","-video_size","1920x1080","-i",g_fifname,"-c:v","copy","-an",pathname, (char*)0};
i probably got the ffmpeg call wrong. argh.
i open the fifo first, then start ffmpeg.
when streaming is stopped i close fifo's, then close ffmpeg output file.
ffmpeg is so powerful and frustrating to wrangle.
thanks all,

redirection to multiple ttys in c

I see that I can do freopen to redirect stdout going to a console to one another tty. I am trying to redirect the same to multiple terminals including the console. Console is where the program is running. What is the best way to do it?
TIA
You didn't specify what platform you're using, but assuming you can find the file path to the TTY you'd like to redirect to, you can call freopen on the stdout file descriptor. However, that would close the initial file descriptor, which doesn't sound like your desired behavior. A file descriptor can only point to one file.
The easiest C solution is probably going to be a wrapper around printf that calls it on all of your specified files. You might be able to do something with threading, but that's likely to complicate things.
If you're on a *nix system, I suggest using tee which is made for outputting to stdout and secondary files.
There is no really easy way to do this like with freopen. You need some wrapper that takes the input and writes it to each output tty individually.
For example there is the tee program that multiplexes its input to stdout and a number of files. You could for example create a pipe in C that is connected to tee /dev/ttyX /dev/ttyY .... Then you can replace stdout with the pipe file descriptor and you will get the desired behaviour.

How to dump raw RTSP stream to file?

Is it possible to dump a raw RTSP stream to file and then later decode the file to something playable?
Currently I'm using FFmpeg to receive and decode the stream, saving it to an mp4 file. This works perfectly, but is CPU intensive, and will severely limit the number of RTSP streams I can receive simultaneously on my server.
I would like to save the stream to file without decoding it, and delay the decoding part to when the file needs to be opened.
Is this possible?
I have tried VLC, which is even more CPU intensive than FFmpeg. I've also looked at this question where the answer says dumping RTSP to file is not useful, and this question, where the comment below the question says "Raw RTSP content is not well suited for save and replay...", which seems to indicate that there is way.
EDIT
Here is the command I'm using for FFmpeg:
ffmpeg -i rtsp://#192.168.241.1:62159 -r 15 C:/DB_Videos/2013-04-30 17_18_34.703.mp4
If you are reencoding in your ffmpeg command line, that may be the reason why it is CPU intensive. You need to simply copy the streams to the single container. Since I do not have your command line I cannot suggest a specific improvement here. Your acodec and vcodec should be set to copy is all I can say.
EDIT: On seeing your command line and given you have already tried it, this is for the benefit of others who come across the same question. The command:
ffmpeg -i rtsp://#192.168.241.1:62156 -acodec copy -vcodec copy c:/abc.mp4
will not do transcoding and dump the file for you in an mp4. Of course this is assuming the streamed contents are compatible with an mp4 (which in all probability they are).
With this command I had poor image quality
ffmpeg -i rtsp://192.168.XXX.XXX:554/live.sdp -vcodec copy -acodec copy -f mp4 -y MyVideoFFmpeg.mp4
With this, almost without delay, I got good image quality.
ffmpeg -i rtsp://192.168.XXX.XXX:554/live.sdp -b 900k -vcodec copy -r 60 -y MyVdeoFFmpeg.avi
You can use mplayer.
mencoder -nocache -rtsp-stream-over-tcp rtsp://192.168.XXX.XXX/test.sdp -oac copy -ovc copy -o test.avi
The "copy" codec is just a dumb copy of the stream. Mencoder adds a header and stuff you probably want.
In the mplayer source file "stream/stream_rtsp.c" is a prebuffer_size setting of 640k and no option to change the size other then recompile. The result is that writing the stream is always delayed, which can be annoying for things like cameras, but besides this, you get an output file, and can play it back most places without a problem.

Writing output of an application as a sound file

I am using espeak on BSD to output text as sound. My problem is that I want it to take it as a .mp3 but I am having little luck. I tried piping the output to tee but I guess that only works with stdout not just playing a sound.
Any ideas? My last resort would be recompiling my own version of espeak that allows me to save to a file instead of playing it
you can write it as wave and then convert it with ffmpeg:
espeak "HelloWorld" -w <file>.wav
Or pipe to ffmpeg
espeak "HelloWorld" --stdout | ffmpeg -i pipe:0 output.mp3
From the documentation:
-w < wave file>
Writes the speech output to a file in WAV format, rather than speaking it.
--stdout
Writes the speech output to stdout as it is produced, rather than speaking it. The data starts with a WAV file header which indicates the sample rate and format of the data. The length field is set to zero because the length of the data is unknown when the header is produced.
It looks like both of those options produce WAV files, but you can easily convert those without another program like ffmpeg.

Resources