I am following this Audio Encode example from the ffmpeg docu: https://www.ffmpeg.org/doxygen/0.6/api-example_8c-source.html
But instead of a .mp2 file i want to decode a .wav file
So I changed this lines in my program:
codec = avcodec_find_encoder(AV_CODEC_ID_WAVPACK)
if(!codec){
fprintf(stderr,"codec not found\n");
exit(1)
}
But I always enter the if -> it seems I cannot open the codec.
I have installed wavpack and libva with --enable-libwavpack.
So your question is unclear, you're talking about .wav decoding and wavpack encoding, I assume you mean you want to use a .wav file and encode it to wavpack/.wv (as opposed to thinking that .wav is related to wavpack in any way; it isn't).
If that's the case, then the reason it's not working is because you're using libav without wavpack support. You may have installed wavpack, but libav didn't find it when compiling, so it has no support for it. You could consider using ffmpeg, which has a built-in wavpack encoder that does not depend on libwavpack, or figure out how to compile libav with libwavpack support (probably need to install developer packages if you're using pre-built binaries from your Linux distribution, or alternatively tell libav where these are located using --extra-cflags= and --extra-libs= when running configure).
Related
I'm trying to program with ffmpeg for the first time, on Ubuntu. The nonclemature seems a bit messy but that's the same as libavformat, libavcodec, right?
I can compile and run a program which calls avformat_open_input on an mp3 file, apparently successfully, and now I am trying to get some information out of it.
For a start I want to call avcodec_get_name on the audio_codec_id member of AVFormatContext but no function of that name appears in any headers in /usr/include/libav{codec,format,util}.
I checked out the latest ffmpeg from GitHub and diffed avcodec.h against my installed version. The files are clearly much the same but, among other differences, avcodec_get_name(AVCodecID) has been replaced with avcodec_get_class(void) although there doesn't seem to be any correspondence between the two.
Confusingly, this documentation omits avcodec_get_class while this documentation omits avcodec_get_name. Looking at the urls, the former ought to be the latest version, corresponding to what I checked out of GitHub and the latter ought to be outdated, but the reverse seems to be true.
Can someone please tell me
what API documentation should I be using for a) the ubuntu libraries and b) the GitHub latest?
if I build against the latest source, will I have to statically link in order to distribute it?
if I want to dynamically link and build against the provided Ubuntu headers, how do I get the name of a codec given its enum, if avocodec_get_name isn't available?
If you run ffmpeg on your Ubuntu box, you should see the versions of the libraries installed with the package. If the package is sane, it will be a long-term release of ffmpeg (e.g. 1.1.x). You can then locate the documentation for that release by e.g. checking out the corresponding version from the ffmpeg repo. You may also find the documentation included with the tarball here.
Note that I'm not sure whether the ffmpeg package included with your version of Ubuntu is the real ffmpeg or the so-called libav fork of ffmpeg. If you don't know about this, now would be a good time to learn. If you don't have real ffmpeg, all bets are off, unfortunately. The libav people could have removed avcodec_get_name() and there's nothing you can do about it. Also note that I just tried compiling my ffmpeg-based application with a very recent (< 2 weeks) old git checkout of ffmpeg and I get no deprecation warnings for my use of avcodec_get_name(). This suggests to me that calling avcodec_get_name() is valid practice with current ffmpeg.
No. You can build shared and distribute the .so dynamic link libraries with your app. Then you only have to distribute the ffmpeg source if you modify it and you don't have to distribute your source. More info. I am not a lawyer, and this is not legal advice.
I believe this is already answered in 1.
I am a new user of ffmpeg. Ffmpeg has a good documentation on using it in command-line, but i am looking for some C API code.
I want to make a software using C, that would capture video stream from webcam and give me the video stream in raw format, that I would encode in a codec later.
I have visited this given link, but it provided only the command-line use, not the use of libraries provided by ffmpeg:
http://ffmpeg.org/trac/ffmpeg/wiki/How%20to%20capture%20a%20webcam%20input
I also visited this link, which gave me good idea on using the libavcodec, but no other tutorial is available :
ffmpeg C API documentation/tutorial
Please someone help me finding C api on video stream capturing from webcam using ffmpeg's library. Thanks in advance.
You are basically repeating the question you are referring to. FFmpeg is basically the name of the library and the ready to use tool (command line interface). Its back end is a set of libraries: libavcodec, libavformat, swscale etc.
There is no comprehensive documentation on these libraries, instead there are samples, there is mailing list and other sparse resources. Also, the libraries are open source and all these are quite usable once you get the pieces together. Specific questions on ffmpeg are asked/answered on StackOverflow as well.
Is there a library for creating zip files (the zip file format not gzip or any other compression format) on-the-fly (so I can start sending the file while it is compressing) for very large files (4 Gb and above).
The compression ratio does not matter much (mostly media files).
The library has to have a c-interface and work on Debian and OSX.
libarchive supports any format you want, on the fly and even in-memory files.
zlib supports compressing by chunks. you should be able to start sending a small chunk right after compressing it, while the library is still compressing the next chunk. (see this example)
(unfortunately, the file table is stored at the end of the zip file, so the file will be unusable until it is complete on the receiver side)
While this question is old and already answered I will note a new potential solution for those that find this.
I needed something very similar, a portable and very small library that created ZIP archives in a streaming fashion in C. Not finding anything that fit the bill I created one that uses zlib, available here:
https://github.com/CTrabant/fdzipstream
That code only depends on zlib and essentially provides a simple interface to creating ZIP archives. Most importantly (for me) the output can be streamed to a pipe, socket, whatever as the output stream does not need to be seek-able. The code is very small, a single source file and a header file. Works on OSX and Linux and probably elsewhere. Hope it helps someone beyond just me...
How can I use zlib library to decompress a PNG file? I need to read a PNG file using a C under gcc compiler.
Why not use libpng? The PNG file format is fairly simple, but there are many different possible variations and encoding methods and it can be fairly tedious to ensure you cover all of the cases. Something like libpng handles all the conversion and stuff for you automatically.
I've code once a basic Java library for reading/writing PNG files: http://code.google.com/p/pngj/
It does not support palleted images but apart from that[Updated: it supports all PNG variants now] it's fairly complete, simple and the code has no external dependencies (i.e. it only uses the standard JSE API, which includes zip decompression). And the code is available. I guess you could port it to C with not much effort.
If this is a homework assignment and you really are only restricted to the standard C library, you to be looking at the official PNG file format specification: http://www.w3.org/TR/PNG/. However, are you sure you really need to be decoding the PNG file? If all you need to do is display it somehow, you're headed on the wrong path.
It will be rather complex and time consuming to write a decoder for any general PNG file, but not too bad for simple ones. In fact, because the PNG format allows for pieces of it to be compressed, to do it with only standard C libraries would require you to implement gzip decompress (a reasonable homework assignment for a mid-level undergrad course, but my guess is that you would have spent a lot of discussing compression algoirthms before this was assigned to you)
However, it isn't terribly difficult if you restrict yourself to non-compressed, non-interlaced PNG files. I wrote a decoder once in Python that handled only the easy cases in a couple of hours, so I'm sure it'll be doable in C.
You should probably read up on how a binary file-format works and use a hex-editor instead of a text-editor to look at the files. Generally you should use libpng to handle png-files as stated earlier but if you want to decode it yourself you have alot of reading to do.
I recommend reading this http://www.libpng.org/pub/png/book/chapter13.html
I have a video decrypter library that can decode an obsolete video format, and returns video frames in BMP and audio in WAV through callback functions. Now I need to encode a new video in any standard format under windows.
What might be the standard way to do this? I am using Win32/C. I guess Windows has its own encoding library and drivers and I don’t need to use FFMPEG. Any pointer to an example code or at least to a library to look at will be greatly helpful.
Edit: I accept. FFMPEG is the easiest way to do it.
On Windows, you have two native choices.
The old Windows Multimedia library which is too ancient to seriously consider, but does have the Video Compression Manager.
And then there's DirectShow.
It's certainly doable through DirectShow, but you better enjoy COM programming and the concepts of Filters, Graphs, and Monikers (oh my). See this tutorial for a crash course:
Recompressing an AVI File
And the MSDN documentation.
A simpler approach is indeed to use an library like FFMPEG or VLC.
To save yourself heartache, I echo Frank's suggestion of just using FFMPEG. Executing a separate FFMPEG process with the correct arguments will 100% be the easiest way to achieve your goals of encoding.
Your other options include:
libavcodec - The central library used in FFMPEG. I warn there don't appear to be many Windows binaries of libavcodec available, so you'd probably have to compile your own, which, at minimum, would require a Cygwin or MingW set up.
ffdshow-tryouts - A video codec library implemented as a DirectShow filter based on libavcodec. They do seem to have an API for manipulating it, but it's a .NET library.
I would suggest looking at the VirtualDub source code. It's a well known encoder that uses VFW. You may be able to get some ideas from that software.