I have a problem with one thing. I searched about show an image from an url with cairo or gtk, but I didn't find anything.
If this is not possible, maybe I could decompose an image in a structure using other
library, and then load it.
Any idea for do it?
GTK+ does not implement a HTTP protocol client, which would be needed in order to get the actual image bits from a URL.
You might want to look into using a dedicated library for that task, such as libcurl.
Related
This is a very simple task that I had such a hard time trying to do. My goal is pretty simple: send an mp4 file from my server to my client, and while its buffering and downloading I want to already play it. That means that I need to play a video.mp4 file while writing it, and I need it to display on some platform that I can control - like wxPython or WPF-Ironpython. Naturally, no such platform will let me play an open file for writing.
I have tried to implement and HTTP server (although totally unnecessary for my case, as I am writing an application-based Server-Client app) that would accept Range request, and when I run the server and load the URL on Chrome, it all works perfect and I can seek and buffering is great, but when I load it from WPF MediaElement it fails to play the video for some point (I cant really tell why as there is no documentation for this, any API, tutorials etc). I am really desperate.
I even thought about playing a video from a buffer and then just changing the buffer's content, but it doesn't seem like this possibility exists.
I am really stuck at this and I would love to get some suggestions. Please note that I am not a professional in this so I would appreciate if you could explain this to me in simple terms.
Thanks!
Not possible. MP4 is not the correct container for your application. You must use something like HLS/dash/fragmented MP4.
Now, I have my simple project on changing the color image into white&black image using CUDA-C.
But I got a problem with importing/loading a bitmap image into program. I don't know how to import it.
So...
CUDA-C have a specific function about importing/loading bitmap image?
If yes, what is it and how to use it?
If no, how do you do with importing/loading bitmap image?
Thank you.
There's really nothing that is CUDA-specific about loading a bitmap image into an application.
If you have a preferred method for loading a bitmap image into an application, you should be able to use it with a CUDA app. You will obviously be loading the image into the host application space first. After that, if you want to transfer it to the device, you can use any of the standard methods for transferring data to the device to accomplish this.
CUDA (i.e. the runtime API) doesn't have any specific functions for importing/loading a bitmap image
-
There are many ways to load an image. If you are already using OpenGL or DirectX, then you will want to use a method associated with one of those APIs, and then use the appropriate interop API within CUDA to manipulate the object.
If you want to import a bitmap image directly into a CUDA program without using a graphics API, take a look at the CUDA samples, as a number of them do this and provide helper functions that you may want to re-use.
For example, the dct8x8 sample provides a file called BmpUtil.cpp which contains a number of useful bitmap import/handling routines, and the dct8x8 app (dct8x8.cu) shows how these may be used directly in a CUDA app.
I'd like to implement the live view function using EDSDK. I have used EdsGetPointer to get the pointer of the memory address for memory streaming. Now I want to display the streaming image on the PC.
I have read in some people use the API on VisualC such as ATL or CImage which able display the streaming image just by passing the pointer of the memory stream as the parameter, and the function could retrieve the streaming images by itself. I am thinking of using OpenCV in order to display the streaming images as I don't have VisualC installed on my computer. Is there any function on OpenCV that I can use to display streaming images? Or is there any other alternatives that I can use to deal with streaming images from EDSDK?
You can pack the data into an IplImage and show it using cvShowImage in a loop: http://opencv.willowgarage.com/documentation/user_interface.html The down side is you're tied into the OpenCV event loop.
There are alternatives. In the past I've used OpenGL to paint an image as a texture so that I could manage the viewport, draw on top of it, etc. You can get a simple and flexible working GUI pretty quickly using GLUT. A benefit to that is that whatever OpenGL code you write will be portable to any other UI library you use as long as that library has an OpenGL canvas widget. What I always do is Camera->IplImage->OpenGL Texture->wxWidgets glCanvas. I still use OpenCV for the actual image processing, etc. It's totally cross-platform and doesn't require the pay version of VC++.
are you want it for LiveView ? if not for liveview, you can save your streaming image in host
using
Error = EdsCreateFileStream(dirItemInfo.szFileName, EDSDK.EdsFileCreateDisposition.CreateAlways, EDSDK.EdsAccess.ReadWrite, out stream);
then you can load it
IplImage *inImg = cvLoadImage("photo2.jpg");
and then can process the image in opencv.
I know people have asked this before, but i see no answer nor people even commenting about it.
So, i'm trying to make SHOUTcast streaming in WP7, anyone have done it? I know i have to use MediaStreamSource with my MediaElement, but how exactly can i skip that header from SHOUTcast and just get the stream and use it in a MediaStreamSource? Is there any app that has done it? Someone actually has some example working code?
There is a really good SHOUTcast Player called streamything (http://www.streamything.com/page/en/default.html) . Unfortunately it is not open source nor freeware but i shows that it is definitely a way to do that.
You need to setup a mechanism to get the stream of data to be passed to the application continuously. Here is a possible implementation. In order to be able to receive the stream directly (so that the application won't be treated as a web browser), you have to call the URL with a semicolon at the end. For example: http://00.00.00.00:8000/;
I have some C code that parses a file and generates another file of processed data. I now need to post these files to a website on a web server. I guess there is a way to do a HTTP POST but I have never done this in c (using GCC on Ubuntu). Does anyone know how to do this? I need a starting point as I have no clue of doing this in C. I also need to be able to authenticate with the website.
libcurl is probably a good place to start.
I think Hank Gay's suggestion of using a library to handle the details is the best one, but if you want to "do it yourself", you need to open a socket to the web server and then send your data in the HTTP POST format which is described here. Authentication can mean a variety of different things, so you need to be more specific.
Unfortunately, all of the above three jobs involve a fair bit of complexity, so you need to break the question down into stages and come back and ask about each bit separately.