Handling a large file upload with "io.Pipe" on the server side - google-app-engine

As part of my learning process I recently started a new challenge to create a photomosaics web app with golang. Since I am planning on hosting it on App Engine I divided it into 2 services: 1 to handle image uploads and the other to handle image processing. What I want to accomplish here is to start processing the image as soon as the first bytes are received. While doing my research I came across this gist cryptix/client.go. I need to use this method to read the request into a io.Pipe and then stream it on the fly to my image processing service and none of my googling helped: all there is to see is a GO client sending files while I want a Go server receiving files.
NOTE:
The two services are communicating via HTTP.
I am using the REST pattern(No html form: Testing with POSTMAN).
Concrete examples are most welcome.
Please don't frown this is my first stackoverflow question.

Somewhat reluctant to answer this as i'm not sure what the exact problem is or what you have tried.
But this is just a streaming http read.
A http.Request actually holds the request body in a io.ReadCloser. This means that standard io.Reader implementations can stream this.
What you need to do is create a function that processes your data.
func process(r io.Reader) {
// do something
}
then you can read it from the http handler
func(w http.ResponseWriter, r *http.Request) {
process(r.Body)
}
Without knowing your exact question or what you need help with I cannot elaborate further.

Related

Sanity Check: Is it possible to proxy between an HLS(m3u8) video stream and an angularjs app (ui)?

I need to create a Spring Boot WebFlux rest web service to act as a proxy between an angularjs app that shows a video stream and an endpoint at dacast.com that delivers m3u8 playlist-based content.
At this time, there is a video component in the angular app that takes the following uri and presents the content to the user. I plan to create a reactive webflux rest service, but am at a loss as to how to implement this proxy. There are a lot of posts online about viewing the HLS feed in HTML, but nothing about how to proxy between the stream and a consumer of it.
https://dcunilive11-lh.akamaihd.net/i/dlive_1#xxxxxx/master.m3u8
I believe that I need to download the master.m3u8 file, which will contain https endpoints that I can download as a Flux stream and pass along to the angular app. Does this make sense? I'd appreciate your help and tips...
Thanks,
Mike
The m3u8 file is a text file which contains some info about the video and links to the media streams as you say.
The simplest way for the angular app to play the video would be just to provide the link to the original m3u8 file to it directly, but I am guessing it can't reach that link for some reason in your use case.
Assuming this is correct, it sounds like your web service just needs to act as a proxy for the m3u8 file link and the media streams.
There are some instructions in the online Spring documentation for this - e.g.: https://cloud.spring.io/spring-cloud-gateway/1.0.x/multi/multi__building_a_gateway_using_spring_mvc.html
One thing that may be causing some confusion is that the HLS media streams are actually transferred between the client and the server as a series of client requests and server responses, i.e. similar to regular HTTP request/responses. They are not constantly streaming, i.e. something that that you might use a websocket to read.

Google Smart Home: Fullfill action.devices.commands.GetCameraStream asynchronously

I am implementing google smart home actions for my device. The device is a camera with the action.devices.traits.CameraStream. I want to know what is the best way to respond to the action.devices.commands.GetCameraStream command asynchronously.
Currently, once my server receives this command, it needs to notify the device and wait for the device to start streaming. Then the server can respond to google with the cameraStreamAccessUrl. This is not ideal because the server is being blocked and exactly how it knows the device has started streaming is a bit tricky. I am wondering if there is a better way to achieve this, for example, the server can respond immediately with some sort of deferred response and have the device tell google what the cameraStreamAccessUrl is.
Is this possible? Thanks for your help!
It sounds like you're trying to find something like follow-up responses to asynchronously notify the stream has started. Unfortunately, CameraStream does not currently support follow-up responses, but you could file a feature request on the public tracker.

Angularjs 1 - one request, multiple responses

I have a page with multiple widgets, each receiving data from a different query in the backend. Doing a request for each will consume the limit the browser puts on the number of parallel connections and will serialize some of them. On the other hand, doing one request that will return one response means it will be as slow as the slowest query (I have no apriori knowledge about which query will be slowest).
So I want to create one request such that the backend runs the queries in parallel and writes each result as it is ready and for the frontend to handle each result as it arrives. At the HTTP level I believe it can be just one body with serveral json, or maybe multipart response.
Is there an angularjs extension that handles the frontend side of things? Optimally something that works well with whatever can be done in the Java backend (didn't start investigating my options there)
I have another suggestion to solve your problem, but I am not sure you would be able to implement such a thing as from you question it is not very clear what you can or cannot do.
You could implement WebSockets and the server would be able to notify the front-end about the data being fetched or it could send the data via WebSockets right away.
In the first example, you would send a request to the server to fetch all the data for your dashboard. Once a piece of data is available, you could make a request for that particular piece and given that the data was fetched couple of seconds ago, it could be cached on the server and the response would be fast.
The second approach seems a more reasonable one. You would make an HTTP/WebSocket request to the server and wait for the data to arrive over WebSocket.
I believe this would be the most robust an efficient way to implement what you are asking for.
https://github.com/dfltr/jQuery-MXHR
This plugin allows to parse a response that contains several parts (multipart) by having a callback to parse each part. This can be used in all our frontends to support responses for multiple data (widgets) in one requests. The server side will receive one request and use servlet 3 async support (or whatever exists in other languages) to ‘park’ it, sending multiple queries, writing each response to the request as each query returns (and with the right multipart boundary).
Another example can be found here: https://github.com/anentropic/stream.
While both of these may not be compatible with angularjs, the code does not seem complex to port there.

Creating a channel for webRTC video chat

I've been following the HTML5rocks webRTC guide and I have the Javascript set up as described, however the guide is not clear on how to receive a channelToken, roomKey, and User ID. The guide says,
"Note that values used in the JavaScript, such as the room variable and
the token used by openChannel(), are provided by the Google App Engine
app itself: take a look at the index.html template in the repository
to see what values are added."
Unfortunately the link provided is no good and I'm left with very little information regarding the most essential step in this process. The guide isn't clear about whether or not the Google App Engine is a necessary component and I don't see why it should be. I have searched the web in an attempt to find a more useful source, but I was unsuccessful. I also took a look at the webRTC Demo(https://apprtc.appspot[dot]com), that too was no help seeing that the channel information is generated server side. I feel like I should just be able to make a simple http request to some Google server and then run from there. Any information regarding my problem would be much appreciated.
Apologies: the code for this example has been moved to here.
(Been meaning to update the article, but haven't had a chance...)
The apprtc.appspot example uses the Channel API on App Engine for signaling, but there are lots of other ways to do this. Signaling mechanisms are not defined by the WebRTC spec. (Note that signaling, which is accomplished via a signaling service, is the exchange of network and media metadata in order to set up a WebRTC 'call': the actual data is communicated directly between peers.)
We ran a codelab at Google I/O, which describes from start to finish how to build a video chat application that uses Socket.io on Node.js for signaling (it's very simple!) You might want to try that instead.

How can a server communiate with two clients at once (JavaScript, HTML, PHP)?

I got an assignment to do and for that I could use any www technology like HTML, JavaScript, PHP etc. I'm really sorry to say that I haven't studied any of these technologies. Therefore I took few tutorials and skimmed through them searching for answers.
I found solutions for many problems but one problem yet unsolved. It is this:
I want two clients to communicate through a server for this assignment. One send a message, server processes it and forwards it to the next.
None of PHP tutorials showed me anyway of doing this. All of them talked of communication between one client with a server.
Please help. Show me a way to do this. Thanks.
Currently, without reverting to cutting-edge (and possibly hacky/unreliable) techniques, your PHP server cannot initiate communications with a page you've already loaded into a web browser. This is a result of the way the HTTP protocol works.
One way to solve this would be polling on the "receiving" end for data. Something like a publish-subscribe pattern.
One way to do this would be:
One client sends data to the server using an HTTP request (XHR aka AJAX) specifying the target for this data (the other client).
The server stores this data in a persistent storage (local file, database, etc).
The second client periodically sends a request to the server asking if there's any new data for it to consume. This can be done using setInterval and XHR in JavaScript.
I would suggest you take a look at:
http://en.wikipedia.org/wiki/Publish/subscribe
And also, for a cutting edge way to do this, check out Socket.IO:
http://socket.io
You might want to Google on "php chat server." Building a chat server is a simple way to get started.
http://net.tutsplus.com/tutorials/javascript-ajax/how-to-create-a-simple-web-based-chat-application/
http://code.jenseng.com/jenChat/

Resources