angular $sce trustAsResourceUrl from node - angularjs

im trying to make a streaming service where i stream the content of a file (in this case a video) into a video element.
for this purpose i have downloaded and installed videogular and is now trying to set it up however im sure how to do it.
According to the documentation on videogular to load a video you would need a syntax like this:
sources: [
{src: $sce.trustAsResourceUrl(myMp4Resource), type: "video/mp4"}
]
Which is fine for when you want to load the content in without streaming.
But say for instance you have a node server running at port 8105 and the file you wish to collect had an id of 1 then the result might look something like this:
sources: [
{src: $sce.trustAsResourceUrl('http://localhost:8105/loadvideo/1'), type: "video/mp4"}
]
However in my attempt to do so it would just tell me that the resource is not an actual resource.
My question is how do you stream to a video content (preferably with videogular) and does anyone know of examples where people have made this possible?
Server side code
Okay so my initial idea (and i know this is a change for the code above) was to create a route that took at path:
router.route('/retrieveFile')
.post(function (request, response) {
var path = '../' + request.body.data;
var file = fs.createReadStream(path);
file.pipe(response);
});
And then piped the output of the file.
Then use this to stream the file

If you have video files on your harddrive and you want to serve them all with their filenames, you should just use Express Static to serve them just like any other resource
You can add a path prefix '/videos' to differentiate them from regular resources.
app.use('/videos', express.static('videos'));
Then a video file ./videos/myvid.mp4 would be available as http://localhost:8000/videos/myvid.mp4

To have a file available as a file, you need to set the appropriate headers before piping
And to load the file you'd put this code in your router, and where you're using post, if you don't have a strong reason I'd just use get or all
You might also wanna be able to end the transmission if client decides to disconnect mid-stream
Alternatively you might want to go with res.download instead of streams, which which case appropriate headers and interruptions are automatically handled.
So the whole code might look like this:
router.route('/path/to/video.mp4')
.all(function(req, res){
res.header('content-disposition', 'filename="video.mp4"')
var stream = fs.createReadStream('./resources/video.mp4');
stream.pipe(res);
require('on-finished')(res, stream.abort.bind(stream));
// or simply
res.download(fs.readSync('/path'))
});
Then you can use http://localhost:8000/path/to/video.mp4 to either directly load the video into your browser, it'll play it if it can or simply offer to download. Or you can use this URL in your videgular
sources: [ {src: $sce.trustAsResourceUrl('http://localhost:8000/path/to/video.mp4'), type: "video/mp4"} ]

Related

pdf.js and protected files not otherwise viewable

I am using the PDF.js library to display PDf files within my site (using the pdf_viewer.js to display documents on-screen), but the PDF files I am displaying are confidential and I need to be able to show them within the site but block non-authorized public folks from being able to view the same files just by typing in theie URLs and seeing them show up right in their browser.
I tried to add the Deny from all line in my htaccess file, but that also of courfse blocked the viewer from showing the docs, so that seems to be a no-go. Clearly anyone could simply look at inspector and see the pdf file that is being read by the viewer, so it seems a direct URL is not going to be secure in any way.
I did read about PDF.js being able to read binary data, but I have no knowledge of how I might read in a PDF in my own file system and prep it for use by the library, eveen if that means it is all a bit slower in loading to get the file contents and prep it on the fly.
Anyone have a solution that allows PDFJS to work without revealing the source PDF URL, or to otherwise read the file using local file calls?
Okay, after some testing, the solution is very easy:
Get the PDF data using an Ajax-called function that can figure out what actual file is to be viewed.
In that PHP file...
Read the file into memory, using fopen and fread normally.
Convert to base64 using the base64_encode
Pass that string back to the calling Javascript.
In the original calling function, use the following to convert the string to a Uint array and then pass that to the PDFJS library...
## The function that turns the base64 string into whatever a Uint8 array is...
function base64ToUint8Array(base64) {
var raw = atob(base64);
var uint8Array = new Uint8Array(raw.length);
for (var i = 0; i < raw.length; i++) {
uint8Array[i] = raw.charCodeAt(i);
}
return uint8Array;
}
## the guts that gets the file data, calls the above function to convert it, and then calls PDF.JS to display it
$.ajax({
type: "GET",
data: {file: <a file id or whatever distinguishes this PDF>},
url: 'getFilePDFdata.php', (the PHP file that reads the data and returns it encoded)
success: function(base64Data){
var pdfData = base64ToUint8Array(base64Data);
## Loading document.
PDFJS.getDocument(pdfData).then(function (pdfDocument) {
## Document loaded, specifying document for the viewer and
## the (optional) linkService.
pdfViewer.setDocument(pdfDocument);
pdfLinkService.setDocument(pdfDocument, null);
});
}
});

convert cordova files path to File object

I am trying to build a simple photo upload app on Ionic (Cordova). I am using the cordovaImagePicker plugin to have the user select images from the mobile device. This plugin returns an array of paths on the device.
For handling the upload part I am using jquery-file-upload (mostly because that is what I used for the browser version and I am doing all kinds of processing for which I have the code ready). The problem is however that jquery-file-upload expects to work with an input element <input type="file"> which creates a javascript File object containing all kinds of metadata.
So in order to get the cordovaImagePicker to work with jquery-file-upload, I figure I have to convert the filepath to a File object. Below I am using the cordova file plugin to achieve this:
$cordovaImagePicker.getPictures($scope.pickOptions).then(function(filelist) {
$.each(filelist, function (index, filepath) {
$window.resolveLocalFileSystemURL(filepath, function(fileEntry) {
fileEntry.file(function(file) {
var reader = new FileReader();
reader.onloadend = function(e) {
fileObj = new File([this.result],"filename.jpg",{type: "image/jpeg"});
// send filelist from cordovaImagePicker to jquery-fileupload as if through file input
$('#fileupload').fileupload('send', {files: fileObj});
};
reader.readAsArrayBuffer(file);
}, function(e){$scope.errorHandler(e)});
}, function(e){$scope.errorHandler(e)});
});
}, function(error) {
// error getting photos
console.log('Error selecting images through $cordovaImagePicker');
});
So first of all this is not really working correctly, apparently I am doing doing something wrong, since for example the type attribute ends up being another object that contains the type attribute with the correct values (and other such weird issues). I would be happy if someone could point out what I am doing wrong.
Surely there must be something (cordova plugin?) that I am not aware of that does this conversion for me (including for example adding a thumbnail)? Alternatively, maybe there is something that can easily make jquery-file-upload work with filepaths? I couldn't find anything so far...
However, it feels I am trying too hard here to force connecting two components that were just not built to work together (File objects vs filepath) and I should maybe just rewrite the processing and use the cordova file transfer plugin?
I ended up rewriting the uploader with the cordova-file-transfer which works like a charm. I wasted more time trying to work around it than just rewriting it from scratch.

ng-flow - no destination directory option

I am using ng-flow in my application and it works pretty well. Currently, the destination directory for the files being uploaded is set in my web.config and used within my webapi controller method.
What I want to do is allow the user to specify the destination, rather than it come from config. However, looking at the docs, I don't see an option that I can add to the below appconfig for this:
function appConfig(flowFactoryProvider) {
flowFactoryProvider.defaults = {
target: 'api/upload',
permanentErrors: [404, 500, 501],
maxChunkRetries: 1,
chunkRetryInterval: 5000,
simultaneousUploads: 4
};
flowFactoryProvider.on('catchAll', function (event) {
console.log('catchAll', arguments);
});
}
Am i missing something or do I need to handle this myself?
You should see a clear separation between your front end and you back end. That is why you are not able to set in flow-js or any other framework.
What you can do, is let the user specify it in a form field and leave the handling to the server side. This way it is also possible to filter on allowed locations. You don't want your client to add files to your system directories.

Google Apps Scripts - How to replace a file?

I'm trying to replace a PDF file in a Google Drive Folder using a script. Since GAS does not provide a method for adding revisions (versions), I'm trying to replace the content of the file, but all I get is a blank PDF.
I can't use the DriveApp.File class since our Admin has disabled the new API, so I have to use DocsList.File instead.
Input:
OldFile.pdf (8 pages)
NewFile.pdf (20 pages)
Output expected:
OldFile.pdf with the same content as NewFile.pdf
Real Output:
OldFile.pdf with 20 empty pages.
Process:
var old = DocsList.getFileById("####");
var new = DocsList.getFileById("####");
old.replace(new.getContentAsString());
Any ideas, please?
Thanks a lot in advance.
PS.: I also tried calling old.clear() first, but I'd say the problem lies on the getContentAsString method.
The Advanced Drive Service can be used to replace the content of an existing PDF file in Google Drive. This answer also includes an example of how to update a PDF file in a shared Drive.
function overwriteFile(blobOfNewContent,currentFileID) {
var currentFile;
currentFile = DriveApp.getFileById(currentFileID);
if (currentFile) {//If there is a truthy value for the current file
Drive.Files.update({
title: currentFile.getName(), mimeType: currentFile.getMimeType()
}, currentFile.getId(), blobOfNewContent);
}
}
References
https://developers.google.com/apps-script/advanced/drive
https://developers.google.com/drive/api/v3/reference/files/update
An example of using with a shared Drive:
Drive.Files.update({ title: currentFile.getName(), mimeType:
currentFile.getMimeType() }, currentFile.getId(), blobOfNewContent,
{supportsTeamDrives: true});
Try to get it as a blob datatype instead.

How to use JSON without json file?

I need to use dynamically JSON with data.TreeStore.
With this component, there is proxy "config", it need a path to JSON file.
My problem is, i can't write Json file in my application.
I would know, if i can generated JSON dynamically and pass it to url config into proxy?
For example :
Var trStore = Ext.create('Ext.Data.TreeStore',{
... // config
proxy {
type : 'ajax',
url : { id : 'id0', task :'task0', value : 'val0', ..... }
}
});
My URL is not a file url but is JSON generated with my own method !
How to build JSON for use it with TreeStore and without make file !?
I hope you understand my problem :)
Thanks a lot to help !
Your example looks like you want to pass static "inline data" to the TreeStore.
As far as I can see this is not possible with a bare TreeStore, since it does not have a data config option as the "normal" Store has. However, it is possible with a Treepanel.
You can pass your inline data to the TreeStore using the root config option of the Treepanel (not the TreeStore). It works in a very similar manner as the data config option of a "normal" Store:
Ext.create('Ext.tree.Panel', {
root: { id : 'id0', task :'task0', value : 'val0', children: [...], ... }
// ...
});
There are two caveats related to this:
The beta3 docs say root is boolean, that's wrong.
Because of a bug in beta3 you cannot use this together with rootVisible: false.
Remember that a "json file" is really just a text string, so you can generate that with PHP or your preferred server software.
For the url in the proxy, simply put in the url you use to run that function. Eg in my web app I have http://example.org/controller/getTree?output=json
This runs the getTree() function on my controller, and the function knows to return json.

Resources