How to sync files of local filesystem to upload using angularJS - angularjs

Basically what it means is let's say I upload a file, and then I make some changes to the file, for example, a .txt file. The uploaded file automatically syncs itself with the original file in the local system, we don't have to upload it again. And this will be done either as soon as there is a change or after a fixed interval of time. Is it possible using javascript(angularjs)?
I guess we will need the path of the file that we want to sync and read the file repeatedly after a set interval of time. But browsers are not allowed to access the full path so I am not sure how to proceed with this. Any suggestions would be helpful.

Related

How to avoid AutoCAD DWG file being renamed and recreated during save?

I edit DWG files on my virtual drive, created with Cloud Filter API. When I edit DWG file using AutoCAD, the file is being renamed and than recreated. Here is the sequence of operations in file system when I save a file in AutoCAD:
I tried several types of hydration policies (Full, Progressive and Partial) and some hydration policy modifiers (DehydrationAllowed, ValidationRequired and None) on sync root, but with no success.
I know that it is possible to avoid the file recreation, at least it is not recreated with SharePoint/OneDrive. AutoCAD somehow correctly detects OneDrive.
How do I avoid file being recreated during save operation?
If I understand, you want to prevent the rename of the current drawing to a ".bak" file and the creation of a new ".dwg" file when you save a file.
The simple answer is to set the AutoCAD system variable ISAVEBAK to 0 as indicated in this web page ISSAVEBAK System Variable.
However, as mentioned on this page, you could lose the drawing file if there were a glitch during the save process. Yes - it is possible to lose the whole file due to a glitch.
Be aware that this rename / create process is the standard operational process with AutoCAD. Unless specifically overridden, this happens during each save of every drawing.
Lastly, note that this setting is saved in the registry. As such, once set, if applies to the save of every drawing from that point forward - unless un-modified via an AutoCAD AutoLisp routine (such as those that run upon starting AutoCAD)

Can one ever access a file stored in Heroku's ephemeral file system via the browser?

I've been writing an app to deploy on Heroku. So far everything has been working great. However, there's one last step that I haven't been able to solve: I need to generate a CSV file on-the-fly from the DB and let the user download the file.
On my local machine I can simply write the file to a folder under the web app root, e.g. /output, and then redirect my browser to http://localhost:4000/output/1.csv to get the file.
However, the same code fails again and again on Heroku no matter how I try to modify it. It either complains about not being able to write the file or not being able to redirect the browser to the correct path.
Even if I manually use heroku run bash and create an /output folder in the project root and create a file there, when I try to direct my browser there (e.g. https://myapp.herokuapp.com/output/1.csv, it simply says "Page not found".
Is it simply impossible to perform such an action on Heroku after all? I thought since we are free to create files on the ephemeral file system we should be able to access it from the browser as well, but things seem to be more complicated than that.
I'm using Phoenix framework and I already added
plug Plug.Static,
at: "/output", from: Path.expand("output/"), gzip: false
to my endpoint.ex. Apparently it works on localhost but not on Heroku?
I'll turn to Amazon S3 if indeed what I'm trying to do is impossible. However, I want to avoid using S3 as much as possible since this should be a really simple task and I don't want to add another set of credentials/extra complexity to manage. Or is there any other way to achieve what I'm trying to do without having to write the file to the file system first and then redirecting the user to it?
I know it doesn't strictly answer your question, but if you don't mind generating the CSV every time it is requested, you can use Controller.send_download/3 and serve arbitrary payload as a download (in your case the contents of the CSV).
Naturally you could store the generated CSVs somewhere (like the database or even ets) and generate them "lazily".

Monitoring for changes in folder without continuously running

This question has been asked around several time. Many programs like Dropbox make use of some form of file system api interaction to instantaneously keep track of changes that take place within a monitored folder.
As far as my understanding goes, however, this requires some daemon to be online at all times to wait for callbacks from the file system api. However, I can shut Dropbox down, update files and folders, and when I launch it again it still gets to know what the changes that I did to my folder were. How is this possible? Does it exhaustively search the whole tree in search for updates?
Short answer is YES.
Let's use Google Drive as an example, since its local database is not encrypted, and it's easy to see what's going on.
Basically it keeps a snapshot of the Google Drive folder.
You can browse the snapshot.db (typically under %USER%\AppData\Local\Google\Drive\user_default) using DB browser for SQLite.
Here's a sample from my computer:
You see that it tracks (among other stuff):
Last write time (looks like Unix time).
checksum.
Size - in bytes.
Whenever Google Drive starts up, it queries all the files and folders that are under your "Google Drive" folder (you can see that using Procmon)
Note that changes can also sync down from the server
There's also Change Journals, but I don't think that Dropbox or GDrive use it:
To avoid these disadvantages, the NTFS file system maintains an update sequence number (USN) change journal. When any change is made to a file or directory in a volume, the USN change journal for that volume is updated with a description of the change and the name of the file or directory.

grails file upload

Hey. I need to upload some files (images/pdf/pp) to my SQLS Database and thereafter, download it again. I'm not sure what is the best solution - store it as bytes, or store it as file (not sure if possible). I need later to databind multiple domain classes together with that file upload.
Any help would be very much apreciated,
JM
saving files in the file system or in the DB is a general question which is asked here several times.
check this: Store images(jpg,gif,png) in filesystem or DB?
I recommend to save the files in the file system and just save the path in the DB.
(if you want to work with google app-engine though you have to save the file as byte array in the DB as saving files in the file system is not possible with google app-engine)
To upload file with grails check this: http://www.grails.org/Controllers+-+File+Uploads

Make a folder like Dropbox that connects with a remote location

How can I make a folder that does things. Surely dropbox knows when a file is put in the folder, and that file is synced. How can I make a folder that does the same, and that the files I put in it go to my ftp?
I'm trying to do this on a Mac (surely, Dropbox works fine on a Mac).
I believe what you are looking for is a way to monitor when files are changed. Then, you can simply upload the changed file via FTP like you mentioned. If this is the case, the answer is to tie into the Windows Folder and File events. Here is a good article on how to do so:
http://www.codeproject.com/KB/files/MonitorFolderActivity.aspx
The code needed to FTP a file can be found here:
http://msdn.microsoft.com/en-us/library/ms229715.aspx
All of this is assuming you are going to be using C#. If you are going to use a different language, you will need to perform the same basic actions in the same basic manner but the syntax will be different.
To get started, this is all you need. You watch the folder for changes to any of the files. When you see a change, you upload the changed file via FTP (if that is your desired method of web transport) to the remote location. Of course, you would need to do the opposite for other clients. They would need to subscribe to events on your server that told them to download the latest versions of the changed files. Finally, you would need to apply your own business logic for things like how often you want the uploads to happen, if you want logging enabled for the changes, if you are going to do file versioning, etc.
One solution(windows only + .NET) would be to run a client with and monitor a folder with FileSystemWatcher and when the change event fires, do appropriate action required to sync with FTP.

Resources