Azure LogicApp OneDriver trigger not working for large files (>50mb) - azure-logic-apps

I have a azure logic app where the trigger is a OneDriveForBusiness connector that is watching for files to be created in a onedrive folder. Normally this works fine. However for large files (narrowed down to 52,397,814 works, 52,590,945 doesn't) the trigger never fires and shows as "skipped" in trigger history.
Has anyone seen anything similar?
Any suggestions on how to proceed?
Any suggestions as to a better place to ask about this?
My current plan is to switch to using a ZIP'd file... but I'm unhappy that there's an unknown upper limit after which file creates are ignored.
Thanks!!!

This is a known limitation:
The When a file is created or When a file is modified triggers will
skip every file bigger than 50 MB.
Depending on your requirements, you either need to find another way of signalling that a file has been created, or you can run the Logic App by schedule and check if new files appeared, or you need to change the approach altogether.

Related

Indesign real-time package for collaboration

I manage a team of designers working on Indesign.
When we work on a project, it often happens that a designer has to work on the project of another. We work with Dropbox for Business.
But when we take the work of another designer, there is often missing links and fonts.
Is there a plugin or a way to develop a plugin that would allow, when we create a new indd file (or for the protection of the same file):
Automatically create a "Links" folder and another "Document fonts" at side of the indd file
Systematically add a new link or new typography in the corresponding folder?
To simplify: each action on font or on a link, make a kind of "Indesign Package" in real time?
If this is not a solution, do you have any solutions to meet this need?
I don't know of a specific script or plugin that does this.
However, it should be possible to write a script with an eventhandler with a beforeClose event that runs certain script commands every time a user closes a document (or even every time a user adds, changes or deletes a link). At this point the script could run some copyLink commands on all the images and fonts (?) placing them all in the folders next to the document.
The whole script could be made a startup script, so it becomes active anytime any user runs InDesign.
(I'm actually not sure, if fonts can be copied so easily. Worst case scenario would be that the script would need to run some packaging command to gather the fonts somewhere, copy them over to where you need them and then delete the rest of the temporary package.)
Did you consider Creative Cloud Libraries ? They are meant to allow sharing assets within a team. Apart form that, you users would need to have a same access to the file system (a common drive letter for the network path for example).
Another solution would be to use a DAM solution so users would link files from the DAM.
Eventually, you could sure think of a script as mdomino offered.

Monitoring for changes in folder without continuously running

This question has been asked around several time. Many programs like Dropbox make use of some form of file system api interaction to instantaneously keep track of changes that take place within a monitored folder.
As far as my understanding goes, however, this requires some daemon to be online at all times to wait for callbacks from the file system api. However, I can shut Dropbox down, update files and folders, and when I launch it again it still gets to know what the changes that I did to my folder were. How is this possible? Does it exhaustively search the whole tree in search for updates?
Short answer is YES.
Let's use Google Drive as an example, since its local database is not encrypted, and it's easy to see what's going on.
Basically it keeps a snapshot of the Google Drive folder.
You can browse the snapshot.db (typically under %USER%\AppData\Local\Google\Drive\user_default) using DB browser for SQLite.
Here's a sample from my computer:
You see that it tracks (among other stuff):
Last write time (looks like Unix time).
checksum.
Size - in bytes.
Whenever Google Drive starts up, it queries all the files and folders that are under your "Google Drive" folder (you can see that using Procmon)
Note that changes can also sync down from the server
There's also Change Journals, but I don't think that Dropbox or GDrive use it:
To avoid these disadvantages, the NTFS file system maintains an update sequence number (USN) change journal. When any change is made to a file or directory in a volume, the USN change journal for that volume is updated with a description of the change and the name of the file or directory.

Neo4j and big log files

I try to use n4j in my app, but I have problem with big log files. Are they necessary or is there some way to reduce the number and size of them?
At the moment I see files like:
nioneo_logical.log.v0
nioneo_logical.log.v1
nioneo_logical.log.v2
etc
and they are ~26MB each (over 50% of neo4j folder).
These files are created whenever the logical logs are rotated.
You can configure rules for them in the server properties file.
See details here: http://docs.neo4j.org/chunked/stable/configuration-logical-logs.html
You can safely remove them (but only the *.v*) if your database is shutdown and in a clean state. Don't remove them while the db is running because they could be needed in case of recovery on a crash.

How to delete files during installation with installshield 2010

I'm including in my installation package some backup files from my database, so I can restore it and have a complete running database in one easy step, and it's all fine, but i'd like to delete the database backup files as a step in the installation process right after it's restored.
Anyone know how?
You can add some InstallScript to do the job as a custom action. There is a function DeleteFile that does that.
Look at the link to the example at the bottom.
As a complement to Booberry's answer, I suggest you make this a deferred custom action, because you will be making actual changes to the system. Once you do this, you won't be able to access your properties so you will have to pass the file path via Custom Action Data.
Besides this, I also suggest you consider the approach of using your database backup, if in the future your application is deployed to international markets you will find issues with the collation (which may vary depending on the country) and cause your customers (and yourself) a headache.
Hope it helps.
Deletefile function will not remove "Read-Only, Hidden and system files".You have to remove the Read-only attribute then use delete file function..
If you are running this as a custom action , then choose deferred execution in system context mode.

Is there a known good way to keep Multiple Servers Logging for Cakephp

Cake PHP stores everything under the /app/tmp/logs folder and if you have multiple servers to see what is happening at each you have to check on each server logs folder.
Is there any solution that I can use with cakephp to centralize in one place the logging for Cakephp with the log files being saved and reset in a daily basis.
Cake allows you to set a parameter in the Controller::log() function.
http://book.cakephp.org/view/159/Using-the-log-function
Basically, when you have an error:
$this->log( 'some message describing the error', 'allserverslog' );
// second param can also be LOG_ERROR or LOG_DEBUG, 2 predefined constants that identify the default logging files
Some quick research shows that a clean method would be to redefine the TMP constant (by default define('TMP', APP.'tmp'.DS)) in /app/webroot/index.php to point the whole temp directory someplace else. This is not a good solution if the folder is supposed to be shared though, since different apps may step on each others feet with their temp files.
The only apparent way to point only the log directory someplace else seems to be to edit /cake/config/paths.php.
If your goal is only to make it easy to skim through log files of different apps quickly, you could simply put a bunch of symlinks to those logs into one directory.
Or, the other way around, you can make each /app/tmp/logs folder a symlink to some shared folder. Not sure I'd recommend that though; having different apps write to the same log may get confusing, since you may not always be sure which app a message came from.

Resources