Azure logic Apps large file support - azure-logic-apps

The video update from the Azure Logic Apps team suggested that large file (exceeding the 100MB limit) support is due for release back in August 2017.
https://youtu.be/DSPNHLOVu_A?t=1514
But I haven't seen it mentioned in the documentation for the connectors.
How do I know which connectors support large files? And how do I make use of it, I'd guess it's different to the normal having the payload in the body of a message (as that's limited to 100MB).
EDIT:
I struggled to find the release notes at first, but saw they are actually in the azure portal now rather than on a blog (which is quite cool).
Here's a deep link: https://ema.hosting.portal.azure.net/ema/1.30101.1.594429775.180105-1338/Html/iframereleasenotes.html?locale=en&trustedAuthority=https://portal.azure.com
I couldn't see it mentioned.
Also this user voice ticket hasn't been closed yet:
https://feedback.azure.com/forums/287593-logic-apps/suggestions/17229566-increase-ftp-connector-limit-above-50mb
Either it's not been documented as released / difficult to find or perhaps although mentioned in the video it didn't get released in the end?

Judging by this video.
https://youtu.be/qBD_RswoaPg?t=631
It sounds like it's shipped for blob storage -> ftp connector and combinations thereof.
But it didn't mention the http connector (which I'd need to copy the file down prior).
It's not mentioned in the release notes as far as I could tell.
Interestingly however in the settings of the http connector there is this:
But the address didn't lead to a page discussing chunking and enabling it doesn't allow me to exceed the 100mb limit.
EDIT
Discussing in the MSDN forums, there's a suggestion that the http connector not working for files above 100MB could be a bug: https://social.msdn.microsoft.com/Forums/en-US/741529c7-a5ad-44e0-8839-497fe8548dee/chunked-transfer-for-http-action-not-working?forum=azurelogicapps

Related

What will happen to parse4cn1 when they shut down parse.com?

As some of us have noted, parse.com will be shut down in January 2017. In the current version of parse4cn1, all requests go to https://api.parse.com, using the constant ParseConstants.API_ENDPOINT. Will it be possible to supply a custom URL here, pointing to a different Parse Server? If not, can I build parse4cn1 from source myself? Or should I be looking for an alternative to Parse?
See these related questions:
Will Parse4cn1 still work after Parse server retirement?
parse4cn1 has some issue when working with Node JS and parse-server open source
I now have some time available and intend to make the change you proposed within the coming month or so and make a release that is compatible with the open source Parse server. If that's too late for you, consider making the changes yourself and contributing back via a Github pull request. The Contributing section of the parse4cn1 repo provides useful tips.
Regarding looking at Parse alternatives, that's a decision you'll have to make yourself based on your needs and timeline. My gut feeling is that the Open Source Parse Server will mature and grow in features in the months leading up to the official retirement of Parse.com.

A plea for a basic Notebook example getting data into and out of Google Cloud Datalab

I have started to try to use the Google Cloud datalab. While I understand it is a Beta product, I find the Doc's very frustrating, to say the least.
The questions here and lack of responses as well as lack of new revisions or docs over the several months the project has been available make me wonder if there is any commitment to the product?
A beginning would be a notebook that shows data ingestion from external sources to both the datastore system and the Big query system. That is a common use case. I'd like to use my own data, it would be great to have a Notebook to ingest it. It seems that should be doable without huge effort? And it would get me (and others) out of this mess trying to link the various terse docs from various products and workspaces up and working together..
in addition to a better explanation of the Git hub connection process (prior question))
For BigQuery, see here: https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/tutorials/BigQuery/Importing%20and%20Exporting%20Data.ipynb
For GCS, see here: https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/tutorials/Storage/Storage%20Commands.ipynb
Those are the only two storage options currently supported in Datalab (which should not be used in any event for large scale data transfers; these are for small scale transfers that can fit in memory in the Datalab VM).
For Git support, see https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/intro/Using%20Datalab%20-%20Managing%20Notebooks%20with%20Git.ipynb. Note that this has nothing to do with Github, however.
As for the low level of activity recently, that is because we have been heads down getting ready for GCP Next (which happens this coming week). Once that is over we should be able to migrate a number of new features over to Datalab and get a new public release out soon.
Datalab isn't running on your local machine. Just the presentation part is in your browser. So if you mean the browser client machine, that wouldn't be a good solution - you'd be moving data from the local machine to a VM which is running the Datalab Python code (and this VM has limited storage space), and then moving it again to the real destination. Instead, you should use the cloud console or (preferably) gcloud command line on your local machine for this.

Publish one product to multiple sites

Is there a way to have one product definition and have it publish to multiple sites? I am looking for this ability specifically in DNN or Umbraco, either with free or paid extensions. I did install both the platforms and played with the free extensions and looked for any extension offering such functionality but did not find one. Any links or pointers are highly appreciated!
I had looked up for this info in many places before reaching over to the expert pool here, hoping to get some hints;
In umbraco there is the built in /base extension (http://our.umbraco.org/wiki/reference/umbraco-base) which enables you to access product data that is maintained in Umbraco from other websites. Base is REST-ish so the implementation is well documented - you can access the data as XML or JSON (Returning Json instead of XML with Umbraco Base).
Also as the implementation is REST-ish the other websites that consume the content maintained in the core site could be written in anything that can consume a REST feed eg html & javascript.
It's not 100% clear to me what setup you're after, but if you're looking to set up a traditional Authoring/Delivery configuration - one of the few paid offerings Umbraco has is called Courier. It's a very reasonably priced (~$135USD,/99EUR) deployment manager that handles syncing content between two sites, i.e., Authoring and a Delivery server.
It's a very smart tool that manages content, configuration, and dependencies. It's neat and also supports a great open-source project!
If you're looking to setup something more like a centralized product database that is used by many sites - amelvin is on good pointer with BASE. They have a nice api where you may also set up your own webservice (beyond their own webservice functaionality!).
If you need this centralized product data to notify the other sites to update their caches - i encourage you to look into the 'distributedCall' functionality.
There's a bit of documentation on distributed calls in this load-balancing tutorial that may help understand the concept a bit better.
...Hope this helps get pointed in the right direction.

Google Cloud Endpoints stability?

I am using this link to build a simple chat application using GCM, and I found this great feature "Google Cloud Endpoints" which makes things easier. But I am afraid to depend on it as I noticed it is still experimental. Can I trust it or should I use Java Servlets instead?
It is true that the tag 'experimental' is a bit scary. If you are concerned, you could consider holding back a bit until Google IO 2013, which is the middle of May. They often make announcement and introduce new technologies there.
They first announced endpoints at last years' Google IO (in July) and if there any significant changes pending for endoints they would likely announce them at this years'.
If you do start using Endpoints, just for Android, and w/o user authentication, I don't think it would be too hard to revert to using a Servlet instead, if you had to (i.e. due to a change in terms that was off-putting). The user authentication stuff would be harder to replace IMO.
As far as I have used Google Cloud Endpoints they work perfectly. Furthermore many interesting features are already implemented, such as integration with Google Eclipse Plugin and testing through the Google APIs Explorer, even in localhost, using the Development Server.
I understand they're still experimental maybe because they're just a new technology not really thoroughly tested yet and are subject to updates. Anyway I've not found significant bugs so far and you should be able to reuse your endpoints with the sucesive versions that will exist. It doesn't seem to be something that will dissapear in the near future...
This is an older question, but for further references I want to say that my short experience was not so pleasant.
I tried "Mobile Backend App". In the beginning, everything worked fine, but after a few days (without changing anything) I received:
GoogleJsonResponseException 404 Not Found
I sow other posts on stackoverflow and manage to solve it by creating another project. I changed the code and it still worked. But again I had problems I played a bit with the 2 projects, I redeployed and changed the settings (tips found on other posts) and it worked. Now it is no longer working, no matter what I do.
I hope that the problem is specific to this project, but nevertheless it is frustrating.

appengine - ftp alternatives

I have an appengine app and I need to receive files from Third Parties.
The best option to me is to receive the files via ftp, but I have read that it is not possible, at least a year ago.
It is still not possible? Which way could I receive the files?
This is very important to my project, in fact it is indispensable.
Thx a lot!!!!
You need to use the Blobstore.
Edit: To post to the blobstore in Java, the code fragment in this SO question should work (this was for Android; elsewhere, use e.g. Apache HTTPClient). The URL to post to must have been created with createUploadUrl. The simplest way to communicate it to the source server might be a GET URL, e.g. "/makeupload" which is text/plain and contains only the URL to POST to. To prevent unauthorized uploads, you can require a password either in the POST, or already in the GET (e.g. as a query parameter)
The answer depend a lot of the size range of your imports. For small files the Urlfetch API will be sufficient.
I myself tend to import large CSV files ranging from 70–800 MB, in which case the legacy Blobstore and HTTP-POST doesn't cut it. GAE cannot handle HTTP requests >32 MB directly, nor can you upload static files >32 MB for manual import.
Traditionally, I've used a *nix relay for downloading the data files, splitting them into well-formed JSON segments and then submitting maybe 10-30 K HTTP-POST requests back to GAE. all inputs into well-formed. This used to be the only viable work-around, and for >1 GB files it might still come across as the preferred method due to scaling performance (complex import procedures is easily distributed across hundreds of F1 instances).
Luckily, as of April 9 this year (SDK 1.7.7) importing large files directly to GAE isn't much of a problem any longer. Outbound sockets are generally available to all billing-enabled apps, and consequently you'd easily solve the "large files" issue by opening up an FTP connection and downloading.
Sockets API Overview (Python): https://developers.google.com/appengine/docs/python/sockets/

Resources