authentication/http headers support in forge.file trigger.io module? - mobile

in the official trigger.io docs there seems to be no provision for custom http headers when it comes to the forge.file module. I need this so I can download files behind an http authentication scheme. This seems like an easy thing to add, if support is not already there.
any workarounds? any chance of a quick fix in the next update? I know I could use forge.request instead, but I'd like to keep a local copy (saveURL).
thanks

Unfortunately the file module just uses simple "download url" methods rather than a full HTTP request library, which makes it a fairly big task to add support for custom headers.
I've added a task to our backlog for this, but I don't have a timeframe for it being added.
Currently on iOS you can do basic auth by using urls in the form http://user:password#url.com in case that helps.
Maybe to avoid this you can configure your server differently, or have a proxy server in front that allows you to pass authentication details as get parameters?

Related

Websocket with codename one

Will be be able to use the same websocket code for JS too? or is there any special code needed depending on platform?
Also will we be able to extend URLImage.createToStorage() method to load from our own websocket based backend rather than from URL? and will it be possible to make it work seamless in all devices?
It is possible to use websockets for the connection from a JavaScript build see this discussion. Notice that if you need to connect to a different server from the origin you will need to proxy your request to keep the same origin behavior.
You can't override the create methods of URLImage as they are all static. You can however download the files thru a websocket and open them using the EncodedImage.create method that accepts a byte array.

Scrape website for basic data using Angular JS ( facebook like Link sharing module)

I am trying to make Facebook like "Link sharing" module i.e when anyone write a link while doing new POST , it will automatically show some basic data from the website like in facebook...
I tried simple scraping using $http.get and it is working only if I install CORS extension in google chroome so the main issue I am facing with this approach is to make is working without using any plugin for it...
I also tried by adding headers in config file but still no luck.
$httpProvider.defaults.headers.common = {};
$httpProvider.defaults.headers.post = {};
$httpProvider.defaults.headers.put = {};
$httpProvider.defaults.headers.patch = {};
$httpProvider.defaults.useXDomain = true;
delete $httpProvider.defaults.headers.common['X-Requested-With'];
Please share me the best approach to do this feature or if there is any way I can solve CORS issue ?
Thanks
Zeshan
This is not possible. CORS exists for a reason: to STOP you from accessing HTTP resources from other domains without those other domains explicitly allowing you to.
Again: this is not possible due to security restrictions imposed by browsers.
The only way you can accomplish this, and the way Facebook does it, is to move those cross-domain requests to a server, where there are no cross-domain restrictions.
So $http.post('/some-script-on-my-server') where that script does the actual HTTP request for the remote page, scrapes the necessary information and returns it back to the browser.
There is a workaround for this in order to have an only browser working javascript solution without configuring anything in the server (maybe useful in some particular situation) and "avoiding" CORS.
You could use the YQL. This way you only have to play in their console a little bit with the url you need to scrape and use the query they provide you with in your website as url for your request.
For example (extracted from YQL website):
select * from html where url='http://finance.yahoo.com/q?s=yhoo' and xpath='//div[#id="yfi_headlines"]/div[2]/ul/li/a'
Gets the headlines from yahoo finance, and you get also the query url that you can use in your request:
https://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20html%20where%20url%3D'http%3A%2F%2Ffinance.yahoo.com%2Fq%3Fs%3Dyhoo'%20and%20xpath%3D'%2F%2Fdiv%5B%40id%3D%22yfi_headlines%22%5D%2Fdiv%5B2%5D%2Ful%2Fli%2Fa'&format=json&diagnostics=true&callback=
They have other examples and how to integrate it in their documentation.
You don't need to configure anything in the server side, but of course, it has to go through Yahoo's, which isn't at all optimal. Of course, performance gets directly affected...
As said again, maybe in some particular situations (dev, tests, etc) this can be useful and it is always interesting to give it a try.

Possible to access Piwik getVisitorLog through HTTP API?

I'm building some reporting tool. Ideally I want to avoid going through web server logs myself and use (some of) the power of Piwik.
The stuff I get from the visitor log would be a good start, this is at http://example.com/piwik/index.php?module=CoreHome&action=index&idSite=1&period=day&date=yesterday#/module=Live&action=getVisitorLog&idSite=1&period=day&date=yesterday
Unfortunately I can't find a getVisitorLog action in the HTTP API docs at
http://developer.piwik.org/api-reference/reporting-api#Actions (and it's also not an undocumented feature, method=Actions.getVisitorLog gives me
The method 'getVisitorLog' does not exist or is not available in the module '\Piwik\Plugins\Actions\API'.
Is there another way to get to this? Or should I write a plugin for Piwik?
Apparently it is possible through the Live plugin API:
http://developer.piwik.org/api-reference/reporting-api#Live
This works as desired:
http://example.com/piwik/index.php?module=API&method=Live.getLastVisitsDetails&format=JSON&idSite=1&period=day&date=2015-07-21&expanded=1&token_auth=XXXXX&filter_limit=100

Mobile application backend

I'm currently developing a mobile application that will fetch data from server by request (page load) or by notification received (e.g. GCM).
Currently I'm starting to think about how to build the backend for that app.
I thought about using PHP to handle the http requests to my database (mySQL) and to return the response as JSON. As I see it there are many ways to implement such server and would like to hear to hear thoughts about my ideas for implementations:
1. create a single php page that will receive an Enum/Query, execute and send the results.
2. create a php page for every query needs to be made.
Which of my implementations should I use? if none please suggest another. Thank you.
P.S, this server will only use as a fetcher for SQL and push notifications. if you have any suggestion past experience about how to perform it (framework, language, anything that comes to mind) I'd be happy to learn.
You can use PHP REST Data services framework https://github.com/chaturadilan/PHP-Data-Services
I am also looking for information about how to power a web and mobile application that has to get and save data on the server.
I've been working with a PHP framework such as Yii Framework, and I know that this framework, and others, have the possibility to create a API/Web service.
APIS can be SOAP or REST, you should read about the differences of both to see wich is best for mobile. I think the main and most important one is that for SOAP, you need a Soap Client library on the device you are trying to connect, but for REST you just make a http request to the url.
I have built a SOAP API with Yii, is quite easy, and I have use it to communicate between two websites, to get and put data in the same database.
As for your question regarding to use one file or multiple files for every request, in the case of SOAP built on Yii, you have to normally define all the functions available to the API on the server side in only one file(controller) and to connect to that webservice you end up doing:
$client=new SoapClient("url/of/webservice);
$result=$client->methodName($param1, $param2, etc..);
So basically what you get is that from your client, you can run any method defined on the server side with the parameters that you wish.
Assuming that you use to work program php in the "classic way" I suggest you should start learning a framework, there are many reasons to do it but in the end, it is because the code will result more clean and stable, for example:
You shouldn't be writing manual queries (sometimes yes), but you can use the framework's models to handle data validation and storage into the database.
Here are some links:
http://www.larryullman.com/series/learning-the-yii-framework/
http://www.yiiframework.com/doc/guide/1.1/en/topics.webservice
http://www.yiiframework.com/wiki/175/how-to-create-a-rest-api/
As I said, I am also looking to learn how to better power a mobile application, I know this can be achieved with a API, but I don't know if that is the only way.
create a single php page that will receive an Enum/Query, execute and send the results.
I created a single PHP file named api.php that does exactly this, the project is named PHP-CRUD-API. It is quite popular.
It should take care of the boring part of the job and provides a sort of a framework to get started.
Talking about frameworks: you can integrate the script in Laravel, Symfony or SlimPHP to name a few.

Using a subdomain to identify a client

I'm working on building a Silverlight application whereas we want to be able to have a client hit a url like:
http://{client}.domain.com/
and login, where the {client} part is their business name. so for example, google's would be:
http://google.domain.com/
What I was wondering was if anyone has been able, in silverlight, to be able to use this subdomain model to make decisions on the call to the web server so that you can switch to a specific database to run a query? Unfortunately, it's something that is quite necessary for the project, as we are trying to make it easy for their employees to get their company specific information for our software.
Wouldn't it work to put the service on a specific subdomain itself, such as wcf.example.com, and then setup a cross domain policy file on the service to allow it to access it?
As long as this would work you could just load the silverlight in the proper subdomain and then pass that subdomain to your service and let it do its thing.
Some examples of this below:
Silverlight Cross Domain Services
Silverlight Cross Domain Policy Helpers
On the server side you can check the HTTP 1.1 Host header to see how the user came to your server and do the necessary customization based on that.
I think you cannot do this with Silverlight alone, I know you cannot do this without problems with Javascript, Ajax etc. . That is because a sub domain is - for security reasons - treated otherwise than a sub-page by the browsers.
What about the following idea: Insert a rewrite rule to your web server software. So if http://google.domain.com is called, the web server itself rewrites the URL to something like http://www.domain.com/google/ (or better: http://www.domain.com/customers/google/). Would that help?
Georgi:
That would help if it would be static, but alas, it's going to all be dynamic. My hope was to have 1x deployment for the application, and to use the http://google.domain.com/ idea to switch to the correct database for the user. I recall doing this once when we built an asp.net website, using the domain context to figure out what skin to use, etc.
Ates: Can you explain more about what you are saying... sounds like you are close to what I am trying to come up with. Have you seen such a tutorial for this?
The only other way I have come up with to make this work is to have a metabase that when the user logs in, it will switch them to the appropriate database as required... was just thinking as well that telling Client x to hit:
http://ClientX.domain.com/ would have been sweeter than saying to hit http://www.domain.com/ and login. It seemed as if they were to hit their name, and to show it personalized for them right from the login screen would have been much more appealing for the client base.
#Richard B: No, I can't think of any such tutorial that I've seen before. I'll try to be more verbose.
The server-side approach in more detail:
Direct *.example.com to the same IP in your DNS settings.
The backend app that handles login checks the Host HTTP header (e.g. the "HTTP_HOST" server variable in some platforms). That would contain the exact subdomain.example.com that the client used for reaching your server. Extract the subdomain part and continue...
There can also be a client-side-only approach. I don't know much about Silverlight but I'm assuming that you should be able to interface Silverlight with JavaScript. You could read document.location with JavaScript and pass it to your Silverlight applet, whereon further data fetching etc. logic would rely on the subdomain that was passed in by JavaScript.
#Ates:
That is what we did when we wrote the ASP.Net system... we pushed a slew of *.example.com hosts against the web server, and handled using the HTTP headers. The hold-up comes when dealing with WCF pushing the info between the client and the server... it can only exist in one domain...
So, for example, when you have {client}.example.com and {sandbox}.example.com, the WCF service can't be registered to both. It also cannot be registered to just *.example.com or example.com, so that's where the catch 22 is coming in at. everything else I have the prior knowledge of handling.
I recall a method by which an application can "spoof" another domain name in certain instances. I take it in this case, I would need to do such a configuration? Much to research yet I believe.

Resources