Do you really need a server for AngularJS app? - angularjs

I want to deploy my AngularJS app which access RESTful web-services onto an aws and I am wondering if I really need a server to serve my AngularJS files.
I can server them as static files or use something like NodeJS but do I really need one?
What are the advantages/dis-advantages of using a server in this scenario?

If your app is small, it's really not a problem if you only access to an API.
But if you want to login via other services where you have for example a public and secret token it's better to work with a server who use cache this datas from your users (maybe it's what your aws is doing).

If you want to access RESTFull Web Services from AWS, you need to put your angularjs files in a server.
The server will give access to resources, if the request is from http protocol. It will deny the request to serve if the protocol is file.

Related

How to access a database from a hybrid app without crossing domains?

I am developing an app with PhoneGap (Cordova) + Framework7 and I need to connect to a database. The issue is that it's an hybrid app, which means the www files are local and the app creates an internal server, and so if you try to use AJAX to run a php file it crosses domains, since it'll try to reach for my webserver while it's running it's own server. What can I do?
(I know Cordova has a utility named WebSQL that connects to SQLite, but my database is MySQL, and I think it can only connect to a local db)
(You can't move php to be local because Cordova can't run php files, also it's probably not very secure)
My suggestion is to use Ajax to access your server. (to run the PHP file) You can allow your server URL in environment variables of frontend.
Check for Content-Security-Policy and connect-src in frontend and add your server URL there. Then you will be able to send Ajax to your server.
Hope this helps.

How to prevent JSON data from being Tampered in a REST request?

The following is the architecture of my Web application.
Web UI(Angular JS) running on nginx
Back-end data access layer (Java App) running on glassfish app server
My question is, how can I prevent a valid user from tampering or manipulating the REST service JSON request using some proxy tool.
One thing that I thought of was to encrypt the JSON but this will still expose the public key and the source code of how to encrypt it since its done on client side scripting. Is there a better way of doing secured JSON request?
P.S: I'm not talking about "Man in the middle Attack". This is not related to session hijacking. This is about a valid session user tampering the POST request using tampering tools.
You can't.
Anything that runs on client-side is exposed. Almost everything there can be tampered.
So your best bet is that you have a strong server-side validation before you process the data from the client.

Save data from clientside withouth a backend

I have a very simple webpage created in angular, with no backed. Now I would like to store some very simple user statistics data in some way, without involving a backend. File, database or some other thing that I can access from clientside.
I had a look at MongoDB which looks very cool, I can access that via a REST api which is perfect. Only problem here is that the api is hosted on https, which my domain are not. That means I cannot connect to the api because of CORS error. And I would like to avoid buying an SSL certificate.
So do I have any options here? Storing data from clientside without a backend and SSL?
Thanks!
The solution was actually to go with Mongodb, and add a free SSL certificate from https://letsencrypt.org/

How to deploy angular app that completely relies on external API to retrieve and store data?

For an angular app that completely relies on external API to retrieve and store data, is NodeJS necessary for deployment? What are the other possible methods of deployment? Currently, I use it for local development and plan on using it in combination with Nginx for production. However, NodeJS is not doing anything except launching index.html. So should I remove NodeJS altogether and simply use Nginx alone?
One solution is to host it using any web host, they all equally host HTML only sites, and this solution is pretty inexpensive. Hosts like Hostgator, web.com etc., will allow you to upload the site via FTP.
A second choice is to host it using a web server (Nginx), but this is probably the most costly. You can host your own server in any cloud service (Amazon would be EC2 for instance) and then host your files there. This is probably not a good option for you. The only reason to use this type of solution is if you need the server to host code, so if you were using node to talk to a database for instance.
A 'pro' option may be to put them in S3 on AWS, and host it that way, it is pretty inexpensive.
Here is a link explaining how to host on Amazon - http://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteHosting.html
http://docs.aws.amazon.com/AmazonS3/latest/dev/website-hosting-custom-domain-walkthrough.html

Use Symfony to authenticate users for external service

I've been googling the entire afternoon and I'm still not able to figure out what's the best solution to implement the following:
We have build a webapp in AngularJS that consumes interacts with REST API build using Symfony. The app allows users to register, login and do stuff. Now, these users need to upload very big files (>60GB) into their personal folders. A separate VM have been setup for this purpose (data server), located in the same VLAN as the frontend, backend and the MySQL db serving the data. The data upload will be done using either HTTP (using JQuery File Upload plugin) or an FTP client.
I'd like the users to authenticate into the data server (both via FTP or HTTP) using the credentials they already have for the app. For the FTP case, I'll use PureFTP as FTP server, which validates user/pass directly from the MySQL. As far as I know, this is the most convenient solution, but criticism is accepted.
For the HTTP upload, we could proceed in a similar way: POST user/pass, validate against DB and return true/false. Since all the communication will happen within the VLAN, security issues are less problematic. Nonetheless, I believe much more sophisticated solutions have already been developed.
My first thought was to build an OAuth server on Symfony and then authenticate the uploader (and future services) with their respective clients. Is this a right approach or is this a too complicated solution?
Alternatively, a service in the dataserver could validate user's credentials sent by the client against the REST API, receive a JWT and generate a new session for that particular client to list and update files on a particular folder. I'm not sure how to build this middleware though, do I need another Symfony instance or a simple PHP script will do the trick?
Please do not hesitate to share any thought you have on this. Any point of view will be much appreciated.
Thanks a lot

Resources