I would like to buy a mobile service but I didn't understand what Microsoft intend for API for Unit.
Are Units the amount of people that have the app in their mobile device?
What Microsoft intends for API and for Unit?
here the calculator http://azure.microsoft.com/en-us/pricing/calculator/?scenario=mobile
In my case I have a company application that probably will be distributed through company hub to about 20 people.
We have also developed a desktop application that accesses to the Azure database directly like a normal network connection to a Sql Server.
The database contains about thirty tables and mobile service allows access (insert, update, delete) to each table.
I don't understand if I need a base account a premium or if a free account is enough at the moment.
How can I choose the best configuration for my mobile app?
Can I try Azure for free, without any risk of being charged? If i use the try Azure for free and it isn't enough can I upgrade and pay for my needs?
Azure Mobile Service's tiers are based off of API usage. Any call into your Mobile Service (insert, update, delete, read, login, custom API) is considered an API call. So you'd need to have an idea of how many API calls you would make in a month to have an idea of how much using Mobile Services would cost you. As an example, let's say each of your 20 users makes 10,000 API calls a day (I would guess that's a bit extreme), then 20 users * 10,000 API calls * 30 days in a month = 6,000,000 API calls. For that, you would need to have either 4 Basic Units or 1 Standard Unit. Units are purchased when you need additional API calls (with each Basic unit being 1.5M and each standard being 15M per month).
Mobile Services has a free tier (500k API calls across the subscription (not an individual service)) so you can try it out free of charge. Additionally, if you haven't used it already, you get a free 20 mb database that you can use as the datastore for your Mobile Service. It's not advised to stay in Free mode once your app is in production but you should have no issues trying it out and then scaling up (scaling is super simple from the Scaling tab in your Mobile Service) once you're ready.
Related
I have spent 3 days researching this problem and cannot find a solution or similar use case that shows how to solve the problem, so any pointers would be greatly appreciated.
I am creating a web-app that uses Google Cloud Storage and Bigquery. A user registers on the web app and then can upload data to Cloud Storage and Big Query. Two users could be from the same company and therefore should be able to view the same data - i.e. Jack and Jill work for company A and if Jack uploads a massive dataset via this app, Jill should also be able to view it later.
Another scenario will be I have two completely separate clients with users using this web-app. If users from Company A upload data, users from Company B should not be able to view Company A's data, and vice versa. But users from the same company should be able to view the data within their company.
Currently, I have an app that works for a single company. This has a React front-end that uses Firebase for authentication. Once the user is logged in, they can use the app which sends off API calls to a Flask back-end that does some error checking and authentication checking and then fires off an API call to GCP. This uses a service account and the key is loaded as an environment variable in the environment in which the Flask app is running.
However, if Company B want to use the app now, both Company A and Company B will be able to see each other's data and visualize it through the app. In addition, they will be sharing a project (I would like to change this to allocate billing more easily to have each client have their own project).
I ultimately want to get this app onto Kubernetes and ensure that each company is independent of each other, however, do not want to have to have separate URL's for every company using the app. Also, I want to abstract GCP away from the client. I would prefer to authenticate a user based on their login credentials and then they will be given access to their GCP project (via my front-end) accordingly.
I thought about perhaps having separate service keys for each client and then storing the service key info in Firebase, while using the respective keys for API calls but not sure this is best practice. It is however the only strategy I can think of.
If anyone could provide some help or guidance it would be very much appreciated. This is my first GCP project and have not been able to find any answers on GCP, SO, Google Groups, Slack or Medium.
Thanks,
TJ
First if all, welcome on GCP! It's an awesome platform, very powerful and flexible. But not magic.
Indeed, the use case that you describe is specific to your business logic. GCP provides told for securing access for user and VM(through service account) but not for customer. Here you have to implement your own custom and authorisation logic, with a database (I don't recommend bigquery for website, the latency is too high) to list three users, the companies where they work, the blobs of each company...
Nothing is magic and your use case specific.
If you want to discuss more about which component to use and to start, no problem. Let a comment.
Let me start with a bit of background: I'm helping a non-profit organization that would like to have a browser-based application that is backed by Salesforce, but has very specific requirements.
I see Salesforce has a REST API that we can call, so we can develop a standalone application to serve the web pages they want and use the REST API to call Salesforce when needed.
I'm wondering if there is a way to host a web application directly on Salesforce; this way we don't have to have a separate application server. Any recommendations or pointers to documentation/open source products is greatly appreciated.
Yes, you can create services that will allow your app to hit Salesforce
Depending on the type of application, yes you can host it on salesforce using the Salesforce Sites feature, also you can develop and host your app on Heroku which is owned by salesforce and can sync data to and from salesforce using Heroku Connect, or you can build and host it on another service like AWS and connect via the REST API. You just need to investigate and choose the option that best fits your use-case. One thing to be aware of is that there are API limits (the number of calls you can make to salesforce in a rolling 24hr period). Depending the the needs of the app be sure to see if those limits will be an issue. Because if the app makes constant calls to salesforce that could be an issue. But there are things you can do to get around that, like caching.
Yes, both Force.com Sites and Site.com features allow you to host webpages on the Force.com Platform. The markup is stored in Visualforce Pages and can use Apex to access records in the Database. I have migrated multiple websites (including our company's www.mkpartners.com) to Force.com using Force.com Sites.
One thing to keep in mind is that you are limited to 500,000 views per month and the rendering of a page with images that are also stored on the platform will incur a single view for the page and a single view for each image. If you already have a very popular website, I wouldn't migrate. If you're a small business or nonprofit, then it should be fine.
Another thing to keep in mind is that dynamic functionality based on records in the database will not work during maintenance windows. There is the ability to upload a static version of your website to be rendered during these windows though.
I have a MySql database set up and a mobile app that should be able to write/read to and from the database.
The data being stored will be posts, comments, votes, etc.
Eventually, I might create a website that uses the same database.
My question is, do I need some sort of middleman (restful) service running or can I just connect straight to the MySql db from the client code (either the mobile app or website)?
Introducing a REST api into the middle would be much beneficial in a lot of ways.
Improve generalization and reuse. (REST api can be used by both mobile client and web client, no need to do the same work twice)
Can maintain business logic centrally. (If there's a logic to change or a bug fix, no need to correct in 2 places)
Can be easily exposed to any other app/client which would need the set of operations provided by the api.
Extending and maintenance of the app would be much simplified and would take minimum effort.
Especially with the mobile application, where you have much less control of updates, it seems better to use some middle-ware to connect to your database.
Say for instance your website needs a little change in the database that would break an active version of the mobile application. A web service could catch that for you.
What about a new version of your mobile app that needs a change. Again a web service can handle that for you.
This is all about cutting dependencies and keep the complete ecosystem adaptable.
Whether this is a rest or any other type of web service is a completely different discussion.
As some background, my company is currently using an Apple Calendar server, some Exchange servers and a Google Apps subscription to provide calendaring for different parts of the organization. I've been tasked with providing free/busy access across these services while we try to take at least one of the services out of the equation.
I've attempted to use Google Interop, but it does not work with Exchange 2013 due to Microsoft eliminating Exchange Public Folder Databases in that release. I've also set up an IIS WebDAV server to attempt to share calendars, but this has shortcomings as well because only one person is able to moderate the calendar, and f/b data can't be queried in the Apple Calendar app - you have to subscribe to a separate f/b calendar.
Are there any suggestions as to how I should proceed?
If you can write your own connectors for each service (ie using propietary API's) you could then expose that information through a custom caldav service.
For example you could use http://milton.io (java) or http://sabre.io/ (php), both allow pulling data from arbitrary data sources.
I've been reviewing Windows Azure platform for some time, and can't find answer to one very important question.
If I deploy my application within a cloud, how it will be reached from different places worldwide?
For example if I have a web application with a database and want it to be accessible to users in UK, US, China and etc. Can I be sure that any user in the world will get almost the same request processing time?
I think of it this way.
1. User sends request (navigates in browser to my web site)
2. This request gets in a cloud in a nearest location (closest to user MS Data Center?)
3. It is processed by an instance of my web application (in nearest location, with request to my centralized DB which can be far away but SQL request goes via MS internal network, which I believe should be very fast).
4. Response sent to user.
Please let me know if I'm wrong.
Thanks.
Unless you take steps to run your application in different data centers around the world, it will typically run in a single data center. So if you, for example, run in North Central US (Chicago), then a user in Shanghai, China would connect through the Internet links across the Pacific then hit your servers in Chicago. This is similar to the process for a traditional web server. However, you don't need to maintain the web server, there's astonishingly good fault tolerance, and a blazingly fast connection into the Chicago data center. There is a content delivery network (CDN) in Azure, but currently it's only used for blob storage. So if you are distributing images and videos from Azure, they will end up cached closer to the user, but the Azure CDN doesn't help with the HTML pages from your web roles.
Note: at this time data transfers in and out of Azure in Asia is 3 times the price of Azure elsewhere.
ref: http://www.microsoft.com/windowsazure/pricing/
I've got an answer on different forum. The answer is no.