Windows Azure Worldwide availability - request

I've been reviewing Windows Azure platform for some time, and can't find answer to one very important question.
If I deploy my application within a cloud, how it will be reached from different places worldwide?
For example if I have a web application with a database and want it to be accessible to users in UK, US, China and etc. Can I be sure that any user in the world will get almost the same request processing time?
I think of it this way.
1. User sends request (navigates in browser to my web site)
2. This request gets in a cloud in a nearest location (closest to user MS Data Center?)
3. It is processed by an instance of my web application (in nearest location, with request to my centralized DB which can be far away but SQL request goes via MS internal network, which I believe should be very fast).
4. Response sent to user.
Please let me know if I'm wrong.
Thanks.

Unless you take steps to run your application in different data centers around the world, it will typically run in a single data center. So if you, for example, run in North Central US (Chicago), then a user in Shanghai, China would connect through the Internet links across the Pacific then hit your servers in Chicago. This is similar to the process for a traditional web server. However, you don't need to maintain the web server, there's astonishingly good fault tolerance, and a blazingly fast connection into the Chicago data center. There is a content delivery network (CDN) in Azure, but currently it's only used for blob storage. So if you are distributing images and videos from Azure, they will end up cached closer to the user, but the Azure CDN doesn't help with the HTML pages from your web roles.

Note: at this time data transfers in and out of Azure in Asia is 3 times the price of Azure elsewhere.
ref: http://www.microsoft.com/windowsazure/pricing/

I've got an answer on different forum. The answer is no.

Related

Enable an asp.net core web application to work without internet and with internet?

I have been developing an asp.net core web application and published on the production mode (online server), the users can access it with the specific domain name and will log in and do data entry from three different countries.
But, the problem is sometimes, in one specific country there is no internet access, my client wants that this application should work online and offline, If there is no internet access the local branch must be able to do data entry, then when the internet gets connected data should send to the online server database,
What is the best way to achieve this goal?
Please write your view or add some good forum link below.
Rationally, it is not possible for you to access a Web App without internet. Web Apps are meant for network usage. However, I believe there is a workaround for such requirements. What you can do is that you can create a clone of your database for the third user, who has no internet access and perform all transactions within the local machine and when the connection comes back on line, you can replicate the data from the local SQL Server into the online server database.
And then there is something called Progressive Web Apps , which will allow you below privileges :
Reliable - Load instantly and never show the downasaur, even in uncertain network conditions.
Fast - Respond quickly to user interactions with silky smooth animations and no janky
Engaging - Feel like a natural app on the device, with an immersive user experience.
What are Progressive Web Applications, Google has something more to discuss here

How do I authenticate users of a web-app to access GCP data relevant only to them?

I have spent 3 days researching this problem and cannot find a solution or similar use case that shows how to solve the problem, so any pointers would be greatly appreciated.
I am creating a web-app that uses Google Cloud Storage and Bigquery. A user registers on the web app and then can upload data to Cloud Storage and Big Query. Two users could be from the same company and therefore should be able to view the same data - i.e. Jack and Jill work for company A and if Jack uploads a massive dataset via this app, Jill should also be able to view it later.
Another scenario will be I have two completely separate clients with users using this web-app. If users from Company A upload data, users from Company B should not be able to view Company A's data, and vice versa. But users from the same company should be able to view the data within their company.
Currently, I have an app that works for a single company. This has a React front-end that uses Firebase for authentication. Once the user is logged in, they can use the app which sends off API calls to a Flask back-end that does some error checking and authentication checking and then fires off an API call to GCP. This uses a service account and the key is loaded as an environment variable in the environment in which the Flask app is running.
However, if Company B want to use the app now, both Company A and Company B will be able to see each other's data and visualize it through the app. In addition, they will be sharing a project (I would like to change this to allocate billing more easily to have each client have their own project).
I ultimately want to get this app onto Kubernetes and ensure that each company is independent of each other, however, do not want to have to have separate URL's for every company using the app. Also, I want to abstract GCP away from the client. I would prefer to authenticate a user based on their login credentials and then they will be given access to their GCP project (via my front-end) accordingly.
I thought about perhaps having separate service keys for each client and then storing the service key info in Firebase, while using the respective keys for API calls but not sure this is best practice. It is however the only strategy I can think of.
If anyone could provide some help or guidance it would be very much appreciated. This is my first GCP project and have not been able to find any answers on GCP, SO, Google Groups, Slack or Medium.
Thanks,
TJ
First if all, welcome on GCP! It's an awesome platform, very powerful and flexible. But not magic.
Indeed, the use case that you describe is specific to your business logic. GCP provides told for securing access for user and VM(through service account) but not for customer. Here you have to implement your own custom and authorisation logic, with a database (I don't recommend bigquery for website, the latency is too high) to list three users, the companies where they work, the blobs of each company...
Nothing is magic and your use case specific.
If you want to discuss more about which component to use and to start, no problem. Let a comment.

How in the web farms or cloud the actual data gets synced between the servers inside the web farm or cloud

1- lets say the web app is hosted on some cloud like azure, aws etc.
2- and Let say a user changed his profile details on my web app...
3- i am assuming that the request with the new data will hit one of the servers/VM inside the cloud.
4- Lets say the data has got saved in sql server db hosted on the same server/VM as the request landed on ...
Now the questions ..
What i am really confused about is that ..
1- the data will be saved on one single database in the first place then how it gets synced to other servers (if it happens, i am not sure about this) instantly.. because there is no guarantee that the next request from the same user will land on the same server...
2- and if the above scenario is invalid and there exists a shared database server for all application servers inside the cloud then isn't it going to be useless as ultimately the database server is going to get overloaded ... because all the servers/VMs hosting application will be hitting the same db at once ...
i know it's a wide question and i don't know if i have explained it properly ..
but please ask me about anything which i haven't made clear ..
any help whether it is a good link explaining the insights or a series of Q&A with me, anything would be great.. as i have to design such a mechanism and i couldn't find any standard approaches or that how the market leaders are doing it ...

Design: using a backend server to circumvent great firewall of china

I have a front-end angular app using firebase to store user data.
I currently do not have a backend set up, such as a node.js server.
I would like to use the Google Docs API to upload files from my app.
Since the Great Firewall of China does not (or makes unstable) the use of Google services, is it possible to place those services on the backend server and still use them reliably?
Perhaps after they have uploaded the document to firebase, a backend script retrieves it, uploads it to google docs, and then removes the record from firebase? Just trying to see if Google or similar services are even feasible for this use case.
I suppose the crux of my question is whether or not the calling of the Google API would be taking place on the user's computer, in which case would it become unstable?
** Updates for clarity:
I am deciding whether my firebase-backed app needs a more traditional backend like a node server to do things like: upload images and documents, send mail via Mandrill, etc... It would be helpful to me if I knew whether, after putting in the time to create a server, some of the services I am after (aka APIs) are any more resilient to the GFW than they would be if they ran on the client side. So if any one has had success in such a task, I would like to know.
** Technical update:
So, for example, if I run the Google Maps API on the client side, if the user is in China and is not running a VPN, accessing the API calls will either lag or time out or (rarely) success in returning the scripts. If I was somehow able to able to process the map query "off-site" aka on the server, could I then return with a static image of the map to a Chinese user without fail?
If I was somehow able to able to process the map query "off-site" aka
on the server, could I then return with a static image of the map to a
Chinese user without fail?
Yes, of course. What you are going to miss this way is all the front-end interactive functionality Google Maps offers. But if that's ok in your use case, sure.
I have never tried it with the GCF, but what I would do is this:
Google Maps <-> Your Reverse proxy <-> User
So, instead of the user visitng the real google maps site, it will be visiting your maps.mydomain.com site, that will be sitting in between, proxying everything.
Nginx is an excellent choice for a reverse proxy. If you need more control, there are good node.js reverse proxying packages that you an use to rewrite the content extensively before serving it (perhaps to obfuscate it in case the GCF blacklists content based on pattern matching, or to change the script names/links again to avoid pattern matching).
You are misunderstanding about the great firewall of China. I consulted for a couple of Chinese companies after the dot com crash so I can say this from personal experience, not hearsay.
It is mostly high-end Cisco hardware behind gateways behind their government telecom infrastructure. Nowadays they knock off what hardware they can, every chance they can, and spend money on specialized hardware to monitor cell phones systems.
There was a brief mention of the street-level surveillance hardware on 20/20 before the crash if you are interested in looking it up.
Not to discourage you, but I say set up whatever open servers you want with whatever frontends or backends you want, but the reality is the traffic is not going to be there.
That is why they call it an oppressive regime, they do not get to decide for themselves, remember?

microsoft azure and silverlight

I am interested in developing a site similar to youtube. I want to have a site that users upload videos.
I imagine technically the website would upload the video to the azure cloud. Where the file will automatically be encoded to silverlight and hosted.
Can azure host my site, take care of encoding and host the videos all programmatically?
And can azure host the rest of the website pages that are not part of the app like a (homepage or about us page) and have a domain name or do i need a web host?
thanks
Azure can do the lot.
You'll probably want to use Azure Blob Storage for the initial upload, then use queues and the worker role functionality to do the encoding and other processing. Then you can store the resulting file back in Blob storage, and have an index either in Azure Tables or SQL Azure, depending on the architecture of the rest of the application.
And yes, an Azure Web role can quite happily host static content, standard dynamic ASPX pages, and a whole lot more (and can do it all on your own domain).
I suggest you grab the Windows Azure SDK (from http://www.microsoft.com/windowsazure/) and take a look through the documentation. Your example scenario is pretty simple actually, and working through the samples should give you all the information you need.
Good luck!
Azure can host your site indeed. However don't forget that the costs will probably be a minimum ~ $80-90 per month even without any load. If your website gets traffic this amount will increase
However you will have to implement video encoding yourself (or better yet find libraries to do it), Azure is purely a host.

Resources