I am a beginner in GAE and still evaluating if I should use this for my school project. I need to show that how an application can be scalable, the definition I wanna use here is whether it can serve 1000s of users concurrently.
Now load testing is one way of doing it. But load is futile when Google will scale out the application to a number of different instances depending on the load.
So hence, I am thinking of simulating data store read/writes access memcache etc to show the scalability prospects of the application.
Now, using JUnit Test is a good way to do this. But they can only be run locally. Is there a way to run them on the server, the actual production environment? If that can be done, then I can just write these tests and execute them via eclipse and I should be done!
The other way is to use functional testing with selenium to simulate the load and actual user conditions but this would most probably crash my computer and also not be concurrent.
The other option is to use a python load testing script and use sample json data to throw requests at the server urls. This however I tried but I cant test the options where genuine user interaction is needed since the live site requires a google sign in.
Any ideas where do I proceed to?
Look at either Siege (http://www.joedog.org/siege-home/) or JMeter (http://jmeter.apache.org/) for doing remote testing of your apps. The problem is though that you will hit the limit of your actual testing machine before you hit the limit of what you're trying to test so a lot of people spin up a few EC2 instances and run the load tests from there. Seige is very simple, it just reads a list of urls from a text file and clobbers the server as hard as you tell it to. JMeter lets you create more robust tests that can do things like log into a server and log more fine grained details about how your app behaves.
Those are the two best free and easy to use tools out there (IMHO).
It sounds like you really want to simulate datastore operations.
You can write a HTTP request handler that loads your junit tests and calls them, then either dump the results to log or as the HTTP result.
If they take a long time to run, you can run them on a backend instance.
Along the lines of Rick's suggestion, you can also probably run a test on a backend instance that makes HTTP requests to your frontend instances using the async HTTP API, and cause your front end instances to do a lot of work, if that's what you need to simulate.
Related
I am unable to perform load test angularjs based web-application. Please share suggestion and example if there is any.
If we can not perform load test on angularjs using jmeter then please suggest any tool or any approach.
Thanks!!!
Load testing tools are backend-agnostic, to wit they don't "know" anything regarding underlying technologies stack of the web application.
Just remember one simple requirement: load test must represent real life application usage. It means that each JMeter thread (virtual user) need to impersonate a real user using a real browser and accessing your application.
Real browsers in their turn don't do any magic, they just send HTTP Requests and render the responses. JMeter doesn't actually "render" the response, JMeter waits for the response and measures the time taken.
So the answer is YES, you can use JMeter (as well as any other testing tool) for load testing an Angular web application, just make sure that load pattern generated by JMeter matches the one which real browser generates.
The most tricky thing is implementing AJAX requests, due to their asynchronous nature they're incompatible with JMeter threads model therefore if your application relies on AJAX technology you will need to precisely mimic AJAX calls as well. You can use Parallel Controller for this (Parallel Controller is not an integral part of JMeter distribution, it's a plugin which can be installed using JMeter Plugins Manager)
I have developed an Angular JS Web Console. Web Console is basically creating, deleting, retrieving and deleting Users.
I want to do its performance testing using Chrome Dev Tool or Jmeter
If I use Jmeter how can I actually monitor the behavior of web console itself because from Jmeter I can only check the response time of API.
If I use chrome dev tool then how can I test it for multiple users against post and get operations.
For Example I have a Scenario that 10 Users are registering or signing in at a time. How can I test this behaviour.
OR
50 Persons are creating or deleting or retrieving a user using a form at a time.
OR
What will be the behavior of web console if 50 users are using web console at a time.
NOTE: Web Console is deployed on server. I want to test it locally and on server as well.
Need help. Thanks in advance!
Server side performance and client-side performance are different beasts so you can break down your performance testing requirements into 2 major parts:
Conduct the required load onto your web console using JMeter HTTP Request samplers. Make sure you configure JMeter properly to handle cookies, cache, headers, embedded resources (scripts, styles, images). See How To Make JMeter Behave More Like A Real Browser article for comprehensive explanation with regards to how to configure JMeter properly. If you need the requests to be fired in exactly the same moment of time also consider Synchronizing Timer
As JMeter neither actually render pages nor executes client-side JavaScript you can check client-side performance using one of below approaches (or any combination)
Using YSlow software
Using aforementioned Chrome Dev Tools
Using WebDriver Sampler (which provides Selenium and JMeter integration) so you will be able to measure page rendering time. If necessary you can add custom scripting logic using Navigation Timing API to get some extended information on page loading events in automated manner
I started setting up protractor tests for an angular SPA. To speed up test running, I've setup Selenium to run tests in Firefox and Chrome using multiple instances for each browser.
Before each test case we clear localStorage so we are certain of our starting position.
However, today I uncovered a conflict. Since a browser with two instances, if pointing at the same domain, it will share localStorage between the two. So if I have two tests running in parallel, and one requires a user to be logged in, while the other requires user to be logged out -- well, the first test will log user in, and now the user in the second browser instance will also be logged in (because localStorage got set/updated in both).
So now, one of my tests just failed because it found itself on the wrong page.
Is there any simple way to overcome this problem? (That does not involve setting up a server and multiple instances, or spending $2,000/mo on Saucelabs/Browserstack...)
Alright, i was facing the same problem and here is the solution i got:
You will have to make a copy of the framework and store it in two different places in your local machine.
You will have to create different listening ports for each instance launch in selenium, you can do the same by configuring the testng.xml that is being used for launching the automation tests.
I have a front-end angular app using firebase to store user data.
I currently do not have a backend set up, such as a node.js server.
I would like to use the Google Docs API to upload files from my app.
Since the Great Firewall of China does not (or makes unstable) the use of Google services, is it possible to place those services on the backend server and still use them reliably?
Perhaps after they have uploaded the document to firebase, a backend script retrieves it, uploads it to google docs, and then removes the record from firebase? Just trying to see if Google or similar services are even feasible for this use case.
I suppose the crux of my question is whether or not the calling of the Google API would be taking place on the user's computer, in which case would it become unstable?
** Updates for clarity:
I am deciding whether my firebase-backed app needs a more traditional backend like a node server to do things like: upload images and documents, send mail via Mandrill, etc... It would be helpful to me if I knew whether, after putting in the time to create a server, some of the services I am after (aka APIs) are any more resilient to the GFW than they would be if they ran on the client side. So if any one has had success in such a task, I would like to know.
** Technical update:
So, for example, if I run the Google Maps API on the client side, if the user is in China and is not running a VPN, accessing the API calls will either lag or time out or (rarely) success in returning the scripts. If I was somehow able to able to process the map query "off-site" aka on the server, could I then return with a static image of the map to a Chinese user without fail?
If I was somehow able to able to process the map query "off-site" aka
on the server, could I then return with a static image of the map to a
Chinese user without fail?
Yes, of course. What you are going to miss this way is all the front-end interactive functionality Google Maps offers. But if that's ok in your use case, sure.
I have never tried it with the GCF, but what I would do is this:
Google Maps <-> Your Reverse proxy <-> User
So, instead of the user visitng the real google maps site, it will be visiting your maps.mydomain.com site, that will be sitting in between, proxying everything.
Nginx is an excellent choice for a reverse proxy. If you need more control, there are good node.js reverse proxying packages that you an use to rewrite the content extensively before serving it (perhaps to obfuscate it in case the GCF blacklists content based on pattern matching, or to change the script names/links again to avoid pattern matching).
You are misunderstanding about the great firewall of China. I consulted for a couple of Chinese companies after the dot com crash so I can say this from personal experience, not hearsay.
It is mostly high-end Cisco hardware behind gateways behind their government telecom infrastructure. Nowadays they knock off what hardware they can, every chance they can, and spend money on specialized hardware to monitor cell phones systems.
There was a brief mention of the street-level surveillance hardware on 20/20 before the crash if you are interested in looking it up.
Not to discourage you, but I say set up whatever open servers you want with whatever frontends or backends you want, but the reality is the traffic is not going to be there.
That is why they call it an oppressive regime, they do not get to decide for themselves, remember?
I was handed an assignment but I don't know where to start.
The aim is to have 2 piece of code running. One will run in Open stack private cloud and perform the task of indexing two sets of text, with another running in EC2 with the task of matching the two indexed tests.
I want to access them via google app engine.
Ideally, I would like to click a button or perform an action on Google app engine, which then sends a request to Openstack to run the code and retrieve the output of a txt file.
That outputted text files will then be forwarded onto EC2 where the matching will occur and the results sent back to Google App Engine.
My question is, how can I send the files between the systems using REST requests?
FrankN --
EC2, GAE and OpenStack are disparate cloud computing platforms. To integrate them might include, say, using one platform while saving backups to another.
CloudU.Rackspace.com is a vendor-neutral education site about cloud computing (note: It'll take six or so hours to finish it all). This might help.
Disclaimer: I work for Rackspace.
Firstly, not really sure what your requirements are, why your code does or why are you trying to mix cloud providers in that way.
That said, I would suggest taking the upload from GAE and push it to AWS S3 where you can then retrieve and use as you please from EC2.
Not sure what functionality you are trying to get out of OpenStack that is not present in AWS; however, I would suggest building whatever you are building in EC2 first, then duplicate in on OpenStack services to avoid future vendor lock in.