Capturing page resources (css, js, images) using Selenium 2 in IE - selenium-webdriver

I'm using Selenium 2 (in IE only) and I need to capture all page resources (js, css, images files etc.) and their HTTP status.
I tried to use HTTP analyzer for this but this tool is very unstable and crashes all the time.
Could you please advise how I can resolve my problem?

You will need to use a proxy to do something like this. Selenium does not intercept HTTP traffic so cannot do this itself (There is an old capturenetworktraffic implementation in Selenium 1 but that was using some FireFox specific code and did not work for any other browsers).
To configure it:
Proxy proxy = new Proxy();
proxy.setHttpProxy(<proxyAddress>);
DesiredCapabilities cap = DesiredCapabilities.firefox();
cap.setCapability(CapabilityType.PROXY, proxy);
WebDriver driver = new FirefoxDriver(cap);
This should enable you to capture network traffic and as a result capture http status codes of various page resources.

Related

How to record the Requests for Angular JS and React JS based application using JMeter

As whwn I have recorded the requests are not visible
How to create test case for Performance testing on Angular Js and React JS based application
As whwn I have recorded the requests are not visible
As per JMeter project main page:
JMeter is not a browser, it works at protocol level. As far as web-services and remote services are concerned, JMeter looks like a browser (or rather, multiple browsers); however JMeter does not perform all the actions supported by browsers. In particular, JMeter does not execute the Javascript found in HTML pages. Nor does it render the HTML pages as a browser does (it's possible to view the response as HTML etc., but the timings are not included in any samples, and only one sample in one thread is ever displayed at a time).
Assuming above:
JMeter won't execute any JavaScript hence it won't generate any traffic connected with AJAX requests
If JavaScript call doesn't generate a HTTP Request - you don't need to worry about it as it runs only on the client side
If JMeter doesn't record anything - first of all check jmeter.log file for any suspicious entries. The most common reasons are:
people forget to import JMeter's certificate into their browser, see HTTPS recording and certificates chapter of HTTP(S) Test Script Recorder user manual entry for more details
people fail to configure browser properly, i.e. Firefox cannot record local traffic unless you set network.proxy.allow_hijacking_localhost property to true
Also be aware of an alternative way of recording a JMeter test: JMeter Chrome Extension. In this case you don't need to worry about proxies and certificates, just follow your test scenario steps in your browser and in the end you will be able to export the recorded script in form of JMeter .jmx test plan

How to set up proxy settings for chrome driver webdriver samper in Jmeter

I am trying to run Jmeter Webdriver script in Blazemeter. The browser is getting launched but subsequent requests are failing. Further investigation it was identified that the chrome driver is not able to launch the expected URL due to proxy requirements.
I tried to use proxy settings in Jmeter Set Up Thread Group using JSR223 sampler.
Below is the code for the same.
With this I am getting "The driver is not executable error" in Blazemeter.
Proxy proxy = new Proxy();
proxy.setHTTPProxy("xyz.net:1234");
proxy.SslProxy("xyz.net:1234");
ChromeOptions options = new ChromeOptions();
options.setCapability ("proxy", proxy);
System.setProperty("webdriver.chromedriver", "chromedriver");
driver = new ChromeDriver(options);
I believe it's better to address this form of questions to BlazeMeter Support
The driver is not executable error doesn't have anything in common with proxies, you need to amend your chromedriver permissions to allow its execution)
Take the following steps:
Add setUp Thread Group to your test plan
Add OS Process Sampler to your Thread Group
Configure it as follows:
That's it, you should be able to launch the browser now

Cefpython app with html/js files in local filesystem

I'm trying to make a hybrid python-js application with cefpython.
I would like to have:
JS and HTML files local to the cef python app (e.g. in './html', './js', etc)
Load one of the HTML files as the initial page
Avoid any CORS issues with files accessing each other (e.g. between directories)
The following seems to work to load the first page:
browser = cef.CreateBrowserSync(url='file:///html/index.html',
window_title="Rulr 2.0")
However, I then hit CORS issues.
Do I need to run a webserver also? Or is there an effective pattern for working with local files?
Try passing "disable-web-security" switch to cef.Initialize or set BrowserSettings.web_security_disabled.
Try also setting BrowserSettings.file_access_from_file_urls_allowed and BrowserSettings.universal_access_from_file_urls_allowed.
There are a few options in CEF for loading custom content and that can be used to load filesystem content without any security restrictions. There is a resource handler, a scheme handler and a resource manager. In CEF Python only resource handler is currently available. There is the wxpython-response.py example on README-Examples.md page.
Resource manager is a very easy API for loading various content, it is to be implemented in Issue #418 (PR is welcome):
https://github.com/cztomczak/cefpython/issues/418
For scheme handler see Issue #50:
https://github.com/cztomczak/cefpython/issues/50
Additionally there is also GetResourceResponseFilter in upstream CEF which is an easier option than resource handler, to be implemented via Issue #229:
https://github.com/cztomczak/cefpython/issues/229
You could also run an internal web server inside your app (easy to do with Python) and serve files that way. Upstream CEF also has a built-in web server functionality, however I don't think this will be exposed in cefpython, as it's already easy to set up web server in Python.

Stress test angularJS application (Jmeter)

I have Jmeter and webdriver plugin (chrome, firefox, phantomJS, ...)
The problem is when I launch the scenario with multi threads all headless (Chrome, PhantomJS) open the first thread and log into but all other threads don't log in, the reason we are already connected on the application (the aim have several users same time on the application), I don't know how to isolate session like firefox (the problem with firefox is not headless and only version 45 works)
I try to test recording controller via proxy and test recording in workbench but when i try to relaunch test the request don't go well (asynchrone) there is an explication tells "use transaction controller" then well but how ? i don't want to go on blazemater website i want to make it work locally anyone could make it work ? nobody stress test angularJS application ?
I prefer the 2nd solution call the browser via jmeter and test ajax via the http request but i don't know how it works
any idea ?
Depending on how many users do you need:
You can parameterize your test so different JMeter Threads (virtual users) would use different credentials to log into the application from different browsers via i.e. CSV Data Set Config. All browsers which are kicked off by the WebDriver Sampler should be isolated from each other and given you use different credentials you should be good to go. But it will only play for several users, as per WebDriver Sampler 10 Minute Guide
However, for the Web Driver use case, the reader should be prudent in the number of threads they will create as each thread will have a single browser instance associated with it. Each browser consumes a significant amount of resources, and a limit should be placed on how many browsers the reader should create.
If you go the HTTP Requests way the easiest option to mimic AJAX calls would be putting them under the Parallel Controller so your test would look like:
Transaction Controller
Main Request
Parallel Controller
AJAX request 1
AJAX request 2
etc.
Strangely, i make a simple configuration and it works, my angularJS application is embedded in a war but i don't know if it is doing a difference the structure is like this:
Plan
Thread Group
HTTP Cookie Manager
HTTP Header Manager
HTTP request Defaults
Recording Controller
I recorded the scenario and simply play it (i assume that the login is in the right order) it is html pages i don't see the JS because the application is in application server

Silverlight 4 OOB + Browser HTTP Stack + Client Certificates = FAIL?

I'm having an issue where IIS 7.5 (on Windows 7 64-bit) is failing when I call it from an out-of-browser Silverlight 4 app using SSL and a client certificate, with the message "The I/O operation has been aborted because of either a thread exit or an application request. (0x800703e3)". The request does make it to IIS. here is a sample from the failed request trace:
The I/O operation has been aborted because of either a thread exit or an application request. (0x800703e3) http://www.slipjig.org/IISError.gif
I am using the browser HTTP stack, because the client HTTP stack does not support client certificates. The client code attempting to hit the server is the Prism module loader. If I run the app out-of-browser but ignore client certs, or if I run the application in-browser but require client certs, it works fine. It seems to be the combination of the two that is causing the problem.
I tried the following to gather more info:
Used Fiddler to view the failing request. It works if Fiddler is running (presumably because Fiddler is handling the client certificate differently?);
Created an .aspx web form to serve up the module .xaps;
Created an HTTPModule to see if I could intercept the request before it failed;
Used a packet sniffer to see if I could tell if the client certificate was being sent correctly.
None of the above gave me much useful information beyond what I could see in the trace file, although the Fiddler thing is interesting.
Any ideas? Thanks in advance!
Mike
I beat my head against the wall for weeks on this problem. Here's what I learned and how I finally worked around it.
Prism's FileDownloader class uses System.Net.WebClient to load modules. In OOB mode, WebClient seems to use the same stack as IE, but it apparently either doesn't send the client certificate, or (more likely) doesn't correctly negotiate the SSL/client cert handshake with the server. I say this because:
I was able to successfully request .xap files using Firefox and Chrome;
I was not able to successfully request .xap files using IE;
IIS would fail with a 500, not a 403.
I couldn't get good visibility into what was actually happening over the wire; if I used Fiddler, it would work, because Fiddler intercepts communications with the server and handles the client certificate handshake itself. And trying to use a packet sniffer obviously wouldn't tell me anything because of SSL.
So - I first spent a lot of time on the server side trying to eliminate things (unneeded handlers, modules, features, etc.) that might be causing the problem.
When that didn't work, I tried modifying the Prism source code to use the browser's HTTP stack instead of WebClient. To do this, I created a new class similar in design to FileDownloader, implementing IFileDownloader, that used the browser stack. I then made some changes to XapModuleTypeLoader (which instantiates the downloader) to make it use the new class. This approach failed with the same error I was originally experiencing.
Then I started researching whether a commercial third-party HTTP stack might be available. I found one that supported the features I needed and that supported the Silverlight 4 runtime. I created another implementation of IFileDownloader that used that stack, and BOOM - it worked.
The good news with this approach is that not only can I use this to load modules, I can also use it to protect communications between the client and our REST API (a benefit we were going to give up, before).
I plan to submit a patch to Prism to allow the downloader to be registered or bound externally, as it's currently hard-coded to use its own FileDownloader. If anyone is interested in that or in the commercial HTTP stack I'm using, contact me (msimpson -at- abelsolutions -dot- com) for links and code samples.
And I must say this - I still don't know for sure whether the root problem is in the HTTP stack on the client side or the server side, but it's a FAIL on Microsoft's part nonetheless.
What we (Slipjig and I) found out this week is that there does appear to be a way around these issues, or at least, we're on the trail to determining whether there is a reliable, repeatable way. We're still not positive on that, but here's what we know so far:
At first pass, if you have code like this you can start making requests with either the Browser or Client stack:
First, place a "WebBrowser" control in your Silverlight XAML, and make it send a request to your HTTPS site.
This may pop up the certificate dialog box for the user. Big deal. Accept it. If you have only one cert, then you can turn an option in IE off to suppress that message.
private void Command_Click(object sender, RoutedEventArgs e) {
// This does not pop up the cert dialog if the option to take the first is turned on in IE settings:
BrowserInstance.Navigate(new Uri("https://www.SiteThatRequiresClientCertificates.com/"));
}
Then, in a separate handler invoke by the user, create an instance of your stack, either Client or Browser:
private void CallServer_Click(object sender, RoutedEventArgs e) {
// Works with BrowserHttp factory also:
var req = WebRequestCreator.ClientHttp.Create(new Uri("https://www.SiteThatRequiresClientCertificates.com/"));
req.Method = "GET";
req.BeginGetResponse(new AsyncCallback(Callback), req);
}
Finally, the Callback:
private void Callback(IAsyncResult result)
{
var req = result.AsyncState as System.Net.WebRequest;
var resp = req.EndGetResponse(result);
var content = string.Empty;
using (var reader = new StreamReader(resp.GetResponseStream())) {
content = reader.ReadToEnd();
}
System.Windows.Deployment.Current.Dispatcher.BeginInvoke(() =>
{
Results.Text = content;
});
}
I had the same issue and I fixed it by creating the certificate using makecert. Follow the steps from this article http://www.codeproject.com/Articles/24027/SSL-with-Self-hosted-WCF-Service and replace CN with your ip/domain. In my case I have tested the service on the local machine and run the commands as follows:
1) makecert -sv SignRoot.pvk -cy authority -r signroot.cer -a sha1 -n "CN=Dev Certification Authority" -ss my -sr localmachine
after running the first command drag the certificate from "Personal" directory to "Trusted Root Certification Authority"
2) makecert -iv SignRoot.pvk -ic signroot.cer -cy end -pe -n
CN="localhost" -eku 1.3.6.1.5.5.7.3.1 -ss my -sr
localmachine -sky exchange -sp
"Microsoft RSA SChannel Cryptographic Provider" -sy 12
In case you want to run the silverlight application on another machine, export the certificate created at step1 and then import it on any machine where you want your application to run.

Resources