tl;dr: Is there any way to set the order of execution of Firefox Extensions?
Currently I'm using uBlock to block a request whose URI contains elements of tracking (like domain.com/?utm_source=facebook). I developed a Firefox extension that cuts those parts out (simple RegEx). Unfortunately my extension gets executed after uBlock: When a request contains utm_source, uBlock stops the further loading of a page. If my extension executes first, I would get rid of some tracking and uBlock wouldn't stop the page to load.
Is there any way to assign a "higher priority" to my extension?
Related
Hi I'm trying to make an offline version of this page:
https://u-he.com/tools/microtuning/ the script is writtin with Angular JS how do I do that?
I saved the page control-s and copied the file to the local server I'm running.
And then I browsed the local ip. the page opened but I get repeated notes ng-repeat shows up as multiple boxes instead of 1 box that edits the same note but in different octaves.
How do I solve this problem please.
You can inspect the front-end code in your browser console. In Firefox it's in the section called "Debugger", in Chrome it's called "Sources". If you use Safari, you need to enable Developer mode first.
Once you have the appropriate view, just click on u-he.com -> tools/microtuning/ -> index
Hopefully it goes without saying that you shouldn't use large swaths of another person's code without at least giving appropriate credit, or better yet getting the developer's permission, unless there is an explicit open-source license.
I am writing a automation script using selenium web driver for downloading multiple files one by one from a web site in Mozilla fire fox. I have downloaded first file successfully and next time I need to wait for download to complete. Can anybody help how to identify an ongoing download is completed using c# selenium web driver.? Since I am not getting the download complete status, I am unable to continue downloading next file.
Assuming you are testing file downloading in Firefox as you mentioned. Here is an approach to achieve what you want.
Open new window/tab in Firefox and navigate to 'about:downloads' directly.
i.e. driver.Navigate().GoToUrl("about:downloads");
You will get the list of downloads (i.e. already downloaded files and the file which is being downloaded currently.)
You will have to switch between your site tab and downloads tab.
Now on downloads tab, looking at HTML, you can easily find out in progress download using state or status attributes.
For example, for the first file in the list, if state="0" and/or status
contains the text 'remaining', it means downloading is in progress
So you can wait till state becomes state = 1 and/or status does not contain >the text 'remaining'
Refer the attached screen shot.
The explanation I gave here looks very high level, but I am sure this approach will work. It is simple too.
Let me know if you have any queries on this.
I have been attempting to port an existing Chrome extension to Firefox.
The extension uses a content script to add an element to a page and proceeds to bootstrap the angular app from that element.
This part works as expected, but individual directives that use templateUrl are blocked. Stepping through the code provides error messages like Access to restricted URI denied and NS_ERROR_DOM_BAD_URI.
In Chrome this problem is solved with the 'web_accessible_resources' whitelist.
This previous question seems to indicate that there is no analogous way to access extension resources in Firefox. This would be rather unfortunate, because, although I can inline all of the relevant templates, the extension also includes a number of images which are programmatically inserted on the page.
Is there any way to get at extension resources in an SDK extension?
If not, is it reasonably easy to do so with a legacy extension (as implied here)?
You can use the self module in the SDK to get a resolvable resource URI for files in the data directory. If you have your angular file in ./data/, you can retrieve it via:
const self = require('sdk/self');
console.log(self.data.url('angular.js'));
// Prints the url of `./data/angular.js`
Pretty lame problem:
I have an xml file that gets updated everyday on a server. Chrome keeps on getting the original cached xml file and not the updated version. The file is hosted on azure.
Any ideas how I could force Chrome to get the latest version instead? (obviously, asking the user the clear the cache isn't an option)
Place the xml file and other similar files in a common folder. Configure the folder so that the following header is sent with any content from the folder:-
cache-control: no-cache
This should cause browsers including Chrome to re-validate any cached content before using it.
I would append something to the URL as a dummy query string, to make sure that no browser will treat it as the same resource, forcing them to load the new version. You don't need to modify the serverside script, as it can safely ignore the new query string.
For this particular application, where updates are daily, it makes sense appending today's date, like so, in the request:
/path/to/my.xml?d=20100214
That way, even if the browser caches that particular XML file, tomorrow the query string will be different and the resource will be fetched again.
Unfortunately, I know nothing about Silverlight itself, but you seem to already be able to load the file.
I'm assuming the issue I'm having is related to caching. Code changes I make are not getting picked up when I debug. Most times I get served a previous version of the app. How do I prevent this from happening?
Ctrl+F5 is an easy way to refresh a page and clear the cache of that page at the same time - it may help :)
Try to add to the page that hosts Silverlight application on Page_Load:
Response.Cache.SetExpires(DateTime.Now.AddSeconds(-100));
Response.Cache.SetCacheability(HttpCacheability.NoCache);
Append a "version" querystring to your XAP Url, something like:
http://localhost:1234/ClientBin/my_silverlight_app.xap?v=1.0.287.5361
This will trick the browser (and many web servers) to think that this is a different file. And when the cache problem appears again, increase the number.
If you then want to employ proper caching, do it on the server-side with OutputCache directives.
As far as I see, this seems to be a problem with Firefox - when I used IE8, this didn't happen to me (I realize this may open its own can of worms, but at least for debugging and testing Silverlight, IE is much better)
I have not had any issues with Silverlight assemblies getting cached - you might want to try debugging the HTTP requests that go back and forth, to see if maybe your server is instead returning incorrect information to the browser (e.g. a "not modified" response).
For general no-cache behavior, the only reliable method I have found is to turn off caching in the browser.
For IE, this has been the only reliable option - otherwise, even if proper no-cache headers are sent, certain things are still cached (specifically, dynamically loaded resources which are accessed via Javascript XmlHttpRequest). I have not specifically had issues with Silverlight getting cached when it should not, though - IE has always loaded the latest updates even if cache is enabled.
Firefox has been much more problematic - even when disabling cache, it still sometimes caches XmlHttpRequest-loaded resources. Manually hitting Refresh a few times has been the only solution in such a case. Once again, I have had no issues with Silverlight assembles, even if cache is turned on.
In Firefox, I use the 'web developer' plugin and simply select to 'disable cache'. Works fine.
Firefox 3.5 under Tools has the option for Private Browsing. Click that to disable caching.
Here is how I have done it for flex/flash and silverlight and it works.
Code Behind ASPX or CSHTML
string slUrl = "/ClientBin/MySilverlight.xap";
string filePath = Server.MapPath(slUrl);
FileInfo info = new FileInfo(filePath);
// this will force browser to
// re download file if file was
// updated
slUrl += "?t=" + info.FileWriteTime.Ticks;
ASPX or CSHTML
<embed ....
src="<%= slUrl %>"
..
/>
Trick is you have to change url by adding something after ? and make a new arbitrary random query string or use file write time, and for browser, something?t=1 and something?t=2 are two urls and it will not pickup cache if t changes.
Instead of write time, you can also choose any standard config value or you can even simply hardcode your ASPX or HTML and append something after ? that will force browsers to download silverlight xap file again.
<embed ....
src="/ClientBin/MySilverlight.xap?something-different-each-time"
...
/>