MacRuby ScriptingBridge loading speed - macruby

Is there a faster way to load scripting bridge applications from MacRuby?
When I make the SBApplication object for Adobe InDesign it takes more than 10 seconds to load. Here's the code I've been using:
framework 'ScriptingBridge'
indesign = SBApplication.applicationWithBundleIdentifier("com.adobe.InDesign")
puts indesign
Of course the generated header file is huge for InDesign... Is there a way to force the use of a support file?

The large load time when using ScriptingBridge is a documented issue. An alternative is macruby-appscript.

Related

Initial page load performance for an angularjs app

I'm working in an AngularJS app that uses webpack for bundling the resources. Currently we are creating a single app.js file that contains the CSS as well. The size of the app.js is around 6MB. If we break the app.js into multiple chunks does that improve the page performance. My colleagues convinces me if we break the single JS file into 2 or 3 then the page load time will increase twice or thrice. Is that really true? I remember reading some where having a single file is better than multiple. I don't really remember the reasons now. Do I really need to break the app.js file for page performance? or what other options I can apply here?
A single file is better because it requires fewer connections (means less overhead), but this is really negligible when talking about < 5 files. When splitting parts of your file you do gain the ability to cache the files separately, which is often a great win. Therefore I'd recommend splitting the files in logically cachable sections (like vendor code and custom code).
Also note that if the client and server support http/2, the fewer connections reason is also gone since http/2 supports connection re-use.
Note that there is no real difference for the initial load time, since in that case all files will need to be downloaded anyway.
A single file will usually mean better performance. You should also ensure that this file is properly cached (on the browser side) and gzipped when served by your webserver.
I did a practical test in Chrome (Mac 54.0.2840.98 (64-bit)) to prove whether there is really a performance gain in breaking a huge JS file into many. I created a 10MB js file and made three copies of it. Concatenated all the 3 copied and created a 30MB file. I measured the time it took for the single file that is referenced using a normal script tag at the page bottom and it's around 1 minute. Then I referenced the 3 10MB script files one after other and it took nearly 20seconds to load everything. So there is a really a performance gain in breaking a huge JS file into many. But there is a limit in the no. of files the browser can download parallely.

Lighttpd: pre-file-upload action possible?

My scenario is like this: on an embedded device we have a web interface using lighttpd and a cgicc-based application. Uploading a new firmware takes a lot of time, especially the CPU has heavy load (which is the typical case in field operation). For example running with 80% CPU usage from 'top' the upload need 5-10 minutes(!), with services off it takes only 1 minute.
Therefore I must implement something which allows me to deactivate the services before file upload starts. Problem is that my CGI will recognize the file upload operation after lighttpd uploaded the whole file into a temporary set of files. My only idea is to implement a second button which allows the human operatior to manually disable the services before starting upload. But this is not really elegant.
In Javascript I could hide the second button and simulate a click using the upload button, maybe (can I catch the click for a file input?). But this sound very dirty, especially we use the unobstrusive Javascript pattern.
Is there some other way to initiate some pre-file-upload action? Maybe by a module or an other feature of HTTP or browsers etc I don't know?
PS: we need backward compatibility to IE6, so no HTML5 features can be used. We use XHTML 1.0 strict mode.
Modern version of lighttpd have improved lighttpd memory management and slightly reduced lighttpd's memory footprint, so this might no longer be an issue for you.
If it still is, then:
lighttpd mod_magnet can be configured to send a trigger:
https://wiki.lighttpd.net/Lua_Trigger_Functions
However, if you disable services, you should probably only do so for a limited time (e.g. two minutes) before automatically re-enabling the services.
Better, identify the exact bottleneck why running both services and upload is slow. Is the embedded system running out of memory and swapping? Is the embedded system running out of CPU?

Microsoft Pivot JIT Collections

I'm attempting to implement a Microsoft Pivot viewer within my application, I've decided that a cross between a JIT collection and Linked Collection is the best case however I'm having a few issues.
The images for the collection are generated by a C# Windows Service that runs overnight generating new images that are required.
The CXML file is then generated dynamically when the user request the data using a custom HttpHandler.
The issue is that with 10000+ items my development machine runs out of memory whilst trying to turn the generated png/jpg images into a DZC.
Is there a tool available that I could use to turn the images that I have into DZI images. I've looked at the Deep Zoom Composer, but I really need something that I can run overnight to convert the images, currently I have about 45,000 images that I need to convert, the Composer just locks up trying to do anything with that amount of images.
Ideally I would like to be able to create the DZI images directly using the service instead of creating png/jpg images first.
i could not see if you already had these tools or not but here goes nothing:
http://www.silverlight.net/archives/whitepapers/deep-zoom-tools
if you use these tools you can preprocess the images and use them later in your collection. if you need dynamic images you might want to take a look at the Silverligh 5 RC (of which the latest version includes the new pivotviewer). this new version converts your xaml into dzi at runtime and is a really cool control!

extjs file size is so big what to do

I am building an application in extjs, the ext-all.js size is 700kb which is very big and not acceptable by our technical architect. So what should I do ?
Should I remove extjs and build in some other UI.
or
I can do something about size ?
You should calmly but firmly explain the following to your technical architect:
The browser will treat the JavaScript file as a static resource and cache it after the initial download, so each visitor will only download the file once (unless they clear their browser cache, which most people don't), even if it is included on every page on the website.
Any modern web server supports automatic gzip compression of text documents (which includes things like JavaScript files). Assuming this is enabled, it means that the amount of data that the client actually downloads is significantly less than 700 KB. You can see what the actual download size is by taking your 700 KB JavaScript file and archiving it with gzip (or any equivalent utility).
Have you considered using the Ajax Minifier on the ext-all.js file? It should drastically reduce its size. Not to mention that browser caching should make it a one-time download (unless you update the underlying file).

How do I put and replace a file using in a client machine in a Silverlight app?

I'm trying to make a Silverlight app which has a local sqlite file to search some data when the app gets offline. I found the following library, http://code.google.com/p/csharp-sqlite/ and it seems pretty nice.
So, what I want to know is, what is a good approach to have and place a file which might be replaced by automatically when the data in a server gets updated at some points?
I've tried to put a file into a folder under the app, but I couldn't access to the file by using csSQLite.sqlite3_open (This method is from the library above). Sorry, I'm pretty new to Silverlight, so my question might be very odd.
Thanks in advance,
yokyo
It doesn't look like this library has been specifially coded for Silverlight. Despite being a pure C# implementation its still likely to assume the full .NET API is available. This is not true in Silverlight.
Specifically Silverlight cannot ordinarily access the local file system. The SQLLite code would need to be modified to understand Silverlight's IsolatedStorage. It would also have to limit its file operations to those that are supported by the streams available Isolated Storage.
The creation of a DB-esq. data source in Silverlight is typically done by create Classes the represent records and collections of records, using LINQ to query them and Xml serialisation into Isolated storage to persist them.
Here is a hacked version of the SQLite code to work with Silverlight, you can use it for some ideas on what to do: http://www.itwriting.com/blog/1695-proof-of-concept-c-sqlite-running-in-silverlight.html

Resources