What are the reasons that cause plugin analytics to use 1x1 img pixels to call their code? What permissions does this shortcut?
The server hosting the 1x1px image logs the request. This allows collecting statistics about who loads (and renders) the page with the image. Browsers in general allow cross origin in img tags.
Also, this allows javascript to be loaded from the same domain as the image.
Behind these image is a server-based script like php, python or so on which can get and log information about the visitor (visited page, referer, ip, ...) for the tracking-stats. The server will be able to get all information you can get on the serverside from a http-request. The benefits of this methods are that it works on nearly all browsers and devices. Its also lightweight because an 1x1 image will cause nearly zero bandwitch.
Its a common solution to track pageviews. To get more flexible, some alternative tracking-methods like pwiki don't even use javascript. They're inserting a tracking-pixel which also works when the user has javascript disabled or he is using some plugins which remove tracking-scripts.
It also works on emails so that the sender has information about which and how much users read for example a newsletter-mail. But nowadays nearly all deskto- and webclients are blocking images in emails by default to prevent these tracking-methods. Images will only be displayed when the user explicit allow this for a specific domain.
Related
Is it possible to have a mobile website that can still function if there's no internet connection?
The user should still be able to use the website (if he has visited that page before), see the data (that was loaded before), add new stuff (cache locally).
When internet connection comes back online, all changed local data should be pushed online.
This should be a complete webbased solution, not a native app.
You should have a look at HTML5 offline storage, see http://diveintohtml5.ep.io/offline.html and the Offline Web Applications spec as a start. There are also quite a few posts here on SO.
Bookmarklets work when a user is offline. The trick with a bookmarklet is that it's entirely self contained javascript wrapped up in such a way that it can live within the bookmark itself. E.g. a javacsript: URL. You can also have a data: URL as a bookmark, which could be a complete HTML page. Usually these are base64 encoded with a mime type.
Probably what I'd do would be have a small base page as data:text/html,base64 which contained whatever offline content you cared about, but periodically tried to bootstrap the rest of the "real" content from wherever you host it.
I have a WPF intranet app running in Trusted mode (local only).
I would like the users to be able to upload an image and attach it to an article on my newsletters section. I am having trouble deciding where these images will be stored.
Please provide me with your opinions.
At present I have a few ideas myself;
I could have an aspx page that runs parallel to this app, and run this inside a browser(I-frame). This page could then handle the upload and display of the image.
I could also, have the users copy directly to a network share.
It seems that there should be a more elegant sollution that I am not aware of.
Any ideas?
Don't force the solution towards ASPX just because you know how to do it there. It's unnatural to build a page, host browser to show that page etc, just so you could upload an image.
It's actually quite simpler to do it in a desktop client than on web page. You have a "Load File Dialog" - use that to get to the filepath the user wants to upload, and when you have that you can either:
copy it (inside your application) to your share,
or if you have a service - send it through some method call,
or you can even store it inside a database (recommended if the files are small)
There's really lots of options here... it depends if your client has connection to db, do you have service in between, etc...
We are considering hosting the core of our site (everything that doesn't need to be dynamically generated) on a CDN, so that our root domain (e.g. "http://example.com/") would point to the CDN, then everything dynamic would either point to an alternate second-level domain (e.g. "http://search.example.com/ for searches) or be layered on top of the static content by AJAX calls to an alternate domain (e.g. http://ajax.example.com/).
This seems like something that would be very desirable for lots of sites but I don't see much information even on the CDN home pages about doing whole-site caching. There is at least one obvious problem that occurs to me, which is that we currently detect whether the user is coming from a mobile browser or not and serve mobile content if they are coming from a mobile browser. The problem is that as far as I know, with most CDNs you can only store on version of a page, so if you cache the regular page, mobile browsers will see that instead of the mobile version (and obviously vice versa).
We could get around this to some degree by moving the mobile stuff to a separate domain like m.example.com but we would need the CDN to detect mobile browsers and redirect them to that domain (which we would also like to have hosted on the CDN, but pointing at the mobile content instead of the regular content, obviously).
It seems like this should be widely supported but I can't find much information on it. Has anyone done something similar? If so, what CDN did you use and how did you address this issue? Were there other significant hurdles that needed to be overcome?
Edited to add a couple of things I forgot:
We also considered redirecting to the mobile site using javascript but then obviously older phones without javascript would be left out in the cold and they are the ones that probably need the mobile version the most.
One constraint that may factor into any answers to this question is that we need the URLs of our primary site to be very specific for SEO purposes but we don't care at all about SEO for the mobile version.
We have rules at our CDN (EdgeCast) that will cache multiple versions (Desktop, Iphone, Blackberry, etc.) of the same incoming Url. The CDN rules append a querystring to the request to the origin server. Custom code at our origin server renders the proper version depending on the incoming querystring. For example:
Desktop: CDN requests /?nomobile origin server returns Desktop rendering
Iphone: CDN requests /?iphone origin server returns Iphone rendering
Blackberry: CDN requests /?mobile origin server returns Mobile rendering
As far as the CDN is concerned, there are 3 different Urls, so 3 different pages are cached. The querystring is completely transparent to the end user. Even if you use a responsive design with media queries, this approach is incredibly valuable in giving you the flexibility to alter the HTML at the server level.
If the rendering of your page is different for various devices (e.g. mobile phones) it is not a static content and should not be on your CDN.
Put only real static files on your CDN and consider a different caching strategy for your pages.
Anyhow instead of detecting the client's browser via JavaScript you could also do this on the server-side and actually I would recommend you that, instead of JavaScript. Then you could realize the redirect approach.
Hope that helps.
I have a news letter which i did in silverlight, is there a way to send it in email. like as you include html tags, is there a way to include silverlight xap package in it.
Probably better to reference a webpage containing your silverlight content.
Technically, you could put the path to the .xap hosted on a website into an HTML email body, but nearly all mail clients will not display this - most even prevent images from loading by default.
Most email systems will prevent you from embedding active content like SilverLight, as it presents a security risk. Your only option probably is to put your SilverLight app on the web, and just email a link to it.
Don't if you want your newsletter to be read by anyone. See this article for a good list of do's and don'ts when sending emails.
Don't listen to those guys, they're probably FlashHeads... ;)
Besides that they give up too easily. More power to ya!
I assume this newsletter is for an audence that specifically desires your content: i.e a club or similar organization that doesn't have a windows based webserver.
What you do is attach the file in such a way that they drag a zip containing the files that would normally be served from a website to the hard drive - right click - extract all then they run it by clicking on an HTML file with .htm extension that hosts the silverlight plugin instead of an aspx file.
One note that probably won't matter to you is that without a server backing this up the content can't really send you back any info but it CAN get dynamic info that comes from say RSS feeds or WCF services hosted on the web.
I have an application that shows a screen of image thumbnails, each image is around 80k and they are stored in a database. To keep response time reasonable, the appilcation displays a placeholder image when it first starts and later downloads the images from the server. I'm expecting to show around 40 images on the screen at once so that's my batch size. What's the best way to serve these images up to the client? I've got two options in mind.
Create an ADO.NET Data Service that exposes the Images database table to the client. The client can asynchronously request the images, one at a time, and display them as they come back from the server. I've implemented this solution and it seems to work Ok; the speed isn't great and I feel like I could utilize the Http pipe better by requesting maybe 3 images at a time.
Create an HttpModule on the server that looks for requests that look something like /Images/1.jpg and then reads the database and returns the requested data. On the client side I can have many Image objects whose source points to the virtual Urls on the server. My theory is that by just giving Silverlight many Urls to deal with it may be able to transfer the images more efficiently than my code in option 1.
Would either of these methods be more efficient or is there another technique for getting this done? Thanks!
I don't know if it's more efficient, but I've accomplished a very similar task using an HTTP Handler (ashx). The handler pulls the image in from the database based on the Parameters in the uri (image ID), and then Silverlight fetches them asynchronously by setting the Source property of an Image control to the URI of the handler with the specific ID that I want in the query string. The Image control, in turn, is inside of an ItemsControl which allows me to display multiple images.
We are doing something very similar, and we are just using an ASPX page to server them up with a query parameter of the image identifier. We are also caching the images, and the ASPX page will used the cached value if it exists. If not, we pull it from the data store, cache it, and send it down. It is working really well for us.
Have you looked at using Deep Zoom? It's very efficient about progressive image loading, and gives you a nicer user experience when the images are fully loaded.
Examples:
Hard Rock Memorabilia site
Deep Zoom Pix