Silverlight and SSL Client Certificates - silverlight

Can anyone point me in the right direction of how I can use SSL client-side certificates with Silverlight to access a restful web service?
I can't seem to find anything on how to handle them, or even whether they are supported.
Cheers.

Slipjig mentioned this:
"The browser stack does, and pretty much automatically, if you're willing to live with its other limitations (lack of support for all HTTP verbs, coercion of response status codes, etc.)."
If that is acceptable to you, look at how Microsoft themselves deal with this in some of their APIs using the custom X-HTTP-Method header, like how they do it for WCF and OData:
http://www.odata.org/developers/protocols/operations
In MSDN, Microsoft also mentions this about using REST in conjunction with SharePoint 2010's WCF based REST API:
msdn.microsoft.com/en-us/library/ff798339.aspx
"In practice, many firewalls and other network intermediaries block HTTP verbs other than GET and POST. To work around this issue, WCF Data Services (and the OData standard) support a technique known as "verb tunneling." In this technique, PUT, DELETE, and MERGE requests are submitted as a POST request, and an X-HTTP-Method header specifies the actual verb that the recipient should apply to the request. For more information, see X-HTTP-Method on MSDN and OData: Operations (the Method Tunneling through POST section) on the OData Web site."
Don Box's also had some words about this, but regarding GData specifically:
www.pluralsight-training.net/community/blogs/dbox/archive/2007/01/16/45725.aspx
"If I were building a GData client, I honestly wonder why I'd bother using DELETE and PUT methods at all given that X-HTTP-Method-Override is going to work in more cases/deployments."
There's an article about Silverlight and Java interop which also addresses this limitation of Silverlight by giving the same advice:
www.infoq.com/articles/silverlight-java-interop
"Silverlight supports only the GET and POST HTTP methods. Some firewalls restrict the use of PUT and DELETE HTTP methods.
It is important to point out that true RESTful service can be created (conforming to all the REST principles listed above) only using the GET and POST HTTP methods, in other words the REST architecture does not require a specific mapping to HTTP. Google’s GData X-Http-Method-Override header is an example of this approach.
The following HTTP methods overrides may be set in the header to accomplish the PUT and DELETE actions if the web services interpret the X-HTTP-Method-Override header on a POST:
* X-HTTP-Method-Override: PUT
* X-HTTP-Method-Override: DELETE"
Hope this helps
-Josh

It depends on whether you're using the browser HTTP stack or the client HTTP stack. The client stack does not support client certificates, period. The browser stack does, and pretty much automatically, if you're willing to live with its other limitations (lack of support for all HTTP verbs, coercion of response status codes, etc.).
I have however been running into a problem using the browser stack with client certificates in an OOB scenario. Prism module loading fails under these conditions - the request gets to IIS, but causes a 500 server error for no apparent reason. If I set IIS to ignore client certs, or if I run the app in-browser, it works fine :-/

take a look at this.
http://support.microsoft.com/kb/307267
just change your urls to https
hope this helps

Dim url As Uri = New Uri(Application.Current.Host.Source, "../WebService.asmx")
Dim binding As New System.ServiceModel.BasicHttpBinding
If url.Scheme = "https" Then
binding.Security.Mode = ServiceModel.BasicHttpSecurityMode.Transport
End If
binding.MaxBufferSize = 2147483647 'this value set to override a bug,
binding.MaxReceivedMessageSize = 2147483647 'this value set to override a bug,
Dim proxy As New ServiceReference1.WebServiceSoapClient(binding, New ServiceModel.EndpointAddress(url))
proxy.InnerChannel.OperationTimeout = New TimeSpan(0, 10, 0)

Related

authentication/http headers support in forge.file trigger.io module?

in the official trigger.io docs there seems to be no provision for custom http headers when it comes to the forge.file module. I need this so I can download files behind an http authentication scheme. This seems like an easy thing to add, if support is not already there.
any workarounds? any chance of a quick fix in the next update? I know I could use forge.request instead, but I'd like to keep a local copy (saveURL).
thanks
Unfortunately the file module just uses simple "download url" methods rather than a full HTTP request library, which makes it a fairly big task to add support for custom headers.
I've added a task to our backlog for this, but I don't have a timeframe for it being added.
Currently on iOS you can do basic auth by using urls in the form http://user:password#url.com in case that helps.
Maybe to avoid this you can configure your server differently, or have a proxy server in front that allows you to pass authentication details as get parameters?

Youtube API (v3) and CORS preflight

EDIT: Looks like this post is much ado about nothing, as the OPTIONS preflighting was introduced late last fall; my problems have been more user error than anything. See comments for more details (I'll leave it all here for documentation's sake). --jlm
It's a known issue that the Youtube API does not support the OPTIONS request method, so any web app that tries to do a Cross-Origin preflight will fail, even if (as is the case with the Youtube API) the actual CORS request would succeed. As we've been facing this issue yet again this week, we've come up with three workarounds; however, each has it's own plusses and minuses.
POSSIBLE WORKAROUND #1: Forget the preflight, and just make the simple cross-origin request.
Benefits -- A) it works. B) this is all the specification requires when doing most GET or POST requests.
Drawbacks -- A) The specification states that any PATCH, PUT, or DELETE requests, along with POST requests that use a content-type other than form-data, url-encoded, or text/plain, need to be preflighted. So while it would work, it would be breaking the spec (which we'd like to avoid if possible). B) Preflighting is also necessary when setting custom headers; so, for example, when I use AngularJS's $http method in the 1.0 branch, it sets a custom header and thus triggers preflighting of even GET requests. In this case, of course, I could write my own $http service or move to the 1.1 branch (as the problem would really be on Angluar's end).
POSSIBLE WORKAROUND #2: Use JSON-P
Benefits -- A) it works, too (in some cases, anyway). B) It's fairly simple to set up.
Drawbacks -- A) An older technology, and it isn't clear if the Youtube API will continue to support JSONP. B) Requires a callback. C) Is limited to GET requests only.
POSSIBLE WORKAROUND #3: Set up a server-side proxy on the same domain, use it to communicate with the Youtube API.
Benefits -- A) Avoids the need to do CORS requests at all, since the client works on the same origin and the server-side proxy has no need to preflight anything.
Drawbacks -- A) Can get complicated to set up, especially if trying to work with credentials (oAuth2 through a proxy can be quite the beast).
So that this post has an actual question (or several, actually)
What take do you have (for better for worse) on any or all of the workarounds above?
Has anyone implemented other solutions?
Is there any information as to if/when the Youtube servers might support the OPTIONS method?
Any and all comments are welcome -- and I apologize in advance if this isn't the best forum for such a question (although I'm hoping that by putting it here on Stack Overflow, it'll prove useful to others facing the same issue)
Youtube upload API(uploads.gdata.youtube.com) doesn't support CORS.
You may vote the issues here and here . There are also several forum posts about it. CORS is not working !
It seems Google is still working to fix it , though started working on it about an year ago so most likely the API will be deprecated before to have the issue fixed(as we already have a V3).
The only workaround is to use a proxy(which is not cool). Basically upload the video on your server and then send it to youtube.

How to get a SPNEGO / Kerberos Session key -and implement HTTP Authentication:Negotiate on my own client

I was recently exposed to a new authentication method i had no idea of.
After reading a bit and researching to understand it,I understood it has something to do with SPNEGO, or maybe it is just spnego.
Im running windows xp on a large network, when my browser opens it automatically
connects to a web-service in the network, which requires authentication:
HTTP/1.1 401 Unauthorized
WWW-Authenticate: Negotiate
then my browser sends automatically (along with more headers ofcourse):
Authorization: Negotiate (encrypted string).
I concluded this Handshake uses the SPNEGO protocol.
What i need to do, is to create my own client (actually,its a bot that uses this webservice that requires that authentication). I Need to get that encrypted string (exactly like my browser gets it, probably by using some SPNEGO protocol) without any user interaction (again, as my browser).
the thing is, that i don't have enough time to study the spnego protocol and how to implement one.
I'm using c/c++, but if i have no option c# would be okay as well.
Are there any functions / classes / codes or maybe even good tutorials to help me implement it shortly?
curl works with Kerberos/spnego. I'm not sure how well this functionality works on Windows, you should try and see. It works well enough on Linux. You can look at the source to see how it is done.

Windows Phone 7 - Cookies not sent to WCF service

I have a bunch of WCF services at a domain:
AuthenticationService (the standard MS one, over HTTPS)
AppService (HTTP)
Usually, I call the Authentication Service and a cookie is returned. For desktop applications, I detach the cookie and reattach it with every new service call to the AppService which exposes the meat of my API.
Silverlight in the browser automatically attaches the cookie in all calls to the domain. I expected the phone to do the same.
It doesn't.
Access to the headers is not supported on the phone, so manual manipulation is out. I wonder if its because some bright spark at MS thought that the phone should enforce that cookies are only reattached to HTTPS endpoints at the same domain or something...
Help!!
This is a nightmare to troubleshoot since the phone doesn't support the other major major helpful setting; ignoring self-signed certificates.
Thanks,
Luke
** UPDATE **
While I'm following up the method using CookieContainer I must point out that even though the Add method on the Headers collection is missing in Silverlight, one can still add headers using the indexer.
See http://cisforcoder.wordpress.com/2010/12/01/how-to-implement-basic-http-authentication-in-wcf-on-windows-phone-7/
** UPDATE 2 **
CookieContainer can be set as per Lex answer. I'm now stuck and continuing to investigate an ArgumentNullException thrown from within the WCF client upon References.cs EndInvoke. My server shows no sign of receiving the call.
Two key calls on the stack are:
System.Net.Browser.HttpWebRequestHelper.ParseHeaders
And
MS.Internal.InternalWebRequest.OnDownloadFailed
FINAL UPDATE
The ArgumentNullException seems to be thrown when called against a server with a self-signed certificate.
However, there's something odd with the emulator/SDK. I have had this exception against all my servers, even those with no SSL, and one with an issued certificate.
I've also had problems that have been resolved only by a local reboot. So I think my problems have been a result of having the right code, but thinking it was wrong because of other problems in the SDK.
Not sure what advice to give, except to be mistrusting of exceptions stemming from the WP7 WCF stack, particularly EndpointNotFoundException and ArgumentNullException, and to have a full framework test client app around as a sanity check.
Luke
In desktop SL, I do it by sharing cookie container between service proxies in client http stack. Did not try this on WP7, but you may check this out.
First, you need to insert <httpCookieContainer/> in your client-side binding configuration (usually in ServiceReferences.ClientConfig).
In code, switch to client http stack.
WebRequest.RegisterPrefix("http://", WebRequestCreator.ClientHttp);
WebRequest.RegisterPrefix("https://", WebRequestCreator.ClientHttp);
Then you need to create a CookieContainer instance and assign it to all service proxies.
var cc = new CookieContainer();
var service1 = new ServiceReference1.MyService1Client { CookieContainer = cc };
var service2 = new ServiceReference2.MyService2Client { CookieContainer = cc };
Now, when your cookie container receives cookie, it will reuse it for all web services. Make sure, that cookie comes with correct Path setting.

Using a subdomain to identify a client

I'm working on building a Silverlight application whereas we want to be able to have a client hit a url like:
http://{client}.domain.com/
and login, where the {client} part is their business name. so for example, google's would be:
http://google.domain.com/
What I was wondering was if anyone has been able, in silverlight, to be able to use this subdomain model to make decisions on the call to the web server so that you can switch to a specific database to run a query? Unfortunately, it's something that is quite necessary for the project, as we are trying to make it easy for their employees to get their company specific information for our software.
Wouldn't it work to put the service on a specific subdomain itself, such as wcf.example.com, and then setup a cross domain policy file on the service to allow it to access it?
As long as this would work you could just load the silverlight in the proper subdomain and then pass that subdomain to your service and let it do its thing.
Some examples of this below:
Silverlight Cross Domain Services
Silverlight Cross Domain Policy Helpers
On the server side you can check the HTTP 1.1 Host header to see how the user came to your server and do the necessary customization based on that.
I think you cannot do this with Silverlight alone, I know you cannot do this without problems with Javascript, Ajax etc. . That is because a sub domain is - for security reasons - treated otherwise than a sub-page by the browsers.
What about the following idea: Insert a rewrite rule to your web server software. So if http://google.domain.com is called, the web server itself rewrites the URL to something like http://www.domain.com/google/ (or better: http://www.domain.com/customers/google/). Would that help?
Georgi:
That would help if it would be static, but alas, it's going to all be dynamic. My hope was to have 1x deployment for the application, and to use the http://google.domain.com/ idea to switch to the correct database for the user. I recall doing this once when we built an asp.net website, using the domain context to figure out what skin to use, etc.
Ates: Can you explain more about what you are saying... sounds like you are close to what I am trying to come up with. Have you seen such a tutorial for this?
The only other way I have come up with to make this work is to have a metabase that when the user logs in, it will switch them to the appropriate database as required... was just thinking as well that telling Client x to hit:
http://ClientX.domain.com/ would have been sweeter than saying to hit http://www.domain.com/ and login. It seemed as if they were to hit their name, and to show it personalized for them right from the login screen would have been much more appealing for the client base.
#Richard B: No, I can't think of any such tutorial that I've seen before. I'll try to be more verbose.
The server-side approach in more detail:
Direct *.example.com to the same IP in your DNS settings.
The backend app that handles login checks the Host HTTP header (e.g. the "HTTP_HOST" server variable in some platforms). That would contain the exact subdomain.example.com that the client used for reaching your server. Extract the subdomain part and continue...
There can also be a client-side-only approach. I don't know much about Silverlight but I'm assuming that you should be able to interface Silverlight with JavaScript. You could read document.location with JavaScript and pass it to your Silverlight applet, whereon further data fetching etc. logic would rely on the subdomain that was passed in by JavaScript.
#Ates:
That is what we did when we wrote the ASP.Net system... we pushed a slew of *.example.com hosts against the web server, and handled using the HTTP headers. The hold-up comes when dealing with WCF pushing the info between the client and the server... it can only exist in one domain...
So, for example, when you have {client}.example.com and {sandbox}.example.com, the WCF service can't be registered to both. It also cannot be registered to just *.example.com or example.com, so that's where the catch 22 is coming in at. everything else I have the prior knowledge of handling.
I recall a method by which an application can "spoof" another domain name in certain instances. I take it in this case, I would need to do such a configuration? Much to research yet I believe.

Resources