Getting "Signature is invalid." when using Artifact Binding during the artifact consumption step - itfoxtec-identity-saml2

I have an IdP and an SP setup using the ITfoxtec SAML2 libraries, and everything works great when not using artifact binding, or when not validating signatures. When using artifact binding and validating signatures I'm getting a "Signature is invalid." exception in the ACS when trying to retrieve and bind the actual response/assertion.
It seems to unbind the artifact response fine, then when it goes to retrieve and unbind the artifact from the ArtifactResolutionService it fails, specifically on the last line of this block:
var soapEnvelope = new Saml2SoapEnvelope();
saml2AuthnResponse = new Saml2AuthnResponse(config);
await soapEnvelope.ResolveAsync(httpClient, saml2ArtifactResolve, saml2AuthnResponse);
I've checked that my signature validation certificate is correct and I've dug through the source code but am scratching my head. I've tried to validate the "saml2p:ArtifactResponse" myself but there isn't much out there.
If I put this line before the chunk above everything works as expected as it no longer validates the signature:
config.SignatureValidationCertificates.Clear();
One thing I noticed is that in the 'saml2p:ArtifactResponse' there is a signature inside of that node but not inside the contained 'saml2p:Response' node. Is it possible that the saml2p:Response is being isolated and then a signature check is being performed? I tried to see if it was supposed to be signing the response/assertion in the artifact cache on the IdP side (artifactSaml2AuthnResponseCache), but it doesn't sign response at all. I'm doing this before putting it in the cache just like in the example and just like I do when using POST binding:
var token = saml2AuthnResponse.CreateSecurityToken(relyingParty.Issuer, subjectConfirmationLifetime: 5, issuedTokenLifetime: 60);
artifactSaml2AuthnResponseCache[saml2ArtifactResolve.Artifact] = saml2AuthnResponse;`
EDIT: I have determined that the ArtifactResponse just isn't signed properly. Another tool claims the digest in the XML doesn't match the computed value. This is after stepping through the source and grabbing the XML that the code is trying to validate directly. I can see that the ArtifactResolve is being signed and validated properly (and I checked with the external tool) but the ArtifactResponse isn't. Even in the code it fails at the final validation of the signature (and not at any checks before it).
EDIT 2: Found the problem in the source. The .ToXmlDocument() extension is breaking the signed XML. The final test was done by 'replacing' it in the spot with a new method that just returns the string directly with "envelope.ToString(SaveOptions.DisableFormatting)":
protected virtual XmlDocument ToSoapXml()
{
var envelope = new XElement(Saml2Constants.SoapEnvironmentNamespaceX + Saml2Constants.Message.Envelope);
envelope.Add(GetXContent());
return envelope.ToXmlDocument();
}
protected string ToSoapXmlString()
{
var envelope = new XElement(Saml2Constants.SoapEnvironmentNamespaceX + Saml2Constants.Message.Envelope);
envelope.Add(GetXContent());
return envelope.ToString(SaveOptions.DisableFormatting);//.ToXmlDocument();
}
And directly save that to the SoapResponseXml of the Saml2SoapEnvelope:
protected override Saml2SoapEnvelope BindInternal(Saml2Request saml2Request, string messageName)
{
if (!(saml2Request is Saml2ArtifactResponse))
throw new ArgumentException("Only Saml2ArtifactResponse is supported");
BindInternal(saml2Request);
SoapResponseXml = ToSoapXmlString();// ToSoapXml().OuterXml;
return this;
}
I would initiate a pull request for this change but honestly I'm not that up to speed with Git. I'm also not sure if this is the best way to fix the issue.

Thank you for your question and code to solve the problem. I'll look into the problem.
EDIT: I'm trying to reproduce the error but no luck. The sample is both an IdP an RP, what have you changed to get the error?

Related

CPQ Quote API, I can't save the quote

I can't save the quote.
Doing the query:
select
ApexClass.name, Id, CreatedDate, CreatedById, JobType,
ApexClassId, Status, JobItemsProcessed, TotalJobItems,
NumberOfErrors, CompletedDate, MethodName, ExtendedStatus,
ParentJobId, LastProcessed, LastProcessedOffset
from
AsyncApexJob
order by
CreatedDate desc
I get this error:
Calculation error on quote Q-13761: "UNAUTHORIZED"
Code:
public with sharing class QuoteCalculator {
public void calculate(QuoteModel quote, String callbackClass) {
system.debug('quote: ' +quote);
system.debug('callbackClass: ' +callbackClass);
QuoteCalculatorContext ctx = new QuoteCalculatorContext(quote, callbackClass);
SBQQ.ServiceRouter.load('SBQQ.QuoteAPI.QuoteCalculator', null, JSON.serialize(ctx));
system.debug('QuoteCalculator.calculate');
}
private class QuoteCalculatorContext {
private QuoteModel quote; //The quote and callbackClass properties are called
in the API code by the exact names seen here.
private String callbackClass; //Altering these property names will cause
calculator API calls to fail.
private QuoteCalculatorContext(QuoteModel quote, String callbackClass) {
this.quote = quote;
this.callbackClass = callbackClass;
}
}
}
anonymous window:
QuoteReader reader = new QuoteReader();
QuoteModel quote = reader.read('a0p1w000BhfXzAAJ');
System.debug(quote);
quote.lineItems[0].record.SBQQ__Quantity__c = 2;
QuoteCalculator calculator = new QuoteCalculator();
calculator.calculate(quote, 'MyCallback')
Preface
I had (almost) the same exact code base as yours, and got the same error message.
In my case there was an other sandbox I could test my code, and it turned out to be working properly there.
Cause
Later found out that the Salesforce CPQ's Calculation Quote API is using Heroku to do the calculations in order to avoid apex limits exhaustion.
From this it can be deducted, that it needs to have a Connected App. I checked the Apps -> Connected Apps setup, and found that no record was listed under the "Connected Apps OAuth Usage" page for the Salesforce CPQ. (On my other sandbox there was a "Steelbrick CPQ" row.)
From this I concluded that this might be the reason for this behaviour.
Seems like something went wrong during the "Authorize new Calculation Service" process. (Or there was a sandbox refresh and something else went wrong during it.)
Solution
The bad news is that the option to authorize a new calculation service is only visible for the first time you configure the package, which you might already done. (Well... if you haven't done, then this is a great news, because your problem is probably solved. :D) (Otherwise read further.)
The good news is I figured out a solution for the case when you already done this, yet that "Steelbrick CPQ" row is missing.
Created a scratch org and installed the Salesforce CPQ package, then before I clicked on the "Authorize new Calculation Service" link under the "Pricing and Calculation" tab in the Settings Editor, I checked the source code in hope of finding something of interest.
I did.
This link: https://rest-na.steelbrick.com/oauth/auth/https%3A%2F%2Ftest.salesforce.com/SBQQ
(⚠️NOTE: You might have to change it according to your location. There are several servers across the globe:
rest-au.steelbrick.com
rest-eu.steelbrick.com
rest-jp.steelbrick.com
rest-na.steelbrick.com
But for me the above pasted link was generated on the settings page. Which is only interesting, because I live in the EU, yet, for some reason I got the link to the rest-NA server... whatever.gif
So just make sure if you click on the link, in the address bar you can find the appropriate salesforce instance URL.)
Conclusion
With this link you won't have to reinstall the package, you just have to click on it, and allow the access from Steelbrick and the missing row will appear, and you will be authorized to use the Calculation API.

Issue creating SamlResponse when following your example Idp code - within the LoginResponse method

I have created an IDP using the code contained within https://github.com/ITfoxtec/ITfoxtec.Identity.Saml2/blob/master/test/TestIdPCore/Controllers/AuthController.cs
This is throwing an error when I attempt to bind the authNResponse using the following code:
var responsebinding = new Saml2PostBinding();
responsebinding.Bind(saml2AuthnResponse).XmlDocument.OuterXml;
This is the same code as within the PostContent method, but I've opted to use this code direct as I just needed the SamlResponse.
The error is:
Microsoft.IdentityModel.Tokens.Saml2.Saml2SecurityTokenWriteException: 'IDX13129: The SAML2:AttributeStatement must contain at least one SAML2:Attribute.'
With the following abridged stack trace:
at Microsoft.IdentityModel.Tokens.Saml2.Saml2Serializer.WriteAttributeStatement(XmlWriter writer, Saml2AttributeStatement statement)
at Microsoft.IdentityModel.Tokens.Saml2.Saml2Serializer.WriteStatement(XmlWriter writer, Saml2Statement statement)
at Microsoft.IdentityModel.Tokens.Saml2.Saml2Serializer.WriteAssertion(XmlWriter writer, Saml2Assertion assertion)
at Microsoft.IdentityModel.Tokens.Saml2.Saml2SecurityTokenHandler.WriteToken(XmlWriter writer, SecurityToken securityToken)
at ITfoxtec.Identity.Saml2.Tokens.Saml2ResponseSecurityTokenHandler.WriteToken(SecurityToken token)
at ITfoxtec.Identity.Saml2.Saml2AuthnResponse.ToXml()
I have used your example code almost exactly, so is there an issue within it, or am I missing something?
Many thanks
Here's a deeper analysis and two possible solutions/workarounds.
The situation: creating a ITfoxtec.Identity.Saml2.Saml2AuthnResponse for a ClaimsIdentity that has only one claim: the nameidentifier.
The relevant code snippet (not complete, just the part that is relevant, but the ITFoxttec samples have the full code)
var response = new Saml2AuthnResponse(config);
response.ClaimsIdentity = new ClaimsIdentity(new[] { new Claim(ClaimTypes.NameIdentifier, "someone#somewhere.com") });
response.NameId = new Saml2NameIdentifier(....etc...);
var token = response.CreateSecurityToken(appliesToAddress);
//so far all is well, but the problem has been sneakily introduced!
//which is why the next line will give the error: Microsoft.IdentityModel.Tokens.Saml2.Saml2SecurityTokenWriteException: 'IDX13129: The SAML2:AttributeStatement must contain at least one SAML2:Attribute
return binding.Bind(response).ToActionResult();
Explanation:
ITfoxtec code: The nameidentifier claim is is removed from the claims when the token is created. This makes senses as it is supposed to be in the NameId property. This remaining claims are set as Subject in the SecurityTokenDescriptor that is fed to the Saml2SecurityTokenHandler which is Microsoft code.
var tokenDescriptor = new SecurityTokenDescriptor();
tokenDescriptor.Subject = new ClaimsIdentity(claims.Where(c => c.Type != ClaimTypes.NameIdentifier));
The claims in this tokendescriptor then end up as Attributes in the AttributeStatement in the generated Saml2SecurityToken (via a Saml2SecurityTokenHandler.CreateToken(tokendescriptor) call).
Unfortunately, if the nameidentifier was the only claim you had, then you end up with an AttributeStatement that has no Attributes. And subsequently run into the problem when the binding.Bind(response) deep down the bowels does its XML thing..
Unless you are supposed to always have an AttributeStatement it looks to me like a bug / edge case in the Microsoft.IdentityModel.Tokens.Saml library.
There are two solutions to solve it:
Prevent ending up with no claims: Simply add another claim to the identity, doesn't have to be email, can be anything:
response.ClaimsIdentity.AddClaim(new Claim("x", "y"))
After the CreateSecurityToken call but before the call to Bind, check if the AttributeStatement is empty and if so remove it. A quick and dirty example for that:
var x = (Saml2AttributeStatement)token.Assertion.Statements.FirstOrDefault(a => a.GetType() == typeof(Saml2AttributeStatement));
if (x?.Attributes.Count == 0)
{
token.Assertion.Statements.Remove(x);
}
Personally, I prefer option 1, as it is generally safer to use and less code. Plus I'm sure there can always be 'something' to further attribute the identity with...
Maybe you are missing the part of adding claims to the token and creating the token?
saml2AuthnResponse.SessionIndex = sessionIndex;
var claimsIdentity = new ClaimsIdentity(claims);
saml2AuthnResponse.NameId = new Saml2NameIdentifier(claimsIdentity.Claims.Where(c => c.Type == ClaimTypes.NameIdentifier).Select(c => c.Value).Single(), NameIdentifierFormats.Persistent);
saml2AuthnResponse.ClaimsIdentity = claimsIdentity;
var token = saml2AuthnResponse.CreateSecurityToken(relyingParty.Issuer, subjectConfirmationLifetime: 5, issuedTokenLifetime: 60);
https://github.com/ITfoxtec/ITfoxtec.Identity.Saml2/blob/master/test/TestIdPCore/Controllers/AuthController.cs#L110
I have found that you need both the ClaimTypes.NameIdentifier and ClaimTypes.Email claims in order for the token to be generated successfully.

Provide a callback URL in Google Cloud Storage signed URL

When uploading to GCS (Google Cloud Storage) using the BlobStore's createUploadURL function, I can provide a callback together with header data that will be POSTed to the callback URL.
There doesn't seem to be a way to do that with GCS's signed URL's
I know there is Object Change Notification but that won't allow the user to provide upload specific information in the header of a POST, the way it is possible with createUploadURL's callback.
My feeling is, if createUploadURL can do it, there must be a way to do it with signed URL's, but I can't find any documentation on it. I was wondering if anyone may know how createUploadURL achieves that callback calling behavior.
PS: I'm trying to move away from createUploadURL because of the __BlobInfo__ entities it creates, which for my specific use case I do not need, and somehow seem to be indelible and are wasting storage space.
Update: It worked! Here is how:
Short Answer: It cannot be done with PUT, but can be done with POST
Long Answer:
If you look at the signed-URL page, in front of HTTP_Verb, under Description, there is a subtle note that this page is only relevant to GET, HEAD, PUT, and DELETE, but POST is a completely different game. I had missed this, but it turned out to be very important.
There is a whole page of HTTP Headers that does not list an important header that can be used with POST; that header is success_action_redirect, as voscausa correctly answered.
In the POST page Google "strongly recommends" using PUT, unless dealing with form data. However, POST has a few nice features that PUT does not have. They may worry that POST gives us too many strings to hang ourselves with.
But I'd say it is totally worth dropping createUploadURL, and writing your own code to redirect to a callback. Here is how:
Code:
If you are working in Python voscausa's code is very helpful.
I'm using apejs to write javascript in a Java app, so my code looks like this:
var exp = new Date()
exp.setTime(exp.getTime() + 1000 * 60 * 100); //100 minutes
json['GoogleAccessId'] = String(appIdentity.getServiceAccountName())
json['key'] = keyGenerator()
json['bucket'] = bucket
json['Expires'] = exp.toISOString();
json['success_action_redirect'] = "https://" + request.getServerName() + "/test2/";
json['uri'] = 'https://' + bucket + '.storage.googleapis.com/';
var policy = {'expiration': json.Expires
, 'conditions': [
["starts-with", "$key", json.key],
{'Expires': json.Expires},
{'bucket': json.bucket},
{"success_action_redirect": json.success_action_redirect}
]
};
var plain = StringToBytes(JSON.stringify(policy))
json['policy'] = String(Base64.encodeBase64String(plain))
var result = appIdentity.signForApp(Base64.encodeBase64(plain, false));
json['signature'] = String(Base64.encodeBase64String(result.getSignature()))
The code above first provides the relevant fields.
Then creates a policy object. Then it stringify's the object and converts it into a byte array (you can use .getBytes in Java. I had to write a function for javascript).
A base64 encoded version of this array, populates the policy field.
Then it is signed using the appidentity package. Finally the signature is base64 encoded, and we are done.
On the client side, all members of the json object will be added to the Form, except the uri which is the form's address.
var formData = new FormData(document.forms.namedItem('upload'));
var blob = new Blob([thedata], {type: 'application/json'})
var keys = ['GoogleAccessId', 'key', 'bucket', 'Expires', 'success_action_redirect', 'policy', 'signature']
for(field in keys)
formData.append(keys[field], url[keys[field]])
formData.append('file', blob)
var rest = new XMLHttpRequest();
rest.open('POST', url.uri)
rest.onload = callback_function
rest.send(formData)
If you do not provide a redirect, the response status will be 204 for success. But if you do redirect, the status will be 200. If you got 403 or 400 something about the signature or policy maybe wrong. Look at the responseText. If is often helpful.
A few things to note:
Both POST and PUT have a signature field, but these mean slightly different things. In case of POST, this is a signature of the policy.
PUT has a baseurl which contains the key (object name), but the URL used for POST may only include bucket name
PUT requires expiration as seconds from UNIX epoch, but POST wants it as an ISO string.
A PUT signature should be URL encoded (Java: by wrapping it with a URLEncoder.encode call). But for POST, Base64 encoding suffices.
By extension, for POST do Base64.encodeBase64String(result.getSignature()), and do not use the Base64.encodeBase64URLSafeString function
You cannot pass extra headers with the POST; only those listed in the POST page are allowed.
If you provide a URL for success_action_redirect, it will receive a GET with the key, bucket and eTag.
The other benefit of using POST is you can provide size limits. With PUT however, if a file breached your size restriction, you can only delete it after it was fully uploaded, even if it is multiple-tera-bytes.
What is wrong with createUploadURL?
The method above is a manual createUploadURL.
But:
You don't get those __BlobInfo__ objects which create many indexes and are indelible. This irritates me as it wastes a lot of space (which reminds me of a separate issue: issue 4231. Please go give it a star)
You can provide your own object name, which helps create folders in your bucket.
You can provide different expiration dates for each link.
For the very very few javascript app-engineers:
function StringToBytes(sz) {
map = function(x) {return x.charCodeAt(0)}
return sz.split('').map(map)
}
You can include succes_action_redirect in a policy document when you use GCS post object.
Docs here: Docs: https://cloud.google.com/storage/docs/xml-api/post-object
Python example here: https://github.com/voscausa/appengine-gcs-upload
Example callback result:
def ok(self):
""" GCS upload success callback """
logging.debug('GCS upload result : %s' % self.request.query_string)
bucket = self.request.get('bucket', default_value='')
key = self.request.get('key', default_value='')
key_parts = key.rsplit('/', 1)
folder = key_parts[0] if len(key_parts) > 1 else None
A solution I am using is to turn on Object Changed Notifications. Any time an object is added, a Post is sent to a URL - in my case - a servlet in my project.
In the doPost() I get all info of objected added to GCS and from there, I can do whatever.
This worked great in my App Engine project.

Parsing Swagger JSON data and storing it in .net class

I want to parse Swagger data from the JSON I get from {service}/swagger/docs/v1 into dynamically generated .NET class.
The problem I am facing is that different APIs can have different number of parameters and operations. How do I dynamically parse Swagger JSON data for different services?
My end result should be list of all APIs and it's operations in a variable on which I can perform search easily.
Did you ever find an answer for this? Today I wanted to do the same thing, so I used the AutoRest open source project from MSFT, https://github.com/Azure/autorest. While it looks like it's designed for generating client code (code to consume the API documented by your swagger document), at some point on the way producing this code it had to of done exactly what you asked in your question - parse the Swagger file and understand the operations, inputs and outputs the API supports.
In fact we can get at this information - AutoRest publically exposes this information.
So use nuget to install AutoRest. Then add a reference to AutoRest.core and AutoRest.Model.Swagger. So far I've just simply gone for:
using Microsoft.Rest.Generator;
using Microsoft.Rest.Generator.Utilities;
using System.IO;
...
var settings = new Settings();
settings.Modeler = "Swagger";
var mfs = new MemoryFileSystem();
mfs.WriteFile("AutoRest.json", File.ReadAllText("AutoRest.json"));
mfs.WriteFile("Swagger.json", File.ReadAllText("Swagger.json"));
settings.FileSystem = mfs;
var b = System.IO.File.Exists("AutoRest.json");
settings.Input = "Swagger.json";
Modeler modeler = Microsoft.Rest.Generator.Extensibility.ExtensionsLoader.GetModeler(settings);
Microsoft.Rest.Generator.ClientModel.ServiceClient serviceClient;
try
{
serviceClient = modeler.Build();
}
catch (Exception exception)
{
throw new Exception(String.Format("Something nasty hit the fan: {0}", exception.Message));
}
The swagger document you want to parse is called Swagger.json and is in your bin directory. The AutoRest.json file you can grab from their GitHub (https://github.com/Azure/autorest/tree/master/AutoRest/AutoRest.Core.Tests/Resource). I'm not 100% sure how it's used, but it seems it's needed to inform the tool about what is supports. Both JSON files need to be in your bin.
The serviceClient object is what you want. It will contain information about the methods, model types, method groups
Let me know if this works. You can try it with their resource files. I used their ExtensionLoaderTests for reference when I was playing around(https://github.com/Azure/autorest/blob/master/AutoRest/AutoRest.Core.Tests/ExtensionsLoaderTests.cs).
(Also thank you to the Denis, an author of AutoRest)
If still a question you can use Swagger Parser library:
https://github.com/swagger-api/swagger-parser
as simple as:
// parse a swagger description from the petstore and get the result
SwaggerParseResult result = new OpenAPIParser().readLocation("https://petstore3.swagger.io/api/v3/openapi.json", null, null);

how to change the url for SolrNet Client

I am a newbie in solrnet and my question is how to change the url for SolrNet Client.
I found this on wiki
initailizing code
Startup.Init<Product>("http://localhost:8983/solr");
invoking code
var solr = ServiceLocator.Current.GetInstance<ISolrOperations<Product>>();
but I dont know how to change the url , could someone tell me how to do this, I am really thanks.
It cannot be changed with existing SOLRNet code as it is implemented on singleton pattern.
You have to download the code from github.
Currently following exception has been thrown
"Key ... already registered in container". You can change code in a way that it will always create new instance. (by pass Singleton pattern)
The default request handler is "/select". So SolrNet will send your requests to
http://localhost:8983/solr/select
If you wish to invoke a different request handler, you will need to get a instance of the SolrQueryExecuter and set the Handler property, accordingly.
Assuming you have a request handler named "/browse":
Startup.Init<Product>("http://localhost:8983/solr");
var executor = ServiceLocator.Current.GetInstance<ISolrQueryExecuter<Product>>() as SolrQueryExecuter<Product>;
if (executor != null)
{
executor.Handler = "/browse";
}

Resources