Mule Salesforce Analytics Connector Requirements does not match the schema - salesforce

We are using the Mulesoft sfdc-analytics connector. When we installed the connector into Anypoint Stuido and used the connector's Operation "Create data set", the connector requires additional attributes one of those being Operation; however, in the schema (http://mulesoft.github.io/salesforce-analytics-connector/1.0.0/mule/sfdc-analytics-schema.html) the create-data-set element does not have "operation" defined so when we go to run the application package fails with the follow issue:
INFO 2017-04-01 22:21:28,431 [main] org.mule.lifecycle.AbstractLifecycleManager: Disposing RegistryBroker
ERROR 2017-04-01 22:21:28,493 [main] org.mule.module.launcher.application.DefaultMuleApplication: null
org.xml.sax.SAXParseException: cvc-complex-type.3.2.2: Attribute 'operation' is not allowed to appear in element 'sfdc-analytics:create-data-set'.

It seems to me that you are using an old version of the Salesforce Analytics Connector. That schema belongs to version 1.0.0 which does not have the "operation" element. That property was added in version 2.0.0 (latest is 2.1.0), as stated in the official Release Notes. And the related schema is here.

Related

Apache Camel with Kafka Schema registry

I am building a Camel application to read message from Confluent Kafka. The messages are in Avro format and added below route configuration to read the Avro messages using schema registry in Camel route. When I enable the valueDeserializer=io.confluent.kafka.serializers.KafkaAvroDeserializer,
I am not getting any messages from Kafka topic. I tested the route with out schema registry and able to consume the message.
Route definition:
from("kafka:topic1?sslTruststoreLocation=<jks file>
&valueDeserializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
&brokers=host1:9092,host2:9092,host3:9092
&sslKeystoreType=JKS
&groupId=grp1
&allowManualCommit=true
&consumersCount=10
&sslKeyPassword=<password>
&autoOffsetReset=earliest
&sslKeystorePassword=<password>
&securityProtocol=SSL
&sslTruststorePassword=<password>
&sslEndpointAlgorithm=HTTPS
&maxPollRecords=10
&sslTruststoreType=JKS
&sslKeystoreLocation=<keystore_path>
&autoCommitEnable=false
&additionalProperties.schema.registry.url=https://localhost:8081
&additionalProperties.basic.auth.user.info=abc:xyz
&additionalProperties.basic.auth.credentials.source=USER_INFO");
Can you please let me know, what is wrong in above configuration for schema registry. I also tried with EndPointRouteBuilder and same issue. However the producer application which is also Camel based and uses the schema registry for publishing Avro messages is working fine.
I figured out the way to configure the basic auth with Confluent schema registry. We need to configure as below
from("kafka:topic1?sslTruststoreLocation=<jks file>
&valueDeserializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
&brokers=host1:9092,host2:9092,host3:9092
&sslKeystoreType=JKS
&groupId=grp1
&allowManualCommit=true
&consumersCount=10
&sslKeyPassword=<password>
&autoOffsetReset=earliest
&sslKeystorePassword=<password>
&securityProtocol=SSL
&sslTruststorePassword=<password>
&sslEndpointAlgorithm=HTTPS
&maxPollRecords=10
&sslTruststoreType=JKS
&sslKeystoreLocation=<keystore_path>
&autoCommitEnable=false
&additionalProperties.schema.registry.url=https://localhost:8081
&additional-properties[basic.auth.user.info]=abc:xyz
&additional-properties[basic.auth.credentials.source]=USER_INFO");
Note here, we need to use additional-properties for basic.auth.user.info and basic.auth.credentials.source as mentioned above.
My issue was that the schema registry password contained special characters, such +.
So I had to wrap the property in RAW as described in the documentation [1]
Given the above example, it would then result in:
&additional-properties[basic.auth.user.info]=RAW(abc:xyz+)
[1] https://camel.apache.org/manual/faq/how-do-i-configure-endpoints.html#HowdoIconfigureendpoints-Configuringparametervaluesusingrawvalues

WSO2 Identity Server XML config of service providers

My company is using WSO2 IS version 5.2. We have implemented it clustered with 1 manager node and 3 worker nodes. We do not use multiple tenants. We are implementing a SAML approach to authentication. Our first implementation was in a development environment which included quite a bit of manual (UI based) configuration. The following was done using the management console:
adding custom claims
adding service providers (we have 3 currently)
assigning custom claims to SPs
configure the resident IdP
We now must setup and configure 50 more development, QA and UAT environments. We would like to be able to do this entirely through XML configuration with no human data entry. Is there a specific resource that can walk me through the above 4 items? Note: We have determined how to add our own custom claims through xml config. So item #1 is no longer an issue but I included it for reference. I am really mostly interested in items 2,3 and 4.
We did find the following topic in the docs:
https://docs.wso2.com/display/IS520/Configuring+a+SP+and+IdP+Using+Configuration+Files
However, the above link does not go far enough to explain how to map custom claim to SPs. We also found this which asks a very similar question but gives only part of what we are looking for.
Thanks for any assistance.
You could setup a basic environment and copy the database from the directoy conf/repository/database.

Mule Salesforce Connector TLS 1.1 upgrade Issue

I was working on one poc which will connect to salesforce account . The mule version is 6.3.2 and sales force version is 6.3.2. Till 2 [![enter image description here][1]][1]days back it was working fine.
I came to know that last weekend sales force as done TLS upgrade to 1.1 from 1.0. When i was testing my flow getting the below exception:
Root Exception stack trace:
[UnexpectedErrorFault [ApiFault exceptionCode='UNSUPPORTED_CLIENT'
exceptionMessage='TLS 1.0 has been disabled in this organization. Please use TLS 1.1 or higher when connecting to Salesforce using https.'
]
]
When i saw the mule documentation it says that sales force connector 7.1.2 as addressed this issue and I update my connector in studio and retried the scenario which is not working.
Can some one help me out on this.
Regards
Vikram
I previously had to set the following property in the application settings:
https.protocols=TLSv1.1,TLSv1.2
And -Dhttps.protocols=TLSv1.1,TLSv1.2 in my wrapper.conf for Mule standalone.
You can put your configuration in tls-default.conf in MULE_ESB/conf/ folder
and then put the value inside like below:
enabledProtocols=TLSv1.1, TLSv1.2
enabledCipherSuites=TLS_KRB5_WITH_3DES_EDE_CBC_MD5, TLS_KRB5_WITH_RC4_128_SHA, SSL_DH_anon_WITH_DES_CBC_SHA
Or if you want to test from your anypoint studio, just create the tls-default.conf and put under your resources folder
Another information I can add is try to input your url destination to https://www.ssllabs.com/ssltest/ to make sure the TLSv1.1 is enabled by your endpoint or the chiper suite enable too
Similar is answered in https://forums.mulesoft.com/questions/41012/getting-error-when-hitting-a-rest-api-via-https.html#answer-43960
Below is the answer I posted.
I resolved it in my system.
When it is not working in the Runtime that is attached in the Anypoint
studio then follow the below steps.
Navigate to the Anypoint studio installation directory
Search for "tls-default.conf" in the folder. This will show you all the files for all the Runtimes that you have installed.
there will be a property "enabledProtocols" make sure that it contains the TLSv1 in it as below
enabledProtocols=TLSv1,TLSv1.1,TLSv1.2
This above should apply to Cloud hub (Most of the times it is already
enabled) or on-premise systems.
Salesforce are now disabling TLS 1.0, forcing TLS 1.1 or higher.
For Java versions >= 1.8 this is not a problem, but for earlier releases you will want to set the SSLContext. This solution worked for me:
if (Double.parseDouble(Runtime.class.getPackage().getSpecificationVersion()) <= 1.7) // Java versions > 1.7 are compatible with TLS 1.1 or higher by default - we want TLSv1.2 for our needs
setSSLContext(SSLContext.getInstance(SSL_VERSION_TO_USE_FOR_SALESFORCE_LOGIN));
private static void setSSLContext(SSLContext context) {
SSLContext.setDefault(context);
try {
/* Either of the first two parameters may be null in which case the installed security providers will be searched for the highest priority implementation of the appropriate factory.
Likewise, the secure random parameter may be null in which case the default implementation will be used. */
context.init(null, null, null);
} catch (KeyManagementException e) {
// handle exception
}
}
Navigate to Setup
In the Quick Find bar, type in Critical Updates
Select Critical Updates
Locate the Require TLS 1.1 or higher for HTTPS connections​ under the Update Name column
Click on Deactivate.

ADFS 2.0 Not handling 'Extension' tag in SAML AuthnRequest - Throwing Exception MSIS7015

We currently have ADFS 2.0 with hotfix 2 rollup installed and working properly as an identity provider for several external relying parties using SAML authentication. This week we attempted to add a new relying party, however, when a client presents the authentication request from the new party, ADFS simply returns an error page with a reference number and does not prompt the client for credentials.
I checked the server ADFS 2.0 event log for the reference number, but it is not present (searching the correlation id column). I enabled the ADFS trace log, re-executed the authentication attempt and this message was presented:
Failed to process the Web request because the request is not valid. Cannot get protocol message from HTTP query. The following errors occurred when trying to parse incoming HTTP request:
Microsoft.IdentityServer.Protocols.Saml.HttpSamlMessageException: MSIS7015: This request does not contain the expected protocol message or incorrect protocol parameters were found according to the HTTP SAML protocol bindings.
at Microsoft.IdentityServer.Web.HttpSamlMessageFactory.CreateMessage(HttpContext httpContext)
at Microsoft.IdentityServer.Web.FederationPassiveContext.EnsureCurrent(HttpContext context)
As the message indicates that the request is not well formed, I went ahead and ran the request through xmlsectool and validated it against the SAML protocol XSD (http://docs.oasis-open.org/security/saml/v2.0/saml-schema-protocol-2.0.xsd) and it came back clean:
C:\Users\ebennett\Desktop\xmlsectool-1.2.0>xmlsectool.bat --validateSchema --inFile metaauth_kld_request.xml --schemaDirectory . --verbose
INFO XmlSecTool - Reading XML document from file 'metaauth_kld_request.xml'
DEBUG XmlSecTool - Building DOM parser
DEBUG XmlSecTool - Parsing XML input stream
INFO XmlSecTool - XML document parsed and is well-formed.
DEBUG XmlSecTool - Building W3 XML Schema from file/directory 'C:\Users\ebennett\Desktop\xmlsectool-1.2.0\.'
DEBUG XmlSecTool - Schema validating XML document
INFO XmlSecTool - XML document is schema valid
So, I'm thinking that ADFS isn't playing full compliance with the SAML specification. To verify, I manually examined the submitted AuthnRequest, and discovered that our vendor is making use of the 'Extensions' element to embed their custom properties (which is valid, according to the SAML specification) (note: "ns33" below correctly namspaces "urn:oasis:names:tc:SAML:2.0:protocol" elsewhere in the request)
<ns33:Extensions>
<vendor_ns:fedId xmlns:vendor_ns="urn:vendor.name.here" name="fedId" value="http://idmfederation.vendorname.org"/>
</ns33:Extensions>
If I remove the previous element from the AuthnRequest and resubmit it to ADFS, everything goes swimmingly. And, in fact, I can leave the 'Extensions' container and simply edit out the vendor namespaced element, and ADFS succeeds.
Now, I guess I have 3 questions:
Why was the reference number not logged to the ADFS log? That really would have helped my early debugging efforts
Is it a known issue that ADFS's SAML handler cannot handle custom elements defined within the Extensions element, and if so, is there a way to add support (or at least not crash while handling it)? My vendor has offered to change the SAML AuthnRequest generated to omit that tag, but said that it 'may take some time'-- and we all know what that means...
Does anyone think that installing ADFS hotfix rollup 3 will address this situation? I didn't see anything in the doc to indicate the affirmative.
Thanks for your feedback.
When facing a MSIS7015 ADFS error, the best place to start would be enabling ADFS Tracing. Login to the ADFS server as admin and run the following command. If you have a very busy ADFS server, might be wise to do it when the server is not as busy.
C:\Windows\System32\> wevtutil sl “AD FS Tracing/Debug” /L:5
C:\Windows\System32\> eventvwr.msc
In Event Viewer select “Application and Services Logs”, right-click and select “View – Show Analytics and Debug Logs”
Go to AD FS Tracing – Debug, right-click and select “Enable Log” to start Trace Debugging.
Process your ADFS login / logout steps and when finished, go to the event viewer mmc find the sub tree AD FS Tracing – Debug, right-click and select “Disable Log” to stop Trace Debugging.
Look for EventID 49 - incoming AuthRequest - and verify values are not being sent with CAPs value. For example, in my case, it was I was receiving the following values: IsPassive='False', ForceAuthn='False'
In my case, to address the issue, all I needed to do was create incoming claim transformer rule - for the distinct endpoints.
Once the CAPs were transformed to lower case true and false, authentication started working.

Dynamic Drools Endpoint Update in Drools Camel Server

The "User Guide" for Drools 6 states that a camel endpoint for drools shall be in the below format:
<to uri="kie:{1}/{2}" />
where
{1}: Execution Node identifier that is registered in the CamelContext
{2}: Knowledge Session identifier that was registered in the Execution Node with identifier {1}
Doubt # 1 :
If the sessions are created before the endpoints are built, how the incremental changes in kmodule will be picked up by the sessions created before?
Statement about KScanner from the document:
If the KieScanner finds, in the Maven repository, an updated version
of the Kie project used by that KieContainer it automatically
downloads the new version and triggers an incremental build of the new
project. From this moment all the new KieBases and KieSessions created
from that KieContainer will use the new project version.
Doubt # 2
And..I am trying to configure this endpoint to route to ksessions dynamically.
Traversing through the source code, I tried declaring the uri to kie:dynamic and adding new sessions to KieEndpoint.executorsByName, like
KieEndpoint endPoint = (KieEndpoint)camel.getEndpoint("kie:dynamic");
endPoint.executorsByName.put(sessionName, kSession);
Is this the right way of adding the dynamic sessions? I didn't find any hint in user guide for this.

Resources