Camel Kafka Connector - Azure Blob Storage Source Connectors Using SAS Token - apache-camel

We are implementing Camel Kafka Connector - Azure Storage Blob source to download files form AZ Blob storage container.
We found that for authentication the connector requires AccessKey, where AccessKey has full access of the storage account and we don't want got with accesskey as there are no restrictions.
https://camel.apache.org/camel-kafka-connector/next/reference/connectors/camel-azure-storage-blob-source-kafka-source-connector.html
We tried to use SAS Token for authentication and got error in HMAC Encryption, is it possible to use SAS Token instead of AccessKey.
org.quartz.JobExecutionException: java.lang.IllegalArgumentException: Input byte array has wrong 4-byte ending unit
at org.apache.camel.component.quartz.CamelJob.execute(CamelJob.java:80) ~[camel-quartz-3.18.2.jar:3.18.2]
at org.quartz.core.JobRunShell.run(JobRunShell.java:202) ~[quartz-2.3.2.jar:na]
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573) ~[quartz-2.3.2.jar:na]
Caused by: java.lang.IllegalArgumentException: Input byte array has wrong 4-byte ending unit
at java.base/java.util.Base64$Decoder.decode0(Base64.java:837) ~[na:na]
at java.base/java.util.Base64$Decoder.decode(Base64.java:566) ~[na:na]
at java.base/java.util.Base64$Decoder.decode(Base64.java:589) ~[na:na]
at com.azure.storage.common.implementation.StorageImplUtils.computeHMac256(StorageImplUtils.java:252) ~[azure-storage-common-12.15.2.jar:12.15.2]
at com.azure.storage.common.StorageSharedKeyCredential.generateAuthorizationHeader(StorageSharedKeyCredential.java:149) ~[azure-storage-common-12.15.2.jar:12.15.2]
at com.azure.storage.common.policy.StorageSharedKeyCredentialPolicy.process(StorageSharedKeyCredentialPolicy.java:38) ~[azure-storage-common-12.15.2.jar:12.15.2]

Related

Snowflake JDBC throws null pointer exception when downloading file from server-side encrypted internal stage

When I tried to use Snowflake JDBC (3.13.16) to download a file via downloadStream() method from internal stage, which has encryption option set to "snowflake_sse", I'm getting NullPointerException. I found that if I change the stage encryption to "snowflake_full", which enables client side encryption, the exception disappear and file is downloaded successfully.
However, if encryption is set to "snowflake_full", I can't use presigned url to download the files because the downloaded content is not decrypted.
So my question is, is there a way to use downloadStream() function from Snowflake JDBC with server-side encrypted stage? or is there a way to use presigned url to download decrypted file when stage is client-side encrypted?

What is the best way to send a file reference through ActiveMQ, when using SpringBoot 1.3.2.RELEASE?

I was using SpringBoot 1.3.1.RELEASE and I was having no problems to send file references through a message in ActiveMQ.
Since I have updated to SpringBoot 1.3.2.RELEASE I have been experimenting a problem.
The activemq client refuse to read the file reference with the following error:
Caused by: java.lang.ClassNotFoundException: Forbidden class java.io.File! This class is not trusted to be serialized as ObjectMessage payload. Please take a look at http://activemq.apache.org/objectmessage.html for more information on how to configure trusted classes.
at org.apache.activemq.util.ClassLoadingAwareObjectInputStream.checkSecurity(ClassLoadingAwareObjectInputStream.java:112) ~[activemq-client-5.12.2.jar:5.12.2]
at org.apache.activemq.util.ClassLoadingAwareObjectInputStream.resolveClass(ClassLoadingAwareObjectInputStream.java:57) ~[activemq-client-5.12.2.jar:5.12.2]
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613) ~[na:1.8.0_05]
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518) ~[na:1.8.0_05]
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774) ~[na:1.8.0_05]
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) ~[na:1.8.0_05]
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) ~[na:1.8.0_05]
at org.apache.activemq.command.ActiveMQObjectMessage.getObject(ActiveMQObjectMessage.java:206) ~[activemq-client-5.12.2.jar:5.12.2]
... 16 common frames omitted
My only solution to solve this problem was to add the first two lines in the creation of the bean JmsTransactionManager.
#Bean
public JmsTransactionManager jmsTransactionManager(ConnectionFactory cf){
ActiveMQConnectionFactory amqCf = (ActiveMQConnectionFactory)cf;
amqCf.setTrustAllPackages(true); // Without this we are not able to send the object type File.class through a JMS message. This problem started when we migrated to Springboot 1.3.2-RELEASE
JmsTransactionManager result = new JmsTransactionManager();
result.setConnectionFactory(cf);
return result;
}
Is there a more recommended way to solve this problem?
Java serialization has many security vulnerabilities that expose you to risk. The updated broker now blacklists most objects and requires you to create a whitelist of the things you want to handle.
Documentation for this is on the ActiveMQ Website.

I am using Http Form Adapter in Ping Federate. How to get user attributes from SAML Response?

Http Form adapter serves as an authentication service in my application. I have not implemented any application on the Identity Provider to get user inputs.
Therefore, on successful authentication, SP verifies the user's signature and redirects to the application. At my target Resource, I receive an open token. Is it still possible to utilize the open Token Jar to read the user attributes from OTK?
**Note: ** In Service Provider, I use open token Adapter.
Also, please let me know if there is any other possible way of getting the user attributes other than using the open token adapter/http form adapter.
Thanks.
There are numerous SP Adapters you can choose to use for your last mile integration with your application. The OpenToken Adapter is just one of them. If your application is in Java and you are using the SP OpenToken Adapter, then you would most likely use the Java OpenToken Agent implementation within your application to read the OTK (documented in the Java Integration Kit). If you look at the Add Ons list, there are actually 3 flavors of OTK Agents (.NET, Java and PHP from PingID. Ruby on Rails and Perl are available via respective Open Source repositories).
However, you are not limited to OpenToken Adapters. The Agentless Integration Kit is also very popular for SP/last-mile integration with PingFederate.
Unfortunately, the question is just too open ended for the Stackoverflow format. I would suggest talking to your Ping Identity Solution Architect who can help steer you in the right direction and ask the necessary follow-up questions on your use case.
If understand the question correctly, you desire attributes to be fulfilled that the web application can read and utilize. This starts with the SP Connection configuration. I am going to assume you are using Active Directory and already configured that data source along with the Password Credential Validator (PCV) for the HTML Form IdP Adapter. In the SP Connection you will need to extend the attribute contract to define the values to put into the SAML assertion and then use the Active Directory data source to fulfill the attributes. When the SAML assertion is received by the PingFederate SP role server, the SP Adapter maps the attribute values from the SAML assertion into the OpenToken. When your application receives the OpenToken, it can read the values.

How to connect a GAE app and a GCE app to the same datastore locally?

I am running into an issue similar to this one. I have a GAE app and a GCE app that seem to work fine once in the cloud but I am having trouble getting my local environment working so that both of them access the same datastore.
I have setup the local datastore as described in the link above except my code looks like this (I had to build it this way in order to get it working in the cloud):
print("Connecting to datastore - " + datasetId);
// Get credentials form GCE if not local.
Credential computeEngineCredential = DatastoreHelper.getComputeEngineCredential();
if(computeEngineCredential != null)
{
print("Compute Engine Credetianls are not null! Access token: " + computeEngineCredential.getAccessToken());
}
DatastoreOptions options = DatastoreHelper.getOptionsfromEnv().credential(DatastoreHelper.getComputeEngineCredential()).dataset(datasetId).build();
print("Connection Host: " + options.getHost());
print("Connection Dataset: " + options.getDataset());
datastore = DatastoreFactory.get().create(options);
When I run the GCE app locally and try to connect to the running GAE datastore I get the following (I have replaced the actual data set id with "myDatasetId" in the output below):
Connecting to datastore - "myDatasetId"
Connection Host: http://localhost:8888
Connection Dataset: "myDatasetId"
com.google.api.services.datastore.client.DatastoreFactory makeClient
WARNING: Not using any credentials
There was a problem running query: Error:
runQuery Error 404 Error
404
com.google.api.services.datastore.client.DatastoreException: Error 404 Error
404
at com.google.api.services.datastore.client.RemoteRpc.makeException(RemoteRpc.java:114)
at com.google.api.services.datastore.client.RemoteRpc.call(RemoteRpc.java:80)
at com.google.api.services.datastore.client.Datastore.runQuery(Datastore.java:109)
My GAE console prints this out (I can access the admin console in the 8888 port just fine):
com.google.appengine.tools.development.AbstractModule startup
INFO: The admin console is running at http://localhost:8888/_ah/admin
com.google.appengine.tools.development.LocalResourceFileServlet doGet
WARNING: No file found for:
/datastore/v1beta2/datasets/"myDatasetId"/runQuery
I have verified the dataset ID in the GCE app and the GAE app match. I have been able to successfully run each app localy on their own and they both are able to properly connect to the local Datastore (this is while using the gcd.cmd tool for the GCE app). Based on the answer in the link above, it sounds like this is possible, am I doing something wrong?
Update
Not sure if this is related but I am getting the following error when starting up the GCD Tool:
SEVERE: Unable to load the App Engine dev agent. Security restrictions
will not be completely emulated. java.lang.RuntimeException:
Unexpected exception during cast.
at com.google.apphosting.utils.clearcast.ClearCast$CasterImpl.cast(ClearCast.java:385)
at com.google.apphosting.utils.clearcast.ClearCast.staticCast(ClearCast.java:252)
at com.google.apphosting.utils.clearcast.ClearCast.staticCast(ClearCast.java:263)
at com.google.appengine.tools.development.agent.AppEngineDevAgent.premain(AppEngineDevAgent.java:61)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at sun.instrument.InstrumentationImpl.loadClassAndStartAgent(Unknown
Source)
at sun.instrument.InstrumentationImpl.loadClassAndCallPremain(Unknown
Source) Caused by: java.lang.IllegalAccessException: Class
com.google.apphosting.utils.clearcast.ClearCast$CasterImpl can not ac
cess a member of class
com.google.appengine.tools.development.agent.$Proxy0 with modifiers
"public"
at sun.reflect.Reflection.ensureMemberAccess(Unknown Source)
at java.lang.reflect.AccessibleObject.slowCheckMemberAccess(Unknown
Source)
at java.lang.reflect.AccessibleObject.checkAccess(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at com.google.apphosting.utils.clearcast.ClearCast$CasterImpl.cast(ClearCast.java:383)
... 9 more
If this initialized properly, could I somehow connect my GAE App to the GCD Tool datastore? So confused.
There's no officially supported way to share Datastore data between the Java Development Server (dev_appserver.sh) and the local Cloud Datastore tool (gcd.sh).
However, if your app is written in Java, you may be able to use the workaround described here and point dev_appserver.sh to the data file generated by gcd.sh by specifying the -Ddatastore.backing_store=<project dir>/WEB-INF/appengine-generated/local_db.bin option.
Thanks for replying this was very useful. I was able to point my local GAE to the local_db.bin that the gcd tool uses through eclipse by providing the "-Ddatastore.backing_store" as a VM argument as you suggested.
However, they still seem to have a different views of the datastore. The admin viewer for the GAE app running on the default 8888 port only shows the data added by the GAE app, on the other hand, the gcd one running on the gcd tool port (8080) shows the data added by the GCE app.
I assumed this was just a visibility issue on the admin site so I tried to access the GAE data through my GCE app but the query is unsuccessful, it doesn't seem to find the Entity kind and thus returns no results. I was able to verify that once deployed GCE is able to access the data the GAE app wrote to the datastore with the same query, due to this I am assuming it is not a namespace issue but more of an issue on where the data is held. Even though they are both pointing to the same local_db.bin file it seems the data is still split somewhere.
Am I supposed to run the dev_appserver.cmd directly form the command line maybe? If so, How do I do this for an EAR project (currently on eclipse).

How do I obscure a password in a Camel configuration file

I am looking at using the Camel crypto tool for processing PGP data but have a requirement that the password to the keys used be either encrypted in the configuration file or be sourced from a secure server elsewhere. Is this possible without generating my own PGP processor?
Yes see the security menu on the Apache Camel web site: http://camel.apache.org/security.html
There is a section about configuration security, where you can use camel-jasypt for that: http://camel.apache.org/jasypt.html
This allows you to store encrypted usernames / passwords etc in a .properties file, and then you can refer to these properties from Camel crypto, using Camel's property placeholder: http://camel.apache.org/using-propertyplaceholder.html

Resources