Logic apps encrypt/decrypt connectors - azure-logic-apps

I need to pgp encrypt a file before I move it to a specified location in logic apps. I can see a file system connector that i can use to copy files etc. But i am not able to find a connector to PGP encrypt/decrypt. My guess is souch connector does not exist, if so what is the best way to create a custom connector for this?

There's an Azure Function for that. You'd need to deploy this function configure keys, probably in the Azure Key Vault and then call the function from your logic app.
https://github.com/lfalck/AzureFunctionsPGPEncrypt

Related

How to use Google Cloud KMS key as SigningCredential in IdentityServer4?

I have an IdentityServer4 app running on App Engine. As I prepare for production, I have a lot of confusion using .AddSigningCredential() in this scenario. In app engine, I don't have a file system to cert store in a traditional sense, and it seems like I can only pass in a path to a cert/key or the contents itself.
As I'm hosting on Google Cloud, I would like to use KMS. The API for KMS gives me the ability to sign something, but Identity Server doesn't give me a way to use that, and I don't see a way for KMS to give Identity server what it wants.
The only workaround I came up with is to generate a key pair, save it as a secret, and pass that in to .AddSigningCredential(). The downside is that now I have to manage this key/secret manually. There must be another way.

Is it possible to upload files to Snowflake via REST protocol directly?

Does anybody know whether it is possible to upload files to Snowflake using REST API endpoint directly, not using 3rd party drivers like https://docs.snowflake.com/en/user-guide/dotnet-driver.html
I didn't find such information in their general API docs: https://docs.snowflake.com/en/user-guide/data-load-snowpipe-rest-apis.html But I assume that may be this information is not publically available. Does anybody know?
The API you're referencing is for the Snowpipe REST API. This API is supported and publicly documented, but I don't think it's what you want.
The Snowpipe REST API does not upload files. Instead, you can invoke it to inform Snowpipe that there are new files in an external stage ready for copying into a table. Something else needs to get the files uploaded to the external stage in S3, Azure Blob, or GCP.
As far as a general-purpose REST API, it's supported only for Snowflake and partner developers use and not publicly documented. The best method is to use one of the drivers or connectors (ODBC, JDBC, .NET driver, etc.) to upload files. If that doesn't work for you, you can put the files to an external stage using whatever supported method you like for that cloud host. You can then use the Snowpipe REST API to initiate the copy into the table or just use SQL and a warehouse to do the copy into the table.

Jenkins external credentials storage on CyberArk

How to use external CyberArk vault to store credentials in free version Jenkins?
Here you can find info regarding the standard jenkins credentials plugin - that provides an API for external storage.
But after digging a while on the net, I’ve found that:
1. Cyberark vault is available on Cloudbees Jenkins only
2. HarshiCorp vault plugin is available for free
3. Here is a nice example how wrong permissions policy can lead to exposing all credentials. I tried it, works like a magic! :)
You really don't want to store credentials (or any sensitive secret, really) in Jenkins. It's not a vault and should never be used as one. Otherwise you'll end up with your Jenkins servers becoming a major target for attackers.
Instead, integrate your Jenkins pipelines to pull secrets securely into executors only when needed, and discarded when the build/test job is done. This is easily done with something like Summon, which is integrated with many vaults already, including Conjur (which is a CyberArk product, too). Both are open source offerings.
This blog post describes an approach to integrating Jenkins with a vault along the lines of what I've described above.
Appears that CyberArk released Jenkins plugin which supports that scenario:
https://docs.cyberark.com/Product-Doc/OnlineHelp/AAM-DAP/Latest/en/Content/Integrations/jenkins.htm
You might want to look at https://github.com/tmobile/t-vault .
This will eliminate the need to manage policies. You could create one safe per project folder or job in jenkins.
You can create approles and grant access to safe for the approle. Each project can use corresponding approle. You can grant access to individuals to the safe as well. Users can then use the webui to author and update the secrets.

FTP to Google Cloud

I need to write ftp script to move the file from File location to google cloud.
I could do this manually using CrossFTP. where it does have component to connect to Google cloud using Access key and secret key. but when i use FTP script i am not able to connect it.
If I want to use traditional FTP way for google cloud what should be server name and user ID and pass. I tried using commondatastorage.googleapis.com But it does work with access key and secret key.
Any information will be helpful.
All I need is, I need to automate this FTP process.
If you're automating using a shell script, you can use gsutil. This allows you to upload / download files, modify ACLs and so on.

How to manage asymmetric keys without checking them into source control?

I have a google app engine application which needs to be given a public-private key pair. I don't want to check this into source control because it will be accessible by too many people. Since this is GAE I can't use the build system to write the keys to the file system of the server.
Is there a known best practice for this?
My first thought was does Jenkins provide a way to manage keys securely? I know I can just copy the keys to a location on the jenkins server and copy them into the build but this project will be used by third party teams so I need to provide a UI based solution in jenkins. I did not find any relevant plugin but I would like to make sure there isn't a better way before writing my own.
There are of course several approaches to this. I believe certificates are a concern of admins, not developers.
What we do is have custom admin pages where we upload certificates to blobstore under separate namespace. Then we have an internal "service" (just a simple factory) so that other pieces of code can retrieve certs.
If you are happy to use a cloud based Jenkins, we (CloudBees) have an oauth based solution at appengine.cloudbees.com
You could roll your own. It is not excessively tricky. You will need to
Register with google's api console to get a client key and secret and define the endpoints that your app will show up as
Write some way if feeding those credentials to your Jenkins build. I would recommend using the credentials plugin
Either write a build wrapper that exposes the refresh token to your build (python sdk deployment) or exposes the access token (java sdk... Got to love that the two sdks do the same thing in different ways)
Or use our free service ;-)

Resources