Dynamic datasource.json in loopback - angularjs

I am using loopback with nodejs.
In my datasource.json file i have a connection Settings
"mongoConnector": {
"host": "127.0.0.1",
"port": 27017,
"url": "",
"database": "DB",
"password": "",
"name": "mongoConnector",
"user": "",
"connector": "mongodb"
}
I have a global config json file which contains db connection settings like pwd, host ...
Is there any way to modify mongoConnector dynamically according to global config file.

You'll have to use a .js configuration file for this. You can create either a server/datasources.local.js (will always take precedence over other config files). Or you can use a NODE_ENV environment variable to specify a different suffix (i.e. datasources.{NODE_ENV}.js). In the file, just export an object that contains your configuration. You can use process.env.FOO to get environment variables. You could also require() your global file inside the .js config file, and pull from there. It's up to you.
You can find some more info on the LB docs.

Related

ExtJS Package Classic Resources Missing

I have an ExtJS package with the following structure:
Package
classic
resrouces
file.json
When I build the app with the package in production mode, the file.json is missing.
How can I get the build to include the resources from classic directory (within a package)?
EDIT
Adding the following to package.json enables copying files from both toolkit specific and shared resources directory.
"resource": {
"paths": [
"${package.dir}/resources",
"${package.dir}/${toolkit.name}/resources"
]
},
However, all the files (from classic/resources/ and resources/) are copied to the same directory (build/production/AppName/classic/resources/PackageName/) and if same filename exists in both directories, one file overwrites the other in the build directory.
build/production/AppName/classic/resources/PackageName/some_resource_file.json
How can they be separated so both files exists in the build?
In your main project folder, you have the file app.json, where you can define which directories should be copied when building the project and a rule for skipping some of them:
{
"production": {
"output": {
"resources": "resources",
// ...
},
},
"resources": [{
"path": "resources", // in your case classic/resources
"output": "shared"
}],
/**
* File / directory name pattern to ignore when copying to the builds. Must be a
* valid regular expression.
*/
"ignore": [
"(^|/)CVS(/?$|/.*?$)"
]
}

Visual Studio Code SQL Server connection to encrypted database with Azure Key Vault (Always Encrypted)

I'm trying to get a connection from VSCode on macOS to a SQL Server database that uses always encrypted mechanism to protect some of the columns. The master key is stored in an Azure Key Vault.
Using the always encrypted guide provided by Microsoft is was able to connect successfully to the database.
The same is true for a simple connection using VSCode on my mac without turning on the encryption/decryption. I used the mssql-extension plugin and providing the necessary information within the settings I was able to query the data
Settings
"mssql.connections": [
{
"server": "XXXXXXXX.database.windows.net",
"database": "AlwaysEncrypted",
"authenticationType": "SqlLogin",
"user": "XXXXX",
"password": "",
"emptyPasswordInput": false,
"savePassword": true,
"profileName": "AlwaysEncrypted"
}
]
Query
SELECT * FROM EmployeeDetails
Result
[
{
"EmployeeDetailsId": "1",
"EmployeeNo": "FE00000001",
"FirstName": "0x013EC8AB61767E1C3D934AB061BCA658B6948637812450C8245DCE4C447F59FD1D6252069A36A67E3477E1C5FB24D860E72FBCC65F98C92B92AB873CE55349672A",
"MiddleName": "0x015354526EC17EB1151AE918514E565507EDCB5691B4215C45798CA86EB11C47EECA579242926EDFE9F6543006177CBFC03E0F95CD0D8CAE6C941AE173AAF2B925",
"LastName": "0x0170B3FD2B0416E0607312FB2A67B0F42798EC1871FEAB90AB81235ADACDE1C4F5614099FA3B61E59FEB2D6AD599CB3A9FD031FE56F327F0C80F4BA963EE7E155A",
"DateOfBirth": "1985-08-12 00:00:00.000"
}
]
Following the two guides
https://learn.microsoft.com/en-us/sql/connect/odbc/using-always-encrypted-with-the-odbc-driver?view=sql-server-2017
https://github.com/Microsoft/vscode-mssql/wiki/manage-connection-profiles
I did try to create another connection using the mssql-extension and providing a ODBC Connection String but ultimately failed to get decrypted data when querying (the connection was established just fine). The result was the same as posted above
Settings with Connection String
"mssql.connections": [
{
"server": "XXXXXXXX.database.windows.net",
"database": "AlwaysEncrypted",
"authenticationType": "SqlLogin",
"user": "XXXXX",
"password": "",
"emptyPasswordInput": false,
"savePassword": true,
"profileName": "AlwaysEncrypted_WithKeyVault",
"connectionString": "SERVER=XXXXXX.database.windows.net;Trusted_Connection=Yes;DATABASE=AlwaysEncrypted;ColumnEncryption=Enabled;KeyStoreAuthentication=KeyVaultPassword;KeyStorePrincipalId=USER.NAME#DOMAIN.com;KeyStoreSecret=PASSWORD"
}
]
Can anyone help me to figure out how to setup the connections right, so that the encryption/decryption will he done transparently when using VSCode?
Bit of a stale question, but for anyone who also ends up finding this:
I managed to get a successfully connection on VS Code SQL Server by having the following settings in my settings.json mssql.connections array:
{
"server": "XXXX.serverhost.domain",
"database": "XXXX",
"authenticationType": "SqlLogin",
"user": "XXXX",
"password": "",
"savePassword": true,
"profileName": "XXXX",
// specifically the settings below were the important ones
"encrypt": true,
"trustServerCertificate": true,
"persistSecurityInfo": true
}

Adding local dependency in Zeppelin Helium

I am creating a Zeppelin Helium Visualization and I need to add one local dependency. I am working on Zeppelin 0.8.snapshot version.
I am not able to do it, I have tried adding in the following manner. I tried using "*" for my modules, I also tried providing relative path without success.
My module has to be added locally.
{
"name": "zeppelin_helium_xxx",
"description" : "xxx",
"version": "1.0.0",
"main": "heliumxxx",
"author": "",
"license": "Apache-2.0",
"dependencies": {
"mymodule": "*",
"zeppelin-tabledata": "*",
"zeppelin-vis": "*"
}
}
Currently, Zeppelin doesn't support the relative path in helium json. You need to provide the absolute path for the artifact field.
Here is one example from https://github.com/1ambda/zeppelin-highcharts-columnrange/blob/master/zeppelin-highcharts-columnrange.json
{
"type" : "VISUALIZATION",
"name" : "zeppelin-highcharts-columnrange",
"version" : "local",
"description": "Column range chart using highcharts library",
"artifact" : "/Users/lambda/github/1ambda/zeppelin-highcharts-columnrange",
"icon": "<i class=\"fa fa-align-center\"></i>"
}
Additionally, there is a JIRA ticket for this issue.
https://issues.apache.org/jira/browse/ZEPPELIN-2097
And you might see an incorrect error message when you load local helium packages.
ERROR [2017-03-05 12:54:14,308] ({qtp1121647253-68}
HeliumBundleFactory.java[buildBundle]:131) - Can't get module name and version of package zeppelin-markdown-spell
Then check the artifact value again. Probably, it's invalid.
https://issues.apache.org/jira/browse/ZEPPELIN-2212

Accessing GAE log files using Google Cloud Logging (Python)

We have a running Google App Engine (GAE) service for which we would like to download the logs for archival on our server.
The GAE has a Service Account, the Credentials for which have been downloaded as a JSON file to our server. The following code, run on our server, attempts to create a client for the logging service:
from google.cloud import logging
client = logging.Client.from_service_account_json('credentials.json')
with the result:
ValueError: Service account info was not in the expected format, missing fields token_uri, client_email.
The error message is quite clear, but what is not clear is why the fields are expected in a JSON file that was created for this purpose? Are we using the credentials from the wrong type of Service account?
You need to get the service account file that contains the private key credentials, it's basically a different file from the one you have.
You can get it, or get a new one by going to https://console.developers.google.com/iam-admin/iam/ then select your project, then select "Service accounts" and create a new one as role "viewer" for the project for example (or use one that already exists and click "create new key")
The "key" is a json or p12 file that will be downloaded when you create the account (or use "create new key" there) which contains the correct fields and credentials that will work for your code.
Example structure of the downloaded "key" file (when selecting JSON):
{
"type": "service_account",
"project_id": "zeta-handler-9999",
"private_key_id": "123456789deedbeaf",
"private_key": "-----BEGIN PRIVATE KEY-----\nREDACTED REDACTED...-----END PRIVATE KEY-----\n",
"client_email": "projectname-service-account#zeta-handler-9999.iam.gserviceaccount.com",
"client_id": "12345678909999",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/projectname-service-account%40zeta-handler-9999.iam.gserviceaccount.com"
}
Example code to use that "key" file (python):
#!/usr/bin/env python
import google.auth
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file('downloaded_key.json')
scoped_credentials = credentials.with_scopes(['https://www.googleapis.com/auth/drive.metadata.readonly'])
{
"type": "service_account",
"project_id": "",
"private_key_id": "",
"private_key": "-----BEGIN PRIVATE KEY-----
something long here
---END PRIVATE KEY-----\n",
"client_email": "",
"client_id": "",
"auth_uri": "",
"token_uri": "",
"auth_provider_x509_cert_url": "",
"client_x509_cert_url": ""
}

Sencha SDK Tool / JSBuilder - include all files with *

Using Sencha Touch 1, I am having to manually create my app.jsb3 file (and add all linked JavaScript files to that .jsb3 file) in order to use jsbuilder to minify all files.
Anybody know the correct syntax to allow use of a wildcard (.*) (referred to as a Filter) in order to easily include all files in a folder?
I know the docs here: http://dev.sencha.com/deploy/JSBuilder2/JSB2FileFormat.txt
state that ".*" can be used - but I've had no joy:
"files": [
{"path": "views/", "name": "file1.js"},
{"path": "views/", "name": "file2.js"}, // currently manually adding each file
...
{"path": "views/", "name": ".*"}, // include all?
{"path": "views/.*"}, // include all?
]
The docs only show filters in relation to listing resources, but not also for listing files - may be why!

Resources