Terraform nested loop array of objects with an array in object - arrays

Just having this input in my module
databases = [
{
db_name = "test_0"
db_owner = "testu_user_0",
extensions = ["unaccent"]
},
{
db_name = "test_db"
db_owner = "test_user"
extensions = ["uuid_ossp","pg_trgm"]
}
]
And then i need to loop through and make specifiec extensions. How can i achieve that?
For the database creation it was pretty straightforward
resource "postgresql_database" "db" {
for_each = {for db in var.databases : db.db_name => db}
name = each.key
owner = postgresql_role.specific_role["${each.value.db_owner}"].name
lifecycle {
prevent_destroy = false
}
}
But now when it comes to extensions im having a hard time to make it happen. I can see examples online, but they all use object with an array, not an array filled with objects and plus nested array inside.
# resource "postgresql_extension" "uuid_ossp" {
# for_each = {for db in var.databases : db.db_name => db}
# name = "uuid-ossp"
# database = each.key
# }
Please help

You have to flatten your data structure, in locals for instance, and then use that in postgresql_extension:
locals {
db_extentions = merge([
for db in var.databases :
{
for ext in db.extensions:
"${db.db_name}-${ext}" => {
db_name = db.db_name
db_owner = db.db_owner
extension = ext
}
}
]...) # <-- the dots are important! Don't remove them
}
resource "postgresql_extension" "uuid_ossp" {
for_each = local.db_extentions
name = each.value.extension
database = each.value.db_name
}
The three dots are for Expanding Function Arguments.

Related

How to use terraform to enable Managed private endpoint on datafactory azure sql database linked service

I am trying to use terraform to create adf linked services however the terraform resource doesn't give the option to select an already existing managed private endpoint for the linked service to communicate over but when creating from the portal, this is possible. bellow is my code
resource "azurerm_data_factory" "process-adf" {
resource_group_name = module.resourcegroup.resource_group.name
location = module.resourcegroup.resource_group.location
name = "adf"
managed_virtual_network_enabled = true
public_network_enabled = false
tags = var.tags
identity {
type = "SystemAssigned"
}
}
resource "azurerm_data_factory_linked_service_azure_sql_database" "process-mssql-adf" {
name = "mssql-adf"
data_factory_id = azurerm_data_factory.process-adf.id
integration_runtime_name = azurerm_data_factory_integration_runtime_azure.adf.id
connection_string = "data source=servername;initial catalog=databasename;user id=admin;Password=password;integrated security=True;encrypt=True;connection timeout=30"
}
resource "azurerm_data_factory_managed_private_endpoint" "adf-msssql-pe" {
name = "adf"
data_factory_id = azurerm_data_factory.process-adf.id
target_resource_id = azurerm_mssql_server.process-control.id
subresource_name = "sqlServer"
}
resource "azurerm_data_factory_integration_runtime_azure" "adf" {
name = "adf"
data_factory_id = azurerm_data_factory.process-adf.id
location = module.resourcegroup.resource_group.location
virtual_network_enabled = true
}
how do i point the resource azurerm_data_factory_linked_service_azure_sql_database to the resource azurerm_data_factory_managed_private_endpoint ?

Is it possible for an Asp.Net Core Web Application to be integrated with local storage?

I have the below working on local host. However when I publish to azure the connection to my network doesn't work? I guess, this is obvious - my web app doesn't have connection to my local network. My question is, is there a way to allow a connection?
try
{
var path = "//192.168.49.14/Data/18 Customer Requests/Customer Request System/Request Letters/";
//Fetch all files in the Folder (Directory).
string[] filePaths = Directory.GetFiles(Path.Combine(path));
//Copy File names to Model collection.
var localFileList = new List<FileData>();
var assocFiles = new List<FileData>();
var linkedFiles = new List<FileData>();
assocFiles = localFileList.Where(x => x.FileName.Contains(id.ToString())).ToList();
foreach (var filePath in filePaths)
{
localFileList.Add(new FileData { FileName = Path.GetFileName(filePath) });
}
assocFiles = localFileList.Where(x => x.FileName.Contains(id.ToString())).ToList();
foreach (var item in assocFiles)
{
linkedFiles.Add(new FileData { FileName = Path.GetFileName(item.FileName) });
}
ViewBag.localFiles = linkedFiles;
}
catch
{
;
}
return View(Complaint);

Create database schema with terraform

I created RDS instance using aws_db_instance (main.tf):
resource "aws_db_instance" "default" {
identifier = "${module.config.database["db_inst_name"]}"
allocated_storage = 20
storage_type = "gp2"
engine = "mysql"
engine_version = "5.7"
instance_class = "db.t3.micro"
name = "${module.config.database["db_name_prefix"]}${terraform.workspace}"
username = "${module.config.database["db_username"]}"
password = "${module.config.database["db_password"]}"
parameter_group_name = "default.mysql5.7"
skip_final_snapshot = true
}
Can I also create database schemas from file schema.sql with terraform apply?
$ tree -L 1
.
├── main.tf
└── schema.sql
You can use a provisioner (https://www.terraform.io/docs/provisioners/index.html) for that:
resource "aws_db_instance" "default" {
identifier = module.config.database["db_inst_name"]
allocated_storage = 20
storage_type = "gp2"
engine = "mysql"
engine_version = "5.7"
instance_class = "db.t3.micro"
name = "${module.config.database["db_name_prefix"]}${terraform.workspace}"
username = module.config.database["db_username"]
password = module.config.database["db_password"]
parameter_group_name = "default.mysql5.7"
skip_final_snapshot = true
provisioner "local-exec" {
command = "mysql --host=${self.address} --port=${self.port} --user=${self.username} --password=${self.password} < ./schema.sql"
}
}
#Apply scheme by using bastion host
resource "aws_db_instance" "default_bastion" {
identifier = module.config.database["db_inst_name"]
allocated_storage = 20
storage_type = "gp2"
engine = "mysql"
engine_version = "5.7"
instance_class = "db.t3.micro"
name = "${module.config.database["db_name_prefix"]}${terraform.workspace}"
username = module.config.database["db_username"]
password = module.config.database["db_password"]
parameter_group_name = "default.mysql5.7"
skip_final_snapshot = true
provisioner "file" {
connection {
user = "ec2-user"
host = "bastion.example.com"
private_key = file("~/.ssh/ec2_cert.pem")
}
source = "./schema.sql"
destination = "~"
}
provisioner "remote-exec" {
connection {
user = "ec2-user"
host = "bastion.example.com"
private_key = file("~/.ssh/ec2_cert.pem")
}
command = "mysql --host=${self.address} --port=${self.port} --user=${self.username} --password=${self.password} < ~/schema.sql"
}
}
mysql client needs to be installed on your device.
If you don't have direct access to your DB, there is also a remote-exec provisioner, where you can use a bastion host (transfer file to remote place with file provisioner first).
If your schema is not to complex, you could also use the MySQL provider of terraform:
https://www.terraform.io/docs/providers/mysql/index.html

How can i establish rpc properties with the datasource type DB in Corda community edition?

To establish an RPC connection in the community edition we need to specify the rpc username, password and permissions but when we are integrating external database like MySQL and change the datasource type from INMEMORY to "DB" it does not allows to give user properties.
these are the settings I am using in my node.conf
security = {
authService = {
dataSource = {
type = "DB"
passwordEncryption = "SHIRO_1_CRYPT"
connection = {
jdbcUrl = "jdbc:mysql://localhost:3306"
username = "root"
password = "password"
driverClassName = "com.mysql.jdbc.Driver"
}
}
options = {
cache = {
expireAfterSecs = 120
maxEntries = 10000
}
}
}
Maybe I didn't understand your question, but database setup in node.conf is separate from RPC user setup in node.conf:
Database (PostGres in my case)
extraConfig = [
'dataSourceProperties.dataSourceClassName' : 'org.postgresql.ds.PGSimpleDataSource',
'dataSourceProperties.dataSource.url' : 'jdbc:postgresql://localhost:5432/postgres',
'dataSourceProperties.dataSource.user' : 'db_user',
'dataSourceProperties.dataSource.password' : 'db_user_password',
'database.transactionIsolationLevel' : 'READ_COMMITTED',
'database.initialiseSchema' : 'true'
]
RPC User
rpcUsers = [[ user: "rpc_user", "password": "rpc_user_password", "permissions": ["ALL"]]]
Ok, I'm adding my node's node.config (it's part of Corda TestNet, and it's deployed on Google Cloud):
baseDirectory = "."
compatibilityZoneURL = "https://netmap.testnet.r3.com"
emailAddress = "xxx"
jarDirs = [ "plugins", "cordapps" ]
sshd { port = 2222 }
myLegalName = "OU=xxx, O=TESTNET_xxx, L=London, C=GB"
keyStorePassword = "xxx"
trustStorePassword = "xxx"
crlCheckSoftFail = true
database = {
transactionIsolationLevel = "READ_COMMITTED"
initialiseSchema = "true"
}
dataSourceProperties {
dataSourceClassName = "org.postgresql.ds.PGSimpleDataSource"
dataSource.url = "jdbc:postgresql://xxx:xxx/postgres"
dataSource.user = xxx
dataSource.password = xxx
}
p2pAddress = "xxx:xxx"
rpcSettings {
useSsl = false
standAloneBroker = false
address = "0.0.0.0:xxx"
adminAddress = "0.0.0.0:xxx"
}
rpcUsers = [
{ username=cordazoneservice, password=xxx, permissions=[ ALL ] }
]
devMode = false
cordappSignerKeyFingerprintBlacklist = []
useTestClock = false

Dynamically add a "Field Collection" in Drupal 7 by script?

I want to add a "field collection" dynamically. But I'm not familiar with Field API or Entity API. New Entity API in Drupal is very poorly documented.
Here is my code, until now:
$node = node_load(1);
$field_collection_item = entity_create('field_collection_item', array('field_name' => 'field_book_text'));
$field_collection_item->setHostEntity('node', $node);
// Adding fields to field_collection
$field_collection_item.save();
"Field Collection" module use function "entity_form_submit_build_entity" which I cannot use because there is no form in my case.
I would appreciate if you can tell me how can I add fields?
Based on some code I used in a live project:
// Create and save research field collection for node.
$field_collection_item = entity_create('field_collection_item', array('field_name' => 'field_article_references'));
$field_collection_item->setHostEntity('node', $node);
$field_collection_item->field_reference_text[$node->language][]['value'] = 'ABCD';
$field_collection_item->field_reference_link[$node->language][]['value'] = 'link-val';
$field_collection_item->field_reference_order[$node->language][]['value'] = 1;
$field_collection_item->save();
Anyone using the above code samples should consider using the entity_metadata_wrapper function from the Entity API to set the values of fields on an entity instead of using an assignment operator. So, the code from the "more complete example" above would be:
if ($node->field_collection[LANGUAGE_NONE][0]) {
// update
$fc_item = reset(entity_load('field_collection_item', array($node->field_collection[LANGUAGE_NONE][0]['value'])));
}
else {
// create
$fc_item = entity_create('field_collection_item', array('field_name' => 'field_collection'));
$fc_item->setHostEntity('node', $node);
}
// Use the Entity API to "wrap" the field collection entity and make CRUD on the
// entity easier
$fc_wrapper = entity_metadata_wrapper('field_collection_item', $fc_item);
// ... set some values ...
$fc_wrapper->field_terms->set('lars-schroeter.com');
// save the wrapper and the node
// Note that the "true" is required due to a bug as of this time
$fc_wrapper->save(true);
node_save($node);
A more complete example:
if ($node->field_collection[LANGUAGE_NONE][0]) {
// update
$fc_item = reset(entity_load('field_collection_item', array($node->field_collection[LANGUAGE_NONE][0]['value'])));
}
else {
// create
$fc_item = entity_create('field_collection_item', array('field_name' => 'field_collection'));
$fc_item->setHostEntity('node', $node);
}
// ... set some values ...
$fc_item->field_terms[LANGUAGE_NONE][0]['value'] = 'lars-schroeter.com';
// save node and field-collection
$node->field_collection[LANGUAGE_NONE][0] = array('entity' => $fc_item);
node_save($node);
You don't need to call node_save($node) when using entity_metadata_wrapper. It will ensure that only the entity's data and the reference to the host are saved without triggering any node_save, which is a good performance boost.
However, you would still need node_save() if you have any node_save-triggered actions that use this field collection (e.g. a rule that sends emails when the node is edited).
use the wrappers, they are your friend:
// Create an Entity
$e = entity_create('node', array('type' => 'CONTENT_TYPE'));
// Specify the author.
$e->uid = 1;
// Create a Entity Wrapper of that new Entity
$entity = entity_metadata_wrapper('node',$e);
// Specify the title
$entity->title = 'Test node';
// Add field data... SO MUCH BETTER!
$entity->field_FIELD_NAME->set(1111);
// Save the node.
$entity->save();
You can find Entity API documented in Entity API Tutorial at Drupal.org.
There you can find some useful examples, especially check for Entity metadata wrappers page.
Here is example based on your variables:
$node = node_load(1);
$field_collection_item = entity_create('field_collection_item', array('field_name' => 'field_book_text')); // field_book_text is field collection
$field_collection_item->setHostEntity('node', $node);
$cwrapper = entity_metadata_wrapper('field_collection_item', $field_collection_item);
// Adding fields to field_collection
$cwrapper->field_foo_text->set("value");
$cwrapper->field_foo_multitext->set(array("value1", "value2"));
$cwrapper.save();
Here is another example using field collections from above docs page:
<?php
// Populate the fields.
$ewrapper = entity_metadata_wrapper('node', $node);
$ewrapper->field_lead_contact_name->set($contact_name);
$ewrapper->field_lead_contact_phone->set($contact_phone);
$ewrapper->field_lead_contact_email->set($contact_email);
// Create the collection entity and set it's "host".
$collection = entity_create('field_collection_item', array('field_name' => 'field_facilities_requested'));
$collection->setHostEntity('node', $node);
// Now define the collection parameters.
$cwrapper = entity_metadata_wrapper('field_collection_item', $collection);
$cwrapper->field_facility->set(intval($offset));
$cwrapper->save();
// Save.
$ewrapper->save();
?>
Here is more advanced example of mine which for given entity it loads taxonomy term references from field_rs_property_features, then for each secondary term which has a parent term, adds its term name and its parent term name into field_feed_characteristics_value by grouping them together into title (parent) and value (child). It's probably more difficult to explain without seeing the code. So here it is:
/**
* Function to set taxonomy term names based on term references for given entity.
*/
function MYMODULE_refresh_property_characteristics(&$entity, $save = FALSE) {
try {
$w_node = entity_metadata_wrapper('node', $entity);
$collections = array();
foreach ($w_node->field_rs_property_features->getIterator() as $delta => $term_wrapper) {
if ($term_wrapper->parent->count() > 0) {
$name = $term_wrapper->name->value();
$pname = $term_wrapper->parent->get(0)->name->value();
if (array_key_exists($pname, $collections)) {
$collections[$pname]->field_feed_characteristics_value[] = $name;
} else {
// Create the collection entity, set field values and set it's "host".
$field_collection_item = entity_create('field_collection_item', array('field_name' => 'field_feed_characteristics'));
$field_collection_item->setHostEntity('node', $w_node->value());
$collections[$pname] = entity_metadata_wrapper('field_collection_item', $field_collection_item);
$collections[$pname]->field_feed_characteristics_title = $pname;
$collections[$pname]->field_feed_characteristics_value = array($name);
}
}
}
if ($save) {
$w_node->save();
}
} catch (Exception $e) {
drupal_set_message(t('Error setting values for field collection: #title, message: #error.',
array('#title' => $w_node->title->value(), '#error' => $e->getMessage())), 'error');
watchdog_exception('MYMODULE', $e);
return FALSE;
}
return TRUE;
}

Resources