SSDT How to: Deploy .dacpac ignoring drop columns on target - database

I'm using the Database project in Visual Studio.
In my scenario I have customers who intuitively customize the database I provide for them, e.g. adding some columns in existing tables, adding new tables...
How can I make when deploying .dacpac these objects that are not part of my schema are not excluded?
Important:
Set DropObjectsNotInSource=FALSE is not working for table columns.
EDIT
Ed, please see if I am doing something wrong:
using (DacPackage dacPackage = DacPackage.Load(DacPacFileName))
{
var dacServices = new DacServices(ConnectionString);
var dacOptions = new DacDeployOptions
{
AdditionalDeploymentContributorArguments = "SqlPackageFilter=KeepTableColumns(*)"
};
dacServices.Deploy(dacPackage, NomeBancoDados, true, dacOptions);
}
Is '*' in table filter "ALL TABLES" ?
I tried a table name too, but it did not work.

I wrote this for this scenario:
https://agilesqlclub.codeplex.com
You can "Keep" which deploys objects if they do not exist but does not deploy if there are changes or "Ignore" to completely ignore.
You can do this on type or name (regex)
Ed

Related

ODI 12c CUSTOM_TEMPLATE extract option not working

I'm using the CUSTOM_TEMPLATE extract option on the source table to force a select actually from another table. Which then would be used by a custom IKM I'm using in order to get the column list of the "forced" table with the odiRef.getColList API. But the template select query is not considered at all in the execution, so the IKM still gets the columns from the original table and I don't need them.
The code in the CUSTOM_TEMPLATE is:
select *
from <%=odiRef.getObjectName("L", "#V_OFFL_TABLE_NAME", "OFFLOAD_AREA_HIST", "DWH_LCL", "D") %>
where src_date_from_dt = to_date('V_OFFL_TRANSFER_DATE','YYYY-MM-DD')
The code in the SOURCE tab of the custom IKM I made is:
select <%=odiRef.getSrcColList("","[COL_NAME]",",\n","")%>
from <%=odiRef.getObjectName("L", "#V_OFFL_TABLE_NAME", "OFFLOAD_AREA_HIST", "DWH_LCL", "D") %>
where src_date_from_dt = to_date('V_OFFL_TRANSFER_DATE','YYYY-MM-DD')
in this case I'm trying with odiRef.getSrcColList in the IKM, but I;ve also tried with odiRef.getColList - same result.
Try to create a Dummy Datastore in the Model where you have that table.
You can add dummy or general attributes into that Datastore(Var1,Var2,Var3...Num1,Num2..etc).
Drag this datastore to the source area and add the CUSTOM_TEMPLATE there.
*Make sure this dummy is having the same logical schema as the table in your custom query.
This can also work with multi table query.

Common strategy in handling concurrent global 'inventory' updates

To give a simplified example:
I have a database with one table: names, which has 1 million records each containing a common boy or girl's name, and more added every day.
I have an application server that takes as input an http request from parents using my website 'Name Chooser' . With each request, I need to pick up a name from the db and return it, and then NOT give that name to another parent. The server is concurrent so can handle a high volume of requests, and yet have to respect "unique name per request" and still be high available.
What are the major components and strategies for an architecture of this use case?
From what I understand, you have two operations: Adding a name and Choosing a name.
I have couple of questions:
Qustion 1: Do parents choose names only or do they also add names?
Question 2 If they add names, doest that mean that when a name is added it should also be marked as already chosen?
Assuming that you don't want to make all name selection requests to wait for one another (by locking of queueing them):
One solution to resolve concurrency in case of choosing a name only is to use Optimistic offline lock.
The most common implementation to this is to add a version field to your table and increment this version when you mark a name as chosen. You will need DB support for this, but most databases offer a mechanism for this. MongoDB adds a version field to the documents by default. For a RDBMS (like SQL) you have to add this field yourself.
You havent specified what technology you are using, so I will give an example using pseudo code for an SQL DB. For MongoDB you can check how the DB makes these checks for you.
NameRecord {
id,
name,
parentID,
version,
isChosen,
function chooseForParent(parentID) {
if(this.isChosen){
throw Error/Exception;
}
this.parentID = parentID
this.isChosen = true;
this.version++;
}
}
NameRecordRepository {
function getByName(name) { ... }
function save(record) {
var oldVersion = record.version - 1;
var query = "UPDATE records SET .....
WHERE id = {record.id} AND version = {oldVersion}";
var rowsCount = db.execute(query);
if(rowsCount == 0) {
throw ConcurrencyViolation
}
}
}
// somewhere else in an object or module or whatever...
function chooseName(parentID, name) {
var record = NameRecordRepository.getByName(name);
record.chooseForParent(parentID);
NameRecordRepository.save(record);
}
Before whis object is saved to the DB a version comparison must be performed. SQL provides a way to execute a query based on some condition and return the row count of affected rows. In our case we check if the version in the Database is the same as the old one before update. If it's not, that means that someone else has updated the record.
In this simple case you can even remove the version field and use the isChosen flag in your SQL query like this:
var query = "UPDATE records SET .....
WHERE id = {record.id} AND isChosend = false";
When adding a new name to the database you will need a Unique constrant that will solve concurrenty issues.

Is thera a way to find forced execution context on datastore objects, somewhere in ODI metadata database?

I have a ODI 12c project with 30 mappings. I need to check if every "Component context" on every datastore object (source or target) is set to "Execution context" (not forced).
Is there a way to achive this by querying ODI underlying database so I don't have to do this manually, and to avoid possible mistakes ?
I have a list of ODI 12c Repository tables and comments on table columns which I got from the Oracle support website, and after hours of digging through database I still can't see this information stored in any table.
My package is located in SNP_PACKAGE, SNP_MAPPING has info about mapping , and SNP_MAP_COMP describes objects in mapping.
I have searched through many different tables as well.
A bit late but for anyone else looking
Messing about the tables is a no-no. APIs are better. Specially if you are to modify anything.
https://docs.oracle.com/en/middleware/data-integrator/12.2.1.3/odija/index.html
Run the following groovy script in ODI (Tools/Groovy/New Script). Should be simple enough to modify. Using the SDK gets a lot easier if you manage to set up a complete development env in IntelliJ or another Java IDE. Groovy in ODI opens up a whole new world.
//Created by DI Studio
import oracle.odi.domain.mapping.Mapping
import oracle.odi.domain.mapping.finder.IMappingFinder
tme = odiInstance.getTransactionalEntityManager()
IMappingFinder mapf = (IMappingFinder) tme.getFinder(Mapping.class)
Collection<Mapping> mappings = mapf.findByProject("PROJECT","FOLDER")
println("Found ${mappings.size()} mappings")
mappings.each { map ->
map.physicalDesigns.each{ phys ->
phys.physicalNodes.each{ node ->
println("${map.project.name}...${map.parentFolder.parentFolder?.name}.${map.parentFolder.name}.${map.name}.${phys.name}.${node.name}.defaultContext=${(node.context.defaultContext) ? "default" : node.context.name}")
}
}
}
It prints default or the set (forced) context. Seems forced context has been deprecated in 12c. Physical.node.context.defaultContext seems to mirror Component Context (Forced) in ODI Studio 12.2.1.3.
https://docs.oracle.com/en/middleware/data-integrator/12.2.1.3/odija/index.html
Update 2019-12-20 - including getExecutionContextName
The following script lists in a hierarchical manner and maybe easier to read the code. Not sure if you get what you are originally was after without having mapping with your exact setup.
//Created by DI Studio
import oracle.odi.domain.mapping.Mapping
import oracle.odi.domain.mapping.finder.IMappingFinder
import oracle.odi.domain.mapping.component.DatastoreComponent
tme = odiInstance.getTransactionalEntityManager()
String project = "PROJECT"
String parentFolder = "PARENT_FOLDER"
IMappingFinder mapf = (IMappingFinder) tme.getFinder(Mapping.class)
Collection<Mapping> mappings = mapf.findByProject(project, parentFolder)
println("Found ${mappings.size()} mappings")
println "Project: ${project}"
mappings.each { map ->
println "\tMapping: ..${map.parentFolder.parentFolder?.name}/${map.parentFolder.name}/${map.name}"
map.physicalDesigns.each{ phys ->
println "\t\tPhysical: ${phys.name}"
phys.physicalNodes.each{ node ->
println "\t\t\tNode: ${node.name}"
println "\t\t\t\tdefaultContext: ${(node.context.defaultContext)}"
println "\t\t\t\tNode context name: ${node.context.name}"
println "\t\t\t\tDatastoreComponent ExecutionContextName: ${DatastoreComponent.getDatastoreComponent(node)?.getExecutionContextName(node).toString()}"
}
}
}
Below is a list of some tables and columns that might hold the value you are looking for.
These tables and columns are from ODI 12.1.2, depending on the exact ODI version you are using, the structure could be a little different.
Here is also a query to retrieve this information directly from database.
-- Forced Contexts on Datastores in Mapping
SELECT MAPP.NAME MAP_NAME, MAPP_COMP.NAME DATASTORE_NAME,
MAPP_REF.QUALIFIED_NAME FORCE_CONTEXT
FROM SNP_MAPPING MAPP
INNER JOIN SNP_MAP_REF MAPP_REF
ON MAPP_REF.I_OWNER_MAPPING = MAPP.I_MAPPING
INNER JOIN SNP_MAP_PROP MAPP_PROP
ON MAPP_REF.I_MAP_REF = MAPP_PROP.I_PROP_XREF_VALUE
INNER JOIN ODIW12.SNP_MAP_COMP MAPP_COMP
ON MAPP_COMP.I_MAP_COMP = MAPP_PROP.I_MAP_COMP
WHERE
MAPP_REF.ADAPTER_INTF_TYPE = 'IContext'
and MAPP.NAME like %yourMapping%

Dynamic variables in Database Project views

I have a Database Project with some views.
The views should behave differently depending on the environment they are published to.
When published to the development environment, the INNER JOINs should use a specific prefix for target schema name, and another prefix on the test environment.
Is it possible to achieve this? In the below code snippet, id like to use Hub when developing locally and when publishing to the dev envrionment, and ISA when published to test.
Example:
CREATE VIEW [ISA].[v_CoveredRisk]
AS SELECT
CR.Bkey_CoveredRisk_Unique
,CO.Bkey_Coverage_Unique
,CO.Name
,PO.EKey_Policy
,CoObj.Bkey_CoveredObject
,CoObj.BKey_Building
,CoObj.Bkey_Home
,CoObj.BKey_Object
,CoObj.BKey_Person
,CoObj.BKey_Pet
,CoObj.BKey_Vehicle
,Risk_Excess
,Risk_Sum
,CAST(CurrentYearPremiumAmount AS float) AS CurrentYearPremiumAmount
,IsActive
,PO.BKey_Policy
,CR.Record_Timestamp
FROM Hub.[CoveredRisk] CR
INNER JOIN Hub.Coverage CO ON CR.EKey_Coverage = CO.EKey_Coverage
INNER JOIN Hub.CoveredObject CoObj ON CR.EKey_CoveredObject = CoObj.EKey_CoveredObject
INNER JOIN Hub.[Policy] PO ON CR.EKey_Policy = PO.Ekey_Policy
The first thing is that you should remove this requirement and have the code the same in all your databases. You are almost 100% guaranteed to make a mistake at some point regarding this and deploy something that doesn't work in a different environment.
If you do want to do this, you can do it with synonyms - in your view reference a synonym and have that pointing to the respective schema. You can't get a synonym to point to a schema directly but can objects within a schema so if you have:
dev table devSchema.table
prod table prodSchema.table
in dev, have a synonym like:
create synonym Hub.table for devSchema.table
then your view reference Hub.table and it will be resolved to the dev table.
have a T4 template that generates your SQL script
in your template find out what environment you are running against... create output accordingly
to find out for which environment you need to generate the output, have a look at publishing profiles, and configuration specific variables (a.k.a. "conditional compilation symbols" in your project's build properties)
you can probe those in the T4 template
You can't use variables in schema or object name. If you really want to achieve what you say, I can suggest you 2 ways:
You will not control it with variables but you'll control it with release configurations. You can use conditional statements in the sqlproj file. So, I've created 2 views:
CREATE VIEW [HUB].[View1]
AS SELECT 1 as one;
CREATE VIEW [ISA].[View1]
AS SELECT 1 as one;
Then in sqlproj file I do following thing:
<Build Include="View1.sql" />
<None Include="View1.sql" Condition=" '$(Configuration)' == 'Debug'" />
<None Include="View1_1.sql" />
<Build Include="View1_1.sql" Condition=" '$(Configuration)' == 'Release' " />
And then just pick the right release configuration when you deploy
NOTE: You need include and exclude the same file from build to achieve that.
This is much simpler approach. Always create view with the same and move it to the proper schema in the publish script. For example:
IF DB_NAME() = 'Dev' EXEC sp_rename ....

db.SubmitChanges() not updating the database in Windows phone 8

my question might be similar to many question in google search but I have some specific query. I have written my code like this where db is database and Items is the table having filename as one property.
var query = from fs in dB.Items
where fs.FilePath.Trim() == strOldpath.ToString()
select fs;
foreach (var fs in query)
{
fs.FileName = txtrename.Text.ToString();
}
try
{
dB.SubmitChanges();
}
catch (Exception e)
{
}
This code is running fine but after debugging I stop the emulator and I run in the command prompt
ISETool.exe ts xd 19xxxx-b6f2-474b-a747-6axxxxxxx E:\Practise\WinPhone\PhoneApp3\
it creates the *.sdf in the specific folder and I can open that in server explorer. But I can see that instead of the updated fileName it shows the old File name. the code is running fine. Any help why the file name is not updated? I have set the primary key for the table also.
You appear to have hit a known issue with trying to update the results of a read-only query:
Workaround for LINQ to SQL Entity Identity Caching and Compiled Query Bug?

Resources