my question might be similar to many question in google search but I have some specific query. I have written my code like this where db is database and Items is the table having filename as one property.
var query = from fs in dB.Items
where fs.FilePath.Trim() == strOldpath.ToString()
select fs;
foreach (var fs in query)
{
fs.FileName = txtrename.Text.ToString();
}
try
{
dB.SubmitChanges();
}
catch (Exception e)
{
}
This code is running fine but after debugging I stop the emulator and I run in the command prompt
ISETool.exe ts xd 19xxxx-b6f2-474b-a747-6axxxxxxx E:\Practise\WinPhone\PhoneApp3\
it creates the *.sdf in the specific folder and I can open that in server explorer. But I can see that instead of the updated fileName it shows the old File name. the code is running fine. Any help why the file name is not updated? I have set the primary key for the table also.
You appear to have hit a known issue with trying to update the results of a read-only query:
Workaround for LINQ to SQL Entity Identity Caching and Compiled Query Bug?
Related
I have a ODI 12c project with 30 mappings. I need to check if every "Component context" on every datastore object (source or target) is set to "Execution context" (not forced).
Is there a way to achive this by querying ODI underlying database so I don't have to do this manually, and to avoid possible mistakes ?
I have a list of ODI 12c Repository tables and comments on table columns which I got from the Oracle support website, and after hours of digging through database I still can't see this information stored in any table.
My package is located in SNP_PACKAGE, SNP_MAPPING has info about mapping , and SNP_MAP_COMP describes objects in mapping.
I have searched through many different tables as well.
A bit late but for anyone else looking
Messing about the tables is a no-no. APIs are better. Specially if you are to modify anything.
https://docs.oracle.com/en/middleware/data-integrator/12.2.1.3/odija/index.html
Run the following groovy script in ODI (Tools/Groovy/New Script). Should be simple enough to modify. Using the SDK gets a lot easier if you manage to set up a complete development env in IntelliJ or another Java IDE. Groovy in ODI opens up a whole new world.
//Created by DI Studio
import oracle.odi.domain.mapping.Mapping
import oracle.odi.domain.mapping.finder.IMappingFinder
tme = odiInstance.getTransactionalEntityManager()
IMappingFinder mapf = (IMappingFinder) tme.getFinder(Mapping.class)
Collection<Mapping> mappings = mapf.findByProject("PROJECT","FOLDER")
println("Found ${mappings.size()} mappings")
mappings.each { map ->
map.physicalDesigns.each{ phys ->
phys.physicalNodes.each{ node ->
println("${map.project.name}...${map.parentFolder.parentFolder?.name}.${map.parentFolder.name}.${map.name}.${phys.name}.${node.name}.defaultContext=${(node.context.defaultContext) ? "default" : node.context.name}")
}
}
}
It prints default or the set (forced) context. Seems forced context has been deprecated in 12c. Physical.node.context.defaultContext seems to mirror Component Context (Forced) in ODI Studio 12.2.1.3.
https://docs.oracle.com/en/middleware/data-integrator/12.2.1.3/odija/index.html
Update 2019-12-20 - including getExecutionContextName
The following script lists in a hierarchical manner and maybe easier to read the code. Not sure if you get what you are originally was after without having mapping with your exact setup.
//Created by DI Studio
import oracle.odi.domain.mapping.Mapping
import oracle.odi.domain.mapping.finder.IMappingFinder
import oracle.odi.domain.mapping.component.DatastoreComponent
tme = odiInstance.getTransactionalEntityManager()
String project = "PROJECT"
String parentFolder = "PARENT_FOLDER"
IMappingFinder mapf = (IMappingFinder) tme.getFinder(Mapping.class)
Collection<Mapping> mappings = mapf.findByProject(project, parentFolder)
println("Found ${mappings.size()} mappings")
println "Project: ${project}"
mappings.each { map ->
println "\tMapping: ..${map.parentFolder.parentFolder?.name}/${map.parentFolder.name}/${map.name}"
map.physicalDesigns.each{ phys ->
println "\t\tPhysical: ${phys.name}"
phys.physicalNodes.each{ node ->
println "\t\t\tNode: ${node.name}"
println "\t\t\t\tdefaultContext: ${(node.context.defaultContext)}"
println "\t\t\t\tNode context name: ${node.context.name}"
println "\t\t\t\tDatastoreComponent ExecutionContextName: ${DatastoreComponent.getDatastoreComponent(node)?.getExecutionContextName(node).toString()}"
}
}
}
Below is a list of some tables and columns that might hold the value you are looking for.
These tables and columns are from ODI 12.1.2, depending on the exact ODI version you are using, the structure could be a little different.
Here is also a query to retrieve this information directly from database.
-- Forced Contexts on Datastores in Mapping
SELECT MAPP.NAME MAP_NAME, MAPP_COMP.NAME DATASTORE_NAME,
MAPP_REF.QUALIFIED_NAME FORCE_CONTEXT
FROM SNP_MAPPING MAPP
INNER JOIN SNP_MAP_REF MAPP_REF
ON MAPP_REF.I_OWNER_MAPPING = MAPP.I_MAPPING
INNER JOIN SNP_MAP_PROP MAPP_PROP
ON MAPP_REF.I_MAP_REF = MAPP_PROP.I_PROP_XREF_VALUE
INNER JOIN ODIW12.SNP_MAP_COMP MAPP_COMP
ON MAPP_COMP.I_MAP_COMP = MAPP_PROP.I_MAP_COMP
WHERE
MAPP_REF.ADAPTER_INTF_TYPE = 'IContext'
and MAPP.NAME like %yourMapping%
When I have a query generated like this:
var query = from x in Entities.SomeTable
select x;
I can set a breakpoint and after hovering cursor over query I can see what will be the SQL command sent to database. Unfortunately I cannot do it when I use Count
var query = (from x in Entities.SomeTable
select x).Count();
Of course I could see what comes to SqlServer using profiler but maybe someone has any idea how to do it (if it is possible) in VS.
You can use ToTraceString():
ObjectQuery<SomeTable> query = (from x in Entities.SomeTable select x).Count();
Console.WriteLine(query.ToTraceString());
You can use the Database.Log to log any query made like this :
using (var context = new MyContext())
{
context.Database.Log = Console.Write;
// Your code here...
}
Usually, in my context's constructor, I set that to my logger (whether it is NLog, Log4Net, or the stock .net loggers) and not the console, but actual logging tool is irrelevant.
For more information
In EF6 and above, you can use the following before your query:
context.Database.Log = s => System.Diagnostics.Debug.WriteLine(s);
I've found this to be quicker than pulling up SQL Profiler and running a trace.
Also, this post talks more about this topic:
How do I view the SQL generated by the Entity Framework?
I'm using the Database project in Visual Studio.
In my scenario I have customers who intuitively customize the database I provide for them, e.g. adding some columns in existing tables, adding new tables...
How can I make when deploying .dacpac these objects that are not part of my schema are not excluded?
Important:
Set DropObjectsNotInSource=FALSE is not working for table columns.
EDIT
Ed, please see if I am doing something wrong:
using (DacPackage dacPackage = DacPackage.Load(DacPacFileName))
{
var dacServices = new DacServices(ConnectionString);
var dacOptions = new DacDeployOptions
{
AdditionalDeploymentContributorArguments = "SqlPackageFilter=KeepTableColumns(*)"
};
dacServices.Deploy(dacPackage, NomeBancoDados, true, dacOptions);
}
Is '*' in table filter "ALL TABLES" ?
I tried a table name too, but it did not work.
I wrote this for this scenario:
https://agilesqlclub.codeplex.com
You can "Keep" which deploys objects if they do not exist but does not deploy if there are changes or "Ignore" to completely ignore.
You can do this on type or name (regex)
Ed
I'm searching way how to generate XLS (XLSX) file from SQL Server in an ASP.NET MVC application.
Now I use EPPlus where I executing SQL query and result is saved to xlsx by this library.
In this case I have performance issue. If I have many data so, generation time is longer (150 rows in average about 10 sec and it's long time).
First idea is execute query that return XML and then transform to xls by xslt template but here I have problem with open through Excel (format and extension doesn't match).
Second idea is execute query that return XSL (or XSLX) from DB but I don't know how do it because file is saved on server and I don't know how to send to client to download.
Third idea Have you someone experience with similar problems and can you help me?
Thanks for any ideas.
//EDIT
Here is short example of my code:
int row = 0;
foreach(var obj1 in objList1)
{
WriteObj1(ref row, obj1);
var objList2 = GetObj2(obj1.Id);
foreach(var obj2 in objList2)
{
WriteObj2(ref row, obj2);
var objList3 = GetObj3(obj2.Id);
foreach(var obj3 in objList3)
{
WriteObj3(ref row, obj3);
var objList4 = GetObj4(obj3.Id);
foreach(var obj4 in objList4)
{
WriteObj4(ref row, obj4);
var objList5 = GetObj5(obj4.Id);
foreach(var obj5 in objList5)
{
WriteObj5(ref row, obj5);
}
}
}
}
}
and inner method for write is this code:
// create header in Excel
...
workSheet.Cells[row, column++].Value = someValue;
You can do this with a simple INSERT INTO.
Something like INSERT INTO OPENROWSET
('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=[path];','SELECT * FROM [table]')
Where [path] is the desired directory and the SELECT statement should be self explaining.
For more information on OPENROWSET you can check out the Microsoft documentation
Since your updated question differs from the original one, I'm writing a new answer:
The problem with your code are the nested loops. This way you have thousands of accesses to the file.
Instead try to generate e.g. an array with all the values and afterwards use Excel.Range to update/set the whole array at once.
Next time please use the search function first. There are dozens of (answered) questions about a slow excel insert.
In the code below, pathToNonDatabase is the path to a simple text file, not a real sqlite database. I was hoping for sqlite3_open to detect that, but it doesn't (db is not NULL, and result is SQLITE_OK). So, how to detect that a file is not a valid sqlite database?
sqlite3 *db = NULL;
int result = sqlite3_open(pathToNonDatabase, &db);
if((NULL==db) || (result!=SQLITE_OK)) {
// invalid database
}
sqlite opens databases lazily. Just do something immediately after opening that requires it to be a database.
The best is probably pragma schema_version;.
This will report 0 if the database hasn't been created (for instance, an empty file). In this case, it's safe work with (and run CREATE TABLE, etc)
If the database has been created, it will return how many revisions the schema has gone through. This value might not be interesting, but that it's not zero is.
If the file exists and isn't a database (or empty), you'll get an error.
If you want a somewhat more thorough check, you can use pragma quick_check;. This is a lighter-weight integrity check, which skips checking that the contents of the tables line up with the indexes. It can still be very slow.
Avoid integrity_check. It not only checks every page, but then verifies the contents of the tables against the indexes. This is positively glacial on a large database.
For anyone needing to do this in C# with System.Data.SQLite you can start a transaction, and then immediately roll it back as follows:-
private bool DatabaseIsValid(string filename)
{
using (SQLiteConnection db = new SQLiteConnection(#"Data Source=" + filename + ";FailIfMissing=True;"))
{
try
{
db.Open();
using (var transaction = db.BeginTransaction())
{
transaction.Rollback();
}
}
catch (Exception ex)
{
log.Debug(ex.Message, ex);
return false;
}
}
return true;
}
If the file is not a valid database the following SQLiteException is thrown - file is encrypted or is not a database (System.Data.SQLite.SQLiteErrorCode.NotADb). If you aren't using encrypted databases then this solution should be sufficient.
(Only the 'db.Open()' was required for version 1.0.81.0 of System.Data.SQLite but when I upgraded to version 1.0.91.0 I had to insert the inner using block to get it to work).
I think a pragma integrity_check; could do it.
If you want only to check if the file is a valid sqlite database then you can check with this function:
private bool CheckIfValidSQLiteDatabase(string databaseFilePath)
{
byte[] bytes = new byte[16];
using (FileStream fileStream = new FileStream(databaseFilePath, FileMode.Open, FileAccess.Read))
{
fileStream.Read(bytes, 0, 16);
}
string gg = System.Text.ASCIIEncoding.ASCII.GetString(bytes);
return gg.Contains("SQLite format");
}
as stated in the documentation:
sqlite database header