Spring Boot JPA with Dynamic Table and column - database

How to run SQL query in spring boot when my tables' name is dynamic and the number of the columns of the table is also dynamic/varying depending on the requirement.
While using entity class we require to set static table and column names.
Eg-
Table- FunndTransfer_Category1 have columns - id,name,amount,abc
Table- FunndTransfer_Category2 have columns - id,name,amount,xyz
here column name abc,xyz will add at time of table creation when user upload it.
Is there any alternate approach to run query??

you can use something as below
#PersistenceContext
private EntityManager entityManager;
List<Object> getData(){
String tableName = "FunndTransfer_Category1";
Query query = entityManager.createNativeQuery("select * from "+tableName);
return query.getResultList();
}

Related

Audit trail with Entity Framework Core

I have an ASP.NET core 2.0 using Entity Framework core on a SQL Server db.
I have to trace and audit all the stuff made by the users on the data. My goal is to have an automatic mechanism writing all what is happening.
For example, if I have the table Animals, I want a parallele table "Audit_animals" where you can find all the info about the data, the operation type (add, delete, edit) and the user who made this.
I already made this time ago in Django + MySQL, but now the environment is different. I found this and it seems interesting, but I'd like to know if there are better ways and which is the best approach to do this in EF Core.
UPDATE
I'm trying this and something happens, but I have some problems.
I added this:
services.AddMvc().AddJsonOptions(options => {
options.SerializerSettings.ReferenceLoopHandling = ReferenceLoopHandling.Ignore;
});
public Mydb_Context(DbContextOptions<isMultiPayOnLine_Context> options) : base(options)
{
Audit.EntityFramework.Configuration.Setup()
.ForContext<Mydb_Context>(config => config
.IncludeEntityObjects()
.AuditEventType("Mydb_Context:Mydb"))
.UseOptOut()
}
public MyRepository(Mydb_Context context)
{
_context = context;
_context.AddAuditCustomField("UserName", "pippo");
}
I also created a table to insert the audits (only one to test this tool), but the only thing I got is what you see in the image. A list of json files with the data I created.... why??
Read the documentation:
Event Output
To configure the output persistence mechanism please see Configuration and Data Providers sections.
Then, in the documentation on Configuration:
If you don't specify a Data Provider, a default FileDataProvider will be used to write the events as .json files into the current working directory. (emphasis mine)
Long and short, follow the documentation to configure the data provider you'd like to use.
If you are going to map the audit table (Audit_Animals) to the same EF context as the audited Animals table, you can use the EntityFramework Data Provider included on the same Audit.EntityFramework library.
Check the documentation here:
Entity Framework Data Provider
If you plan to store the audit logs in
the same database as the audited entities, you can use the
EntityFrameworkDataProvider. Use this if you plan to store the audit
trails for each entity type in a table with similar structure.
There is another library that can audit EF contexts in a similar way, take a look: zzzprojects/EntityFramework-Plus.
Cannot recommend one over the other since they provide different features (and I'm the owner of the audit.net library).
Update:
.NET 6 and Entity Framework Core 6.0 supports SQL Server temporal tables out of the box.
See this answer for examples:
https://stackoverflow.com/a/70017768/3850405
Original:
You could have a look at Temporal tables (system-versioned temporal tables) if you are using SQL Server 2016< or Azure SQL.
https://learn.microsoft.com/en-us/sql/relational-databases/tables/temporal-tables?view=sql-server-ver15
From documentation:
Database feature that brings built-in support for providing
information about data stored in the table at any point in time rather
than only the data that is correct at the current moment in time.
Temporal is a database feature that was introduced in ANSI SQL 2011.
There is currently an open issue to support this out of the box:
https://github.com/dotnet/efcore/issues/4693
There are third party options available today but since they are not from Microsoft it is of course a risk that they won't be supported in future versions.
https://github.com/Adam-Langley/efcore-temporal-query
https://github.com/findulov/EntityFrameworkCore.TemporalTables
I solved it like this:
If you use the included Visual Studio 2019 LocalDB (Microsoft SQL Server 2016 (13.1.4001.0 LocalDB) you will need to upgrade if you use cascading DELETE or UPDATE. This is because Temporal tables with cascading actions is not supported in that version.
Complete guide for upgrading here:
https://stackoverflow.com/a/64210519/3850405
Start by adding a new empty migration. I prefer to use Package Manager Console (PMC):
Add-Migration "Temporal tables"
Should look like this:
public partial class Temporaltables : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
}
protected override void Down(MigrationBuilder migrationBuilder)
{
}
}
Then edit the migration like this:
public partial class Temporaltables : Migration
{
List<string> tablesToUpdate = new List<string>
{
"Images",
"Languages",
"Questions",
"Texts",
"Medias",
};
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.Sql($"CREATE SCHEMA History");
foreach (var table in tablesToUpdate)
{
string alterStatement = $#"ALTER TABLE [{table}] ADD SysStartTime datetime2(0) GENERATED ALWAYS AS ROW START HIDDEN
CONSTRAINT DF_{table}_SysStart DEFAULT GETDATE(), SysEndTime datetime2(0) GENERATED ALWAYS AS ROW END HIDDEN
CONSTRAINT DF_{table}_SysEnd DEFAULT CONVERT(datetime2 (0), '9999-12-31 23:59:59'),
PERIOD FOR SYSTEM_TIME (SysStartTime, SysEndTime)";
migrationBuilder.Sql(alterStatement);
alterStatement = $#"ALTER TABLE [{table}] SET (SYSTEM_VERSIONING = ON (HISTORY_TABLE = History.[{table}]));";
migrationBuilder.Sql(alterStatement);
}
}
protected override void Down(MigrationBuilder migrationBuilder)
{
foreach (var table in tablesToUpdate)
{
string alterStatement = $#"ALTER TABLE [{table}] SET (SYSTEM_VERSIONING = OFF);";
migrationBuilder.Sql(alterStatement);
alterStatement = $#"ALTER TABLE [{table}] DROP PERIOD FOR SYSTEM_TIME";
migrationBuilder.Sql(alterStatement);
alterStatement = $#"ALTER TABLE [{table}] DROP DF_{table}_SysStart, DF_{table}_SysEnd";
migrationBuilder.Sql(alterStatement);
alterStatement = $#"ALTER TABLE [{table}] DROP COLUMN SysStartTime, COLUMN SysEndTime";
migrationBuilder.Sql(alterStatement);
alterStatement = $#"DROP TABLE History.[{table}]";
migrationBuilder.Sql(alterStatement);
}
migrationBuilder.Sql($"DROP SCHEMA History");
}
}
tablesToUpdate should contain every table you need history for.
Then run Update-Database command.
Original source, a bit modified with escaping tables with square brackets etc:
https://intellitect.com/updating-sql-database-use-temporal-tables-entity-framework-migration/
Testing Create, Update and Delete will then show a complete history.
[HttpGet]
public async Task<ActionResult<string>> Test()
{
var identifier1 = "OATestar123";
var identifier2 = "OATestar12345";
var newQuestion = new Question()
{
Identifier = identifier1
};
_dbContext.Questions.Add(newQuestion);
await _dbContext.SaveChangesAsync();
var question = await _dbContext.Questions.FirstOrDefaultAsync(x => x.Identifier == identifier1);
question.Identifier = identifier2;
await _dbContext.SaveChangesAsync();
question = await _dbContext.Questions.FirstOrDefaultAsync(x => x.Identifier == identifier2);
_dbContext.Entry(question).State = EntityState.Deleted;
await _dbContext.SaveChangesAsync();
return Ok();
}
Tested a few times but the log will look like this:
This solution has a huge advantage IMAO that it is not Object Relational Mapper (ORM) specific and you will even get history if you write plain SQL.
The History tables are also read only by default so less chance of a corrupt audit trail. Error received: Cannot update rows in a temporal history table ''
If you need access to the data you can use your preferred ORM to fetch it or audit via SQL.

Inserting/Updating Rows to DB table where Rows result from VO [Backed by EO] based on Union Query

Jdev Version : 11.1.1.7
I have created a Department VO based Department EO with the following query :
SELECT DeptEO.DEPARTMENT_ID,
DeptEO.DEPARTMENT_NAME,
DeptEO.MANAGER_ID,
DeptEO.LOCATION_ID,
DeptEO.ACTIVE
FROM DEPARTMENTS DeptEO where DeptEO.DEPARTMENT_ID > 250
UNION
SELECT 280 , 'Advertising',200,1700,'Y' from Dual
For the simplicity , I have used a sample statement from dual table , in real scenario , the query after UNION clause will populate from a table.
After running the query ,I get the result that is desired on the UI .
Now my requirement is to insert this newly created row with DEPARTMENT_ID as 280 , into DB table DEPARTMENTS.
While committing , ADF throws error as " oracle.jbo.RowAlreadyDeletedException: JBO-29114 " which is correct as the this row is missing from DB table , so when it goes for taking a lock on the row for update , it doesn't find anything .
Is there any way that i can instruct ADF to consider this row for Insert rather than update .
We also tried to populate the data of this row into a new row instance created from RowSetIterator , and afterwards remove the culprit row by calling removeFromCollection() and then inserting the duplicated row , but still no luck .
Other approaches that we are thinking of are :
1- Create another VO/EO and insert values in table through them .
2- Create a DB View for this query and trigger on this view , so when ever an update operation comes , we do our logic in trigger i.e. decide whether to update or insert the data.
Can you please guide what should be done in such scenario .
Regards,
Siddharth
Edit : Code for Inserting Row (What I was trying but it's not working)
RowSetIterator rsi=iterator.getRowSetIterator();
Row editableRow= rsi.createRow();
while(rsi.hasNext()){
Row r =rsi.next();
if((""+r.getAttribute("DepartmentId")).toString().equals("280") ){
System.err.println("? Equality row found!!!");
editableRow.setAttribute("DepartmentId", r.getAttribute("DepartmentId"));
editableRow.setAttribute("DepartmentName", r.getAttribute("DepartmentName"));
editableRow.setAttribute("ManagerId", r.getAttribute("ManagerId"));
editableRow.setAttribute("LocationId", r.getAttribute("LocationId"));
editableRow.setAttribute("Active", r.getAttribute("Active"));
rsi.removeCurrentRowFromCollection();
}
}
if(editableRow !=null){
System.err.println("? Row value after removal : "+editableRow.getAttribute("DepartmentName"));
rsi.insertRow(editableRow);
operBindingCommit.execute();
}
Your use case can be implemented in a couple of ways. First way is to iterate over row set in managed bean and check if department with id 280 exists, if yes then update the row otherwise invoke Create with parameters for department VO. The second way, and would say the better way, is to create a method for update/insert at business component level, either in ViewObjectImpl or in ApplicationModuleImpl and then invoke it from managed bean.
Here is the sample code for insert/update method written in VOImpl
public void updateInsertJobs(String jobId, String jobTitle,
String minSalary, String maxSalary)
{
RowSetIterator rSet = this.createRowSetIterator(null);
JobsViewRowImpl row = new JobsViewRowImpl();
Boolean jobExist = false;
if (null != jobId)
{
try
{
while (rSet.hasNext())
{
row = (JobsViewRowImpl) rSet.next();
if (row.getJobId().equals(jobId))
{
row.setJobTitle(jobTitle);
row.setMinSalary(new Number(minSalary));
row.setMaxSalary(new Number(maxSalary));
jobExist = true;
}
}
if (!jobExist)
{
JobsViewRowImpl r = (JobsViewRowImpl) this.createRow();
r.setJobId(jobId);
r.setJobTitle(jobTitle);
r.setMinSalary(new Number(minSalary));
r.setMaxSalary(new Number(maxSalary));
this.insertRow(r);
}
this.getDBTransaction().commit();
}
catch (Exception e)
{
e.printStackTrace();
}
}
}
Make sure to expose the method in Client Interface in order to be able to access it from data control.
Here is how to invoke the method from managed bean:
public void insertUpdateData(ActionEvent actionEvent)
{
BindingContainer bc =
BindingContext.getCurrent().getCurrentBindingsEntry();
OperationBinding oB = bc.getOperationBinding("updateInsertJobs");
oB.getParamsMap().put("jobId", "TI_STF");
oB.getParamsMap().put("jobTitle", "Technical Staff");
oB.getParamsMap().put("minSalary", "5000");
oB.getParamsMap().put("maxSalary", "18000");
oB.execute();
}
Some references which could be helpful:
http://mahmoudoracle.blogspot.com/2012/07/adf-call-method-from-pagedefinition.html#.VMLYaf54q-0
http://adftidbits.blogspot.com//2014/11/update-vo-data-programatically-adf.html
http://www.awasthiashish.com/2012/12/insert-new-row-in-adf-viewobject.html
Your view object become readonly due to custom sql query.
However you still can create row in dept table using entity.
Create java implemetation including accessors for DeptEO.
Create custom method in view object and create new entity or update existing using entity definition there. To find that required row exist, you can check that entity with this key is already exists. Something like this (assuming deptId is your primary key):
public void createOrUpdateDept(BigInteger deptId){
DeptEOImpl dept;
EntityDefImpl deptDef = DeptEOImpl.getDefinitionObject();
Key key = new Key(new Object[]{deptId});
dept = deptDef.findByPrimaryKey(getDBTransaction(), key);
if (dept == null){
// Creating new entity if it doesn't exist
dept = deptDef.createInstance2(getDBTransaction(), null);
dept.setDepartmentId(deptId);
}
// Changing other attributes
dept.setDepartmentName("New name");
// Commiting changes and refreshing ViewObject if required
getDBTransaction().commit();
executeQuery();
}
This code is just a sample, use it as reference/idea, don't blindly copy/paste.

How to control primary key values when seeding data with Entity Framework codefirst

I am creating an asp.net mvc4 site using entity framework 5 with codefirst and sql server express 2012.
I have enabled migrations and now do this in my Configuration.Seed method:
(note that I want to set the primary key to 8 even though this is the first record in the database).
context.ProductCategoryDtoes.AddOrUpdate(x => x.Id,
new ProductCategoryDto() { Id = 8, Name = "category1" }
);
My Model object is defined like this:
[Table("ProductCategory")]
public class ProductCategoryDto {
public long Id { get; set; }
public string Name { get; set; }
}
This results in a table in (SQL SERVER EXPRESS 2012) where the Id column has Identity = true, Identity seed = 1, identity increment = 1.
Now when I run migrations by doing an PM> Update-Database this result in a row with Id = 1.
So my question are:
1) How can I control the values of auto incremented primary keys when seeding data.
2) If the solution is to increment the key columns seed value, then how is this to be done when I am using Database.SetInitializer(new DropCreateDatabaseAlways<MyContext>());. This will nuke and rebuild the database everytime I update the database, so how would the seed value be updated in the fresh database?
Just create dummy entities with default values, then add your real data and afterwards delete the dummies. Not the best way but I guess there is no other...
Have you tried adding this on top of your Id property:
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public long Id { get; set; }
It seems you are trying to defeat the purpose of an identity column. If you want to do this your only choice is to use SQL Commands Set IDENTITY_INSERT to allow you to insert the value and then run DBCC CHECKIDENT to update the seed. Not a really good idea. These options have security and performance limitations.
You may want to consider using a GUID instead. You can create GUIDs in code which are guaranteed to be unique, and you can also generate GUIDs in SQL as a column default.
With GUIDs, which are non sequential you will need to think through a good indexing strategy. This approach is also debatable.
Ultimately, it looks like you need a different strategy other than using an Identity Column.
It is very hackish, but I ran into a scenario where I had to do it due to some report having hard-coded PK values. Fixing the reports was beyond my scope of work.
Context.Database.ExecuteSqlCommand("SET IDENTITY_INSERT dbo.ProductCategoryDto ON " +
"INSERT INTO dbo.ProductCategoryDto (Id, Name) VALUES (8, 'category1') " +
"SET IDENTITY_INSERT dbo.ProductCategoryDto OFF");

Correct method of deleting over 2100 rows (by ID) with Dapper

I am trying to use Dapper support my data access for my server app.
My server app has another application that drops records into my database at a rate of 400 per minute.
My app pulls them out in batches, processes them, and then deletes them from the database.
Since data continues to flow into the database while I am processing, I don't have a good way to say delete from myTable where allProcessed = true.
However, I do know the PK value of the rows to delete. So I want to do a delete from myTable where Id in #listToDelete
Problem is that if my server goes down for even 6 mintues, then I have over 2100 rows to delete.
Since Dapper takes my #listToDelete and turns each one into a parameter, my call to delete fails. (Causing my data purging to get even further behind.)
What is the best way to deal with this in Dapper?
NOTES:
I have looked at Tabled Valued Parameters but from what I can see, they are not very performant. This piece of my architecture is the bottle neck of my system and I need to be very very fast.
One option is to create a temp table on the server and then use the bulk load facility to upload all the IDs into that table at once. Then use a join, EXISTS or IN clause to delete only the records that you uploaded into your temp table.
Bulk loads are a well-optimized path in SQL Server and it should be very fast.
For example:
Execute the statement CREATE TABLE #RowsToDelete(ID INT PRIMARY KEY)
Use a bulk load to insert keys into #RowsToDelete
Execute DELETE FROM myTable where Id IN (SELECT ID FROM #RowsToDelete)
Execute DROP TABLE #RowsToDelte (the table will also be automatically dropped if you close the session)
(Assuming Dapper) code example:
conn.Open();
var columnName = "ID";
conn.Execute(string.Format("CREATE TABLE #{0}s({0} INT PRIMARY KEY)", columnName));
using (var bulkCopy = new SqlBulkCopy(conn))
{
bulkCopy.BatchSize = ids.Count;
bulkCopy.DestinationTableName = string.Format("#{0}s", columnName);
var table = new DataTable();
table.Columns.Add(columnName, typeof (int));
bulkCopy.ColumnMappings.Add(columnName, columnName);
foreach (var id in ids)
{
table.Rows.Add(id);
}
bulkCopy.WriteToServer(table);
}
//or do other things with your table instead of deleting here
conn.Execute(string.Format(#"DELETE FROM myTable where Id IN
(SELECT {0} FROM #{0}s", columnName));
conn.Execute(string.Format("DROP TABLE #{0}s", columnName));
To get this code working, I went dark side.
Since Dapper makes my list into parameters. And SQL Server can't handle a lot of parameters. (I have never needed even double digit parameters before). I had to go with Dynamic SQL.
So here was my solution:
string listOfIdsJoined = "("+String.Join(",", listOfIds.ToArray())+")";
connection.Execute("delete from myTable where Id in " + listOfIdsJoined);
Before everyone grabs the their torches and pitchforks, let me explain.
This code runs on a server whose only input is a data feed from a Mainframe system.
The list I am dynamically creating is a list of longs/bigints.
The longs/bigints are from an Identity column.
I know constructing dynamic SQL is bad juju, but in this case, I just can't see how it leads to a security risk.
Dapper request the List of object having parameter as a property so in above case a list of object having Id as property will work.
connection.Execute("delete from myTable where Id in (#Id)", listOfIds.AsEnumerable().Select(i=> new { Id = i }).ToList());
This will work.

Obtain 'Identity' setting for a column in VistaDB

I am reading the database schema for VistaDB 4.0 database using the standard ADO.NET 'DbConnection.GetSchema' API. I haven't found a way to obtain the 'Identity' setting for a column? The 'Columns' schema collection doesn't seem to have a column for this and I am not aware of any other collection that I should look into.
If it is not possible by querying any of the available collections, do I have to query some system table or view?
Any help would be appreciated.
There are no "sys" tables in VistaDB. There is a [database schema] table that contains most of what you need though.
[database schema]
You can get the identity columns for a database using the database schema table like this:
select * from [database schema] where typeid = 6
Look in the help file for the typeid list and what they mean.
Then once you have the list, you can match it up to the typeid for tables to see what table the identity column came from.
The only catch with the database schema table is that you cannot self reference or join it to itself (design limitation). So if you need to pull and reference from itself you have to do it in two commands, or through a temp table. The help file has an example of how to do this as well.
Alternate Way
You can also find all the identity columns using a VistaDB stored proc:
select * from VistaDBColumnSchema() where is_identity = true
DDA
If you need to find the next value, seed, etc you can also get those through DDA (Direct Data Access) methods.
The Identities property on an IVistaDBTableSchema object is a collection of the identities for that table. That collection can then be walked to pull the individual values.
The identity information included is the Seed, Step, Tablename, and Columnname.
ADO.NET GetSchemaTable Way
And yes, there is still another way. You can call GetSchemaTable on a reader to get some more information about the underlying structure.
using (VistaDBConnection cn = new VistaDBConnection("Data Source=" + dbName))
{
cn.Open();
using (VistaDBCommand cmd = new VistaDBCommand("Select * from simpletable", cn))
{
using (VistaDBDataReader myReader = cmd.ExecuteReader(CommandBehavior.KeyInfo))
{
//Retrieve column schema into a DataTable.
DataTable schemaTable = myReader.GetSchemaTable();
foreach (DataRow myField in schemaTable.Rows)
{
foreach (DataColumn myProperty in schemaTable.Columns)
{
System.Diagnostics.Debug.WriteLine(myProperty.ColumnName + " = " + myField[myProperty].ToString());
}
}
}
}
}

Resources