In my application I need to get database date(sysdate in case of Oracle DB) and compare it with user input date (String converted to java.util.Date). From this forum I got the following code which helps in the case of Oracle dialect.
public Date getDate() {
Session session = getHibernateTemplate().getSessionFactory().openSession();
SQLQuery query = session.createSQLQuery("select sysdate as mydate from dual");
query.addScalar("mydate", Hibernate.TIMESTAMP);
return (Date) query.uniqueResult();
}
And from this link got the following method which uses mapping file with formula.
<property name="currentDate" formula="(select sysdate from dual)"/>
Again this is specific to Oracle. I think using later method is more performance friendly, because we can get it from the same session, i.e no need of opening another session just for getting date.
I am looking for a generic solution to get date, time and timestamp from any DBMS using Hibernate. Using HQL is the preferred. Hope such a solution is available.
For those who are looking for .NET /C# solution, here is what worked for me:
// this works only with Oracle
public DateTime DbTimeStamp(ISession session)
{
// Sample returned value = "12-OCT-11 01.05.54.365134000 AM -07:00"
string sql = "SELECT SYSTIMESTAMP FROM DUAL";
ISQLQuery query = session.CreateSQLQuery(sql)
.AddScalar("SYSTIMESTAMP", NHibernate.NHibernateUtil.DateTime);
return query.UniqueResult<DateTime>();
}
Related
In my app I'm using node mssql module to input datetime into a SQL Server database. The problem is that in the database datetime is always changed and time is one hour less then the one that I input.
async function insertDate(date, logIO) {
try {
var d = new Date(date)
var name = 'zdzmar'
const pool = await sql.connect(config)
let result = await pool.request(pool)
.input('date', TYPES.DateTime, d)
.input('name', TYPES.VarChar, name)
.input('logIO', TYPES.TinyInt, logIO)
.query(`insert clock(date, name, logIO) values(#date, #name, #logIO)`)
} finally {
sql.close();
}
}
Where is the problem?
It seems like a bug with the mssql package. I run into the same problem even when I formated the datetime into a string, the date still got converted into a different datetime, in my case 12:04pm got converted to 20:04pm, which is not directly related to timezone or UTC time.
A workaround to solve this problem is to use sql.VarChar instead of sql.DataTime.
I have an ASP.NET core 2.0 using Entity Framework core on a SQL Server db.
I have to trace and audit all the stuff made by the users on the data. My goal is to have an automatic mechanism writing all what is happening.
For example, if I have the table Animals, I want a parallele table "Audit_animals" where you can find all the info about the data, the operation type (add, delete, edit) and the user who made this.
I already made this time ago in Django + MySQL, but now the environment is different. I found this and it seems interesting, but I'd like to know if there are better ways and which is the best approach to do this in EF Core.
UPDATE
I'm trying this and something happens, but I have some problems.
I added this:
services.AddMvc().AddJsonOptions(options => {
options.SerializerSettings.ReferenceLoopHandling = ReferenceLoopHandling.Ignore;
});
public Mydb_Context(DbContextOptions<isMultiPayOnLine_Context> options) : base(options)
{
Audit.EntityFramework.Configuration.Setup()
.ForContext<Mydb_Context>(config => config
.IncludeEntityObjects()
.AuditEventType("Mydb_Context:Mydb"))
.UseOptOut()
}
public MyRepository(Mydb_Context context)
{
_context = context;
_context.AddAuditCustomField("UserName", "pippo");
}
I also created a table to insert the audits (only one to test this tool), but the only thing I got is what you see in the image. A list of json files with the data I created.... why??
Read the documentation:
Event Output
To configure the output persistence mechanism please see Configuration and Data Providers sections.
Then, in the documentation on Configuration:
If you don't specify a Data Provider, a default FileDataProvider will be used to write the events as .json files into the current working directory. (emphasis mine)
Long and short, follow the documentation to configure the data provider you'd like to use.
If you are going to map the audit table (Audit_Animals) to the same EF context as the audited Animals table, you can use the EntityFramework Data Provider included on the same Audit.EntityFramework library.
Check the documentation here:
Entity Framework Data Provider
If you plan to store the audit logs in
the same database as the audited entities, you can use the
EntityFrameworkDataProvider. Use this if you plan to store the audit
trails for each entity type in a table with similar structure.
There is another library that can audit EF contexts in a similar way, take a look: zzzprojects/EntityFramework-Plus.
Cannot recommend one over the other since they provide different features (and I'm the owner of the audit.net library).
Update:
.NET 6 and Entity Framework Core 6.0 supports SQL Server temporal tables out of the box.
See this answer for examples:
https://stackoverflow.com/a/70017768/3850405
Original:
You could have a look at Temporal tables (system-versioned temporal tables) if you are using SQL Server 2016< or Azure SQL.
https://learn.microsoft.com/en-us/sql/relational-databases/tables/temporal-tables?view=sql-server-ver15
From documentation:
Database feature that brings built-in support for providing
information about data stored in the table at any point in time rather
than only the data that is correct at the current moment in time.
Temporal is a database feature that was introduced in ANSI SQL 2011.
There is currently an open issue to support this out of the box:
https://github.com/dotnet/efcore/issues/4693
There are third party options available today but since they are not from Microsoft it is of course a risk that they won't be supported in future versions.
https://github.com/Adam-Langley/efcore-temporal-query
https://github.com/findulov/EntityFrameworkCore.TemporalTables
I solved it like this:
If you use the included Visual Studio 2019 LocalDB (Microsoft SQL Server 2016 (13.1.4001.0 LocalDB) you will need to upgrade if you use cascading DELETE or UPDATE. This is because Temporal tables with cascading actions is not supported in that version.
Complete guide for upgrading here:
https://stackoverflow.com/a/64210519/3850405
Start by adding a new empty migration. I prefer to use Package Manager Console (PMC):
Add-Migration "Temporal tables"
Should look like this:
public partial class Temporaltables : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
}
protected override void Down(MigrationBuilder migrationBuilder)
{
}
}
Then edit the migration like this:
public partial class Temporaltables : Migration
{
List<string> tablesToUpdate = new List<string>
{
"Images",
"Languages",
"Questions",
"Texts",
"Medias",
};
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.Sql($"CREATE SCHEMA History");
foreach (var table in tablesToUpdate)
{
string alterStatement = $#"ALTER TABLE [{table}] ADD SysStartTime datetime2(0) GENERATED ALWAYS AS ROW START HIDDEN
CONSTRAINT DF_{table}_SysStart DEFAULT GETDATE(), SysEndTime datetime2(0) GENERATED ALWAYS AS ROW END HIDDEN
CONSTRAINT DF_{table}_SysEnd DEFAULT CONVERT(datetime2 (0), '9999-12-31 23:59:59'),
PERIOD FOR SYSTEM_TIME (SysStartTime, SysEndTime)";
migrationBuilder.Sql(alterStatement);
alterStatement = $#"ALTER TABLE [{table}] SET (SYSTEM_VERSIONING = ON (HISTORY_TABLE = History.[{table}]));";
migrationBuilder.Sql(alterStatement);
}
}
protected override void Down(MigrationBuilder migrationBuilder)
{
foreach (var table in tablesToUpdate)
{
string alterStatement = $#"ALTER TABLE [{table}] SET (SYSTEM_VERSIONING = OFF);";
migrationBuilder.Sql(alterStatement);
alterStatement = $#"ALTER TABLE [{table}] DROP PERIOD FOR SYSTEM_TIME";
migrationBuilder.Sql(alterStatement);
alterStatement = $#"ALTER TABLE [{table}] DROP DF_{table}_SysStart, DF_{table}_SysEnd";
migrationBuilder.Sql(alterStatement);
alterStatement = $#"ALTER TABLE [{table}] DROP COLUMN SysStartTime, COLUMN SysEndTime";
migrationBuilder.Sql(alterStatement);
alterStatement = $#"DROP TABLE History.[{table}]";
migrationBuilder.Sql(alterStatement);
}
migrationBuilder.Sql($"DROP SCHEMA History");
}
}
tablesToUpdate should contain every table you need history for.
Then run Update-Database command.
Original source, a bit modified with escaping tables with square brackets etc:
https://intellitect.com/updating-sql-database-use-temporal-tables-entity-framework-migration/
Testing Create, Update and Delete will then show a complete history.
[HttpGet]
public async Task<ActionResult<string>> Test()
{
var identifier1 = "OATestar123";
var identifier2 = "OATestar12345";
var newQuestion = new Question()
{
Identifier = identifier1
};
_dbContext.Questions.Add(newQuestion);
await _dbContext.SaveChangesAsync();
var question = await _dbContext.Questions.FirstOrDefaultAsync(x => x.Identifier == identifier1);
question.Identifier = identifier2;
await _dbContext.SaveChangesAsync();
question = await _dbContext.Questions.FirstOrDefaultAsync(x => x.Identifier == identifier2);
_dbContext.Entry(question).State = EntityState.Deleted;
await _dbContext.SaveChangesAsync();
return Ok();
}
Tested a few times but the log will look like this:
This solution has a huge advantage IMAO that it is not Object Relational Mapper (ORM) specific and you will even get history if you write plain SQL.
The History tables are also read only by default so less chance of a corrupt audit trail. Error received: Cannot update rows in a temporal history table ''
If you need access to the data you can use your preferred ORM to fetch it or audit via SQL.
I've been searching but haven't found my answer so forgive me if this question is a duplicate.
I've got a .Net C# application that is using entity framework (EF) to communicate with a SQL Server database. I'm converting a large amount of data and I need to make sure my dates are valid SQL Server datetime types. My POCO classes use a datetime2 type for the dates so a date '0201-04-11 13:00:00 PM' is valid until the insert is actually attempted in the SQL Server database. I was attempting to use DateTime.TryParseExact with something like this...
if (DateTime.TryParseExact(legacyRecord.date_paid.ToString(), "M/d/yyyy hh:mm:ss tt", new CultureInfo("en-us"), DateTimeStyles.None, out datePaid))
{
// Load record into lease payment table table
LoadLeasePayment loadLeasePayment = new LoadLeasePayment();
Decimal LeasePaymentId = loadLeasePayment.AddRecord(prodLeaseId, legacyRecord.amount_paid, datePaid, prodContext, loadDate);
}
I'm sure the solution is obvious but I cannot see the forest for the trees. Any help is much appreciated.
After parsing the string DateTime value, you'll need to verify it is within the range of the target SQL data type. The SqlDateTime structure includes static MinValue and MaxValue fields to facilitate this.
if (DateTime.TryParseExact(legacyRecord.date_paid.ToString(), "M/d/yyyy hh:mm:ss tt", new CultureInfo("en-us"), DateTimeStyles.None, out datePaid))
{
if((datePaid >= SqlDateTime.MinValue) && (datePaid <= SqlDateTime.MaxValue))
{
// Load record into lease payment table table
LoadLeasePayment loadLeasePayment = new LoadLeasePayment();
Decimal LeasePaymentId = loadLeasePayment.AddRecord(prodLeaseId, legacyRecord.amount_paid, datePaid, prodContext, loadDate);
}
}
I am retrieving data from SQL Server from a StoredProcedure using Dapper and I'm getting error
Specified cast is not valid.
and details:
Error parsing column 4 (SubTotal=0.00 - Decimal)
On SQL Server side the column SubTotal is decimal(18, 2) NULLABLE and on .NET side it's decimal?. The data being retrieved is 0.00.
I checked this answer: Dapper,decimal to double? Error parsing column X
As per answer, I replaced
il.Emit(OpCodes.Ldtoken, unboxType);
with
il.Emit(OpCodes.Ldtoken, Nullable.GetUnderlyingType(unboxType) ?? unboxType);
on line 2360 and still getting the same error.
Anyone has any ideas about this? Thanks.
Update:
I tried making column non-nullable. Also tried changing column to float (on SQL Server) and double (on .NET side). None of these worked and I was getting the same error. Then I changed column to int and now code works fine. However, I'm working with monetary values and would like to use floating point numbers. Will investigate further...
I'm executing a stored procedure as follows
var transaction = this.db.Query<PaymentTransactions>("usp_PaymentTransactionsGetSingleIfPaid", new { registrationId }, commandType: CommandType.StoredProcedure);
The relevant part of the stored procedure that returns information is below.
SELECT * FROM PaymentTransactions WHERE RegistrationId = #registrationId AND TransactionStatus = 'SUCCESS';
UPDATE 2:
Dapper is working fine. Maybe there was something wrong with my dev environment. All it took was VS restart.
Don't laugh, but I had this exact same problem with Dapper in an ASP.NET MVC project and the solution as in the comment from #erdinger worked also for me:
Close Visual Studio
Start Visual Studio again
The problem was fixed this way...
Seems like this is not Dapper specific, as I just verified the below snippet works as expected.
Try enumerating your column names explictly (instead of select *) so that the procedure returns exactly what should be mapped to PaymentTransactions. Its possible there is another non-decimal column that is misnamed?
This is using Dapper v1.13 on .Net45:
Procedure:
create procedure dbo.Test
as
select [SubTotal] = cast('0.01' as decimal(18,2))
union all
select null;
Linqpad:
void Main()
{
using (IDbConnection cnn = GetOpenConnection())
{
var users = cnn.Query<Sale>("yak.dbo.test", new { }, commandType: CommandType.StoredProcedure);
users.Dump();
}
}
public static readonly string connectionString = "Data Source=.;Initial Catalog=tempdb;Integrated Security=True";
public static IDbConnection GetOpenConnection()
{
var connection = new SqlConnection(connectionString);
connection.Open();
return connection;
}
public class Sale
{
public decimal? SubTotal;
}
Returns:
I am using Hibernate to access my database. I would like to delete a set of fields on function of a criteria. My database is PostgreSQL and my Java code is:
public void deleteAttr(String parameter){
Configuration cfg = new Configuration();
cfg.configure(resource.getString("hibernate_config_file"));
SessionFactory sessionFactory = cfg.buildSessionFactory();
session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
tx.begin();
String sql = "delete from attribute where timestamp > to_date('"+parameter+"','YYYY-MM-DD')"
session.createSQLQuery(sql);
tx.commit();
}
The method runs, but it doesn't delete data from database. I have also checked the sql sentence in PgAdmin and it works, but not in code. Why? Does someone help me?
Thanks in advance!
It's because you're creating a query, but you don't execute it:
String sql = "delete from attribute where timestamp > to_date('"+parameter+"','YYYY-MM-DD')"
Query query = session.createSQLQuery(sql);
query.executeUpdate();
You should really use bound named parameters rather than string concatenation to pass parameters in your query: it's usually more efficient, it' much more robust, but above all, it doesn't open the door to SQL injection attacks.