I've got a problem with flutter recognizing my DateTime as UTC time. Via http request, I am storing some data inside my SQL Server database which contains a UTC DateTime. This is done via Entity Framework.
var workingTime = new WorkingTime()
{
StartDateTime = dto.StartDateTime.ToUniversalTime(),
EndDateTime = dto.EndDateTime.ToUniversalTime(),
...
};
await _repository.AddAsync(workingTime);
This looks quiet okay, the request was done at 2020-06-13 01-59-41:690 in Germany, so the stored data is the correct UTC time.
Now I am loading these data in my flutter app and if I debug my app, the loaded datetime says, that it is not a UTC time.
I am not sure, if I store the data wrong, or if I the parsing inside flutter is wrong, but I can't see me doing something wrong here.
Please tell me if you need code or more information.
Edit
So after a lot of testing, I found out something:
Debug.WriteLine(model.EndDateTime.Kind);
This prints "Unspecified". It seems like there is something wrong either in storing the DateTime or in reading from it.
Be explicit about the UTC every time. Indicate to the compiler you want the UTC time.
To get millisecondsSinceEpoch in UTC time:
DateTime dateTime=DateTime.now().toUtc();
int epochTime = dateTime.toUtc().millisecondsSinceEpoch;
To get Datetime in UTC:
DateTime dt=DateTime.fromMillisecondsSinceEpoch(millisecondsSinceEpoch).toUtc();
Store time as int in database. Milliseconds since epoch time.
DateTime dateTime=DateTime.now().toUtc();
int epochTime = dateTime.toUtc().millisecondsSinceEpoch;
After a lot of thinking I came along a good method for my case. I ended up implementing an extension method, that converts the Kind of every DateTime stored inside the Database to the UTC Kind when loaded.
Inside the DataContext:
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
// Convert all DateTimes to UTC
modelBuilder.TreatDateTimeAsUtc();
}
The extension method
public static class ModelBuilderExtensions
{
public static void TreatDateTimeAsUtc(this ModelBuilder modelBuilder)
{
var dateTimeConverter = new ValueConverter<DateTime, DateTime>(
v => v.ToUniversalTime(),
v => DateTime.SpecifyKind(v, DateTimeKind.Utc));
var nullableDateTimeConverter = new ValueConverter<DateTime?, DateTime?>(
v => v.HasValue ? v.Value.ToUniversalTime() : v,
v => v.HasValue ? DateTime.SpecifyKind(v.Value, DateTimeKind.Utc) : v);
foreach (var entityType in modelBuilder.Model.GetEntityTypes())
{
foreach (var property in entityType.GetProperties())
{
if (property.ClrType == typeof(DateTime))
{
property.SetValueConverter(dateTimeConverter);
}
else if (property.ClrType == typeof(DateTime?))
{
property.SetValueConverter(nullableDateTimeConverter);
}
}
}
}
}
Related
I have some question.
Based on the timestamp in the class, I would like to make a logic that excludes data that has entered N or more times in 1 minute.
UserData class has a timestamp variable.
class UserData{
public Timestamp timestamp;
public String userId;
}
At first I tried to use a tumbling window.
SingleOutputStreamOperator<UserData> validStream =
stream.keyBy((KeySelector<UserData, String>) value -> value.userId)
.window(TumblingProcessingTimeWindows.of(Time.seconds(60)))
.process(new ValidProcessWindow());
public class ValidProcessWindow extends ProcessWindowFunction<UserData, UserData, String, TimeWindow> {
private int validCount = 10;
#Override
public void process(String key, Context context, Iterable<UserData> elements, Collector<UserData> out) throws Exception {
int count = -1;
for (UserData element : elements) {
count++; // start is 0
if (count >= validCount) // valid click count
{
continue;
}
out.collect(element);
}
}
}
However, the time calculation of the tumbling window is based on a fixed time, so it is not suitable regardless of the timestamp of the UserData class.
How to handle window on stream UserData class's timestamp base?
Thanks.
Additinal Information
I use code like this.
stream.assignTimestampsAndWatermarks(WatermarkStrategy.<UserData>forBoundedOutOfOrderness(Duration.ofSeconds(1))
.withTimestampAssigner((event, timestamp) -> Timestamps.toMillis(event.timestamp))
.keyBy((KeySelector<UserData, String>) value -> value.userId)
.window(TumblingEventTimeWindows.of(Time.seconds(60)))
.process(new ValidProcessWindow());
I tried some test.
150 sample data. The timestamp of each data increased by 1 second.
result is |1,2,3....59| |60,61....119| .
I wait last 30 data. but is not processed.
I expected |1,2,3....59| |60,61....119| |120...149|.
How can I get last other datas?
Self Answer
I found the cause.
Because I use only 150 sample data.
If use event time at Flink can not progress if there are no elements to be processed.
https://ci.apache.org/projects/flink/flink-docs-release-1.9/dev/event_time.html#idling-sources
So, I tested 150 sample data and dummy data. (dummy data timestamp of each data increased by 1 second).
I received correct data |1,2,3....59| |60,61....119| |120...149|.
Thank you for help.
So as far as I understand Your problem, You should just use different Time Characteristic. Processing time is using the system time to calculate windows, You should use event time for Your application. You can find more info about proper usage of event time here.
EDIT:
That's how flink works, there is no data to push watermark past 150, so window is not closed and thus no output. You can use custom trigger that will close the window even if watermark is not generated or inject some data to move the watermark.
I use JsonRowSerializationSchema to serialize Flink's Row into JSON. I SQL timestamp serialization has timezone issues.
val row = new Row(1)
row.setField(0, new Timestamp(0))
val tableSchema = TableSchema
.builder
.field("c", DataTypes.TIMESTAMP(3).bridgedTo(classOf[Timestamp]))
.build
val serializer = JsonRowSerializationSchema.builder()
.withTypeInfo(tableSchema.toRowType)
.build()
println(new String(serializer.serialize(row)))
{"c":"1969-12-31T16:00:00Z"}
I see it uses PST(local time zone) to interpret timestamp, but then output UTC(see Z in output)
If I do TimeZone.setDefault(TimeZone.getTimeZone("UTC")), then it prints {"c":"1970-01-01T00:00:00Z"}. My timestamps are created for UTC time, and I want Flink to interpret them as UTC.
I am checking the Flink implementation, following two methods are in action.
private JsonNode convertLocalDateTime(ObjectMapper mapper, JsonNode reuse, Object object) {
return mapper.getNodeFactory()
.textNode(RFC3339_TIMESTAMP_FORMAT.format((LocalDateTime) object));
}
private JsonNode convertTimestamp(ObjectMapper mapper, JsonNode reuse, Object object) {
Timestamp timestamp = (Timestamp) object;
return convertLocalDateTime(mapper, reuse, timestamp.toLocalDateTime());
}
It looks like the implementation is hardcoded, is there any way to tell Flink to use UTC without changing the system time?
The java.sql.Timestamp is very problematic because it depends on a time zone. This is why we replaced it with the new java.time.* classes in the new Table/SQL type system.
We recommend that all Flink JVMs are configured in UTC time zone for the outdated implementation.
For Table/SQL, we use the new org.apache.flink.formats.json.JsonRowDataSerializationSchema but this works on internal data structures. I would recommend to just copy the source code of JsonRowSerializationSchema and implement the format as you need it. Or use the Jackson library directly which would avoid dealing with TypeInformation at all.
I'm new to using EF to handle data in SQL. In a MVC Core project we're testing EF (Microsoft.EntityFrameworkCore, version 2.2.3) to handle data.
When trying to update data and update failed for some reason (missing fields etc) it seemed like EF actually deleted the record from the database (MSSQL 2014) instead of throwing an update error...
Is it possible?
Code for updating:
public void Update(Contact contact)
{
_dbContext.Update(contact);
_dbContext.SaveChanges();
}
When trying to update data and update failed for some reason (missing fields etc) it seemed like EF actually deleted the record from the database (MSSQL 2014) instead of throwing an update error...
Is it possible?
It should not.
test it, try to debug here
_dbContext.Update(contact);
_dbContext.SaveChanges();
var updated = _dbContext.Contacts.FirstOrDefault(x => x.Id == contact.Id); //debug here
check if it has a value, if still none, these are the scenarios i can think of that may have caused your problem
investigate the missing field specially if it is not nullable.
is the _dbContext used here is the same connection string used with everything?
is the [Key] attribute listed on your Contact entity?
public class Contact
{
[Key]
public int Id
}
overridden the SaveChanges function?
is what you are passing Contact contains a Key and it is not 0?
is a delete function called after Update?
try using SQL Profiler to look at the Linq to SQL if it really generated an update query and if it is really pointing at the right [Key]
but if it is still not working properly, you could do
public void Update(Contact contact)
{
var selectedContactToBeUpdated = _dbContext.Contacts.FirstOrDefault(x => x.Id == contact.Id);
if (selectedContactToBeUpdated != null)
{
selectedContactToBeUpdated.PropertyToBeUpdated1 = newValue;
selectedContactToBeUpdated.PropertyToBeUpdated2 = newValue2;
//additional Properties
_dbContext.SaveChanges();
}
}
in the scenario above, it will only generate an Update statement with fields you have changed.
I have developed simple application with Spring 4.2.5 + Hibernate 5.1.0 - database system is MS SQL Server 2014.
From few days I am struggling with correct storing time + timezone in database.
Requirements that I need to fulfill is:
Save all dates in UTC time zone.
Store timezone in database column value.
To achieve it I created model called MyComment:
#Entity
#Table(name = "MY_COMMENT")
#EntityListeners(value = { MyCommentListener.class })
#Audited
public class MyComment implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID")
private Long id;
#Column(name = "DATE_", nullable = false)
private Timestamp date;
...
}
To enforce saving dates in UTC time zone I used Jadira framework:
hibProperties.put("jadira.usertype.autoRegisterUserTypes", true);
hibProperties.put("jadira.usertype.javaZone", "UTC");
hibProperties.put("jadira.usertype.databaseZone", "UTC");
However during each create/update operation of MyComment object, MyCommentListener is getting date from my local timezone (not UTC date!):
public class MyCommentListener {
#PreUpdate
#PrePersist
public void setLastUpdate(MyComment myComment) {
myComment.setDate(new Timestamp(System.currentTimeMillis()));
}
}
Do you know how can I solve this issue?
Should I use other date type in my model? Different than Timestamp?
What kind of type should be DATE_ column in MS SQL server database?
I will appreciate any help. Thank you.
AFAIK, the problem is with listener. Replace the following code in listener and verify. Change the date format as per your need.
#PreUpdate
#PrePersist
public void setLastUpdate(MyComment myComment) {
SimpleDateFormat dateFormat = new SimpleDateFormat("dd-MM-yyyy");
dateFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
myComment.setDate(dateFormat.getCalendar().getTime());
}
You can set a timezone property in the application properties file as below.
spring.jpa.properties.hibernate.jdbc.time_zone=UTC
It was very strange for me that there's no property in Spring to set up default TimeZone - at least I do not know about it.
After some googling I found out that the best place in Spring to set time zone is WebApplicationInitializer, so I prepared following code:
public class MyWebApplicationInitializer implements WebApplicationInitializer {
#Override
public void onStartup(final ServletContext servletContext) throws ServletException {
setupTimeZone();
}
private void setupTimeZone() {
TimeZone.setDefault(TimeZone.getTimeZone("UTC"));
}
}
Let's say I have a due date and a reminder timespan. How do I find the ones where due date is less than current date + reminder with Hibernate 3.6 criteria queries? In other words, I want to find my Events I've displayed the reminder. The reminder is a Long marking when the reminder should be sent either days or milliseconds, whichever is easier.
To summarize, my entities are following:
java.util.Date Event.DueDate
Long Event.Type.Reminder.Before // (in days or millis)
Examples
Today is 2012-06-11.
Included:
DueDate is 2012-06-15 and Before is 30 days.
Excluded:
DueDate is 2012-06-15 and Before is 1 day.
Ultimately this is just what ANSI SQL calls date/time arithmetic and specifically you are looking for INTERVAL datatype handling. Unfortunately, databases vary widely on support for INTERVAL datatype. I really want to support this in HQL (and possibly criterias, although that relies on agreement in the JPA spec committee). The difficulty, like I said, is the varied (if any) support for intervals.
The best bet at the moment (through Hibernate 4.1) is to provide a custom function (org.hibernate.dialect.function.SQLFunction) registered with either the Dialect (search google to see how this is done) or the "custom function registry" (org.hibernate.cfg.Configuration#addSqlFunction). You'd probably want this to render to your database-specific representation of date-arith with an interval.
Here is an example using the Oracle NUMTODSINTERVAL function:
public class MySqlFunction implements SQLFunction
{
public Type getReturnType(Type firstArgumentType,
Mapping mapping) throws QueryException
{
return TimestampType.INSTANCE;
}
public String render(Type firstArgumentType,
List arguments,
SessionFactoryImplementor factory) throws QueryException
{
// Arguments are already interpreted into sql variants...
final String dueDateArg = arguments.get( 0 );
final String beforeArg = arguments.get( 1 );
// Again, using the Oracle-specific NUMTODSINTERVAL
// function and using days as the unit...
return dueDateArg + " + numtodsinterval(" + beforeArg + ", 'day')";
}
public boolean hasArguments() { return true; }
public boolean hasParenthesesIfNoArguments() { return false; }
}
You would use this in HQL like:
select ...
from Event e
where current_date() between e.dueDate and
interval_date_calc( e.dueDate, e.before )
where 'interval_date_calc' is the name under which you registered your SQLFunction.