I'm writing a macro for slick-edit (with the C similar language slick-C)
As far as I found there are no time functions to help me convert epoch time (1321357827) to a human readable date and time
(if any of you knows how to do it in slick-C it's great, those of you who doesn't know it - assume C without time.h or any other libs)
In SE16 you'll find a whole class for date/time manipulation in se/datetime/DateTime.e.
In addition, the built-in function _time has an option to return the epoch time.
You should find enough example code there.
And for the basic algorithm I found another SO question answered on this: Includes a link to gmtime source. From there you should be able to adapt to SlickEdit code.
If you need only the time you could do:
sec_of_day = epoch % (24 * 60 * 60);
hour = sec_of_day / (60 * 60);
minute = sec_of_day % (60 * 60) / 60;
second = epoch % 60;
. This is of course not considering the timezone of your system.
If you need the date, you need to consider leap years.
EDIT: Warning: this code does not take into account leap seconds.
Related
I'm trying to calculate the day of week from a seconds since epoch timestamp. From time.h I can use gmtime(), but this increases the program with 1.3kB, probably because gmtime() also calculates the date.
So, I'm wondering what would be wrong about using something like:
(timestamp / (24*3600)) % 7
The only thing I could think of is leap seconds, such that something like 00:00:02 could be classified as the wrong day.
edit
This is for embedded programming, 1.3kB is a substantial part of the 64kB program/firmware. Furthermore, I'm not expecting to do anything with timezones or dates after this.
what would be wrong about using something like: (?)
(timestamp / (24*3600)) % 7
Not much wrong with the above except you have not specified the day of the week the epoch began (e.g. Thursday) nor the day of the week the week begins on (e.g. Monday). See 8601 for a deeper discusison on the first day of the week. Also watch out for computations like 24*3600 that with 16-bit int/unsigned lead to troubles.
Example: Let us say the epoch began on day-of-the-week number 3 (Monday:0. Thursday:3). This takes care of 2 issues at once: day of the week of the epoch and the first day of the week as only the positive difference needs to be coded.
#define EPOCH_DOW 3
#define SECS_PER_DAY 86400
dow = ((timestamp / SECS_PER_DAY) + EPOCH_DOW) % 7;
If timestamp is a signed type, append the following to insure a result in the [0...6] range.
dow = (dow + 7) % 7;
// or
if (dow < 0) dow += 7;
I doubt leap seconds are used in your application. If they are, the task is far more complicated as code then needs to deal with not only with a more complex calculation, but how to receive updates of the next scheduled leap-seconds - they occur irregularly.
You're off by 4, since January 1, 1970 was a Thursday, and you may compute an incorrect result for dates before the epoch depending on the behavior of % on negative operands. (The simplest fix is to check whether the result is negative, and if so add 7.) But other than this, your algorithm is correct; it's the same as used by glibc internal function __offtime, which gmtime ultimately ends up calling. (You can't call it yourself, as it's an internal implementation detail).
There's no need to worry about leap seconds; Unix time ignores them.
I would recommend encapsulating the code in a (possibly inline) function, with the gmtime implementation as a comment of #if 0 block, so that you can easily switch to it if you start needing to compute months/years as well.
I feel very stupid as I don't seem to get a plain Natural number representing the seconds since the unix epoch (01/01/1970 00:00:00) in Ada. I've read the Ada.Calendar and it's subpackages up and down but don't seem to find a sensible way of achieving that, even though Ada.Calendar.Clock itself is supposed to be exactly what I want...
I am at my wits end. Any nudge in the right direction?
Using Ada.Calendar.Formatting, construct a Time representing the epoch.
Epoch : constant Time := Formatting.Time_Of(1970, 1, 1, 0.0);
Examine the difference between Ada.Calendar.Clock and Epoch.
Put(Natural(Clock - Epoch)'Img);
Check the result against this epoch display or the Unix command date +%s.
See Rationale for Ada 2005: §7.3 Times and dates and Rationale for Ada 2012: §6.6 General miscellanea for additional details.
According to POSIX standard UNIX time does not account leap seconds, while Ada.Calendar."-" handles them:
For the returned values, if Days = 0, then Seconds + Duration(Leap_Seconds) = Calendar."–" (Left, Right).
One option is to split Ada.Calendar.Time into pieces using Ada.Calendar.Formatting.Split and gather back using POSIX algorithm.
The best option option seems to be to use Ada.Calendar.Arithmetic.Difference. It returns Days, Seconds and Leap_Seconds. You can then combine Days * 86_400 + Seconds to get UNIX time, and Leap_Seconds will be explicitly thrown away as required by POSIX.
I have recently been solving this problem and posted a library into public domain.
In the GNAT implementation of Ada, there is a private package Ada.Calendar.Conversions which contains Ada <-> Unix conversions used by the children of Calendar.
Sorry if this sounds like a puzzle, but it does have puzzling me for a while. :)
From a sqlite3 db file, one of the record has last_visit_time field of value 13010301178000000 (INTEGER type).
How come 13010301178000000 = 4/12/2013 9:32:58PM? (4/12/2013 9:32:58PM is got by an existing tool, which I know nothing about how it translates internally).
Can someone shed some light on how this is translated?
I've looked at http://www.epochconverter.com/, but had no luck.
Thanks.
January 1, 1601 is the epoch for Windows timestamps.
However, those timestamps use 100-nanosecond intervals, so it appears your value got divided by 10, or you're missing a zero for some reason.
To convert to/from Unix timestamps, divide/multiply by 1000000 to convert between second and microseconds, and adjust be the offset between 1970 and 1601 in seconds.
I working on a horse racing application and have the need to store elapsed times from races in a table. I will be importing data from a comma delimited file that provides the final time in one format and the interior elapsed times in another. The following is an example:
Final Time: 109.39 (1 minute, 9 seconds and 39/100th seconds)
Quarter Time: 2260 (21 seconds and 60/100th seconds)
Half Time: 4524 (45 seconds and 24/100th seconds)
Three Quarters: 5993 (59 seconds and 93/100th seconds)
I'll want to have the flexibility to easily do things like feet per seconds calculations and to convert elapsed times to splits. I'll also want to be able to easily display the times (elapsed or splits) in fifth of seconds or in hundredths.
Times in fifths: :223 :451 :564 1:091 (note the last digits are superscripts)
Times in hundredths: 22.60 :45.24 :56.93 1:09.39
Thanks in advance for your input.
Generally timespans are either stored as (1) seconds elapsed or (2) start / end datetime. Seconds elapsed can be an integer or a float / double if you require it. You could be creative / crazy and store all times as milliseconds in which case you'd only need an integer.
If you are using PostgreSQL, you can use interval datatype. Otherwise, any integer (int4, int8) or number your database supports is OK. Of course, store values on a single unit of measure: seconds, minutes, milliseconds.
It all depends on how you intend to use it, but number of elapsed seconds (perhaps as a float if necessary) is certainly a favorite.
I think the 109.39 representing 1 min 9.39 sec is pretty silly. Unambiguous, sure, historical tradition maybe, but it's miserable to do computations with that format. (Not impossible, but fixing it during import sounds easy.)
I'd store time in a decimal format of some sort -- either an integer representing hundredths-of-a-second, as all your other times are displayed, or a data-base specific decimal-aware format.
Standard floating point representations might eventually lead you to wonder why a horse that ran two laps in 20.1 seconds each took 40.200035 seconds to run both laps combined.
How do I calculate the time period between 2 dates in C (any library, etc.)?
The program should take two (local) dates as input and provide the duration period between them as output.
For example,
startDate = OCT-09-1976 and endDate = OCT-09-2008
should show a duration of 32 years.
startDate = OCT-09-1976 and endDate = DEC-09-2008
should show a duration of 32 years and 2 months.
Thanks.
Convert the dates into two struct tm structures with strptime
Difftime gives you the difference between the two in seconds.
Convert that into months etc with the code here (in C++, but the only C++ is for the string formatting, easy to change)
EDIT: as a commentor observed, that avoids the month issue. There is (GPL'd) code for
isodiff_from_secs that can be converted to do what you want, if you're happy with its assumption that months have 30 days. See Google codesearch and the description of the standard here
Doing the fully-correct solution which takes acccount of the true months between the actual days would be pretty complex. Is that required for your problem?
I did something very similar recently using Boost.Date_Time, and presenting the resulting function as C, but this of course requires using the C++ linker.
Actually, the example leaves a little to be desired - will the start and end dates always be on the same day of the month? If so you can ignore the day number end up with a trivial subtraction of the month and year numbers.
However if your dates can be anywhere in the month it might be a bit more tricky. Remember that not all months have the same number of days!
C difftime doesn't help you with month calculations, which is why I used Boost, though you may not have that option.