Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am working on a MCU and my aim is to implement time/date on the MCU.
I use a timer that ticks per seconds and store it on uint32_t count that has enough size to store 136 years. I want to have 2000 as a reference and max should be 2099.
here is my data struct:
typedef struct
{
uint8_t sec; // Seconds. [0-60] (1 leap second)
uint8_t min; // Minutes. [0-59]
uint8_t hour; // Hours. [0-23]
uint8_t day; // Day. [1-31]
uint8_t month; // Month. [0-11]
uint8_t year; // Year - from 2000. [00-99]
} osal_time_t;
What is a best way to convert the seconds(uint32_t count) to min/hr/day/month/year correctly and by using the lowest resources?
Time, hour, and year seems simple but day gets tricky with 28-29-30-31 days and feb is 29 each 4 years.
I see linux source code implementations but I think it is designed an OS, not best of a humble MCU.
Can anyone hint what kind of algorithm should I use in a MCU so that it requires min. resources?
As an example, what algorithm is used to calculate this http://www.mathcats.com/explore/elapsedtime.html
If you have any code snippet, I would appreciate if you could share it.
You have to just do the math, there is no other way around it. You are converting from base 2 to base 10 (base 60 represented in base 10).
Likewise for month day stuff, you have to grind through that as well, with a table of some sort for days per month and deal with leap year.
The alternative to doing the math is changing how you count, using more memory but less calculation. Basically a BCD approach. When the ones of seconds rolls from 9 to 10 then increment tens of seconds and set ones of seconds to a 0. Repeat all the way up to the date. Or meet half way and seconds over 59 rolls to zero and increments minutes...then do the base 10 stuff to separate the tens from ones of seconds minutes hours. you could use a table for that if you dont have a divide.
This is not a programming problem, because you can't do this reliably with just a microcontroller. An internal RC oscillator will be way too inaccurate, and even if you use a high accuracy external crystal oscillator, it will drift over time and may vary with temperature.
The only correct solution is to add a real-time clock circuit to the hardware, preferably together with a back-up battery. How to communicate with the real-time clock circuit is hardware-specific.
It is better to ask such questions which are on the borderland to hardware on https://electronics.stackexchange.com/.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
I am working on algorithm where i can have any number of 16 bit values(For instance i have 1000 16 bit values , and all are sensor data, so no particular series or repetition). I want to stuff all of this data into an 8 or a 10 byte array(each and every value of the 1000 16 bits numbers should be inside the 10 byte array) . The information should be such that i can also easily decode to read each and every value from the 1000 values.
I have thought of using sin function by dividing the values by 100 so every data point would always be in 8 bits(0-1 sin value range) , but that only covers up small range of data and not huge number of values.
Pardon me if i am asking for too much. I am just curious if its possible or not.
The answer to this question is rather obvious with a little knowledge in information sciences. It is not possible to store that much information in so little memory, and the data you are talking about just contains too much information.
Some data, like repetitive data or data which is following some structure (like constantly rising values), contains very little information. The task of compression algorithms is to figure out the structure or repetition and instead of storing the pure data to store the structure or rule how to reproduce the data instead.
In your case, the data is coming from sensors and unless you are willing to lose a massive amount of information, you will not be able to generate a compressed version of it with a compression factor in the magnitude your are talking about (1000 × 2 bytes into 10 bytes). If your sensors more or less produce the same values all the time with just a little jitter, a good compression can be achieved (but for this your question is way to broad to be answered here) but it will probably never be in the range of reducing your 1000 values to 10 bytes.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I am trying to classify the events above as 1 or 0. 1 would be the lower values and 0 would be the higher values. Usually the data is does not look as clean as this. Currently the approach I am taking is to have two different thresholds so that in order to go from 0 to 1 it has to go past the 1 to 0 threshold and it has to be above for 20 sensor values. This threshold is set to the highest value I receive minus ten percent of that value. I dont think a machine learning approach will work because I have too few features to work with and also the implementation has to take up minimal code space. I am hoping someone may be able to point me in the direction of a known algorithm that would apply well to this sort of problem, googling it and checking my other sources isnt producing great results. The current implementation is very effective and the hardware inst going to change.
Currently the approach I am taking is to have two different thresholds so that in order to go from 0 to 1 it has to go past the 1 to 0 threshold and it has to be above for 20 sensor values
Calculate the area on your graph of those 20 sensor values. If the area is greater than a threshold (perhaps half the peak value) assign it as 1, else assign it as 0.
Since your measurements are one unit wide (pixels, or sensor readings) the area ends up being the sum of the 20 sensor values.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a device generating some values say N, each value having 32 bit.
I am logging these values every 10 seconds by writing a new row in an excel file. I will be creating a new file every day.
I have to estimate the hard disk storage capacity necessary to store these log files for a period of 10 years.
Can someone give any hints regarding the calculation of the size of log file generated per day ?
Assuming worst case 2's complement 32-bit ASCII...
-2147483648 is 13 characters per value
1 value / 10 seconds
3600 seconds / hour
24 hours /day
that's 112,320 bytes per day, per number of values N,
"round" that off to 112,640 bytes (divisible by 1024) per day
365.25 days per year
10 years
that's N * 411,417,600 or slightly more than N * 4Mbytes
So if N was 10, that would be slightly more than 41MBytes.
Create a sample spreadsheet. Add 1000 rows and save it as a different name.
That will give an estimate for per-row cost.
Incremental writing is not a good scenario for complex formats such as spread sheet. Text log file could be appended.
A spread sheet would tend to re-write whole file for each flush.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I just need some guidance as to how to detect the frequency from a sine wave. I generated the sinewave via a DtoA converter. Now, I'm putting that signal back through an AtoD to monitor and verify the output.
I don't know how to detect the frequency of the sine wave. Apparently, I'm supposed to get the period from the sine wave and apply hysteresis to compensate for noise.
Any hint is much appreciated. thanks.
If this is about sine waves only, I'd check for zero crossings, and calculate the average time between zero crossings for a couple of hundred cycles, that would give an accurate half period length, and from that, you can calculate the frequency.
("zero" might not be trivial, as most uC have 0-Vdd range ADC inputs only, so zero in that case might happen to be Vdd/2...)
(very simple) pseudocode could be
const zero = 0; //or vdd/2 if that's the case
while(cyclesSoFar<enoughCycles) {
currentSample=adcRead();
//detect zero crossing (needs hysteresis added)
if((lastSample>zero&¤tSample<=zero) || (lastSample<zero&¤tSample>=zero) ) {
period = getTicks()-ticksAtLastCrossing; //might have to check for over/underflow to get correct value
avgPeriod = avgPeriod * (cyclesSoFar)/(cyclesSoFar+1) + period/(cyclesSoFar+1);
cyclesSoFar++;
}
lastSample = currentSample;
}
freq = ticksFreq/(avgPeriod*2);
Where: enoughCycles is the number of cycles to measure, zero is the DC offset of the sine wave, and ticksFreq is the frequency of the CPU, the most precise time available. Of course, this is very-very simplified, lots of fluff and checks need to be added.
I am using NXP LPC17xx family microcontrollers (LPC1759 and LPC1768).
How can I determine if RTC is running for sure?
I am doing a test on
LPC_RTC->CCR & RTC_CCR_CLKEN
but it seems no much reliable.
I have seen values in year 3197 or so when turning on my device.
How can I tell if RTC is running and its value is not corrupt?
EDIT:
I ended up adding a simple sanity check in RTC values:
bool DateTime::validate( const RTC_TIME_Type &time_info )
{
if ( time_info.YEAR > 2100
|| time_info.DOY > 366
|| time_info.MONTH > 12
|| time_info.DOM > 31
|| time_info.HOUR > 23
|| time_info.MIN > 59
|| time_info.SEC > 59 )
return false;
return true;
}
It is run during my POST, as suggested bellow.
I battled with the RTC on that chip's grandfather (LPC2148) about 5 years ago. If you look on the Yahoo LPC2000 group (it also covers the LPC1000 chips) you'll see the RTC & its issues come up a lot.
Anyway, I'm going from memory here, but I think I concluded that reading the status register wasn't reliable enough. Maybe the issue was that when power was removed, if the battery backup was absent, things would get scrambled...
So what I recall I did was the following, during the boot phase:
(1) Enable RTC peripheral
(2) Read ALL RTC registers. In firmware, have "out of bounds" min & max values for each field (e.g. year must be at least 2005, and no larger than 2030)
(3) If any value is out of range, reset date & time to some hard-coded value (e.g. Jan. 1, 2005) (product would let user adjust time/date once booted)
(4) Take snapshot of registers; wait AT LEAST one second (use timer peripheral to measure time), then make sure the value changed. I might have gone so far during boot-up as to set the values so that a 1-second tick would/should cause everything to roll over (probably 1 second before midnight, Dec. 31), ensure everything changes, then write back the original value + 1 second. (You would want to do this right as the value changes to avoid slipping seconds)
I will try to dig up the code and see if there was something more. I just recall finally concluding I had to run the damn thing & watch it work before I passed the P.O.S.T. for that peripheral.
(I kind of mentioned it in passing, but just to re-iterate... if your values seem corrupted on power-on, make sure your battery back-up circuitry is rock solid - even a basic circuit with a pair of diodes normally suffices. Could be that the clock is running when the product is running, but that when power is removed, its brains get scrambled.)
Also confronted with temperamental rtc's here....
I don't think a really reliable test is possible, you could store the last recorded time somewhere in non volatile memory, and check that the clock hasn't moved to a past date, and you can also test that the delta between two checks is not too big. That would catch something like the year 3000, but you cannot reduce the tested time lapse to say 1 month - you want the thing to wake up even if it was shelved for say a year.
Do you have the possibility to consult a time source at startup? Eg an ntp server or some other device that your controller speaks with that can be considered synchronized with a reliable time source?
You can route the RTC clock to an external pin and watch it on an oscilloscope or a logic analyzer.
IIRC I did just that for LPC1766/1768 (I have two identical boards populated with different processors).