How to calculate the pace of a person from the steps? - reactjs

I developing an fitenss app and want to calculate the pace(i.e speed) of a person using the steps he will walk.
For this, I will use certain time interval.
For example, in 1 hour if a person walked 2500 steps and now calculate the pace of a person i.e he is walking slow or moderate or fast.
How pace can be achieved from steps?
Can anyone please help it out?
Thank you

Related

Weekly data prediction with Neural Prophet

Hi there is something which is unclear to me when reading the articles and documentation.
They say (https://github.com/facebook/prophet/issues/2112) that the underlying model is continuous.
For me it is hard to understand as normally their X matrix is based on data points at time t no?
For example if I have a weekly seasonality normally i should not use a frequency which is lower or equal to twice a month as the data is sampled weekly.
But under the hood do they interpolate the time series to daily points?
In an experiment with weekly data I have force the weekly seasonality to 8 coefficients and it increases drastically the performances on an unseen validation set ( from 0.3 R2 to 0.6 without weekly and with weekly seasonality which for me doesn't make sense).
I of course predict on weekly data too
So I am not sure whether they first interploate to daily and then perform seasonality computations
Can someone help me understand please?

How can i keep track of a total balance in an efficent way?

I have a webpage that keeps track of all the incomes and expenses of a business.
The user can see its businesses total balance any day, from the day the business was created. So for example, if today is 9/15/22 and the business was created on 6/12/21, the user can see the business total balance from any day since 6/12/21.
Calculating the total balance is easy: Incomes - expenses = total balance
The problem is that when the business has a lot of time running, expenses and incomes can be thousands, so querying them all from db and operating with all of them at the same time can be very slow.
Can you think of any other way of keeping track of this? I thought about checkpoints every month, but i dont really know if that is the best idea. Im working with nodejs and mysql. Thanks a lot.

Dangers of having too many Materialized Views?

My application is going to be creating a growing number of materialized views.
The number of materialized views will increase by 8,000 - 10,000 per month.
Most of the views will hold very little information, around 100-1000 rows with small fields, but a few (10 per month) will hold from 100,000 to millions of rows with small fields.
I am cautious to see if this is a good idea before going too far into the implementation.
Can anyone tell me the hard limits I may hit, or if this is a good idea at all?
If needed, I can explain further the use-case. It may be possible to drop some of the older views if needed (99%+), but not all (the 10/month big ones must stay).
EDIT: Explanation Requested
The app allows users to vote on content, and then we make charts of the content with the most votes. We have 5-minute units of voting, and charts generated by request. A user can look at any length of time with a granularity of 5 minutes. For example, I could look at just the past 5 minutes of votes, 15 minutes, 2 hours, etc.
To tally votes and make the "Top Charts" we must do an expensive query, searching all the content that has gotten votes within the time units that are requested, tallying, and sorting. To mitigate this, I wanted to make a sorted materialized view with the results of the vote for EVERY time unit, as after voting closes for that 5 minutes, the votes will never change. This way, a popular search (like the latest 5-minute chart) will not have to be generated and sorted every time a user searches (there are 8760 5-minute units per month). I also wanted to make mviews for the weeks, months, and days.
This is the table I will be using to generate the mviews, with tuid being a reference to the 5-minute voting time unit
Perhaps there is a better way to make this efficient with caching?

How to store total visits statistics for user history efficiently?

I'm maintaining a system where users create something called "books" that are accessed by other users.
I need a convenient (good performance) way to store events in database where users visit these books to later display graphs with statistics. The graphs need to demonstrate a history where the owner of the book can see which days in the week, and at which times there is more visiting activity (all over the months).
Using ERD (Entity-Relationship-Diagram), I can produce the following Conceptual Model:
At first the problem seems to be solved, as we have a very simple situation here. This will give me a table with 3 fields. One will be the occurrence of the visit event, and the other 2 will be foreign keys. One represents the user, while the other represents which book was visited. In short, every record in this table will be a visit:
However, thinking that a user can average about 10 to 30 book visits per day, and having a system with 100.000 users, in a single day this table can add many gigabytes of new records. I'm not the most experienced person in good database performance practices, but I'm pretty sure that this is not the solution.
Even though I do a cleanup on the database to delete old records, I need to keep a record history of the last 2 months of visits (at least).
I've been looking for a way to solve this for days, and I have not found anything yet. Could someone help me, please?
Thank you.
OBS: I'm using PostgreSQL 9.X, and the system is written in Java.
As mentioned in the comments, you might be overestimating data size. Let's do the math. 100k users at 30 books/day at, say, 30 bytes per record.
(100_000 * 30 * 30) / 1_000_000 # => 90 megabytes per day
Even if you add index size and some amount of overhead, this is still a few orders of magnitude lower than "many gigabytes per day".

Montecarlo simulation for forecasting loan portfolio growth

I have to forecast the future gold loan portfolio growth of a financial firm. I have past 36 month growth. I am planning to use montecarlo simulation to forecast but growth is a deterministic process not a random process. How can I apply montecarlo for this situation?
Please help me friends. Thanks in Advance..

Resources