Full & Incremental data load - api - database

I am using application insights api to get the events data to a database. However, I see that there is a limit of 500.
My use case is - Dump all the historic data coming from the api to database and then run a job every hour so that new data is only dumped into the database
How do I achieve this?
Currently - The code is consuming the api and storing the data in database(only the 500 rows that is given out of the api)
Problem -
A 500 limit in application insights api
Unable to get all the historic data from the api
Mechanism to setup a incremental load - not known
Any idea on this would be very helpful

Related

How should I push and pull data from a server using REST API and Later generate reports from it

I am new to REST API. What I basically understand from REST API is you need to call it each time to get the updated data.
In my case, I need to use the data received from REST API to generate reports in PowerBI. Adding on, I should be able to "read" the data coming from the server and "write" to the data as well.
I did find the option of getting data from WEB in PowerBI to connect directly to REST API. So, if I do that I can only "read".
Can you help me with different options on how can I do both "read/pull" and "write/push" to the server when I have its REST API? I am not sure if a cloud has to be used in-between.

Preloading Cache Express

I'm working on a back-end / API built with Express for a dashboard built with Next/React.
Each distinct API call requires data to be fetched from an Elasticsearch cluster. All endpoints accept a date-range parameter. Requests are cached using node-cache, but if a request is made with a new date-range, it requires a new Elasticsearch Query which takes time. I know that for any given hour all requests can at most ask for 5 different predefined date-ranges [date-ranges go until the end of the current hour].
I'm interested in preloading data to the cache at the beginning of every hour so that when an API call is made with predefined parameters the data will already be in the cache.
How can I do this?

Improving mobile aps client server communication efficiency and data availability in offline mode

My question is about how to store data which once was received online and still can be processed after the mobile device got offline and/or was restarted.
I'm using AngularJS with Ionic (PhoneGap) for building apps. But my question is not explicitly adressing these technologies.
Best practices, patterns or algorythms would be very helpful to me or even some useful articles or key words.
1) The most simple challenge is to make my app more user-friendly by making its functionality usable not only if the device is online but also in offline mode. In my case this implies that I have to make the last fetched online data available for later use (while device is offline and also after restarting the device!).
2) A bit more difficult is to reduce the communication costs by only synchronizing the server side changed data when the device reconnects to the internet.
3) Entities can also be produced on client side while the device is offline and they must get synchronized to the server too. There are no potential risks of conflicts because the users don't share Entities with write access.
4) I use Googles and Apples push services to inform the devices about newer entity versions, which should get updated on client side. So polling isn't needed.
Client side technologies: Javascript, AngularJS Framework, Ionic Framework, SQLite (WebSQL) or IndexedDB, PhoneGap (Cordova)
Server side technologies: Java EE, JPA, MySQL
Data Format and communication: JSON over REST / http, Googles and Apples push services for server-to-client messaging
1) Store the needed data inside a local SQLite database, and pull it out when the app starts/resumes.
2)In the MySQL database you need a table that creates new entries when you update/change/create content. You would need to store an id and a timestamp (maybe a boolean value if the content was deleted).
On the device you would make a request to the server to send the data from that table and compare it with the locally stored data. If there is a new id or the timestamp has change make a new request to pull the updated data.
3)Store the created data locally with a flag that it isn't synced with the server. When the device then goes online again check for not synced flags and send the data to the server with an identifier to know which device it comes from and where to save it.
4)See 2)
You could make a Java script which checks every x minutes for updated entries and send an automatic push Notification with it. What you would need is 2 tables, one with the newest updates and one with the updates that got pulled by the device(just id's and timestamps, not all the data).
I hope this was helpful, if something new comes to my mind I will update this answer.

Load data when webservices starts

i have a scenario where i have to load data from SQL server when i start running a web service. Later i have to use this data for my application, instead of accessing it every time from Database. In addition to this this data should be refreshed every one hour without affecting the website operation on the back end.If any of you has came across such scenario please let me know the solution. By the way i am using asp.net web services, SQL server database, and DNN for my front end.Thanks in advance.
In Global Asax,Application start event you can load all your data in the Dataset.
And by using Sql Cache dependency, You can refresh the data for each hour.But loading the
Entire data is not advisable.By making so you memory will be full.There will
be Performance degrade.
http://www.codeproject.com/Articles/14976/ASP-NET-Caching-Dependencies
Pre-loading all of your data is not a good practice because the database loses its purpose then. It’s probably ok for some data that is very rarely updated but needed very frequently but most definitely not for all the data you have in database.
As for the loading of data you can use app start event as others have already suggested.
Regarding caching – use Application object to make this data available to all parts of application and add a proprety to it that will keep the time of the last update. Then just create separate service that will check the last update time every X minutes and refresh the data when the time comes.

Multiple data sources: data storage and retrieval approaches

I am building a website (probably in Wordpress) which takes data from a number of different sources for display on various pages.
The sources:
A Twitter feed
A Flickr feed
A database on a remote server
A local database
From each source I will mainly retrieve
A short string, e.g. for Twitter, the Tweet, and from the local database the title of a blog page.
An associated image, if one exists
A link identifying the content at its source
My question is:
What is the best way to a) store the data and b) retrieve the data
My thinking is:
i) Write a script that is run every 2 or so minutes on a cron job
ii) the script retrieves data from all sources and stores it in the local database
iii) application code can then retrieve all data from the one source, the local database
This should make application code easier to manage - we only ever draw data from one source in application code - and that's the main appeal. But is it overkill for a relatively small site?
I would recommend putting the twitter feed and flickr feed in JavaScript. Both flickr and twitter have REST APIs. By putting it on the client you free up resources on your server, create less complexity, your users won't be waiting around for your server to fetch the data, and you can let twitter and flickr cache the data for you.
This assumes you know JavaScript. Once you get past JavaScript quirks, it's not a bad language. Give Jquery a try. JQuery Twitter plugin Flickery JQuery plugin. There are others, that's just the first results from Google.
As for your data on the local server and remote server, that will depend more on the data that is being fetched. I would go with whatever you can develop the fastest and gives acceptable results. If that means making a REST call from server to sever, then go for it. IF the remote server is slow to respond, I would go the AJAX REST API method.
And for the local database, you are going to have to write server side code for that, so I would do that inside the Wordpress "framework".
Hope that helps.

Resources