UWP database of music library storage - database

I want to know that how can I achieve following functionality in a c#/xaml UWP app.
when the app runs first time it scans all of the music library, folders and files and stuff.
Then it stores in some kind of database at backend.
so whenever u launch app again, it doesnt have to scan again everytime, and it just runs and it already has all the necessary data.
i am guessing that a sqlite database can be used in applicationdata.localfolder
but is there a way that I can save the state and data of all the visual elements of all the pages of my app somewhere in a better way? so that everytime app launches it appears that it was just minimized and then maximized?
thanks in advance.

Save the meta data of songs(name,Path etc)to database table on first launch. On subsequent launch compare the songs count returned from Music API and database table. If it is same fetch from database or else you have to fetch from API It will be much faster on second launch
You have to do first launch operation that is looping through songs collection returned from API and saving to database in async Task so that it won't hang UI
No you can't store storagefile to db.. There is no such compatible type in sqllite. Instead that you can store the path of the song. Actually you have to use path to play the song instead of StorageFile. MediaPlayer play method also takes path to play the song as Uri. Check that.

Related

Does my mobile app need a database or should I save everything to files instead?

I have created a video editor using React Native. I need to choose a way to save all of the user's projects locally. Each project has a video file, thumbnails (images) and its current Redux state.
My first idea is to save everything in files using RNFS. Each Project's folder would have a video file, Thumbnails folder and a state.txt file containing my application's current state (current text size selected, background color, etc...).
Do I need a database like SQLITE or should I save everything in files? I know I'll have to use RNFS for the binary data like videos and images. But what about the state.txt file? is that a good idea? The idea of each user having their own local database just for that sounds strange to me.
You have your Redux state that just represents local state. Have a look at https://github.com/rt2zz/redux-persist
This allows you to serialize the Redux State and save it using whatever storage you provide, but you can use asyncStorage in the react native apps.
You are right about using RNFS for videos etc. For the metadata in state.txt, in my opinion, it would be better to go with async-storage or other offline storage options (like Realm )
It will be much more performant to query it instead of reading from a file.

What is most efficient way to automatically scrape new news articles from news sources?

I want to build a news aggregator application. I have one problem that I don't know how should I take new news articles from news webpages.
I wrote a scraper script in python in which when I run it takes all the news from the source (published today the time of running) and saves them in to a CSV file (I save: URL, Title, Date, Time, Image URL, Category, Content). When I run the script again it checks with the CSV file if it processed the URLs so it does not write duplicate content, only writes the new content. And at the end I want to write these results to my database.
But with this script I have to run it periodically to (lets say every 10 mins) to check if there is new content published.
Is this the write way to accomplish this?
Is there a better way to listen to news sources which can take when the new content is published?
If this is the way to do it how can I set the script to run periodically?
Greatly appreciate your help.
I run the script again it checks with the CSV file if it processed the URLs so it does not write duplicate content, only writes the new content.
You might add to your question:
website address
python code that you've already done
My suggestion for you: Get the most recent URLs from DB (say 100-200, number should be comparable with URLs number on the web page to scrape) and check them against the present URLs on the web page. If new URLs appear, do scrape them.

How to load a part of application before loading rest of the application in angularjs?

I could not think of a better title, Please suggest one.
I am planning to work on a large web application. It will take time to load the full application before application starts functioning.
Suppose its something like asana.com. If you have a link to the task and you open the link. It loads the application first and then shows the detail of the task.
Note: I have added another example in update 2
I want to do just the opposite. Suppose if I try to open the link directly. It should show me the tasks details first and then load the whole application in background.
What development strategy should I follow to implement such feature. Will angular be good for this? I have worked with angular for small projects and am capable of think in angular :)
I just wanted to be pointed in right direction.
Update 1:
I am using Apache2 PHP5 in backing as ReST API. I am thinking to change to GoLang http server. But that does not matter in this context :)
Update 2:
I have not yet started working on the application, but I know that its size is going to be big and its going to take time to load the application. This will be a javascript application, all the communication to web will be done mostly by API. APIs will be fast and it wont be slowing down the application. My main concern is the javascript library and the approach to the issue that I want to display the content of the page before the application is loaded and load the application in background.
As second example: https://chrome.google.com/webstore/detail/a-journey-through-middle/gjgkjeheegjnnmheaflhdocglkiegoni?utm_source=chrome-ntp-icon
If you open this link in chrome, it will load the application and then load the specific content in a popup. I want to load the content of the popup first and then load the application in background. How should I write my application to achieve that.
My suggestion (and I say this as I start to do similar vs. having proven it successful) would be to make some level of framework fairly static so that users get an almost instant response to the site loaded and then start the angular app with something like this
angular.bootstrap(document.getElementById("container"), ["app"])
Ref for the api - https://docs.angularjs.org/api/ng/function/angular.bootstrap
Ref for a demonstration of this - https://egghead.io/lessons/angularjs-angular-bootstrap-app-init
My expectation then is that you will be able to
Load your static elements quickly (which will just have placeholders for your content/material)
Access the data you want in the order you want to get it to present on the screen
Release any other part of the app you need to chrome it up/decorate or populate side items.

Whats the best file location to download content to?

I am building a WPF application and while im not a newbie, I am not an expert either. My WPF application streams images from a website when connected to the internet. If a user selects the image, I would like to save it to the hard drive. The user would then reuse the image later on when loading the application if it isn't connected to the internet. I will be querying a folder to see if its there in a saved location as well.
I question where the best location is to save it? I think it might be document settings and then application data, but Im not too sure.
Thank you in advance.
For behind the scenes data i usually use the local app data, you can get the directory like this (in case you did not know):
Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData)
I create some root folder there for the application to store the data in, cleanup can then just be done by getting rid of the whole folder.

Most efficient way to transfer images to a Silverlight client

I have an application that shows a screen of image thumbnails, each image is around 80k and they are stored in a database. To keep response time reasonable, the appilcation displays a placeholder image when it first starts and later downloads the images from the server. I'm expecting to show around 40 images on the screen at once so that's my batch size. What's the best way to serve these images up to the client? I've got two options in mind.
Create an ADO.NET Data Service that exposes the Images database table to the client. The client can asynchronously request the images, one at a time, and display them as they come back from the server. I've implemented this solution and it seems to work Ok; the speed isn't great and I feel like I could utilize the Http pipe better by requesting maybe 3 images at a time.
Create an HttpModule on the server that looks for requests that look something like /Images/1.jpg and then reads the database and returns the requested data. On the client side I can have many Image objects whose source points to the virtual Urls on the server. My theory is that by just giving Silverlight many Urls to deal with it may be able to transfer the images more efficiently than my code in option 1.
Would either of these methods be more efficient or is there another technique for getting this done? Thanks!
I don't know if it's more efficient, but I've accomplished a very similar task using an HTTP Handler (ashx). The handler pulls the image in from the database based on the Parameters in the uri (image ID), and then Silverlight fetches them asynchronously by setting the Source property of an Image control to the URI of the handler with the specific ID that I want in the query string. The Image control, in turn, is inside of an ItemsControl which allows me to display multiple images.
We are doing something very similar, and we are just using an ASPX page to server them up with a query parameter of the image identifier. We are also caching the images, and the ASPX page will used the cached value if it exists. If not, we pull it from the data store, cache it, and send it down. It is working really well for us.
Have you looked at using Deep Zoom? It's very efficient about progressive image loading, and gives you a nicer user experience when the images are fully loaded.
Examples:
Hard Rock Memorabilia site
Deep Zoom Pix

Resources