Store intermediate form data - database

In a web application, we often come across a form submission process that spans across several pages, for ex: In first form we capture basic information, next page capture some other information and so on. I have a scenario where I've 7 screens to capture all the details about user and "Submit" button appears on 7th page.
Usually we store all the intermediate values in HttpSession and when its time to submit we retrieve all the values from Session and create an entry in database.
With this approach, by the time user completes all the form entries (i.e. from Page 1 to Page 7), everything resides in Session.
I would like to know, is there any alternative apart from HttpSession for storing the intermediate values?
I'm actually trying to find the ways to make my HttpSession less bulky.

You can also store just the reference in a session which then maps to a cache like e.g. Memcached. Or if it is important that you don't lose the data while the user walks through the steps, you can also persist the data in a database and just refer via a key from a session to it. To store too much data in the session is sometimes not the best choice, so I would just store a reference there.

You can try caching technology of .Net, this might be useful instead of using session for all the data, also you can just use the id of the session for the cache id.
Second option I think is configuring your Session-State mode to use SQLServer mode for the storage.

Related

How to deal with web forms with lots of data?

I am curious to know about the real-world solution for dealing with forms that have large amount of data/fields or a wizard like interface (may be using AngularJS kind of GUI framework), especially if we want to take care of a scenario where the data persistence on back-end fails.
My questions are:
Is the form data saved in session, in the interim, (or may be on Browser itself using the JS libraries) till the user saves the final "Save" button?
Or is the data saved each time (i.e. on the user traversing from one screen to another using "Previous" or "next" buttons) on to a back-end database?
What happens if the form data has to be sent to an external web-service (instead of a database) and the call fails (due to timeout or any error)?
There is a strong chance that we will lose all the user entered data (unless we save it in a local database and re-try the web-service call later).
Do any of the caching f/w have any role to play here (including any AngularJS caching f/w)?
Thanks for sharing your knowledge.

What is the best approach to work with data while using token based authentication

I am building an sample application that lets user store comments.
I've created the registration and login process. When the user registers, his details are stored in a MySQL database and a token is returned to the browser. Now he can access the Profile page.
When an existing user logs in he is redirected to the profile page. The profile page is accessible only when a user registers or logs in.
After logging in, I want to show all his comments if he has already added them.
My frontend is in Angular and backend use Laravel. For authentication I use Satellizer.
I want to know, what is the best approach while playing with data, considering the fact that the user will add, edit his comments. Should I use localstorage and store data in a key value pair or should I create a json file which gets updated everytime the user adds a comment or makes a change.
I wanted to know what is the most efficient way to deal with data from server so that the application is fast even when it scales to a 10000 users and lot of data for each user.
Thanks
You should be updating it on the server when changes are made rather than only relying on localstorage. You can use localstorage to cache, but it should only be for immutable data, it shouldn't really be used for data that is going to change.
So in this case you'll be adding and updating new comments via your API (ideally a RESTful one!). Once you've made a change, you could store the comments locally and only update them when the user makes a new comment, however you'll quickly run into issues where the data is invalid on different clients. (i.e. if you update the comments on a different computer, the other computer won't be aware).
Alternatively, you could cache the comments and then simply ping the server to find out if new comments have been added. This could be using a HEAD request for example to check the last modified date on your comments resource.
You can store comments data locally on user browser, but you should properly manage it.
I don't how much load your server will have and if the time invested now worths it.
You can fetch comments and store them locally
User adds a comment, then you update locally and send a request to the server
You need to track the request response, if requests fail so notify user and remove comments from local.
if request was successful so you can continue on your way.
** facebook uses this "success first" approach
user does an action, and he see it happens instantly, in the background it could take few seconds, only if it fails they will notify you.
** look at their commenting process, when you comment, it appears instantly, no loading... but in the BG the load happens.

ASP.Net OutputCache audit page visits

I have an internal corporate web application that has a page that hosts static content (things like announcements) and I decided to implement OutputCaching on this page to reduce the ammount of processing and db calls.
The only problem is that I need to also keep track of who has and hasn't viewed the content. Specifically I need to be able to audit who has seen a specific announcement (The announcement is determined by the GUID id passed to the MVC endpoint)
Is there any way to log who has accessed a page that has been output cached? Anything would be better than nothing, but if the log could be sent to an SQL database, that would be best solution for me.
I would normally log using an ActionFilterAttribute, but according to the stackoverflow article Working with the Output Cache and other Action Filters, that does not work well with the default output cache and instead you could try an alternative called DonutOutputCache.

What's the main advantage to save the results in cakephp cache (i.e. tmp/cache)

I've seen so many plugins which are using Cache to store the results like results from a third part api, and directly taking the
results from the cache instead of sending request to the third party servers again.
1.) But what is the case if the results coming from the server changes from time to time.
2.) Suppose let us assume that we're saving the no.of login attempts made by user into the cache and we'll check this count
for securing the user's account. i.e. if user made 5 wrong login attempts then his/her account will be locked for 5mins and allow
the user to login only after 5mins.
There may be a situation can be happen like this:
Now user has made 3 login attempts from one machine and count will be saved into the cache (cache name will be as username) .
rest 2 login attempts he made from another system and count become 5 then it'll show a message that "you've done 5 incorrect login
attempts, please try after 5mins."
Instead i can use session here to get the best results.
In that case what's the importance of Cache here ?
please tell me, Thanks in Advance.
1) You don't cache results if you rely on changes. There are different APIs, some APIS send an expiration date with the result that can be used to cache the result. Sometimes you simply don't need to get new data with every request, then you cache it. Geolocation is a good example here that can be cached nearly forever.
2) This doesn't make any sense, you can't use session if you try the access an account from multiple devices. Each device will get a new session id. But in this case I would not use a cache engine at all but write it to the users table, the table needs to be queried in any case to get the login data. So it's just a write more for each attempt.

Is it better to process auto-complete/suggestions on the client or server?

I am building a web app that will use an auto-complete/suggestions for the end user as they type their information in. This will be specifically for entering Country, Province, City information.
Do a wild card search on the database on each keystroke:
SELECT CityName
FROM City
WHERE CityName LIKE '%#CityName%'
Return a list of all Cities to a given Province to the client and have the client do the matching:
SELECT CityName
FROM City
WHERE ProvinceID = #ProvinceID
These would be returned to the client as a JSON string via an ajax call to a web service. My thoughts are that javascript would be able to handle the list of 100+ entries via JSON faster than the database would be able to do a wildcard search, but I'd like the communities input.
In the past, I have used both techniques. If you are talking about 100 or so entries, and assuming each entry is very small, it will likely be faster to do the autocomplete filter on the client side. That will provide you with better response time (although probably negligible) and will reduce the load on your server.
Google actually does a live search while the user is typing, and it seems to be pretty responsive from the user's point of view. This is an example where the query must be executed server-side because the dataset is far too large to transfer to the client.
One thing you might do is wait until the user types two keystrokes before fetching the list from the server, thus narrowing down the results initially. Of course, that adds complexity - you would then need to refresh the list if the user changes either of the first two keystrokes.
We have implemented same functionality using ajax auto complete control we wait the user type three keystroke before fetching the list from server we have not done any coding at client side we just assigned web services method which return list to ajax control and its start working
In the end user's interest, it is always better to handle this client-side.
The Telerik Autocomplete controller allows for both ways.
Of course under load client-side autocomplete is likely to make the application crawl.

Resources