How to synchronize DB reading and writing from a servlet - database

I have a Servlet when a request comes it checks for the user id and then if id is not there it creates a new user id in the database. But if I get multiple requests with a very short delay then all those request tend to see that there is not user at the moment and create multiple users with the same name. I just don't want to make the user id field unique to solve this problem. Other than the user id I store some related data as well.
I need to know how to keep a DB locked until one Servlet request is finished processing.

You need to make your servlet code synchronized.
Easy way is to make your servlet implement SingleThreadModel.
http://www.javatpoint.com/SingleThreadModel-interface
But this is not a good approach as your servlet will handle only one thread/request at a time. Good solution is to synchronize the part where you check and generate the uid.

Related

REST-api structure when creating associated records during update

In my database, I have two tables, User and Location. Each user should have a location. If I have a user and want to update it, I use PUT on "users/:id", and could add locationId if I want to. What if location does not already exist? I am aware that I could first create a location and then update with the new locationId, but I would like to be able to do it in one request.
I have implemented a way to do this, where I can send a location-object under key: "location" in the body when doing the request in "users/:id". This works, but I understand that this means I am creating in my database, when the requests in itself is a PUT-request. I this allowed in REST, or do I need to do two requests to follow the best practices of REST? The two requests seems pretty annoying for client if I have multiple associations to the user, and would be a lot of requests.

How to store a Correlation ID (X-Correlation-ID) in the application that is responsible for generating it

I'm implementing a correlation ID within my applications and would like some feedback on the design for it. The primary concern is that the correlation ID should be available for all logs.
Lets say I have a web (front-end) application which is serving pages to my users. It talks with two API's which provide data. The API's are not exposed to the user, and so all requests 'begin' in the front-end app.
The API's job is simple, they consume the correlation ID as provided in all headers from the front-end app (X-Correlation-ID) and print it in any logs.
The front-end app has to generate the ID, add it to the headers for outgoing requests, but it must also consume the ID.
My question is this: How does the front-end app store the correlation ID?
My first thought was that it would modify the incoming request and add the header if it did not exist, however this would make the incoming request somewhat 'unreliable' as it is now modified.
Another thought is perhaps it is stored as some kind of application global that is cleared per request.
Correlation ids i.e. ids typically attached to headers like Request-ID, X-Request-ID, X-Trace-ID, X-Correlation-ID are typically issued per request.
You seem to want to store it locally on the client though. What you describe sounds more like a “session id” that gets reset when the client “restarts”. If that is the case, then you simply use local/session storage or cookies to store and clear it when needed.
Do keep in mind that first sentence above though. Correlation ids are typically used per request. What I usually do:
Generate an id on the client per request
Pass it to the API via one of the aforementioned headers
Whoever gets the request first (some API gateway, HA Proxy etc) checks for the existence of the header and proxies it further downstream. So do any services calling other services. This is usually provided as a service:tool to most services/teams so that they don’t forget to do it.
Profit?
That’s what heroku does for example. Same for many other services / companies.
Goes without saying, you can combine the two ids, the “session” one you refer to plus the ones generated per request to get a better view of what is going on in logs etc
As per my design, I'd suggest to always check (intercept using middlewares/filters on backend) each request from the client at the first point of contact at the backend (load balancer/ gateway / controller) and check for the Trace-ID/Correlation-ID at the request header, if present then forward the request as it is, if not present (because this is the first call from that new client), then generate the random ID (as Trace-ID) and attach that newly generated random ID to the Request Header and reroute/pass on the request further down the application.
One more thing, while sending back the response, make sure to again add this ID generated earlier to the Response Header, so that the client can receive this unqiue ID and save it on localstorage/cookie for further calls so that the client could easily be traced using that trace ID.
As you requested for logs, now since you have that correlation ID/trace ID, you can log them anywhere you want and you can easily determine the complete flow of the client's request in case of any issues using this unique ID.
Steps:
Check for the Correlation ID in the Request Header (at Backend)
If already present, allow the request to pass through to the required service.
If not present, generate a unique random ID and add a custom Header in the Request Header.
Again, before sending the response, again add this custom Header to the Response header for client.
I hope that answers your query.

Creating a new user in ADAL TokenValidated results in duplicates

I have a multi-tenant REST app. When a new user first tries to access my application ( and assuming their admin has already granted the app permission for their directory ) I create a user row in my User table and store their name/email and other fields. I perform this in the TokenValidated event of JwtBearerEvents.
Unfortunately, I'm ending up with multiple users rows attempting to be inserted because of simultaneous (parallel request) hitting my web API. I do a simple SQL query for the User by ObjectId, and then create if necessary. This isn't threadsafe. I tried wrapping it in a SQL transaction, but the select isn't blocking and I'm not sure EF Core lets me perform the kind of locking I'd need to block other selects from completing.
I'm basing my code off the TailSpin PnP and they perform the same logic here as well. My guess is their site logic is forcing a single call the the WEB API first as part of the sign-in/login process, where the new user is created if they don't exist. In my flow, the REST API is hit right off the bat with multiple HTTP GET's and I just have to validate the bearer token in the headers and let ADAL cache it.
Aside from changing my client logic, and forcing the first call to API to be a single HTTP GET, how else can I make this work in a REST world? I can't use SESSION logic to block other calls in the same session. I'm not sure how I can perform a lock across the whole server ( Which works only if there's one server ). I could use the DB layer to hold a write lock, but that seems dirty. Maybe there's a better place to put the Create new user logic? Is there some other way for me to safely perform a one time atomic operation?
Based on the description, it seems you were create the user record(sign-up) when the users call the REST API and after the token is validated.
To fix the duplicates records issue, one possible way is that separate the sign-up progress from token validation as same the code sample TailSpin PnP. For example, we can custom the token handler to verify whether the users is sign-up and provide the UI for the users sign-up.
Another way is that, you can insert the users sequentially by using the lock. For example, here is the code for your reference:
private Task tokenValidated(TokenValidatedContext context)
{
lock (obj)
{
//query db and insert users here
}
return Task.Delay(0);
}

What is the best approach to work with data while using token based authentication

I am building an sample application that lets user store comments.
I've created the registration and login process. When the user registers, his details are stored in a MySQL database and a token is returned to the browser. Now he can access the Profile page.
When an existing user logs in he is redirected to the profile page. The profile page is accessible only when a user registers or logs in.
After logging in, I want to show all his comments if he has already added them.
My frontend is in Angular and backend use Laravel. For authentication I use Satellizer.
I want to know, what is the best approach while playing with data, considering the fact that the user will add, edit his comments. Should I use localstorage and store data in a key value pair or should I create a json file which gets updated everytime the user adds a comment or makes a change.
I wanted to know what is the most efficient way to deal with data from server so that the application is fast even when it scales to a 10000 users and lot of data for each user.
Thanks
You should be updating it on the server when changes are made rather than only relying on localstorage. You can use localstorage to cache, but it should only be for immutable data, it shouldn't really be used for data that is going to change.
So in this case you'll be adding and updating new comments via your API (ideally a RESTful one!). Once you've made a change, you could store the comments locally and only update them when the user makes a new comment, however you'll quickly run into issues where the data is invalid on different clients. (i.e. if you update the comments on a different computer, the other computer won't be aware).
Alternatively, you could cache the comments and then simply ping the server to find out if new comments have been added. This could be using a HEAD request for example to check the last modified date on your comments resource.
You can store comments data locally on user browser, but you should properly manage it.
I don't how much load your server will have and if the time invested now worths it.
You can fetch comments and store them locally
User adds a comment, then you update locally and send a request to the server
You need to track the request response, if requests fail so notify user and remove comments from local.
if request was successful so you can continue on your way.
** facebook uses this "success first" approach
user does an action, and he see it happens instantly, in the background it could take few seconds, only if it fails they will notify you.
** look at their commenting process, when you comment, it appears instantly, no loading... but in the BG the load happens.

Is it possible to update/delete User by externalId

We are trying to develop a SCIM enabled Provisioning system for provisioning data from an Enterprise Cloud Subscriber(ECS) to Salesforce(Cloud Service Provider-CSP). We are following SCIM 1.1 standard.
What are we able to do:
We are able to perform CRUD operations on User object using Salesforce auto-generated userId field
Exact Problem:
We are not able to update/delete User object using externalId provided by ECS.
Tried something as below... But it is not working, Unknown_Exception is thrown...
XXX/my.salesforce.com/services/scim/v1/Users/701984?fields=externalId
Please note that it is not possible to store Salesforce userId in ECS's database due to some compliance reasons. So we have to completely depend upon externalId only.
Possible Workaround:
Step1: Read the userId based on externalId from Salesforce
Step2: Update the User object using the salesforce UserId obtained in Step1.
But this two step process would definitely degrade the performance.
Is there any way to update/delete the User by externalId
Could you please guide us on this..
Thanks so much....
I realize this is old thread but wanted to note that you CAN update Users from REST using an external ID. The endpoint in above question is incorrect. Following is how it should be set, send as a PATCH request:
[instance]/services/data/v37.0/sobjects/user/[external_id__c]/[external id value]
Instance = your instance i.e. https://test.salesforce.com/
external_id__c = API name of your custom external Id field on User
external id value = whatever the value of the user's external Id
NOTES:
Salesforce responds with an HTTP 204 status code with No Content in the body, this isn't usual for patch requests, but it is 'success' response
The external id on user has to be a custom field, make sure it is set
as UNIQUE
Ensure the profile/permission set of the user that is making the call
has the Manage Users permission & has access to the external id field
It is pretty common pattern for other applications, too, to search first and then perform on update on the returned object. Your workaround seems fine to me. What performance problem are you concerned about? Are you concerned about Salesforce not being able to process more requests or are you concerned about the higher response time in your application because you need to make multiple requests? Have you actually measured how much an extra call costs?

Resources