Is this relevant to hash everything related to finance in a DB? - database

I'm working on an app with financial informations about users.
I'm wondering if the more secure way is to hash everything that is related to the users finance in the database and just decrypt everything in the application frontend or is it better to anonymise everything related to the user ?
Or are there other options I didn't think about ?
I have searched different solutions but I want to know what are the best practice with securing financial data.

Related

Securing user data on strapi or firebase

I’m planning to make an app where a user can add content (like tweets) and I’m just curious what are usually the practices in storing these kinds of data. I wouldn’t prefer seeing what my user posts as an admin so should I encrypt it?
I’m also planning on using either Strapi or Firebase for my app.
Any references or some articles to get me started and better understand how to handle data would be a huge help.

How do I authenticate users of a web-app to access GCP data relevant only to them?

I have spent 3 days researching this problem and cannot find a solution or similar use case that shows how to solve the problem, so any pointers would be greatly appreciated.
I am creating a web-app that uses Google Cloud Storage and Bigquery. A user registers on the web app and then can upload data to Cloud Storage and Big Query. Two users could be from the same company and therefore should be able to view the same data - i.e. Jack and Jill work for company A and if Jack uploads a massive dataset via this app, Jill should also be able to view it later.
Another scenario will be I have two completely separate clients with users using this web-app. If users from Company A upload data, users from Company B should not be able to view Company A's data, and vice versa. But users from the same company should be able to view the data within their company.
Currently, I have an app that works for a single company. This has a React front-end that uses Firebase for authentication. Once the user is logged in, they can use the app which sends off API calls to a Flask back-end that does some error checking and authentication checking and then fires off an API call to GCP. This uses a service account and the key is loaded as an environment variable in the environment in which the Flask app is running.
However, if Company B want to use the app now, both Company A and Company B will be able to see each other's data and visualize it through the app. In addition, they will be sharing a project (I would like to change this to allocate billing more easily to have each client have their own project).
I ultimately want to get this app onto Kubernetes and ensure that each company is independent of each other, however, do not want to have to have separate URL's for every company using the app. Also, I want to abstract GCP away from the client. I would prefer to authenticate a user based on their login credentials and then they will be given access to their GCP project (via my front-end) accordingly.
I thought about perhaps having separate service keys for each client and then storing the service key info in Firebase, while using the respective keys for API calls but not sure this is best practice. It is however the only strategy I can think of.
If anyone could provide some help or guidance it would be very much appreciated. This is my first GCP project and have not been able to find any answers on GCP, SO, Google Groups, Slack or Medium.
Thanks,
TJ
First if all, welcome on GCP! It's an awesome platform, very powerful and flexible. But not magic.
Indeed, the use case that you describe is specific to your business logic. GCP provides told for securing access for user and VM(through service account) but not for customer. Here you have to implement your own custom and authorisation logic, with a database (I don't recommend bigquery for website, the latency is too high) to list three users, the companies where they work, the blobs of each company...
Nothing is magic and your use case specific.
If you want to discuss more about which component to use and to start, no problem. Let a comment.

exporting data for analytics use in SaaS

We are a SaaS product and we would like to be able have per-user data exports that will be used with various analytical (BI) tools like Tableau or PowerBI. Instead of just managing all those exports manually, we thought of using some cloud database such as AWS Redshift (which will be part of our service). But then, it is not clear how is user will access those databases naturally, unless we do some kind of SSO integration with AWS.
So - what is the best practice for exporting data for analytics use in SaaS products?
In this case you can build your security in to your backend API layer.
First you can set up processes to load your data to Redshift, then make sure that only your backend API server/cluster has access to redshift (e.g. through a vpc with no external ip access to redshift)
Now you have your data, you can validate your user as usual through your backend service, then when a user requests a download through the backend API, the backend can create a query to extract from redshift only the correct data based upon the users security role. In order to make this possible you may need to build some kind of security column into your redshift data model.
I am assuming getting data to redshift is not a problem.
What you are looking for, if I understand correctly is a OEM solutions.
The problem is how does one mimic the security model you have in place for your SaaS offering.
That depends on how complex is your security model.
If it is as simple as just authenticate the user and he has access to all tenant data or the data can be easily filtered for user. Things are simple for you. Trusted authentication will allow you to authenticate that user and user filtering will allow you to show him all that he has access to.
But here is the kicker, if your security is really complex , then it can become really difficult to mimic it within these products.
Here for integrating tableau this link will help:-
https://tableau.github.io/embedding-playbook/#
Power BI, this product am not a fan off. I tried to embed a view in one my applications and data refresh was a big issue.
Its almost like they want you to be a azure shop for real time reporting.( I like GCP more )
If you create the api's and populate datasets then they have crazy restrictions like 1MB/sec etc.
On the other instances datasets can be refreshed only 8 times.
I gave up on them.
Very recently I got a call from Sisense and they seemed promising as well from a OEM perspective. You might was to try them.

How to log in on multiple sites with one log in?

I've tried to Google this for a while now, but haven't found anything yet. Mostly because I'm not even sure what type of resources and what in general I should be looking for.
This question might be too broad, so if there's no good way to answer this, I'd be glad to be pointed to corresponding resources from where I could read and learn more about this.
My scenario is following:
I have two sites Site A and Site B.
A user logs in on site A. Site A redirects the log in, so that the person is asked to log in on site B.
If log in succeeds, the person is also logged in on site A and some information from site B is sent to site A. Otherwise user isn't logged in on any site.
Log in methods can be some custom or google sign in for example. I know from several sites that this kind of scenario is possible, but I have no idea of how this could be implemented.
These systems could physically be on different servers and there's no really a need for two way communication, though that would be great to learn too.
I know the basics of single site log in and how to do that and authorize access etc, but is there a way to authorize a user on some other site as well with that single authorization?
Technologies I've planned to use in this are Meteor, React and OAuth 2.0, but general or different solution is appreciated as well, if there's no way to implement this with those technologies.
Thanks.

Best way to store users to allow single sign-on for multiple asp.net mvc websites

I have multiple asp.net mvc websites hosted in sub folders of a main domain. Each website has its own sql server database. Currently users need to sign up to each individual website if they want access but I am looking for a sso solution. I guess a little bit like how ebay works where you can sign up in one country's domain but can log into ebay from any other of the domains ebay has sites for.
I am looking for the best architectural design to achieve this. After a lot of googling this seems to be the only solution that fits the bill but wanted to check first (http://arunendapally.com/post/implementation-of-single-sign-on-(sso)-in-asp.net-mvc). If this is the right approach how does that effect the database design, would the users now only get stored in their own database all of the other websites have access to?
I think that maybe moving the authentication part of your application somewhere else would be a better option.
You could try the approach suggested in your link.
Another option would be to look at external providers.
I'm thinking here about azure active directory (https://azure.microsoft.com/en-us/documentation/articles/active-directory-whatis/ with this example http://www.asp.net/identity/overview/getting-started/developing-aspnet-apps-with-windows-azure-active-directory) or auth0 (https://auth0.com/) for example.
That way you have a separate place for your user accounts which are in a database on the cloud which you don't have to manage. Somthing like azure active directory allows you to also set which users are allowed to access which app, which is also something you need to take into account.

Resources