I know Snowflake Reader Accounts let users query databases shared with them but can they connect the reader account to a 3rd party BI or Data Analysis tool like Tableau as you can with a Full Account.
Yes, a reader account is a "full account" that only reads, thus they cannot create tables etc. But most BI tools only read.
Now maybe there is a more subtle question that you mean, which is some tooling might understand their target DB and user things like temp tables to do work, which a reader account does not allow, and thus you are possibly really asking.
Can someone confirm Tableau works against a reader account?
Which I cannot directly answer, we have customers that use our reader accounts, but I have not seen specific see Tableau usage.
Related
I am a former Lotus Notes/Domino developer, learning the Microsoft stack. I am building a web application with a SQL server back end. I want to build a basic CRUD application where users register, then login. They should be able to work with data they created or that has been shared with them. I am unsure how to achieve that result.
One of the cool features of the Domino database was the ability to add reader and or author fields to each record with user IDs, or roles and the server would automatically filter out those records when a user accesses the database. So if the user opened a View or Queried the database, the server would never let them see records they were not assigned access to.
Does SQL server have similar functionality? I've seen some information on row-level security, is that the best way to secure data? If not, what is the best practice to secure data so users can only see their data?
Thanks for any help you can offer.
Yes. SQL server has Row Level Security:
Row-Level Security enables you to use group membership or execution
context to control access to rows in a database table.
Row-Level Security (RLS) simplifies the design and coding of security
in your application. RLS helps you implement restrictions on data row
access. For example, you can ensure that workers access only those
data rows that are pertinent to their department. Another example is
to restrict customers' data access to only the data relevant to their
company.
The access restriction logic is located in the database tier rather
than away from the data in another application tier. The database
system applies the access restrictions every time that data access is
attempted from any tier. This makes your security system more reliable
and robust by reducing the surface area of your security system.
But it's more common to embed the authorization logic in a server-side application like a web server or web API. Row-Level Security is more common where users connect directly to the database, like in reporting and analytics, and client/server desktop applications.
Background
5-10 data sources
Various formats (csv, psv, xml)
Different update schedules (weekly, monthly, quarterly)
Requirements
Only interested in some of the fields from each data source
Want to build a model from the various sources, into a single database (SQL Server)
Current platform/skillset
Azure
SQL Server
Considerations
Minimal code. Hopefully i can do this all via a UI/drag-drop interface.
Automation. Hoping i can drop the files onto a server when it needs to be updated, then "things" kick off (Azure Functions blob/FTP trigger?)
Questions
I haven't done much in the ETL space, but my initial thoughts point to something like SQL Server Integration Services, mainly because that's the only thing i can ever had experience in, ETL-wise.
Now that we have things like Azure Data Factory, SQL Data Warehouse, etc, would that be a better solution? Obviously the answer is "it depends", so what questions do i need to go about asking myself in order to clarify that? Can someone please point me to a good article to get started in this space?
TIA
The main question is where do you want to stage the data.
Many people are talking about Azure Data Lake as a staging area. There are pros and cons to this solution.
The pros are Azure Active Directory Service can be federated with your on premise forest. Once that is done, regular Access Control List can be used to restrict access.
The cons are the fact that you are using premium storage (SSD) which can cost a-lot of money for a small to medium size company.
On the other hand, Azure Blob Storage has been around for a long time. One of the pros is the cost of this storage. A shared access signature (SAS) can be used to let anyone access to the account.
The cons is that the SAS is the key to the whole kingdom. Unlike ADLS, you can not assign privledges at the file.
If you like SQL Server OpenRowSet or Bulk Insert, you are in for a treat. Support for those functions were added earlier this year.
Check out my article on MS SQL TIPS for the details.
As for scheduling, you can use a very simple Power Shell script in Azure Automation to create a hands off process.
Azure Data Factory might be able to do some of these tasks; However, you adding a-lot more complexity than a simple T-SQL statement to load data into a table.
Last but not least, learn to love PowerShell. You can pretty much do any type of file processing with that language and the right .NET components.
Happy coding.
John Miner
The Crafty DBA
In my company we have a selected list of companies that are using our in-house built tool (e.g. Northwind).
When we make changes we deploy these to all our client locations.
The structure currently is: the application is installed at the client's location and the databases sit with them.
However, we would like to consolidate all this information into one database and clients will connect via web services for any data requests.
For example....We have deployed Northwind App and Db to company X, Y and Z and would like to create a single database to maintain all these company's data.
We have reviewed one option which is to create a field for the Company to associate it with the various tables and another option is to create a schema for each company and in this way we can allocate permissions to the relevant company. Is there an alternative to this and what are the pros and cons to the ways we could do this.
One con with adding a company field, is that we have to cater for indexes being the same in all the client databases and this makes it more difficult and the performance of the app as a whole due to multiple requests to the same db. Please Help?
Note: Using Sql Server 2008
Research "multi-tenant database architecture". (For your purposes, think of one tenant as one client.) You'll find a spectrum from "one database per tenant" to "every tenant in every table".
Read carefully. Writers in this field can confuse you. Expect technical terms like shared schema to mean different things to different writers.
See this SO answer for tradeoffs.
For your first step, I wouldn't consider anything besides simply moving those client databases in-house. Just doing that is going to give you and your application programmers enough headaches. You don't need an architectural change on top of it.
That will also give you time for research and testing.
I am developing one Inventory project to sell by using C# and MS Access. After sell this product, Client(or any other) may open the document and read it. I need to overcome this less secure.
I googled it and found solution that Password protection. But lot of tools there to breach this. Finally I need to close all the ways for steal and view data.
If you feel that the password protection feature of the Access Database Engine (ACE) does not offer a sufficient level of security against unauthorized "snooping" of the data then you really should consider using SQL Server instead of ACE as the back-end database.
I have a rich client program installed on users PCs where I want to start storing some user created data on SQL Azure/SQL Server. The potential anonymous-to-me users would key in their name, email account and a password which would get stored on SQL Azure/SQL Server. Then they would start generating their own data. I'm anticipating volumes of maybe 1000 users.
There are times when those users would like to run their own queries against their own data but, obviously, I must ensure that they can never view other users data.
I'm thinking the best way to ensure security of data is for each user to be issued their own SQL Azure account and password. I will setup a SQL Azure user and long password, known only to me, which only has permissions to execute several stored procedures with appropriate parameters being passed to those SPs which will create the SQL Server accounts, logins and add the users to a role which I have created.
Obviously someone running debugging tools could figure out the user name and password but I'm thinking this isn't a big deal. If all that particular SQL Azure account can do is execute a few SPs so what if a malicious individual starts doing that. I will only allow a very limited amount of data to be uploaded before I require payment.
The users can only insert records using stored procedures which use the following:
SELECT #uName=SYSTEM_USER
and only select appropriate parent records. All stored procedures which users can execute would have the above as required to ensure they can only work with their own records.
All views will have embedded with them WHERE clauses such as
WHERE tbLoginName = SYSTEM_USER.
I'm new to SQL Server so I may be missing some fundamental concepts so I'd appreciate any and all comments.
The issue is, as pointed out on http://msdn.microsoft.com/en-us/library/ms189751.aspx:
In SQL Azure, only the server-level principal login (created by the provisioning process) or members of the loginmanager database role in the master database can create new logins.
Those accounts are also capable of alter and drop logins. So if you embed those accounts in the client application, you’re essentially granting every user permission to alter/drop other users accounts. While an average user won’t do that, a hacker will. So you cannot let a client application manage SQL Azure logins, unless only trusted users (such as your IT administrator) are permitted to use the app.
Best Regards,
Ming Xu.
I would like to point out a potential issue in the approach you mentioned: Your master SQL Azure account need to have privilege to create new accounts and grant them access to particular tables. This means your master account itself need to also have access to all those tables. If you store the master account on the client side, a clever user will get access to all users data.
From my experience, connecting to a database directly from a client side application will almost always make your solution less secure. You can do that for testing purposes, but in a real world solution, I would like to suggest you to use a service. You can host the service in Windows Azure. Then the service will access the database, and client application can only access the service. In this way, you can authenticate clients using any mechanisms you like, such as ASP.NET membership.
Best Regards,
Ming Xu.
You are essentially creating a physical two-tier database connection, allowing a client application to connect directly to the database. This is a problem in many ways, including security and performance. From a security standpoint, not controlling from where your customers will connect, you will need to keep your firewall rule wide open for anyone in the world to try to hack every customer uid/pwd. And instead of having only 1 user id to play with, hackers will have up to 1,000...
The second issue is performance. You applications will be unable to leverage connection pooling, creating undue stress on your database server and possibly hitting throttling issues at some point. Using a web service, with ASP.NET membership to manage logins, and using a service account (i.e. the same uid/pwd) to get data will ensure you will leverage connection pooling correctly if you keep the connection string the same for all your requests.
With a web service layer you also have a whole slew of new options at your fingertips that a two-tier architecture can't offer. This includes centralizing your business and data access logic, adding caching for improved performance, adding auditing in a centralized location, allowing to make updates to parts of your applications without redeploying anything at your customer locations and so much more...
In the cloud, you are much better off leveraging web services.
My 2 cents.