I run a system based around an Azure SQL Database.
A few different team members need to have read access to this database to perform support and management tasks.
However, I am concerned that by having access to the database, one of them may - with the best of intentions - export the database and manage the backup carelessly, resulting in a data breach.
How can I get Azure to notify me if somebody backs up the database (or downloads more than X million rows, maybe?) These people need to have database access, I would just like to know if they use it in a way that could cause a security risk for the platform.
You can use Extended Events for this.
To set it up on Azure you can follow this tutorial.
For your case
You create a session
You Select the rpc_completed (docs) event and click configure
In the Global Fields tab you can select the fields you want to keep track of. I.e.: Username, sql_text, session_id, database_name, client_*
In the Filter tab you can select a filter condition. In your case row_count would be appropriate.
When malicious users are smart, and retrieve small numbers of rows and page them this will go undetected. So a second filter could be Querys without WHERE clauses or a different approach based on your case.
When extended events are setup to write to blobstorage. You would have a different process (Azure Function, Runbook, ...) that would inspect the result and alert you.
Extended events are moslty used for troubleshooting, they replace SQL profiler. So turning it on a production server may have a performance impact.
Related
I'm working on a Microsoft BI project.
I am currently in the process of connecting my systems to SQL Server. I want to connect my Active Directory to a table in SQL Server and I want to sync to one table per hour. This means that every hour the details of the Active Directory will be updated.
I realized that it is necessary to use SSIS to do this I would be happy for help to connect my AD to SQL Server with the help of SSIS.
There are two routes available to you to sync AC user classes to a table. You can use an ADO source in an SSIS Data Flow Task or you can write custom .NET code as part of a Script Source. The right answer depends on your team's ability to maintain and troubleshoot a particular solution as well as the size of your AD tree/forest. If you're a small shop (under a thousand) anything is going to work. If you're a larger shop, then you need to worry about the query mechanism and the total rows returned as there is an upper boundary of how many results can be returned in a single query. In that case, then a script task likely makes more sense as you can more easily write a query to pull all the accounts that start with A, B, etc. I've never worked with Hebrew, so I assume one could do a similar filter for aleph, bet, etc.
General steps
Identify your domain controller as you need to know what server to ask information from. I do not know how to deal with Azure Active Directory requests as I believe it works a bit different there but haven't had client work that needed it.
Create a Connection Manager for ADO.NET . Use the ".Net Providers for OleDb\OLE DB Provider for Microsoft Directory Services" and point that to your DC.
Write a query to pull back the data you need. Based on the comment, it seems you want something like this
SELECT
distinguishedName
, mail
, samaccountname
, mobile
, telephoneNumber
, objectSid
, userAccountControl
, title
, sn
FROM
'LDAP://DC=domain,DC=net'
WHERE
sAMAccountType = 805306368
ORDER BY
sAMAccountName ASC
Using that query, we'll add a Data Flow Task and within it, add an ADO.NET Source. Configure it to use our ADO.NET Connection manager and use the above query (adjusting for the LDAP line and any other fields you do/don't need)
Add an OLE DB Connection Manager to your package and point it to the database that will record the data.
Add an OLE DB Destination to the Data Flow and connect the output line from the ADO.NET Source to this destination. Pick the table in the drop down list and on the Columns tab, make sure you have all of your columns connected. You might run into issues where the data types don't match so you'll need to figure out how to handle that - either change your table definition to match the source or you need to add data conversion/derived columns components to the data flow to mangle the data into the correct shape.
You might be tempted to pull in group membership. Do not. Make that a separate task as a person might be a member of many groups (at one client, I am in 94 groups). Also, the MemberOf data type is a DistinguishedName, DN, which SSIS cannot handle. So, check your types before you add them into an AD query.
References
ldap query to get disabled user records with whenchanged within 30 days
http://billfellows.blogspot.com/2011/04/active-directory-ssis-data-source.html
http://billfellows.blogspot.com/2013/11/biml-active-directory-ssis-data-source.html
Is there a particular part of the AD that you want? In any but the smallest corporations the AD tends to be huge. Making a SQL copy of an entire forest every hour is a very strange thing that may have many adverse effects on your AD, network, security and domain-wide performance.
If you are just looking to backup your AD, I believe that there are other options available, specific to the Windows AD (maybe even built-in, I'm not an AD expert).
If you really, truly want to do this here is a link to get you started: https://social.technet.microsoft.com/Forums/ie/en-US/79bb4879-4d82-4a41-81a4-c62afc6c4b1e/copy-all-ad-objects-to-sql-database?forum=winserverDS. You can find many more articles on this just by Googling "Copy AD to Sql".
However, heed the warnings well: the AD is effectively a multi-domain-wide distributed database, attempting to copy it into a centralized database like SQL Server every hour is contra-indicated. You are really fighting against its design.
UPDATE Based on the Comments:
Basically you've got too much in one question here. Sql Server, SSIS and the Active Directory (AD) are each huge subjects in and of themselves and the first time that you attempt to use all of them together you will run into many individual issues depending on your environment, experience and specific project goals. We cannot anticipate all of them in a single answer on this site.
You need to start using the information you have from the following links to begin to implement this yourself, and then ask specific questions as you run into problems along the way.
Here are the links that you can start with,
The link I provided above from MS: https://social.technet.microsoft.com/Forums/ie/en-US/79bb4879-4d82-4a41-81a4-c62afc6c4b1e/copy-all-ad-objects-to-sql-database?forum=winserverDS
The link that you provided in the comments that explains how to setup ADSI as a linked server and how to use T-SQL on it: https://yiengly.wordpress.com/2018/04/08/query-active-directory-in-sql-server-with-linked-server/
This one explain how to use AD from within an SSIS DataFlow task (but is limited to 1000 rows): https://dataqueen.unlimitedviz.com/2012/05/importing-data-from-active-directory-using-ssis/
This related one explains how to use AD within an SSIS Script task to get around the DataFlow task limits: https://dataqueen.unlimitedviz.com/2012/09/get-around-active-directory-paging-on-ssis-import/
As you work your way through this you may run into specific problems, which you can ask about at https://dba.stackexchange.com which has more specific expertise with Sql Server and SSIS.
Based on your goals, I think that you will want to use a staging table approach. That is, use your AD/Sql query to import all of the AD users records into a new/empty temporary table that has the same column definition as your production table, then use a Merge query to find and update the changed user records and insert the new user records (this is called a Differential or Type II update).
We have a SQL Server 2008 R2 database whose tables are used by stored procedures themselves called by dedicated application code (VBA).
Until now all the final users were accessing the same data but for regulatory compliance they will be split into 2 legal entities and we'll have to ensure each user only accesses his entity's records.
Implementing this restriction at the application level is quite simple but not safe (AFAIK any XLA is easily broken).
So to be safe we must implement it at the database level.
My first idea was to simply change the stored procedures to join on the current caller's entity to transparently filter the records retrieved by the SQL queries.
Unfortunately the access is made via a generic SQL Server user, and, from what I've seen on SO and elsewhere, although we are on a full Microsoft infrastructure, there is no way to get the Windows user name.
And indeed all the functions I have tested return the SQL Server user name:
SELECT SUSER_NAME();
SELECT ORIGINAL_LOGIN();
EXEC sp_who
EXEC sp_who2
So, unless I've missed something, we'll have to switch the authentification mode to Windows.
Then, either join as described above, or:
create 2 database roles, one per legal entity, and manually assign users to each one,
create views dedicated to each legal entity, and restrict their access with the roles.
Is there any other option?
My client has been using Microsoft Access 2010 for quite a while and they received some Security Audit Requirements. They are using a Linked Tables approach connecting to Microsoft SQL Server 2012 Express.
The requirements states that all actions against the data must be logged. (INSERT, UPDATE, DELETE AND SELECT statements)
For the INSERT, UPDATE, DELETE statements I could create a trigger which would log the changes.
The issue is around the audit of SELECT statements. If the data was read-only, I could have used a Stored Procedure which would have logged the query. But executing a Stored Proc makes a Recordset not updatable.
Does anyone have an idea how to approach this challenge?
I'm open to a lot of strategies... (Connecting Access to SQL through a web service, anything...)
It's important to note that my client does not have $30k to spend on an Enterprise edition of SQL Sever as they are a small-business with less than 10 employees.
SELECT statements are part of the database-level audit action groups in SQL Server. (Search that page for "database-level audit actions".) But that level of auditing requires SQL Server Enterprise edition.
Theoretically, you can limit all access to use only stored procedures regardless of whether the data is read-only. Write the stored procedure to write auditing information to the log first, then do whatever else needs to be done--SELECT, INSERT, etc.
Practically, you might not be able to do that. It depends on the applications that hit your database. Limiting all access to use only stored procedures can break applications that expect other things. (How would a Ruby on Rails application respond if you switched to just stored procedures?)
A bulletproof audit system that makes your database unusable isn't very good; it's simpler and cheaper to just shut down the database server altogether.
You could upgrade to a SQL Server edition that supports SQL Server profiler.
The other option is to get other tools to audit like sql audit for example.
You could turn on JET showplan. This would log all queries used by Access.
http://www.techrepublic.com/article/use-microsoft-jets-showplan-to-write-more-efficient-queries/?siu-container
As I pointed out in comments you really fooling the audit requirements UNLESS each form is opened using a where clause that limits the viewing of data in that form to the ONE record. If you don’t do this, then a form opened to a linked table could have 1000’s of records, and user(s) hitting ctrl-f to find and jump to one record means the SELECT statement tells you ZERO about what the user actually looked at. So while you can turn on show plan, the audit concept would not tell you anything about what the user actually looked at unless application design changes are made to restrict forms to one record. And to be fair, 99% of my applications in fact do open and restrict the main editing form to the one record via a where clause.
So while you can technology wise log all SELECT commands as per above, it not really in the sprit of such a log since such a log would not be of any use to determine what actual records the user looked at.
Is there any handy tool that can make updating tables easier? Usually I got an Excel file with the original value in one column and new value in another column. Then I write a formula in Excel to create the 'update' statement. Is there any way to simplify the updating task?
I believe the approach in SQL server 2000 and 2005 would be different, so could we discuss them both? Thanks.
In addition, these updates usually request by "non-programmer" (which means they don't understand SQL, so it may not feasible to let them do query), is there any tool that can let them update the table directly without having DBAs do this task? Also, that tool needs to limit the privilege to only modify certain tables. And better has a way rollback the change.
Create a DTS package that will import a csv file, make the updates and then archives the file. The user can drop the file in a specific folder designated for the task or this can be done by an ops person. Schedule the DTS to run every hour, day, etc.
In case your users would insist that they keep using Excel, you've got several different possibilities of getting the data transferred to SQL Server. My preferred one would be to use DTS/SSIS, as mentioned by buckbova.
However, another method is by using OPENROWSET(), which makes it possible to query your Excel file as if it was a table. I wrote a small article about it here: http://blog.hoegaerden.be/2010/03/29/retrieving-data-from-excel/
Another approach that hasn't been mentioned yet (I'm not a big fan of letting regular users edit data directly in the DB), any possibility of creating a small custom application for them?
There you go, a couple more possible solutions :-)
Valentino.
I think the best approach is to expose a view on your data accessible to users who are allowed to do updates, and set up triggers on the view to perform the actual updates on the underlying data. Restrict change to only the columns they should be changing.
This technique can work on SQL Server 2000 and 2005.
I would add audit triggers on the underlying tables so you can always track changes.
You'll have complete control, and they can connect to it with Access or whatever and perform their maintenance.
You could create some accounts in SQL Server for these users and limit their access to only certain tables and columns along with onlu select / update / insert privileges. Then you could create an access database with linked tables to these.
We have an application that has 1000+ databases and 600+ sprocs. Each database represents a different client.
Problem: We need to move this to a single database while creating as little effect on the ui as possible, meaning dont change all the sproc signatures at 1 time.
The connection string currently sets the database attribute, a proposal is to move that to the user attribute. This attribute (using SYSTEM_USER) could be used to determine the site identifier which would be used on the where clause.
The above would not be final solution, but allows us to make changes to the sproc signature at a slow controlled pace. Once all are done we can correct the connstring and get some connection pooling.
Are there any limitation to the number of logins/users that we can have on sqlserver 2005/8. Or has anyone been down this path that could shed some light on a better option.
See my answer here
Ideas for Combining Thousand Databases into One Database
Sounds like you two are working the same project. YOu will need to change every proc before you can move to one datbase or each client will see the others' data.
As for the number of logins on SQL Server 2005 / 08 - I don't think anyone has ever run into a hard limit here. A few thousand will NOT be any problem at all.
What you could consider for this scenario might be one schema inside your single DB per customer, e.g. customer "Miller" has a "miller" schema, with its objects inside, and customer "Brown" will have a "brown" schema.
And contrary to what HLGEM just responded - no, customers won't see each others data, if you specify proper permissions - each customer (and its users) into its own schema only - should work just fine.
Marc
You might also consider setting a distinctive application name in the connection string rather than using a distinctive user, which you can get into your where clause using APP_NAME(). I'm sure that SQL Server won't have a problem with thousands of logins, but you may prefer not to have to create them.