We're running SQL Server 2012 / .Net Framework 4.5.1
We have an application that does the following:
Extract all table data from a source database using an instance of .Net's SqlBulkCopy.
Delete all data in a target database using regular SQL statements.
Deploy the data from the source database to the target database using an instance of .Net's SqlBulkCopy.
The third step is successful when the SQL connection uses my Active Directory account, but fails with the following error message when using a SQL Server account created for this purpose: Cannot find the object "[SchemaName].[TableName]" because it does not exist or you do not have permissions.
Interestingly, the process runs through about a dozen tables before hitting one that causes this error. Manual verification proves that a) The table exists on the target, b) The problem user can select from the table, and c) the problem user can manually insert into the table with the standard INSERT INTO [SchemaName].[TableName] ([Columns]) VALUES ([Values]) format. BCP also works for that user, but using SqlBulkCopy from a .Net application fails for the same user.
Our DBA (A pretty seasoned guy, so far as I can tell, actually) says that the database permissions on the target database are IDENTICAL between the two users, but reality would seem to suggest this is not the case.
Googling the problem shows that the user should have the db_owner or db_ddladmin roles. The user actually belongs to both.
Anyway, solving the local problem is of secondary concern, since I can get done what I need done with my AD account. What I'd really like to know is whether there is a baked-in way to compare the differences in permissions between two users. If not, can this be done with a T-SQL query of some kind?
Thanks, guys and gals!
Here's my permissions script that I use. It's generally the approach that everyone uses, unless they have a schema compare product via Visual Studio, Red Gate, etc. http://www.csvreader.com/posts/permissions_list.php
Are you specifying the schema on the destination table with SqlBulkCopy? Is it possible that you're running into a user owned schema instance?
It's also been my experience that SqlBulkCopy only requires select and insert on the destination table. BCP requires the escalated permissions that you described, which is another benefit of SqlBulkCopy.
Related
I'm working on a Microsoft BI project.
I am currently in the process of connecting my systems to SQL Server. I want to connect my Active Directory to a table in SQL Server and I want to sync to one table per hour. This means that every hour the details of the Active Directory will be updated.
I realized that it is necessary to use SSIS to do this I would be happy for help to connect my AD to SQL Server with the help of SSIS.
There are two routes available to you to sync AC user classes to a table. You can use an ADO source in an SSIS Data Flow Task or you can write custom .NET code as part of a Script Source. The right answer depends on your team's ability to maintain and troubleshoot a particular solution as well as the size of your AD tree/forest. If you're a small shop (under a thousand) anything is going to work. If you're a larger shop, then you need to worry about the query mechanism and the total rows returned as there is an upper boundary of how many results can be returned in a single query. In that case, then a script task likely makes more sense as you can more easily write a query to pull all the accounts that start with A, B, etc. I've never worked with Hebrew, so I assume one could do a similar filter for aleph, bet, etc.
General steps
Identify your domain controller as you need to know what server to ask information from. I do not know how to deal with Azure Active Directory requests as I believe it works a bit different there but haven't had client work that needed it.
Create a Connection Manager for ADO.NET . Use the ".Net Providers for OleDb\OLE DB Provider for Microsoft Directory Services" and point that to your DC.
Write a query to pull back the data you need. Based on the comment, it seems you want something like this
SELECT
distinguishedName
, mail
, samaccountname
, mobile
, telephoneNumber
, objectSid
, userAccountControl
, title
, sn
FROM
'LDAP://DC=domain,DC=net'
WHERE
sAMAccountType = 805306368
ORDER BY
sAMAccountName ASC
Using that query, we'll add a Data Flow Task and within it, add an ADO.NET Source. Configure it to use our ADO.NET Connection manager and use the above query (adjusting for the LDAP line and any other fields you do/don't need)
Add an OLE DB Connection Manager to your package and point it to the database that will record the data.
Add an OLE DB Destination to the Data Flow and connect the output line from the ADO.NET Source to this destination. Pick the table in the drop down list and on the Columns tab, make sure you have all of your columns connected. You might run into issues where the data types don't match so you'll need to figure out how to handle that - either change your table definition to match the source or you need to add data conversion/derived columns components to the data flow to mangle the data into the correct shape.
You might be tempted to pull in group membership. Do not. Make that a separate task as a person might be a member of many groups (at one client, I am in 94 groups). Also, the MemberOf data type is a DistinguishedName, DN, which SSIS cannot handle. So, check your types before you add them into an AD query.
References
ldap query to get disabled user records with whenchanged within 30 days
http://billfellows.blogspot.com/2011/04/active-directory-ssis-data-source.html
http://billfellows.blogspot.com/2013/11/biml-active-directory-ssis-data-source.html
Is there a particular part of the AD that you want? In any but the smallest corporations the AD tends to be huge. Making a SQL copy of an entire forest every hour is a very strange thing that may have many adverse effects on your AD, network, security and domain-wide performance.
If you are just looking to backup your AD, I believe that there are other options available, specific to the Windows AD (maybe even built-in, I'm not an AD expert).
If you really, truly want to do this here is a link to get you started: https://social.technet.microsoft.com/Forums/ie/en-US/79bb4879-4d82-4a41-81a4-c62afc6c4b1e/copy-all-ad-objects-to-sql-database?forum=winserverDS. You can find many more articles on this just by Googling "Copy AD to Sql".
However, heed the warnings well: the AD is effectively a multi-domain-wide distributed database, attempting to copy it into a centralized database like SQL Server every hour is contra-indicated. You are really fighting against its design.
UPDATE Based on the Comments:
Basically you've got too much in one question here. Sql Server, SSIS and the Active Directory (AD) are each huge subjects in and of themselves and the first time that you attempt to use all of them together you will run into many individual issues depending on your environment, experience and specific project goals. We cannot anticipate all of them in a single answer on this site.
You need to start using the information you have from the following links to begin to implement this yourself, and then ask specific questions as you run into problems along the way.
Here are the links that you can start with,
The link I provided above from MS: https://social.technet.microsoft.com/Forums/ie/en-US/79bb4879-4d82-4a41-81a4-c62afc6c4b1e/copy-all-ad-objects-to-sql-database?forum=winserverDS
The link that you provided in the comments that explains how to setup ADSI as a linked server and how to use T-SQL on it: https://yiengly.wordpress.com/2018/04/08/query-active-directory-in-sql-server-with-linked-server/
This one explain how to use AD from within an SSIS DataFlow task (but is limited to 1000 rows): https://dataqueen.unlimitedviz.com/2012/05/importing-data-from-active-directory-using-ssis/
This related one explains how to use AD within an SSIS Script task to get around the DataFlow task limits: https://dataqueen.unlimitedviz.com/2012/09/get-around-active-directory-paging-on-ssis-import/
As you work your way through this you may run into specific problems, which you can ask about at https://dba.stackexchange.com which has more specific expertise with Sql Server and SSIS.
Based on your goals, I think that you will want to use a staging table approach. That is, use your AD/Sql query to import all of the AD users records into a new/empty temporary table that has the same column definition as your production table, then use a Merge query to find and update the changed user records and insert the new user records (this is called a Differential or Type II update).
I last looked at Schemas (on Oracle) about 20 years ago, I know that Microsoft changed schemas in SQL Server 2005. We're now about to create a new application and I've long wanted to take another look at schemas.
We use 1 specific login to do the applications work so it has db_owner role and 1 specific login for running all reports so it has data_reader role.
I've done my research as well as poked around and wrote some scripts. This script "appears" to be all I need in order to create a schema:
CREATE SCHEMA [MySchema];
I used the sa user to create the schema and related tables. From there, I've been able to create tables within the schema and access them just fine from the two users.
My question is, was this very simple statement all that there is required to create a schema "correctly" and are there any specifics I should be watching out for?
We already access all db objects with the [dbo]. schema prefix in preparation for going multi-schema. I'm just not sure if there's something sneaky when we finally start getting into stored procedures, functions, views, indexes, foreign keys and the likes. So far all my testing has come up roses but I'm concerned I'm missing something that's going to really beat me with a stick some ways down the line.
I'm tired of searching for this, but I couldn't find anything.
I have three databases in SQL Server and although all stored procedures are in the Main database, they work with tables from the other databases.
My question is: if you have the query
select name
from SecondDatabase.dbo.SomeTable
where id = 56
and this query is stored in the main database, will it run in the main database and go all the way to the second database and returns the data, or will it run in the second database and you have the select result directly?
(hope you understand my question)
I think you are misunderstanding the difference between a Database and an Instance.
An instance is the software running the SQL service. Each instance can have multiple databases. For example, there is a master database and a tempdb database for each instance of SQL Server, these are system databases. You can create any number of user databases. All these databases will be handled by the same SQL Server instance (on the same machine).
A particular client session is connected first to an instance and then to a particular database, thats why you include which database you will connect to by default on connection strings (or by login). When you write select name from SecondDatabase.dbo.SomeTable, you are telling the SQL service to retrieve data from the SecondDatabase, even if your session is linked to any other database. The engine will then use your login credential to match a user of the other database (since users go by database and logins by instance) to validate if it has enough privileges to query that table, before searching for the data.
A complete different story would be trying to access data from another instance (machine), in which you will need a linked server, a openrowset or such.
use FirstDatabase
select name
from SecondDatabase.dbo.SomeTable
where id = 56
Question:
will it run in the main database and go all the way to the second
database and returns the data, or will it run in the second database
and you have the select result directly?
Your first assumption is correct:
This query will run in a first database, it will use context and all settings (ANSI, query optimizer and statistic related) of the first database but will get data from a table of the second database.
Just an example from a life: if database have to stay in an old compatibility mode, but new T-SQL features need occasionally to be used, query can switch context to tempdb (which normally set to the latest compatibility level) and run queries referencing data from any other database where access is granted. Usage of those new features will not raise exception
The (now edited) query above will always execute on SecondDatabase.dbo.SomeTable even if the active database context was another database and even if the active user had a different default schema. This is because the object SomeTable is qualified with the schema and the schema owner.
Test to illustrate that the following still returns the executed results (assuming the objects exist and the active user context has access to them)
USE [OtherDatabaseSchema]
GO
SELECT TOP 10 *
FROM [SecondDatabase].[dbo].[SomeTable]
I have an access 2003 database that holds all of my business data. This access database gets updated every few hours during the day.
We're currently writing a website that will need to use the data from the access database. This website (for the time being) will have only read only capabilities. Meaning there will only need to be one way transfer of data (Access -> SQL).
I'm imaging there's a way to perform this data migration from access to SQL server programatically. Does anyone have any links to something I can read about?
If this practice sounds odd, and you'd like to suggest another way to do this (or a situation where data can go both ways (Access -> SQL, SQL -> Access), that's perfectly fine.
The company is going to continue using Access 2003 for their business functionality. There's no way around that. But I'd like to build the (readonly) website on top of SQL Server.
The strategy you outlined can be very challenging. You could use INSERT queries to copy new Access rows to SQL Server, as described in another answer.
However, if you have changes to existing Access rows, and you also want those changes propagated to SQL Server, it won't be so simple. And it will be more complicated still if you want deleted Access rows deleted from SQL Server, too.
It seems more reasonable to me to use a different approach. Migrate the data to SQL Server once. Then replace the tables in your Access database with ODBC links to the SQL Server tables. Thereafter, changes to the data from within your Access application will not require a separate synchronization step ... they will already be in SQL Server. And you won't need to write any code to synchronize them.
If your concern is that the connections between the web server and SQL Server be read-only, just set them up that way. You can still independently allow read-write permissions for your Access application.
To do the initial data migration and set the SQL Server automatically, I would use the SQL Server Migration Assistant. The only thing you should definitely change that I can think of would be to turn off the Identity property on any columns that have it - to be explained below (MS Access calls Identity autonumber). Once you have your tables loaded, you can set up a dsnless connection to the database (and tables) you just created.
I haven't used the method just linked, but I believe it allows you to use SQL Server authentication to connect to the db. The benefit of using this method is you can easily change which SQL Server instance and/or database your are connecting to for development and testing.
There might be a better, automated way, but you can create several insert queries doing left joins from the primary key of the Access table to the SQL Server table, and putting a WHERE clause that specifies the SQL Server PrimaryKey must be null. This is why you need to turn off the Identity property in the SQL Server tables, so that you can insert the new data.
Finally, put the name of each query in one function, then run the function periodically.
I have used Microsoft's free SQL Server Migration Assistant (SSMA) to migrate Access to SQL Server. The tool is very simple to use. The only problem I have encountered with the tool was overloaded data types when migrating. What I mean by this is a small string will get converted to a NVARCHAR(MAX) in some instances. Otherwise, the tool is very handy and can be reused after setting up a 'profile'.
I have two databases, one on a remote server the other local. (SQL Server 2008)
The database on my local server has the entire structure setup but no data. I would like to copy the data from the remote server to my server and I am wondering the best method in which to do this.
The main issue I am experiencing is the user that I have to the remote database has limited permissions. I cannot read the stored procedures, user defined functions so when I use Import/Export wizard I do not get the schema etc. So a regular dump/restore is not working for me as it restores the tables without the Primary Keys/Foreign Keys and the stored procedures.
I'd like to do this,
INSERT INTO localtable SELECT * FROM remotedb.table
I was having issues because of the IDENTITY fields and I had to explicitly name all of the columns. Also I am not sure if SQL Server Management Studio allows you to use two different databases, remote and local, so I was looking for any advice.
I have also tried applications like SQL FTP and Backup and it fails because it runs out of memory (I have 16GB of memory on the machine and the DB is like 4GB). I also can use the SQL Server import/export wizard but then I don't get the schema information. I also tried SQL Compare from Red Gate and it runs into issues with the permissions. Unfortunately I do not have the time to request and gain access to a new user so I was hoping someone had a creative idea.
You can definitely use SQL Server Backups for this. It will not run out of memory. If it does please tell us the message (because likely you are misinterpreting it). This is the fastest possible and the most complete solution.
You can tell the export wizard to also script the schema. It is hidden under "advanced" somewhere (terrible UI). But the script will be extremely big and I know of no way to execute it.
You can drop all schema objects except PKs in the target database. Then you can use remote queries to copy all the data over. You will not get any problems with foreign keys and identity columns if you drop the beforehand. After you are done you can recreate all those objects. It is probably best if you use a transaction for all of this because that way you get consistent source data from a point-in-time.