User created statistics adding dependency on columns on sql server - sql-server

Recently I have dropped all automatically created statistics (named _WA_Sys_%) and run the following T-SQL command to create statistics for all columns of the database :
EXEC sp_createstats #indexonly = 'NO', #fullscan = 'FULLSCAN', #norecompute ='NO'
All worked fine, until I had to drop a column in a table, then an error 5074 occured, indicating that statistics should be dropped before dropping the column.
Is there a way to get SQL Server to drop silently user created statistics when a column is dropped ?

I don't think you can make SQL server do it silently, as this is a common problem for people using user statistics. There is a way to drop relevant statistics with a custom query - would it work for your needs?

Related

Executing query from SSDT

I'm using Visual Studio 2015, SSIS to run set of sql tasks in Execute Sql task and then do a data transfer between tables which are in SSMS by executing package in SSIS. When we run a series of sql statements on SSMS, we get results like rows effected for every sql successful activity. However, now I want to automate the process using SSIS to reduce the turn around time. I would like to get the rows effected for every sql query like select, insert, delete which are in execute sql task. How can it be done in SSIS? I don't have dbo_owner permission to stored procedures in SSMS. I'm thinking SSIS would be a quick way. But it is very important for me to make a log of rows effected to validate the data, as it is financial data. I have nearly 10 sql statements in each sql task like select and delete. But the output is only one table.
For example my sql task is like below
select * from dbo.table1;
select * from dbo.table2 where city = 'Chicago';
create dbo.table3(id int, name varchar(50);
insert into dbo.table3(1,'a');
select * from dbo.table3;
If I execute this in SSMS I get rows effected for each select statement and also table is created. If I execute the same through package in SSIS, how will get messages for each of them?
I assume your data lies on SQL Server. With selects, you could use data flow tasks and row counts instead of Excecute Sql's.
For inserts and updates there's a few ways to get affected rowcount, like this: https://stackoverflow.com/a/1834264/5605866
or like this: http://microsoft-ssis.blogspot.fi/2011/03/rowcount-for-execute-sql-statement.html
Basically the same thing but with a bit different syntax.
You can use the Row Count transaformation after the Data source and save it the variable. Can refer to this get the number of rows returned from the Source that SHOULD be processed.
Hope this help.

Check SQL Server Replication Defaults

We have a database a that is replicated to a subscriber db b (used for SSRS reporting) every night at 2.45 AM.
We need to add a column to one of the replicated tables since it's source file in our iSeries is having a column added that we need to use in our SSRS reporting db.
I understand (from Making Schema Changes on Publication Databases) and the answer here from Damien_The_Unbeliever) that there is a default setting in SQL Server Replication whereby if we use a T-SQL ALTER TABLE DDL statement to add the new column to our table BUPF in the PION database, the change will automatically propagate to the subscriber db.
How can I check the replication of schema changes setting to ensure that we will have no issues with the replication following making the change?
Or should I just run ALTER TABLE BUPF ADD Column BUPCAT Char(5) NULL?
To add a new column to a table and include it in an existing publication, you'll need to use ALTER TABLE < Table > ADD < Column > syntax at the publisher. By default the schema change will be propagated to subscribers, publication property #replicate_ddl must be set to true.
You can verify if #replicate_ddl is set to true by executing sp_helppublication and inspecting the #replicate_ddl value. Likewise, you can set #replicate_ddl to true by using sp_changepublication.
See Making Schema Changes on Publication Databases for more information.

Unable to carry out operations (create trigger, drop table) for a table I created

I am using a SQL Server database with SQL Server Management Studio where I have existing tables. I add a few tables to it and it works just fine. However, for subsequent operations such as
Drop table XXX --OR
Create Trigger YYY on XXX
I run into a error statement that reads:
i) Cannot drop table XXX as it does not exist or you do not have permissions
ii) The object 'XXX' does not exist or is invalid for this operation
I tried to carry out an Insert operation but that showed me a similar error (The object 'XXX' does not exist). I can see this maybe a permissions issue since I am using an existing database. However, in that case, I should have been unable to create a table as well?
Can anyone pinpoint how I can work myself around this and what the problem is?
What is your default schema?
SELECT name, default_schema_name
FROM sys.database_principals
WHERE type = 'S';
Try qualifying your references to the table as SchemaName.XXX and see if that helps.
Most of times when I had similar situations tables were created in system databases (master, tempdb..). Of course it was my mistake.
So maybe try to search for a tables in other databases?

How to merge table from access to SQL Express?

I have one table named "Staff" in access and also have this table(same name) in SQL 2008.
Both table have thousands of records. I want to merge records from the access table to sql table without affecting the existing records in sql. Normally, I just export using OCBC driver and that works fine if that table doesn't exist in sql server. Please advise. Thanks.
A simple append query from the local access table to the linked sql server table should work just fine in this case.
So, just drop in the first (from) table into the query builder. Then change the query type to append, and you are prompted for the append table name.
From that point on, just drop in the columns you want (do not drop in the PK column, as they need not be used nor transferred in this case).
You can also type in the sql directly in the query builder. Either way, you will wind up with something like:
INSERT INTO dbo_custsql
( ADMINID, Amount, Notes, Status )
SELECT ADMINID, Amount, Notes, Status
FROM custsql1;
This may help: http://www.red-gate.com/products/sql-development/sql-compare/
Or you could write a simple program to read from each data set and do the comparison, adding, updating, and deleting, etc.

how can i change a field in my SQL database from numeric(18) to varchar(10)

I have a zipcode field in a database that I just took over. Previously, it was set as a numeric field with 18 precision but I am trying to convert it over to varchar(10).
I need to make this change because the linq fields are coming in as decimal and are causing issues and i want to change the linq fields to simply be strings.
I tried this in SQL server enterprise manager but i get this error, saying:
that the table will have to be dropped and recreated. you have either made changes to a table that can't be recreated or enable the option to prevent saving changes that require a table recreation
Any suggestions?
to enable that option in SQL management studio uncheck the following option...
Tools / Options / Designers / Table and Database Designers / Prevent saving changes that require table re-creation
You could also run an alter statement to change your datatype (as long as all of your data will fit in a varchar(10) column).
ALTER TABLE MyTable
ALTER COLUMN MyZipCodeColumn VARCHAR(10)
Are you using MS-SQL 2008? Changes that require the table to rebuilt are blocked by default.
Click Tools->Options, then Designers. Uncheck "Prevent saving changes that require table re-creation".
Then you can change your column using the designer.
Screenshots on how to do it:
http://pragmaticworks.com/community/blogs/brianknight/archive/2008/06/04/sql-server-2008-designer-behavior-change-saving-changes-not-permitted.aspx

Resources