I found this useful query to rename all my tables, indexes and constrains but I just figured out it didn't rename the columns.
SELECT 'exec sp_rename ' + '''' + NAME + '''' + ', ' + '''' + replace(NAME, 'Tb', 'Tabela') + ''''
FROM sysObjects
WHERE
NAME LIKE 'Tb%'
I know there's syscolumns but I'm not sure how to use in this case.
Question: How can I get the same result of this query but for columns instead of tables?
I appreciate your help in this. I'm using SQL Server 2012. Thanks.
You have to do a little more work:
SELECT 'exec sp_rename ' + '''' + QUOTENAME(s.name) + '.' + QUOTENAME(o.name) + '.' + QUOTENAME(c.name) + '''' + ', ' + '''' + replace(c.name, 'Col', 'Column') + ''', ''COLUMN'''
FROM sys.columns c
INNER JOIN sys.objects o ON c.object_id = o.object_id
INNER JOIN sys.schemas s ON o.schema_id = s.schema_id
WHERE c.name LIKE 'Col%'
Since you are renaming column, you must specify that third argument to sp_rename is COLUMN. You must also construct three part name in the form of [schema].[table name].[current column name] to point to the correct column.
Related
Based on the examples here, here and here, I tried to build dynamic SQL so that I don't have to manually do this at some point. However, I am facing a challenge when the PK is a combination of more than one column. This code is creating scripts separately for each additional column that is part of PK. I would like to have it something with number of columns in the suffix if it has more than one column that should be part of PK - _12, _123, etc.
ALTER TABLE <Schema_Name>.<Table_Name>
DROP CONSTRAINT <constraint_name>
ALTER TABLE <Schema_Name>.<Table_Name>
ADD CONSTRAINT PK_<Schema_Name>_<Table_Name>_12 PRIMARY KEY (<Column1>,<Column2>)
Code built so far:
SELECT
STUFF((
select ';' + 'ALTER TABLE ' +
s.name + '.' +
t.name + ' DROP CONSTRAINT ' +
i.name + ';' +
'ALTER TABLE ' + s.name + '.' + t.name +
' ADD CONSTRAINT ' + 'PK__' + s.name + '__' + t.name + '__' +
--tc.name,
string_agg(ic.key_ordinal, ',') +
' PRIMARY KEY (' + tc.name + ')'
from
sys.schemas s
inner join sys.tables t on s.schema_id=t.schema_id
inner join sys.indexes i on t.object_id=i.object_id
inner join sys.index_columns ic on i.object_id=ic.object_id
and i.index_id=ic.index_id
inner join sys.columns tc on ic.object_id=tc.object_id
and ic.column_id=tc.column_id
where i.is_primary_key=1
GROUP BY s.name, t.name, i.name, tc.name, ic.key_ordinal
order by t.name, ic.key_ordinal
FOR XML PATH('')),1,1,'') + ';'
;
I have a list of tables across multiple databases, and I would like to find out the row counts for these tables and the name of these tables.
Note that the names of the tables may change, as I need to repeat this many times, so ideally, I would like to specify the tables and then query some of the data dictionary tables.
I am able to achieve what I want, by writing multiple queries for each database and then sticking the results into one final table, but wondered if there is a solution that is more elegant.
Below is an example that will get the row counts for all tables in the specified databases in a single resultset. Add additional filters as appropriate for your needs.
DECLARE #SQL nvarchar(MAX) =
STUFF((SELECT N'UNION ALL SELECT
N''' + QUOTENAME(d.name) + N'''
+ N''.''
+ QUOTENAME(OBJECT_SCHEMA_NAME(t.object_id, ' + CAST(d.database_id AS nvarchar(10)) + N'))
+ N''.''
+ QUOTENAME(t.name) AS TableName
, SUM(p.rows) AS Rows
FROM ' + QUOTENAME(d.name) + N'.sys.tables AS t
JOIN ' + QUOTENAME(d.name) + N'.sys.partitions AS p ON p.object_id = t.object_id AND p.index_id IN(0,1)
GROUP BY N''' + QUOTENAME(d.name) + N'''
+ N''.''
+ QUOTENAME(OBJECT_SCHEMA_NAME(t.object_id, ' + CAST(d.database_id AS nvarchar(10)) + N'))
+ N''.''
+ QUOTENAME(t.name)'
FROM sys.databases AS d
WHERE d.name IN(N'Database1', N'Database2)
FOR XML PATH(''), TYPE).value('(text())[1]','nvarchar(MAX)'),1,10,'');
EXEC sp_executesql #SQL;
You can get the number of rows using system tables as well. If you need other tables from different databases as well, you can just use union and Change Adventure works to your DB name.
select t.name as Tablename, s.name Schemaname ,p.rows as Numberofrows from AdventureWorks.sys.tables t
join AdventureWorks.sys.schemas s on t.schema_id = s.schema_id
join AdventureWorks.sys.indexes i on i.object_id = t.object_id
join AdventureWorks.sys.partitions p on p.object_id = i.object_id and p.index_id = i.index_id
group by t.name , s.name ,p.rows
Output:
I am running a dynamic sql query which loops over all tables in a large dB (1600 tables) and creates the query I need to run afterwards. After the string is created i save the string and run it. However the string concatenation is extremely slow . I need to add a lot more boilerplate code for each table but in the example below I've omitted the boilerplate code to keep it simple.
My question is how can I speed up this process of building queries via concatenation of NVARCHAR? When I run this query it gets significantly slower the more text and parameter concatenations are added. It takes over 30 mins currently but the CTE executes in 1 sec.
DECLARE #UpdateColumnsSql NVARCHAR(MAX) = '';
WITH TableColumns AS
(
SELECT
QUOTENAME(OBJECT_SCHEMA_NAME(t.object_id)) +
'.' + QUOTENAME(t.name) AS TableName
,LOWER(c.name) AS ColumnName
,ty.name AS TypeName
,c.max_length AS ColLength
FROM
sys.tables AS t
INNER JOIN
sys.columns AS c
ON
t.object_id = c.object_id
INNER JOIN
sys.types AS ty
ON
c.system_type_id = ty.system_type_id
WHERE
c.name IN
( 'Company_ID', 'Facility_ID', 'Premises_ID' )
)
SELECT
#UpdateColumnsSql = #UpdateColumnsSql +
'ALTER TABLE ' + TableColumns.TableName +
' ADD [' + TableColumns.ColumnName + '_New] ' + TypeName +
'(' + CONVERT(VARCHAR(4), ColLength) + ')' + ' NULL
SET #sql =
'''' UPDATE T
SET T.[' + TableColumns.ColumnName + '_New] = S.[NewValue]
from ' + TableColumns.TableName + ' T
inner join NewIDList S
on Company = Company_ID
where T.[' + TableColumns.ColumnName + '] = S.OldValue
AND S.[OldColumnName] = ''''''''' + TableColumns.ColumnName + ''''''''' ''''
exec(#sql);
'
FROM
TableColumns;
Using a single select query, I need to get the Minimum and Maximum values of Identity Columns along with the other columns specified in the query below, for all tables in a given database.
This is what I've been able to code to get a list of tables and their identity columns:
Select so.name as TableName
, sic.name as ColumnName
, i.Rows Count_NumberOfRecords
, IDENT_CURRENT(so.name)+IDENT_INCR(so.name) as NextSeedValue
from sys.identity_columns sic
inner join sys.objects so on sic.object_id = so.object_id
inner join sys.sysindexes I ON So.OBJECT_ID = I.ID
Where so.type_desc = 'USER_TABLE' and last_value is not null and indid IN (0,1);
The query needs to get these extra columns:
MaximumValue (IdentityColumn) and MinimumValue (IdentityColumn) for each table.
You should be able to use something along these lines:
DECLARE #cmd NVARCHAR(max);
SET #cmd = '';
SELECT #cmd = #cmd + CASE WHEN (#cmd = '') THEN '' ELSE ' UNION ALL ' END + 'SELECT ''' +
QUOTENAME(s.name) + '.' + QUOTENAME(t.name) + ''' AS TableName, ''' +
QUOTENAME(c.name) + ''' AS ColumnName, MAX(' + QUOTENAME(c.name) + ') AS MaxID, MIN(' +
QUOTENAME(c.name) + ') AS MinID, COALESCE(IDENT_CURRENT(''' + QUOTENAME(s.name) + '.' +
QUOTENAME(t.name) + '''),0) + COALESCE(IDENT_INCR(''' + QUOTENAME(s.name) + '.' +
QUOTENAME(t.name) + '''),0) AS NextValue FROM ' + QUOTENAME(s.name) + '.' + QUOTENAME(t.name)
FROM sys.tables t
INNER JOIN sys.columns c ON t.object_id = c.object_id
INNER JOIN sys.schemas s on t.schema_id = s.schema_id
WHERE c.is_identity = 1
SELECT #cmd; /* Shows the dynamic query generated, not necessary */
EXEC sp_executesql #cmd;
The query uses dynamic SQL to construct a UNION query that gathers the Table Name, Column Name, and Min and Max ID values currently in every table that has an IDENTITY field.
You could quite easily modify this to show the columns in the format you want, along with the other columns you mention in your question.
I've edited the query above to include the "NextValue" field, however I agree with #AaronBertrand in that this value is of little use, since in a busy system it will most certainly be wrong immediately (or shortly thereafter) once the query executes.
I was wondering if there is an equivalent in SQL Server 2008 to Oracle's DBMS_METADATA.GET_DDL Function? You can pass this function a table name and it will return the ddl for that table so that you can use it to build a script for a schema.
I know I can go into SSMS and use that, but I would prefer to have a t-sql script that would generate the ddl for me.
Thanks,
S
I using this query for generate query but this work for 1 table :
declare #vsSQL varchar(8000)
declare #vsTableName varchar(50)
select #vsTableName = 'Customers'
select #vsSQL = 'CREATE TABLE ' + #vsTableName + char(10) + '(' + char(10)
select #vsSQL = #vsSQL + ' ' + sc.Name + ' ' +
st.Name +
case when st.Name in ('varchar','varchar','char','nchar') then '(' + cast(sc.Length as varchar) + ') ' else ' ' end +
case when sc.IsNullable = 1 then 'NULL' else 'NOT NULL' end + ',' + char(10)
from sysobjects so
join syscolumns sc on sc.id = so.id
join systypes st on st.xusertype = sc.xusertype
where so.name = #vsTableName
order by
sc.ColID
select substring(#vsSQL,1,len(#vsSQL) - 2) + char(10) + ')'
If you are looking for a TSQL solution, it is quite verbose, as [this example]¹ shows.
A shorter alternative would be using the SMO library (example)
¹ Link for this example deleted. The way Internet Archive Wayback Machine displayed an error saying that they could not display the content. And following the link to the original went someplace malicious. (Grave errors, instruction to call some number, etc.)