ColumnMapping.Add alternative - sql-server

I have my PowerShell script:
$dataSource = "HPVIE1APARBLD21";
$database = "tempdb"
$connection = New-Object System.Data.SqlClient.SqlConnection; #setting connection
$connection.ConnectionString = "Server=$dataSource;Database=$database;Integrated Security=True;";
$connection.Open();
$command = New-Object System.Data.SqlClient.SqlCommand;
$command.Connection = $connection;
$as = New-Object System.Data.SqlClient.SqlDataAdapter;
$as.SelectCommand = $command;
$filename = 'C:\XML\apc.xml'; #file to be loaded into sql server database table
$ds = New-Object System.Data.DataSet;
$ds.ReadXml($filename); #reading from the file -- line gives output InferSchema
$dt = New-Object System.Data.DataTable;
$dt=$ds.Tables[0];
$bcp = New-Object "Data.SqlClient.SqlBulkCopy" $connection; #bulkcopy to the destination table.
$bcp.DestinationTableName = "dbo.tempor";
$bcp.ColumnMappings.Count;
$bcp.ColumnMappings.Clear();
$bcp.ColumnMappings.Add("StartTime","StartTime");
$bcp.ColumnMappings.Add("EndTime","EndTime");
$bcp.ColumnMappings.Add("Status","Status");
$bcp.ColumnMappings.Add("ProcessedTime","ProcessedTime");
$bcp.ColumnMappings.Add("ScheduleId","ScheduleId");
$bcp.ColumnMappings.Add("Model","Model");
$bcp.ColumnMappings.Add("ModelVersion","ModelVersion");
$bcp.ColumnMappings.Add("ModelVersionState","ModelVersionState");
$bcp.ColumnMappings.Add("ModifiedTime","ModifiedTime");
$bcp.ColumnMappings.Add("WriteBackLastRunTime","WriteBackLastRunTime");
$bcp.ColumnMappings.Add("ModifiedBy","ModifiedBy");
$bcp.ColumnMappings.Add("ModelType","ModelType");
$bcp.ColumnMappings.Add("IsTimeRange","IsTimeRange");
$bcp.ColumnMappings.Add("WriteBackStatus","WriteBackStatus");
$bcp.ColumnMappings.Add("RetryWriteBack","RetryWriteBack");
$bcp.ColumnMappings.Add("NumOfRetry","NumOfRetry");
$bcp.ColumnMappings.Add("FrequencyUnits","FrequencyUnits");
$bcp.ColumnMappings.Add("ScheduleType","ScheduleType");
$bcp.ColumnMappings.Add("CustomType","CustomType");
$bcp.ColumnMappings.Add("WritebackProcessedTime","WritebackProcessedTime");
$bcp.ColumnMappings.Add("DiagStatus","DiagStatus");
$bcp.ColumnMappings.Add("AccountingPeriodCase_PK_ID","AccountingPeriodCase_PK_ID");
$bcp.ColumnMappings.Count;
if ($connection.State -ne [Data.ConnectionState]::Open) {
"Connection to DB is not open."
Exit
}
$bcp.WriteToServer("$dt"); #writing to server -- not loading content of xml file to sql server table
$connection.Close();
Now for this script, my output is:
I want to insert the values of the following xml file: apc.xml
<?xml version="1.0" standalone="yes"?>
<NewDataSet>
<Table>
<StartTime>2017-01-14T18:30:00+05:30</StartTime>
<EndTime>2017-01-15T18:30:00+05:30</EndTime>
<Status>1</Status>
<ProcessedTime>2017-04-06T05:51:18.843+05:30</ProcessedTime>
<ScheduleId>db6f3178-4702-456c-904b-2cd833b54efa</ScheduleId>
<Model>Gasco_Hotfix</Model>
<ModelVersion>G3</ModelVersion>
<ModelVersionState>1</ModelVersionState>
<ModifiedTime>2017-04-06T05:51:58.48+05:30</ModifiedTime>
<ModifiedBy>l4-dc\ilbinstall</ModifiedBy>
<ModelType>1</ModelType>
<IsTimeRange>true</IsTimeRange>
<RetryWriteBack>false</RetryWriteBack>
<NumOfRetry>0</NumOfRetry>
<FrequencyUnits>2</FrequencyUnits>
<ScheduleType>1</ScheduleType>
<CustomType>0</CustomType>
<DiagStatus>2</DiagStatus>
<AccountingPeriodCase_PK_ID>9544940d-8d1a-e711-80d3-0050569c2fb6</AccountingPeriodCase_PK_ID>
</Table>
<Table>
<StartTime>2017-01-15T18:30:00+05:30</StartTime>
<EndTime>2017-01-16T18:30:00+05:30</EndTime>
<Status>1</Status>
<ProcessedTime>2017-04-06T05:51:18.843+05:30</ProcessedTime>
<ScheduleId>db6f3178-4702-456c-904b-2cd833b54efa</ScheduleId>
<Model>Gasco_Hotfix</Model>
<ModelVersion>G3</ModelVersion>
<ModelVersionState>1</ModelVersionState>
<ModifiedTime>2017-04-06T05:52:29.073+05:30</ModifiedTime>
<ModifiedBy>l4-dc\ilbinstall</ModifiedBy>
<ModelType>1</ModelType>
<IsTimeRange>true</IsTimeRange>
<RetryWriteBack>false</RetryWriteBack>
<NumOfRetry>0</NumOfRetry>
<FrequencyUnits>2</FrequencyUnits>
<ScheduleType>1</ScheduleType>
<CustomType>0</CustomType>
<DiagStatus>2</DiagStatus>
<AccountingPeriodCase_PK_ID>9644940d-8d1a-e711-80d3-0050569c2fb6</AccountingPeriodCase_PK_ID>
</Table>
</NewDataSet>
Into the table dbo.tempor in the database tempdb.
Could anyone help me up with the error?
And also, i would like to know if there is a method to not add column mappings manually by writing each column names. I mean, is there a way to dynamically add columns without having to manually type up each column name?
The purpose is sometimes in future i would have to add more columns to the original table, then again i can't come back to the script and do changes manually. So if there is an alternative method please do help me up with it.
Thank you.

I think, that this appraoch cannot get the XMLs schema automatically. I'd try an approach like this:
DECLARE #xml XML=
N'<?xml version="1.0" standalone="yes"?>
<NewDataSet>
<Table>
<StartTime>2017-01-14T18:30:00+05:30</StartTime>
<EndTime>2017-01-15T18:30:00+05:30</EndTime>
<Status>1</Status>
<ProcessedTime>2017-04-06T05:51:18.843+05:30</ProcessedTime>
<ScheduleId>db6f3178-4702-456c-904b-2cd833b54efa</ScheduleId>
<Model>Gasco_Hotfix</Model>
<ModelVersion>G3</ModelVersion>
<ModelVersionState>1</ModelVersionState>
<ModifiedTime>2017-04-06T05:51:58.48+05:30</ModifiedTime>
<ModifiedBy>l4-dc\ilbinstall</ModifiedBy>
<ModelType>1</ModelType>
<IsTimeRange>true</IsTimeRange>
<RetryWriteBack>false</RetryWriteBack>
<NumOfRetry>0</NumOfRetry>
<FrequencyUnits>2</FrequencyUnits>
<ScheduleType>1</ScheduleType>
<CustomType>0</CustomType>
<DiagStatus>2</DiagStatus>
<AccountingPeriodCase_PK_ID>9544940d-8d1a-e711-80d3-0050569c2fb6</AccountingPeriodCase_PK_ID>
</Table>
<Table>
<StartTime>2017-01-15T18:30:00+05:30</StartTime>
<EndTime>2017-01-16T18:30:00+05:30</EndTime>
<Status>1</Status>
<ProcessedTime>2017-04-06T05:51:18.843+05:30</ProcessedTime>
<ScheduleId>db6f3178-4702-456c-904b-2cd833b54efa</ScheduleId>
<Model>Gasco_Hotfix</Model>
<ModelVersion>G3</ModelVersion>
<ModelVersionState>1</ModelVersionState>
<ModifiedTime>2017-04-06T05:52:29.073+05:30</ModifiedTime>
<ModifiedBy>l4-dc\ilbinstall</ModifiedBy>
<ModelType>1</ModelType>
<IsTimeRange>true</IsTimeRange>
<RetryWriteBack>false</RetryWriteBack>
<NumOfRetry>0</NumOfRetry>
<FrequencyUnits>2</FrequencyUnits>
<ScheduleType>1</ScheduleType>
<CustomType>0</CustomType>
<DiagStatus>2</DiagStatus>
<AccountingPeriodCase_PK_ID>9644940d-8d1a-e711-80d3-0050569c2fb6</AccountingPeriodCase_PK_ID>
</Table>
</NewDataSet>';
--The query will read all columns and insert this to the target table
INSERT INTO dbo.tempor(StartTime,EndTime,[Status],ProcessedTime /*more columns*/)
SELECT t.value(N'(StartTime/text())[1]',N'datetime') AS StartTime
,t.value(N'(EndTime/text())[1]',N'datetime') AS EndTime
,t.value(N'(Status/text())[1]',N'int') AS [Status]
,t.value(N'(ProcessedTime/text())[1]',N'datetime') AS ProcessedTime
--and so on
FROM #xml.nodes(N'/NewDataSet/Table') AS A(t);
Some explanation
To test this, you can copy my script (together with the XML declaration) into a new SSMS query window and execute it. .nodes() returns all (repeated) <Table> nodes as derived table.
Using SQL-Server's native XML support allows for fast and typesafe queries. Your XML looks clean and neat, so this should be rather easy.
If this is possible for you, I'd create a stored procedure accepting the XML as input (either NVARCHAR(MAX) or XML). You can call this from PS as a one-liner.
If you expect issues (wrong/missing data) it is recommended to use a tolerant staging table, use this for validation, cleansing, and do the final import from there.
The purpose is sometimes in future i would have to add more columns to
the original table, then again i can't come back to the script and do
changes manually.
When you come back to add more columns to the table, it is easy to add them to the SP too...

Related

How to bulk import many Word documents into SQL Server database table

I need to import ~50,000 Word documents (.doc and .docx) from a single directory into a SQL Server 2016 database table so that I can use full text indexing and then search the documents' contents.
Since this is a one-off task and the database won't be required for long I'm not concerned with performance or the arguments for using FILESTREAM or FileTables.
I've just created a database with a single table:
CREATE TABLE [dbo].[MyDocument]
(
[ID] INT IDENTITY(1,1) NOT NULL,
[DocumentName] NVARCHAR(255) NOT NULL,
[Extension] NCHAR(10) NOT NULL,
[DocumentContent] VARBINARY(MAX) NOT NULL,
CONSTRAINT [PK_MyDocument] PRIMARY KEY CLUSTERED ([ID] ASC)
)
Now I'm looking for a way to get my documents into the table. There are plenty of examples online for importing a single document into a SQL Server database table using OPENROWSET, but they require me to specify a name for the file, which is obviously no use for my requirements.
I can't believe there isn't a well-documented and straightforward way to do this but a couple of hours of searching haven't turned anything up, which is starting to make me doubt this is even possible, but surely it is?
Can anybody give me an example snippet of T-SQL for importing multiple files into the database? Or suggest how else it might be achieved?
Below is a PowerShell script to import all ".docx" files in the specified folder using a parameterized query along with a FileStream parameter value to stream file contents to the database rather than loading the entire file contents into client memory.
# import all documents in specified directory using file stream parameter
try {
$timer = [System.Diagnostics.Stopwatch]::StartNew()
$insertQuery = #"
INSERT INTO dbo.MyDocument (DocumentName, Extension, DocumentContent)
VALUES(#DocumentName, #Extension, #DocumentContent);
"#
$connection = New-Object System.Data.SqlClient.SqlConnection("Data Source=.;Initial Catalog=YourDatabase;Integrated Security=SSPI")
$command = New-Object System.Data.SqlClient.SqlCommand($insertQuery, $connection)
$documentNameParameter = $command.Parameters.Add("#DocumentName", [System.Data.SqlDbType]::NVarChar, 255)
$documentExtensionParameter = $command.Parameters.Add("#Extension", [System.Data.SqlDbType]::NVarChar, 10)
$documentContentParameter = $command.Parameters.Add("#DocumentContent", [System.Data.SqlDbType]::VarBinary, -1)
$connection.Open()
$filesToImport = Get-ChildItem "E:\DocumentsToImport\*.docx"
$importedFileCount = 0
foreach($fileToImport in $filesToImport) {
$documentContentStream = [System.IO.File]::Open($fileToImport.FullName, [System.IO.FileMode]::Open)
$documentNameParameter.Value = [System.IO.Path]::GetFileNameWithoutExtension($fileToImport.FullName)
$documentExtensionParameter.Value = [System.IO.Path]::GetExtension($fileToImport.Name)
$documentContentParameter.Value = $documentContentStream
[void]$command.ExecuteNonQuery()
$documentContentStream.Close()
$importedFileCount += 1
}
$connection.Close()
$timer.Stop()
Write-Host "$importedFileCount files imported. Duration $($timer.Elapsed)."
}
catch {
throw
}

How to backup statistics in sql?

Let's just say I have a table A with some data on it in SSMS. There are sub tables such as columns, constraints,triggers,indexes and statistics etc.
I want to create a similar table with same properties as table A. I know I need to go to Script Table As-> Create To-> New Query Window to duplicate the table structure.
However, after doing that, I realized the statistics in my new table is empty when there are statistics in table A. Did I miss out something?
You can script the statistics blob only with the following bit of powershell (which I yoinked from an old blog post of mine):
pushd;
import-module sqlps -disablenamechecking;
popd;
$opts = new-object Microsoft.SqlServer.Management.SMO.ScriptingOptions;
$opts.OptimizerData = $true;
$server = new-object Microsoft.SqlServer.Management.SMO.Server ".";
$database = $server.Databases["AdventureWorks2008R2"];
foreach ($table in $database.Tables) {
foreach ($stat in $table.Statistics) {
$stat.Script($opts);
}
}
The above will script out all statistics (including the histogram data) for all tables in the AdventureWorks2008R2 database. You should be able to tailor it to your needs.

Extract DICOM Image From RayStation Microsoft SQL Database

I was given a backup RayStation database, RS_Patients.bak, and am trying to extract and view the DICOM images that are stored in it. The trouble is two-fold: I don't know which one of the 2,000+ fields (or combinations of fields) refer to the images themselves, and even if I did know where the images were, I don't know how to extract them from the database into .dcm files.
From examining the schema, I found a few fields that are large varbinary fields (BLOBs) and I think they might be the fields I'm looking for. FileStream is enabled for the database and there is an FS directory. I've tried to download those fields into files using the bcp Utility, but that hasn't generated successful DICOMs.
Does anyone have any experience with this sort of database/image structure? Any other suggestions for pulling out and viewing the image? Do you think the image would be made up of a couple of fields instead of just one? There are fields next to what we believe the image field with headers for the DICOM image: in the table callefd ImageStack, next to a field called PixelData, there are fields called PixelSize, SlicePosition, NrPixels, etc.
Also, if you can think of another place to ask this, I would appreciate that too.
Edit per #mcNets suggestion, the bcp command:
DECLARE #Command Varchar(400)
SET #Command = 'bcp "SELECT TOP 1 PixelData FROM RayStationPatientDB.dbo.ImageStack" queryout "C:\Users\Administrator\Documents\test.dcm" -S WIN-123ABC\MSSQLSERVER01 -T -w'
EXEC xp_cmdshell #Command
Generally speaking, you're not going to be able to use SQL Server results to write image data directly. bcp.exe isn't going to help you, either. You need to either use something that understands that the binary string is raw file data, or, because this is a FILESTREAM, use something that will give you the path to the file on the SQL Server. I have limited experience with FILESTREAM, but here's what I would do.
I can't definitively answer which field to use. That will depend on the application. If we assume that the DICOM images are stored in a FILESTREAM, then you can find the available FILESTREAM columns with this:
select t.name TableName
,c.name ColumnName
from sys.tables t
join sys.columns c
on c.object_id = t.object_id
where c.is_filestream = 1
If we also assume that DICOM images are stored as raw image files -- i.e., as a complete binary version of what they would be if they were saved on a PACS disc -- then you can run this to determine the path for each file by the ID:
select TableName_Id
,FileData.PathName()
from TableName.ColumnName
The doc for the PathName() function of FILESTREAM columns is here.
If you instead want to pull the data through SQL Server in a traditional sense, then I would probably use a PowerShell script to do it. This has the advantage of letting you use arbitrary data from the server to name the files. This method also has the advantage that it will work on any binary or varbinary column. As a disadvantage, this method will be slower and uses more disk space, because the server has to read the data, send it to the client, and then the client writes the data to disk:
$SqlQuery = "select Name, FileData from TableName.ColumnName";
$OutputPath = 'C:\OutputPath';
$SqlServer = 'ServerName';
$SqlDatabase = 'DatabaseName';
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
$SqlCommand = New-Object -TypeName System.Data.SqlClient.SqlCommand;
$SqlCommand.CommandText = $SqlQuery;
$SqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
$SqlCommand.Connection = $SqlConnection;
$SqlConnection.Open();
$SqlDataReader = $SqlCommand.ExecuteReader();
while ($SqlDataReader.Read()) {
$OutputFileName = Join-Path -Path $OutputPath -ChildPath "$($SqlDataReader['Name']).dcm"
[System.IO.File]::WriteAllBytes($OutputFileName,$SqlDataReader['FileData']);
}
$SqlConnection.Close();
$SqlConnection.Dispose();
It's also possible to use FILESTREAM functions to return Win32 API handles, but I have never done that.

Pass a powershell variable into a SQL value during out-datatable (invoke-sqlcmd2)

I want to insert a PowerShell variable value with a Select as I build a datatable from a SQL query.
Borrowed function invoke-sqlcmd2 from TechNet gallery and dot-sourced it in.
$NewSequenceID = invoke-sqlcmd2 -ServerInstance "MyServer" -Database "MyDB" -Query "INSERT INTO [Sequence] (TimeStarted) SELECT GETDATE(); SELECT max(SequenceID) as SequenceID FROM [Sequence]" | foreach { $_.SequenceID }
This generates a new sequence ID and stamps the time we started the batch. Results in a single number which will identify this run. Verified with 'write $NewSequenceID'.
I want to keep later results from queries together with this SequenceID for analysis.
Then I have this:
$PollTime = Get-Date -format "yyyy-MM-dd HH:mm:ss"
Then I want to do this: (Edit: This statement is not working - error message at the bottom)
$AuditUserOutput = invoke-sqlcmd2 -ServerInstance "MyServer2" -Database "MyDB2" -Query "SELECT $NewSequenceID, $PollTime, [USERID], [PID], [UDATE] FROM [MyTable]" -As 'Datatable'
And do some things with the table, then write it after with write-datatable.
If I select NULL for the first two values and grab the other three from the existing table, it works fine. I want to add the $NewSequenceID and $PollTime from the previous statements.
I've read a dozen pages about using ` (backtick), $, {}, and on and on, but I haven't gotten it right. Can someone help with the correct syntax for inserting these variable values into the selection?
PS Error is: Exception calling "Fill" with "1" argument(s): "Invalid pseudocolumn "$NewSequenceID"."
You're interpolating the variables correctly in PowerShell. If I'm understanding this correctly, the problem is with your SQL query. I'm going to make an inference here, but I think this is probably what you want:
$AuditUserOutput = invoke-sqlcmd2 -ServerInstance "MyServer2" -Database "MyDB2" -Query "SELECT [NewSequenceID], [PollTime], [USERID], [PID], [UDATE] FROM [MyTable] WHERE NewSequenceID = '$NewSequenceID' AND PollTime = '$PollTime'" -As 'Datatable'
If not, please clarify by responding to the questions above.
I was able to work around this by first creating a variable to store the query text, which allowed for the natural substitution I needed:
$AuditUserQuery = "SELECT '$NewSequenceID', '$PollTime', [USERID], [PID], [UDATE] FROM [AUDITUSER]"
Then calling that variable as the $query when building the datatable.
This avoided the parameterization problem experienced before.

T-SQL query to show table definition?

What is a query that will show me the full definition, including indexes and keys for a SQL Server table? I want a pure query - and know that SQL Studio can give this to me, but I am often on "wild" computers that have only the most bare-bones apps and I have no rights to install studio. But SQLCMD is always an option.
UPDATE: I have tried sp_help, but is just yields one record which shows Name, Owner, Type and Created_Datetime. Is there something else I am missing with sp_help?
Here is what I call:
sp_help airports
Note that I really do want the DDL that defines the table.
There is no easy way to return the DDL. However you can get most of the details from Information Schema Views and System Views.
SELECT ORDINAL_POSITION, COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH
, IS_NULLABLE
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'Customers'
SELECT CONSTRAINT_NAME
FROM INFORMATION_SCHEMA.CONSTRAINT_TABLE_USAGE
WHERE TABLE_NAME = 'Customers'
SELECT name, type_desc, is_unique, is_primary_key
FROM sys.indexes
WHERE [object_id] = OBJECT_ID('dbo.Customers')
Have you tried sp_help?
sp_help 'TableName'
Visit http://www.stormrage.com/SQLStuff/sp_GetDDL_Latest.txt.
You will find the code of sp_getddl procedure for SQL Server.
The purpose of the procedure is script any table, temp table or object.
USAGE:
exec sp_GetDDL GMACT
or
exec sp_GetDDL 'bob.example'
or
exec sp_GetDDL '[schemaname].[tablename]'
or
exec sp_GetDDL #temp
I tested it on SQL Server 2012, and it does an excellent job.
I'm not the author of the procedure. Any improvement you make to it send to Lowell Izaguirre (scripts#stormrage.com).
The easiest and quickest way I can think of would be to use sp_help
sp_help 'TableName'
Use this little Windows command-line app that gets the CREATE TABLE script (with constraints) for any table. I've written it in C#. Just compile it and carry it on a memory stick. Perhaps someone can port it to Powershell.
using System;
using System.Linq;
using Microsoft.SqlServer.Management.Common;
using Microsoft.SqlServer.Management.Smo;
namespace ViewSource
{
public class ViewSource
{
public static void Main(string[] args)
{
if (args.Length != 6)
{
Console.Error.WriteLine("Syntax: ViewSource.exe <server>" +
" <user> <password> <database> <schema> <table>");
}
Script(args[0], args[1], args[2], args[3], args[4], args[5]);
}
private static void Script(string server, string user,
string password, string database, string schema, string table)
{
new Server(new ServerConnection(server, user, password))
.Databases[database]
.Tables[table, schema]
.Script(new ScriptingOptions { SchemaQualify = true,
DriAll = true })
.Cast<string>()
.Select(s => s + "\n" + "GO")
.ToList()
.ForEach(Console.WriteLine);
}
}
}
Since SQL 2012 you can run the following statement:
Exec sp_describe_first_result_set #tsql= N'Select * from <yourtable>'
If you enter a complex select statement (joins, subselects, etc), it will give you the definition of the result set. This is very handy, if you need to create a new table (or temp table) and you don't want to check every single field definition manually.
https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-describe-first-result-set-transact-sql
sp_help 'YourTableName'
This will return columns, datatypes, and indexes defined on the table:
--List all tables in DB
select * from sysobjects where xtype = 'U'
--Table Definition
sp_help TableName
This will return triggers defined on the table:
--Triggers in SQL Table
select * from sys.triggers where parent_id = object_id(N'SQLTableName')
I know it's an old question, but exactly what I was looking for. Because I want to batch script some tables, I rewrote the C# code from Anthony Faull for PowerShell.
This one is uses Integrated Security:
Import-Module sqlps
$serverInstance = "<server>"
$database = "<database>"
$table = "<table>"
$schema = "<schema>"
$options = New-Object -TypeName Microsoft.SqlServer.Management.Smo.ScriptingOptions
$options.DriAll = $true
$options.SchemaQualify = $true
$connection = New-Object -TypeName Microsoft.SqlServer.Management.Common.ServerConnection `
-ArgumentList $serverInstance
$server = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server `
-ArgumentList $connection
$server.Databases.Item($database).Tables.Item($table, $schema).Script($options) `
| ForEach-Object -Process { $_ + "`nGO"}
And here with username and password:
Import-Module sqlps
$serverInstance = "<server>"
$user = "<user>"
$password = "<pasword>"
$database = "<database>"
$table = "<table>"
$schema = "<schema>"
$options = New-Object -TypeName Microsoft.SqlServer.Management.Smo.ScriptingOptions
$options.DriAll = $true
$options.SchemaQualify = $true
$connection = New-Object -TypeName Microsoft.SqlServer.Management.Common.ServerConnection `
-ArgumentList $serverInstance
$connection.LoginSecure = $false
$connection.Login = $user
$connection.Password = $password
$server = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server `
-ArgumentList $connection
$server.Databases.Item($database).Tables.Item($table, $schema).Script($options) `
| ForEach-Object -Process { $_ + "`nGO"}
Simply type the table name and select it and press ATL + F1
Say your table name is Customer then open a new query window, type and select the table name and press ALT + F1
It will show the complete definition of table.
A variation of #Anthony Faull's answer for those using LINQPad:
new Server(new ServerConnection(this.Connection.DataSource))
.Databases[this.Connection.Database]
.Tables["<table>", "dbo"]
?.Script(new ScriptingOptions {
SchemaQualify = true,
DriAll = true,
})
You'll need to reference 2 assemblies:
Microsoft.SqlServer.ConnectionInfo.dll
Microsoft.SqlServer.Smo.dll
And add namespace references as mentioned in Anthony's snippet.
SELECT ORDINAL_POSITION, COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH
, IS_NULLABLE
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'EMPLOYEES'
Try the sp_help stored procedure.
sp_help <>
Another way is to execute sp_columns procedure.
EXEC sys.sp_columns #TABLE_NAME = 'YourTableName'
As an addition to Barry's answer. The sp_help can also be used by itself to iterate all of the objects in a particular database. You also have sp_helptext for your arsenal, which scripts out programmatic elements, like stored procedures.
General table definition is shown using stored procedure sp_help, as said before:
sp_help 'table_name_in_current_db_context'
When using tables of multiple contexts is useful to prefix the command above with the desired context, instead of changing them with use db_xyz:
DB_Products..sp_help 'MyTable' -- for definition of MyTable in DB_Products
And it works with temp tables also:
tempdb..sp_help '#TempTable' -- for definition of #TempTable in current context.
There is an easy way to get DDL scripts for any database object.
Open SQL Server Management Studio
Connect to the source database server.
Expand the databases tree.
Right-click on the database with the tables to export.
In the sub-menu, expand "Tasks".
In the sub-menu, choose "Generate Scripts..."
Use the wizard to choose the objects to export.
For the scripting options, choose 1 script per object.
This will export a file for every table, view, stored proc, user, role, or schema you chose. Take that to the destination computer.

Resources