Windows Powershell script to alter table - sql-server

I have below script, named as alterTable.ps1. I am trying to alter table by adding two new columns to table.
Here is script
function die {
"Error: $($args[0])"
exit 1
}
function verifySuccess {
if (!$?) {
die "$($args[0])"
}
}
# Create variables and assign environment variables
$INSTANCE_HOST = $env:HOST
$INSTANCE_PORT = $env:PORT
$INSTANCE_NAME = $env:INSTANCE
$DATABASE_NAME = $env:DATABASE_NAME
# Execute the alter table, passing the variables
#sqlcmd -U sa -P sapassword -S "$INSTANCE_HOST\$INSTANCE_NAME,$INSTANCE_PORT" -q ”use $DATABASE_NAME; ALTER TABLE dbo.tabletest ADD Test1 VARCHAR(6) NULL, Test2 VARCHAR(10) NULL”
VerifySuccess "sqlcmd failed to alter table tabletest"
When I execute script with values passing to script, getting error
C:\> .\alterTable.ps1 "WINDOWSHOST" "1433" "MSSQLSERVER" "dbname"
Getting error as below:
HResult 0x57, Level 16, State 1 SQL Server Network Interfaces:
Connection string is not valid [87]. Sqlcmd: Error: Microsoft SQL
Server Native Client 10.0 : A network-related or instance-specific
error has occurred while establishing a connection to SQL Server.
Server is not found or not accessible.
If I hardcode those values in script, then it worked fine.
One more quick thing, how I can exit script after execution and check the exit status?

You are never accessing the arguments that you pass to the script, but rather reading the values from four environment-variables (that you never set?). You should be using parameters with default values set to the values of the environment variables if necessary. Ex:
param(
$INSTANCEHOST = $env:HOST,
$INSTANCEPORT = $env:PORT,
$INSTANCENAME = $env:INSTANCE,
$DATABASENAME = $env:DATABASE_NAME
)
function die {
"Error: $($args[0])"
exit 1
}
function verifySuccess {
if (!$?) {
die "$($args[0])"
}
}
# Execute the alter table, passing the variables
sqlcmd -U sa -P sapassword -S "$INSTANCEHOST\$INSTANCENAME,$INSTANCEPORT" -q ”use $DATABASENAME; ALTER TABLE dbo.tabletest ADD Test1 VARCHAR(6) NULL, Test2 VARCHAR(10) NULL”
VerifySuccess "sqlcmd failed to alter table tabletest"
Use it like this:
C:\> .\alterTable.ps1 -INSTANCEHOST "WINDOWSHOST" -INSTANCEPORT "1433" -INSTANCENAME "MSSQLSERVER" -DATABASENAME "dbname"

Related

Stored procedure call problem in Powerbuilder10 on Windows 10 64bit

I get this error message when the DW calls the stored procedure with several parameters:
DW error event occur
Select Error. An error occurred, yet no message was returned by the database driver. sqldbcode 999
The DB was connected as like this:
SQLCA.DBMS = "ODBC"
SQLCA.AutoCommit = False
SQLCA.DBParm = "ConnectString='DSN=XXX_prod;UID=aaa;PWD=xxxx'" + "StripParmNames='YES' and CallEscape='No'"
Connect using sqlca;
The "XXX_prod" is the 32bit ODBC Data Source DSN for SQL Server.
Before, it was "OLE DB" on Windows 7, but, on Windows 10, I have to change the to use ODBC.
From this time, this error occurred:
"DW error event occur",
"Select Error. An error occurred, yet no message was returned by the database driver. sqldbcode 999")
This is the procedure aaa.sp_dw_xxx being called by the DW
SET ANSI_NULLS OFF
GO
SET QUOTED_IDENTIFIER OFF
GO
ALTER PROCEDURE [aaa].[sp_dw_xxx]
(#as_st_ym varchar(8),
#as_yymmdd varchar(8),
#as_yymmdd2 varchar(8),
#as_model varchar(20),
#as_pn varchar(20),
#as_brand varchar(20) ,
#as_product varchar(20) ='%')
AS
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
SELECT
FROM
WHERE
This is the code of the exported DW source:
.
.
procedure="1 execute aaa.sp_dw_xxx;1 #as_st_ym = :as_st_ym, #as_yymmdd = :as_yymm, #as_yymmdd2 = :as_yymm2, #as_model=:as_model, #as_pn = :as_pn,#as_brand = :as_brand ,#as_product = :as_product" arguments = (("as_st_ym", string),("as_yymm", string), ("as_yymm2", string),("as_model", string), ("as_pn", string), ("as_brand", string),("as_product", string)))
.

syntax error near go in a script run from powershell

I'm trying to bulk import into SQL Server, and I need to automate the task (there are thousands of directories) in Powershell. I'm using bcp and have a format file because I need to skip a column when importing. Whenever I run this, it fails with the error:
Exception calling "ExecuteReader" with "0" argument(s): "Incorrect syntax near 'GO'.
The code is:
$query =
"USE Database;
GO
BULK INSERT $tableName
FROM 'C:\users\Name\documents\bcp_sql\File\$name\$dir_id${string}.csv'
WITH (FORMATFILE = 'C:\users\Name\documents\bcp_sql\formatFile.fmt');
GO
SELECT * FROM $tableName;
GO"
$sqlCmd2 = $connection.CreateCommand()
$sqlCmd2.Connection = $connection
$sqlCmd2.CommandText = $query
$sqlCmd2.ExecuteReader()
I've confirmed that the file paths do, in fact, exist (by cd-ing to them).

Use SQLCMD to execute script return Invalid object name

I have to insert a very large amount of data in sql, the operation did not work via SQL Management Studio therefore I was investigating the insert via sqlcmd. so what I did was the following:
create a file.sql that contains the following query (multiple times):
IF NOT EXISTS (SELECT * FROM [dbo].[tblAccount]
WHERE [AccountID] = 117242 AND
[TimeStamp] = CAST(N'2013-01-16 05:53:50.490' AS DateTime))
BEGIN
INSERT INTO
[dbo].[tblAccount] ([AccountID]
,[Name]
,[Comment]
,[IsMachine]
,[UserID]
,[Prefix]
,[Action]
,[Initials]
,[Name]
,[TimeStamp]
,[Reason]
,[Iscal])
VALUES (117242
,'blabla'
,'The users project)'
,1
,'val'
,39
,'val'
,'blabla'
,'blabla'
,CAST(N'2013-01-16 05:53:50.490' AS DateTime)
,'NORMAL'
,'0')
END
I saved the file into a folder and then from the command line I do the following:
C:\>sqlcmd -S pc_name\MSSQLEXPRESS -i"C:\Users\name\Desktop\OutPut\Result tblAccount.sql"
I get the following error:
Msg 208, Level 16, State 1, Server pc_name\MSSQLEXPRESS, Line 1
Invalid object name 'dbo.tblAccount'.
I'm not sure if it is related to the syntax of the sql or in the way I write the sqlcmd.
I know it's an over a year ago question, but in case somebody run into the same issue, make sure to provide credentials. I just ran into same issue and without providing credentials, sqlcmd issues that error. I don't know what's happening behind the scene, but seems that it connects to some sort of default db when authentication fails and couldn't find the table as a result.
sqlcmd -S <computer name> -U <username> -P <password> -i <absolute path to your script>

SQL Server R Services - outputting data to database table, performance

I noticed that rx* functions (eg. rxKmeans, rxDataStep) insert data to SQL Server table in a row-by-row fashion when outFile parameter is set to a table. This is obviously very slow and something like bulk-insert would be desirable instead. Can this be obtained and how to do it?
Currently I am trying to insert about 14 mln rows to a table by invoking rxKmeans function with outFile parameter specified and it takes about 20 minutes.
Example of my code:
clustersLogInitialPD <- rxKmeans(formula = ~LogInitialPD
,data = inDataSource
,algorithm = "Lloyd"
,centers = start_c
,maxIterations = 1
,outFile = sqlLogPDClustersDS
,outColName = "ClusterNo"
,overwrite = TRUE
,writeModelVars = TRUE
,extraVarsToWrite = c("LoadsetId", "ExposureId")
,reportProgress = 0
)
sqlLogPDClustersDS points to a table in my database.
I am working on SQL Server 2016 SP1 with R Services installed and configured (both in-database and standalone). Generally everything works fine except this terrible performance of writing rows to database tables from R scrip.
Any comments will be greatly appreciated.
I brought this up on this Microsoft R MSDN forum thread recently as well.
I ran into this problem and I'm aware of 2 reasonable solutions.
Use sp_execute_external_script output data frame option
/* Time writing data back to SQL from R */
SET STATISTICS TIME ON
IF object_id('tempdb..#tmp') IS NOT NULL
DROP TABLE #tmp
CREATE TABLE #tmp (a FLOAT NOT NULL, b INT NOT NULL );
DECLARE #numRows INT = 1000000
INSERT INTO #tmp (a, b)
EXECUTE sys.sp_execute_external_script
#language = N'R'
,#script = N'OutputDataSet <- data.frame(a=rnorm(numRows), b=1)'
,#input_data_1 = N''
, #output_data_1_name = N'OutputDataSet'
,#params = N' #numRows INT'
,#numRows = #numRows
GO
-- ~7-8 seconds for 1 million row insert (2 columns) on my server
-- rxDataStep for 100K rows takes ~45 seconds on my server
Use SQL Server bcp.exe or BULK INSERT (only if running on the SQL box itself) after first writing a data frame to a flat file
I've written some code that does this but it's not very polished and I've had to leave sections with <<<VARIABLE>>> that assume connection string information (server, database, schema, login, password). If you find this useful or any bugs please let me know. I'd also love to see Microsoft incorporate the ability to save data from R back to SQL Server using BCP APIs. Solution (1) above only works via sp_execute_external_script. Basic testing also leads me to believe that bcp.exe can be roughly twice as fast as option (1) for a million rows. BCP will result in a minimally-logged SQL operation so I'd expect it to be faster.
# Creates a bcp file format function needed to insert data into a table.
# This should be run one-off during code development to generate the format needed for a given task and saved in a the .R file that uses it
createBcpFormatFile <- function(formatFileName, tableName) {
# Command to generate BCP file format for importing data into SQL Server
# https://msdn.microsoft.com/en-us/library/ms162802.aspx
# format creates a format file based on the option specified (-n, -c, -w, or -N) and the table or view delimiters. When bulk copying data, the bcp command can refer to a format file, which saves you from re-entering format information interactively. The format option requires the -f option; creating an XML format file, also requires the -x option. For more information, see Create a Format File (SQL Server). You must specify nul as the value (format nul).
# -c Performs the operation using a character data type. This option does not prompt for each field; it uses char as the storage type, without prefixes and with \t (tab character) as the field separator and \r\n (newline character) as the row terminator. -c is not compatible with -w.
# -x Used with the format and -f format_file options, generates an XML-based format file instead of the default non-XML format file. The -x does not work when importing or exporting data. It generates an error if used without both format and -f format_file.
## Bob: -x not used because we currently target bcp version 8 (default odbc driver compatibility that is installed everywhere)
# -f If -f is used with the format option, the specified format_file is created for the specified table or view. To create an XML format file, also specify the -x option. For more information, see Create a Format File (SQL Server).
# -t field_term Specifies the field terminator. The default is \t (tab character). Use this parameter to override the default field terminator. For more information, see Specify Field and Row Terminators (SQL Server).
# -S server_name [\instance_name] Specifies the instance of SQL Server to which to connect. If no server is specified, the bcp utility connects to the default instance of SQL Server on the local computer. This option is required when a bcp command is run from a remote computer on the network or a local named instance. To connect to the default instance of SQL Server on a server, specify only server_name. To connect to a named instance of SQL Server, specify server_name\instance_name.
# -U login_id Specifies the login ID used to connect to SQL Server.
# -P -P password Specifies the password for the login ID. If this option is not used, the bcp command prompts for a password. If this option is used at the end of the command prompt without a password, bcp uses the default password (NULL).
bcpPath <- .pathToBcpExe()
parsedTableName <- parseName(tableName)
# We can't use the -d option for BCP and instead need to fully qualify a table (database.schema.table)
# -d database_name Specifies the database to connect to. By default, bcp.exe connects to the user’s default database. If -d database_name and a three part name (database_name.schema.table, passed as the first parameter to bcp.exe) is specified, an error will occur because you cannot specify the database name twice.If database_name begins with a hyphen (-) or a forward slash (/), do not add a space between -d and the database name.
fullyQualifiedTableName <- paste0(parsedTableName["dbName"], ".", parsedTableName["schemaName"], ".", parsedTableName["tableName"])
bcpOptions <- paste0("format nul -c -f ", formatFileName, " -t, ", .bcpConnectionOptions())
commandToRun <- paste0(bcpPath, " ", fullyQualifiedTableName, " ", bcpOptions)
result <- .bcpRunShellThrowErrors(commandToRun)
}
# Save a data frame (data) using file format (formatFilePath) to a table on the database (tableName)
bcpDataToTable <- function(data, formatFilePath, tableName) {
numRows <- nrow(data)
# write file to disk
ptm <- proc.time()
tmpFileName <- tempfile("bcp", tmpdir=getwd(), fileext=".csv")
write.table(data, file=tmpFileName, quote=FALSE, row.names=FALSE, col.names=FALSE, sep=",")
# Bob: note that one can make this significantly faster by switching over to use the readr package (readr::write_csv)
#readr::write_csv(data, tmpFileName, col_names=FALSE)
# bcp file to server time start
mid <- proc.time()
bcpPath <- .pathToBcpExe()
parsedTableName <- parseName(tableName)
# We can't use the -d option for BCP and instead need to fully qualify a table (database.schema.table)
# -d database_name Specifies the database to connect to. By default, bcp.exe connects to the user’s default database. If -d database_name and a three part name (database_name.schema.table, passed as the first parameter to bcp.exe) is specified, an error will occur because you cannot specify the database name twice.If database_name begins with a hyphen (-) or a forward slash (/), do not add a space between -d and the database name.
fullyQualifiedTableName <- paste0(parsedTableName["dbName"], ".", parsedTableName["schemaName"], ".", parsedTableName["tableName"])
bcpOptions <- paste0(" in ", tmpFileName, " ", .bcpConnectionOptions(), " -f ", formatFilePath, " -h TABLOCK")
commandToRun <- paste0(bcpPath, " ", fullyQualifiedTableName, " ", bcpOptions)
result <- .bcpRunShellThrowErrors(commandToRun)
cat(paste0("time to save dataset to disk (", numRows, " rows):\n"))
print(mid - ptm)
cat(paste0("overall time (", numRows, " rows):\n"))
proc.time() - ptm
unlink(tmpFileName)
}
# Examples:
# createBcpFormatFile("test2.fmt", "temp_bob")
# data <- data.frame(x=sample(1:40, 1000, replace=TRUE))
# bcpDataToTable(data, "test2.fmt", "test_bcp_1")
#####################
# #
# Private functions #
# #
#####################
# Path to bcp.exe. bcp.exe is currently from version 8 (SQL 2000); newer versions depend on newer SQL Server ODBC drivers and are harder to copy/paste distribute
.pathToBcpExe <- function() {
paste0(<<<bcpFolder>>>, "/bcp.exe")
}
# Function to convert warnings from shell into errors always
.bcpRunShellThrowErrors <- function(commandToRun) {
tryCatch({
shell(commandToRun)
}, warning=function(w) {
conditionMessageWithoutPassword <- gsub(<<<connectionStringSqlPassword>>>, "*****", conditionMessage(w), fixed=TRUE) # Do not print SQL passwords in errors
stop("Converted from warning: ", conditionMessageWithoutPassword)
})
}
# The connection options needed to establish a connection to the client database
.bcpConnectionOptions <- function() {
if (<<<useTrustedConnection>>>) {
return(paste0(" -S ", <<<databaseServer>>>, " -T"))
} else {
return(paste0(" -S ", <<<databaseServer>>>, " -U ", <<<connectionStringLogin>>>," -P ", <<<connectionStringSqlPassword>>>))
}
}
###################
# Other functions #
###################
# Mirrors SQL Server parseName function
parseName <- function(databaseObject) {
splitName <- strsplit(databaseObject, '.', fixed=TRUE)[[1]]
if (length(splitName)==3){
dbName <- splitName[1]
schemaName <- splitName[2]
tableName <- splitName[3]
} else if (length(splitName)==2){
dbName <- <<<databaseServer>>>
schemaName <- splitName[1]
tableName <- splitName[2]
} else if (length(splitName)==1){
dbName <- <<<databaseName>>>
schemaName <- ""
tableName <- splitName[1]
}
return(c(tableName=tableName, schemaName=schemaName, dbName=dbName))
}

Unaccent issue when restoring a Postgres database

I want to restore a particular database under another database name to another server as well. So far, so good.
I used this command :
pg_dump -U postgres -F c -O -b -f maindb.dump maindb
to dump the main database on the production server. The I use this command :
pg_restore --verbose -O -l -d restoredb maindb.dump
to restore the database in another database on our test server. It restore mostly ok, but there are some errors, like :
pg_restore: [archiver (db)] Error while PROCESSING TOC:
pg_restore: [archiver (db)] Error from TOC entry 3595; 1259 213452 INDEX idx_clientnomclient maindbuser
pg_restore: [archiver (db)] could not execute query: ERROR: function unaccent(text) does not exist
LINE 1: SELECT unaccent(lower($1));
^
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
QUERY: SELECT unaccent(lower($1));
CONTEXT: SQL function "cyunaccent" during inlining
Command was: CREATE INDEX idx_clientnomclient ON client USING btree (public.cyunaccent((lower((nomclient)::text))::character varying));
cyunaccent is a function that is in the public shcema and does gets created with the restore.
After the restore, I am able to re-create those indexs perfecly with the same sql, without any errors.
I've also tried to restore with the -i option of pg_restore to do a single transaction, but it doesn't help.
What am I doing wrong ?
I just found the problem, and I was able to narrow it down to a simple test-case.
CREATE SCHEMA intranet;
CREATE EXTENSION IF NOT EXISTS unaccent WITH SCHEMA public;
SET search_path = public, pg_catalog;
CREATE FUNCTION cyunaccent(character varying) RETURNS character varying
LANGUAGE sql IMMUTABLE
AS $_$ SELECT unaccent(lower($1)); $_$;
SET search_path = intranet, pg_catalog;
CREATE TABLE intranet.client (
codeclient character varying(10) NOT NULL,
noclient character varying(7),
nomclient character varying(200) COLLATE pg_catalog."fr_CA"
);
ALTER TABLE ONLY client ADD CONSTRAINT client_pkey PRIMARY KEY (codeclient);
CREATE INDEX idx_clientnomclient ON client USING btree (public.cyunaccent((lower((nomclient)::text))::character varying));
This test case is from a pg_dump done in plain text.
As you can see, the cyunaccent function is created in the public shcema, as it's later used by other tables in other schema.
psql/pg_restore won't re-create the index, as it cannot find the function, despite the fact that the shcema name is specified to reference it. The problem lies in the
SET search_path = intranet, pg_catalog;
call. Changing it to
SET search_path = intranet, public, pg_catalog;
solves the problem. I've submitted a bug report to postgres about this, not yet in the queue.

Resources