syntax error near go in a script run from powershell - sql-server

I'm trying to bulk import into SQL Server, and I need to automate the task (there are thousands of directories) in Powershell. I'm using bcp and have a format file because I need to skip a column when importing. Whenever I run this, it fails with the error:
Exception calling "ExecuteReader" with "0" argument(s): "Incorrect syntax near 'GO'.
The code is:
$query =
"USE Database;
GO
BULK INSERT $tableName
FROM 'C:\users\Name\documents\bcp_sql\File\$name\$dir_id${string}.csv'
WITH (FORMATFILE = 'C:\users\Name\documents\bcp_sql\formatFile.fmt');
GO
SELECT * FROM $tableName;
GO"
$sqlCmd2 = $connection.CreateCommand()
$sqlCmd2.Connection = $connection
$sqlCmd2.CommandText = $query
$sqlCmd2.ExecuteReader()
I've confirmed that the file paths do, in fact, exist (by cd-ing to them).

Related

I am trying to run multiple query statements created when using the python connector with the same query id

I have created a Python function which creates multiple query statements.
Once it creates the SQL statement, it executes it (one at a time).
Is there anyway to way to bulk run all the statements at once (assuming I was able to create all the SQL statements and wanted to execute them once all the statements were generated)? I know there is an execute_stream in the Python Connector, but I think this requires a file to be created first. It also appears to me that it runs a single query statement at a time."
Since this question is missing an example of the file, here is a file content that I have provided as extra that we can work from.
//connection test file for python multiple queries
import snowflake.connector
conn = snowflake.connector.connect(
user = 'xxx',
password = '',
account = 'xxx',
warehouse= 'xxx',
database= 'TEST_xxx'
session_parameters = {
'QUERY_TAG: 'Rachel_test',
}
}
while(conn== true){
print(conn.sfqid)import snowflake.connector
try:
conn.cursor().execute("CREATE WAREHOUSE IF NOT EXISTS tiny_warehouse_mg")
conn.cursor().execute("CREATE DATABASE IF NOT EXISTS testdb_mg")
conn.cursor().execute("USE DATABASE testdb_mg")
conn.cursor().execute(
"CREATE OR REPLACE TABLE "
"test_table(col1 integer, col2 string)")
conn.cursor().execute(
"INSERT INTO test_table(col1, col2) VALUES " +
" (123, 'test string1'), " +
" (456, 'test string2')")
break
except Exception as e:
conn.rollback()
raise e
}
conn.close()
The reference to this question refers to a method that can be done with the file call, the example in documentation is as follows:
from codecs import open
with open(sqlfile, 'r', encoding='utf-8') as f:
for cur in con.execute_stream(f):
for ret in cur:
print(ret)
Reference to guide I used
Now when I ran these, they were not perfect, but in practice I was able to execute multiple sql statements in one connection, but not many at once. Each statement had their own query id. Is it possible to have a .sql file associated with one query id?
Is it possible to have a .sql file associated with one query id?
You can achieve that effect with the QUERY_TAG session parameter. Set the QUERY_TAG to the name of your .SQL file before executing it's queries. Access the .SQL file QUERY_IDs later using the QUERY_TAG field in QUERY_HISTORY().
I believe though you generated the .sql while executing in snowflake each statement will have unique query id.
If you want to run one sql independent to other you may try with multiprocessing/multi threading concept in python.
The Python and Node.Js libraries do not allow multiple statement executions.
I'm not sure about Python but for Node.JS there is this library that extends the original one and add a method call "ExecutionAll" to it:
snowflake-multisql
You just need to wrap multiple statements with the BEGIN and END.
BEGIN
<statement_1>;
<statement_2>;
END;
With these operators, I was able to execute multiple statement in nodejs

Issue while trying to insert data from csv file into a accdb table

I have a access database with a lot of information, I am not that good with access database and I am still a newbie when it comes to powershell. I have been trying to automate the insertion of data from the csv files into the access database as they are done weekly.
after researching I found a powershell code that allows insertion of CSV into databases, However I have not been able to make it work with my current acccess database and CSV file.
write-host "current path is $PSSCriptRoot"
$datafile="$PSScriptRoot\Input.csv" #trying to import this file into the databse
$dbfile="$PSScriptRoot\database.accdb" #this is the test access database
$connectionString="Provider=Microsoft.Ace.OLEDB.12.0; Data Source=$dbfile" #general info I found researc
$conn = New-Object System.Data.OleDb.OleDbConnection($connectionString)
$conn.Open()
$cmd = $Conn.CreateCommand()
Import-Csv $dataFile |
ForEach{
$cmd.CommandText = "INSERT into [MyTable1]([Name],[First Data Set],[Description],[Enabled],[Last Date,Creation Date],[Modification Date],[Day of The year],[ID type],[ID Date],[Street Address],[date],[type])
VALUES('{0}', '{1}', '{2}', '{3}', '{4}', '{5}', '{6}', '{7}', '{8}', '{9}', '{10}', '{11}', '{12}')" -f #(
$_.Name,
$_.'First Data Set',
$_.Description,
$_.Enabled,
if ($_.'Last Logon Date') { ([datetime]$_.'Last Logon Date').ToString('M/d/yyyy H:mm') } else {[System.DBNull]::Value }
if ($_.'Creation Date') { ([datetime]$_.'Creation Date').ToString('M/d/yyyy H:mm') } else { [System.DBNull]::Value}
if ($_.'Modification Date') { ([datetime]$_.'Modification Date').ToString('M/d/yyyy H:mm') } else { [System.DBNull]::Value}
$_.'Day of The year',
$_.'ID type',
if ($_.'ID Date') { ([datetime]$_.'Modification Date').ToString('M/d/yyyy H:mm') } else { [System.DBNull]::Value },
$_.'Street Address',
if ($_.date) { ([datetime]$_.date).ToString('M/d/yyyy H:mm') } else { [System.DBNull]::Value}
$_.Type
)
$cmd.ExecuteNonQuery()
}
$conn.Close()
The access database has the corresponding table with the corresponding rows from the csv, so what should happen is that the data from the csv would be inserted into the table.
the input and database files are recognized with the $PSScriptRoot command as they point to the path of the script which is in the same folder.
The error I am getting is the following:
"Exception calling "ExecuteNonQuery" with "0" argument(s): "Syntax error in INSERT INTO statement."
At C:\1 - database paths\Database1\database.accdb "
yes that is the current testing path, it has spaces and a couple of special characters, this cannot be changed and as other scripts work fine, this should also work fine.
Since I have little knowledge about accdb I am having a hard time finding why is the error, for what it says it seems my syntax for the insert command is not correct, yet I cant find an example of another type of syntax used for that command. I used the -f as rows contain spaces and there is an issue with quotes on powershell. I can not change the path name nor the row names at all.
edit: now I have another problem, where the columns with date time cannot accept a null value, I tried going around it by asetting a DBNull value but this does not work. I need to see those blank values on my database for statistic purposes , is there a way to import those empty values ?
i had to format each value into adata type as powershell passes them a strings, .

Restore SQL database - Incorrect syntax near '-'

I'm trying to restore a database with a - in the name. If I have no - in the name, the database is correctly restored.
If I put '' around the database it still doesn't work. I'm still searching what to do but I can't find it.
$sqlRestoreCommand.CommandText = "RESTORE DATABASE '$databaseName' FROM DISK = '$databaseBackup' " ;
The following exception occurs and the code is below:
Exception: Exception calling "ExecuteNonQuery" with "0" argument(s): "Incorrect syntax near '-'.
Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a semicolon.".Exception.Message
$databaseName = "TempRpc-RC";
# Write the state for debugging purposes.
"SQL State" > sqlstatelog.txt;
$connection.State >> sqlstatelog.txt;
# Create the SQL restore command.
$sqlRestoreCommand = $connection.CreateCommand();
# Set the command text to a SQL restore command and fill in the parameters.
# With move option is needed!
$sqlRestoreCommand.CommandText = "RESTORE DATABASE $databaseName FROM DISK = '$databaseBackup' " ;
$sqlRestoreCommand.CommandText += "WITH FILE = 1, ";
$sqlRestoreCommand.CommandText += "MOVE N'$databaseName" + "'" + " TO N'C:\Program Files\Microsoft SQL Server\MSSQL14.SQLEXPRESS\MSSQL\Data\$databaseName" + "_Data.mdf', ";
$sqlRestoreCommand.CommandText += "MOVE N'$databaseName" + "_log'" +" TO N'C:\Program Files\Microsoft SQL Server\MSSQL14.SQLEXPRESS\MSSQL\Data\$databaseName" +"_Log.ldf'";
$sqlRestoreCommand.Connection = $connection;
# Execute the commands.
$sqlRestoreResult = $sqlRestoreCommand.ExecuteNonQuery();
# Write the query
Write-Verbose $sqlRestoreCommand.CommandText;
tl;dr:
Inside the expandable PowerShell string ("...") that defines the query string, replace '$databaseName' with [$databaseName]
As an aside: Unless you fully control all parts used to construct the query string, you should use parameters to avoid SQL injection attacks, as Ansgar advises.
Ansgar Wiechers provided the crucial pointer:
Since your database name contains a - character, it isn't a regular identifier, so it must be quoted / delimited, i.e., it must be specified as a delimited identifier.
Additionally, identifiers that conflict with reserved keywords must be specified as delimited identifiers too.
A delimited identifier is one enclosed in either [...] or "..."; by contrast, '...' is not supported, which explains your symptoms.
Note: [...] is SQL Server-specific, but is always supported there, whereas the standard "..." is only supported when SET QUOTED_IDENTIFIER ON is in effect in T-SQL, which is true by default, however - except in certain situations, such as when using sqlcmd.exe without -l.

Laravel 5.5 - DB::statement error with \copy command (POSTGRES)

Im trying to use the \copy command from POSTGRES using laravel 5.5, to insert a large file at the DB, but im getting this error bellow.
I tried this way:
DB::statement( DB::raw("\\copy requisicoes FROM '".$file1."' WITH DELIMITER ','"));
Get this error:
SQLSTATE[42601]: Syntax error: 7 ERROR: syntax error at or near "\" LINE 1: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_... ^ (SQL: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_log1_2018-10-29' WITH DELIMITER ',')
Tried this way too:
DB::statement( DB::raw('\copy requisicoes FROM \''.$file1.'\' WITH DELIMITER \',\''));
Get this error:
SQLSTATE[42601]: Syntax error: 7 ERROR: syntax error at or near "\" LINE 1: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_... ^ (SQL: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_log1_2018-10-29' WITH DELIMITER ',')
If i execute the command that returns on the error above with psql line command, works fine
\copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_log1_2018-10-29' WITH DELIMITER ','
Could somebody helps me? :)
I have to use \copy insted of copy becouse I dont have superuser privilege at the DB.
https://www.postgresql.org/docs/9.2/static/sql-copy.html
COPY naming a file is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
See this article on PostgreSQL and note this line:
Do not confuse COPY with the psql instruction \copy. \copy invokes
COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in
a file accessible to the psql client. Thus, file accessibility and
access rights depend on the client rather than the server when \copy
is used.
\copy is a psql instruction, so you do not need to write \copy, just COPY.
This is my code to import data from sql to pgsql database
First export CSV file with separator '^'
Then import same file into pgsql using copy command
$users = User::select('*')->get()->toArray();
$pages = "id,warehouse_id,name,email,email_verified_at,password,remember_token,created_at,updated_at\n";
foreach ($users as $where) {
$pages .= "{$where['id']}^{$where['warehouse_id']}^{$where['name']}^{$where['email']}^{$where['email_verified_at']}^{$where['password']}^{$where['remember_token']}^{$where['created_at']}^{$where['updated_at']}\n";
}
$file = Storage::disk('local')->put('user.csv', $pages);
if($file){
$data = "";
try {
$file_path = storage_path('app/user.csv');
$data = DB::connection('pgsql')->statement("copy public.users (id, warehouse_id, name, email, email_verified_at, password, remember_token, created_at, updated_at) FROM '$file_path' DELIMITER '^' CSV HEADER ENCODING 'UTF8' ESCAPE '\"';");
} catch (\Exception $e) {
throw $e;
}
}

sql server invalid precision on exists check query

Using sql server 2008 I am getting and invalid precision value error in the following perl script:
use DBI;
$idx = '12345';
$query = 'if exists (select * from tbl where idx = ?) select top 10 * from tbl';
my $h = $dbh->prepare($query) or die "Couldn't prepare query: " . $dbh->errstr;
$h->execute($idx) or die "Couldn't execute statement: " . $h->errstr;
Note however that if I try this instead
use DBI;
$query = 'if exists (select * from tbl where idx = \'12345\') select top 10 * from tbl';
my $h = $dbh->prepare($query) or die "Couldn't prepare query: " . $dbh->errstr;
$h->execute() or die "Couldn't execute statement: " . $h->errstr;
then it works. I am really confused at how the ? in the query could possibly be causing an invalid precision error.
Thanks for any help anyone can provide.
Based on this article, please try the following:
You need to import the SQL type constants from DBI
and specify that SQL_LONGVARCHAR as the type of the data to be added to the memo field.
To do that you do:
$dbh->bind_param(1, $idx, SQL_LONGVARCHAR);
Binding with a specific type overrides what DBD::ODBC decides. DBD::ODBC will bind the parameter based on what comes back from SQLDescribeParam. Sometimes SQL Server's SQLDescribeParam fails, especially in cases where you are using functions or subselects. The SQL Server ODBC driver takes your SQL and rearranges it to attempt to end up with something like "select idx from tbl" then it looks at the columns to answer SQLDescribeParam calls. I'm betting the SQL Server ODBC driver fails to rearrange your SQL in this case and either SQLDescribeParam failed or returned the wrong information. If you enable tracing in DBD::ODBC we could probably see this happening.

Resources