From what was suggested here, I am trying to pipe the output from sqlcmd to 7zip so that I can save disk space when dumping a 200GB database. I have tried the following:
> sqlcmd -S <DBNAME> -Q "SELECT * FROM ..." | .\7za.exe a -si <FILENAME>
This does not seem to be working even when I leave the system for a whole day. However, the following works:
> sqlcmd -S <DBNAME> -Q "SELECT TOP 100 * FROM ..." | .\7za.exe a -si <FILENAME>
and even this one:
> sqlcmd -S <DBNAME> -Q "SELECT * FROM ..."
When I remove the pipe symbol, I can see the results and can even redirect it to a file within finishes in 7 hours.
I am not sure what is going on with piping large amount of output but what I could understand up until this point is that 7zip seems to be waiting to consume the whole input before it creates an archive file (because I don't really see a file being created to begin with) so I am not sure if it is actually performing on-the-fly compression. So I tried gzip and here's my experience:
> echo "Test" | .\gzip.exe > test.gz
> .\gzip.exe test.gz
gzip: test.gz: not in gzip format
I am not sure I am doing this the right way. Any suggestions?
Oh boy! It was PowerShell all along! I have no idea why this is happening at least with gzip. Gzip kept complaining that the input was not in gzip format. I switched over to the normal command prompt and everything started working.
I did observe this before. Looks like | and > have a slightly different functionality in PowerShell and Command prompt. Not sure what exactly it is but if someone knows about it, please add in here.
Related
Trying to run this below command from a Linux box,
sqlcmd -S<server> -d<db> -U<login> -P<pwd> -i /scripts/StoredProcedure1.sql -o /logs/output.log
If the SP is executed in SSMS it would normally give
Commands completed successfully.
Completion time: 2022-06-12T18:28:18.4580782-04:00
However this is not getting captured in the output.log as it is only of zero bytes.
How do I capture that output too.
Tried these, different combinations
-r[0 | 1] (msgs to stderr) - and it did not worked
-m error_level - and it did not worked
https://ss64.com/nt/syntax-redirection.html - and it did not worked
May be i'm missing something silly or just putting in wrong combination. Could anyone help here.
i have a small problem with BCP functionality in SQL Server 2012.
The things is:
im loading .jpg image (167KB in size) using below command:
INSERT [tabela_testowa] ( Data )
SELECT * FROM OPENROWSET (BULK N'C:\foty\ch6_MagicShop.jpg', SINGLE_BLOB) a;
and then im trying to export it back to disk using:
BCP "SELECT data FROM tabela_testowa WHERE ID = 1" queryout "C:\test\file.jpg" -T -n -d test
File gets saved on disk no problem, size is also 167 KB but.. it cant be opened like the original copy.
I dont know whatever some parameter is wrong in BCP export? Or maybe it gets corrupted at import stage?
Anyone had similiar problems?
Thank god, thanks to #user_0 answer and #user3494351's cryptic answer and comment and this ancient forum post I finally figured this out after several hours of banging my head against the wall.
The issue is that BCP likes to add an extra 8 bytes to the file by default. This corrupts the file and makes it unable to be opened if you just use the native -n flag.
However, BCP allows you to specify a format file as output that can allow you to tell it not to add the extra 8 bytes. So I have a table I created (to be used in a cursor) in SQL Server that only has ONE ROW and ONE COLUMN with my binary data. Table must exist when you run the first command.
In command line first you need to do this:
bcp MyDatabase.MySchema.MyTempTable format nul -T -n -f formatfile.fmt
This creates formatfile.fmt in the directory you are in. I did on E:\ drive. Here's what it looks like:
10.0
1
1 SQLBINARY 8 0 "" 1 MyColumn ""
That 8 right there is the variable that bcp says how many bytes to add to your file. It is the bastard that is corrupting your files. Change that sucker to a 0:
10.0
1
1 SQLBINARY 0 0 "" 1 MyColumn ""
Now just run your BCP script, drop the -n flag and include the -f flag:
bcp "SELECT MyColumn FROM MyDatabase.MySchema.MyTempTable" queryout "E:\MyOutputpath" -T -f E:\formatfile.fmt
BCP is adding informations to his file. Just few data, but you are not exporting just a jpg file.
You say 167 KB, but watch the real bytes, not the rounded dimension. There will be a difference.
You cannot export the image via BCP.
OK so i solved the issue.
Format file has to be added using -f and path to the file. It can be create by running bcp without any format and order it to save format file to disk. Then we can use this format file so its no longer needed to answer those questions, and file itself has no additional data and can be opened without problems
I've gotten some help with an earlier part of this batch file, but now I'm having trouble with the final component.
I've tried a few things with no success. I tried changing the CRLF to LF which did nothing. I also tried rephrasing the commands a few ways but I am still not getting anywhere. The following is my main batch file.
#echo on
REM delete deauth command file
SET OutFile="C:\temp\Out2.txt"
IF EXIST "%OutFile%" DEL "%OutFile%"
plink -v -ssh *#x.x.x.x -pw PW -m "c:\temp\WirelessDump.txt" > "C:\temp\output.txt"
setlocal
for /f %%a in (C:\temp\output.txt) do >> "Out2.txt" echo wir cli mac-address %%a deauth forced
REM Use commands in out2 to deauth
plink -v -ssh *#x.x.x.x -pw PW -m "c:\temp\Out2.txt"
pause
Below this sentence is the command found in Out2 which I think is giving the actual trouble. The number of lines varies but they are all this particular command just with differing MACs.
wir cli mac-address xxxx.xxxx.xxxx deauth forced
If Out2 has only a single line it runs fine, no issues. But when there are multiple lines, it fails with an error stating that the Line has an invalid autocommand. It's almost as if it was reading it as one contiguous command. As I mentioned above I changed from CRLF to LF hoping IOS would like it better, but that failed. I've tried adding extra lines between the commands, and I've tried calling the login every time from that file.
I am hoping that there is a way to tailor the commands to pass all lines one at a time to keep this down to a minimum of files.
I had another thought but it is kinda/very clunky. If there was a way to output each of those MAC deauth commands to their own file in a saperate folder (out1, out2, out3), and have the BAT able to run all the randomly generated files in that folder so that each one is a separated plink session.
Let me know if I need to change/add/elaborate on anything. Thanks in advance for anything you guys are willing to help with. I appreciate it.
EDIT: Martin has pointed out what the limitation actually is. It appears to be a limitation on Cisco to accept blocks of commands through SSH. So I still have the same question really, I just need some help figuring a workaround to this issue. I'm thinking the multiple file solution I mentioned above may have some possibility. But I'm too much of a noob to know how to make that work. I'll update if I have any breakthroughs though. Thanks for any contributions!
It's actually a known limitation of Cisco, that it does not support multiple commands in an SSH "exec" channel command.
Quoting section 3.8.3.6 -m: read a remote command or script from a file of PuTTY/Plink manual:
With some servers (particularly Unix systems), you can even put multiple lines in this file and execute more than one command in sequence, or a whole shell script; but this is arguably an abuse, and cannot be expected to work on all servers. In particular, it is known not to work with certain ‘embedded’ servers, such as Cisco routers.
Though you can probably still feed multiple commands to Plink input:
(
echo command 1
echo command 2
echo command 3
echo exit
) | plink -v -ssh user#host -pw password > output.txt
Or you can simply use an input file:
plink -v -ssh user#host -pw password < input.txt > output.txt
Similar question: A way of typing multiple commands in cmd.txt file using PuTTY batch against Cisco
This works without cmd.exe and using files:
function Invoke-PlinkCommandsIOS {
param (
[Parameter(Mandatory=$true)][string] $Host,
[Parameter(Mandatory=$true)][System.Management.Automation.PSCredential] $Credential,
[Parameter(Mandatory=$true)][string] $Commands,
[Switch] $ConnectOnceToAcceptHostKey = $false
)
$PlinkPath="$PSScriptRoot\plink.exe"
$commands | & "$PSScriptRoot\plink.exe" -ssh -2 -l $Credential.GetNetworkCredential().username -pw "$($Credential.GetNetworkCredential().password)" $Host -batch
}
Usage: dont forget your exit's and terminal length 0 or it will hang
PS C:\> $Command = "terminal lenght 0
>> show running-config
>> exit
>> "
>>
PS C:\> Invoke-PlinkCommandsIOS -Host ace-dc1 -Credential $cred -Commands $Command
....
Sounds like your file 'Out2.txt' has only LF at end of line. Simple way to convert that to CRLF is to use MORE command and redirect output to a new file and then use the new file.
more Out2.txt > Out2CRLF.txt
I ran into the same issue when trying to pull the full list of ACLs on an ASA via plink in powershell.
Essentially, due to the abuse issue referenced in the documentation: https://the.earth.li/~sgtatham/putty/0.72/htmldoc/Chapter3.html#using-cmdline-m, I was getting inconsistent results in pulling the ACLs. Sometimes I would get 0, sometimes only 1 or 2, and sometimes I would get all of them. (I personally, had about a 1 in 5 success rate).
As I would occasionally be successful I used a while loop that would catch the unsuccessful attempts and retry. Just be sure to put some timing on the while loop to prevent it from spamming ssh connections too much.
It is not a good solution, but it worked as a last resort.
i have a small problem with BCP functionality in SQL Server 2012.
The things is:
im loading .jpg image (167KB in size) using below command:
INSERT [tabela_testowa] ( Data )
SELECT * FROM OPENROWSET (BULK N'C:\foty\ch6_MagicShop.jpg', SINGLE_BLOB) a;
and then im trying to export it back to disk using:
BCP "SELECT data FROM tabela_testowa WHERE ID = 1" queryout "C:\test\file.jpg" -T -n -d test
File gets saved on disk no problem, size is also 167 KB but.. it cant be opened like the original copy.
I dont know whatever some parameter is wrong in BCP export? Or maybe it gets corrupted at import stage?
Anyone had similiar problems?
Thank god, thanks to #user_0 answer and #user3494351's cryptic answer and comment and this ancient forum post I finally figured this out after several hours of banging my head against the wall.
The issue is that BCP likes to add an extra 8 bytes to the file by default. This corrupts the file and makes it unable to be opened if you just use the native -n flag.
However, BCP allows you to specify a format file as output that can allow you to tell it not to add the extra 8 bytes. So I have a table I created (to be used in a cursor) in SQL Server that only has ONE ROW and ONE COLUMN with my binary data. Table must exist when you run the first command.
In command line first you need to do this:
bcp MyDatabase.MySchema.MyTempTable format nul -T -n -f formatfile.fmt
This creates formatfile.fmt in the directory you are in. I did on E:\ drive. Here's what it looks like:
10.0
1
1 SQLBINARY 8 0 "" 1 MyColumn ""
That 8 right there is the variable that bcp says how many bytes to add to your file. It is the bastard that is corrupting your files. Change that sucker to a 0:
10.0
1
1 SQLBINARY 0 0 "" 1 MyColumn ""
Now just run your BCP script, drop the -n flag and include the -f flag:
bcp "SELECT MyColumn FROM MyDatabase.MySchema.MyTempTable" queryout "E:\MyOutputpath" -T -f E:\formatfile.fmt
BCP is adding informations to his file. Just few data, but you are not exporting just a jpg file.
You say 167 KB, but watch the real bytes, not the rounded dimension. There will be a difference.
You cannot export the image via BCP.
OK so i solved the issue.
Format file has to be added using -f and path to the file. It can be create by running bcp without any format and order it to save format file to disk. Then we can use this format file so its no longer needed to answer those questions, and file itself has no additional data and can be opened without problems
i have looked all over the internet and cant seem to find a solution to this problem.
i am trying to output query results as a CSV through using a combination of sqlcmd and windows batch. here is what i have so far:
sqlcmd.exe -S %DBSERVER% -U %DBUSER% -P %DBPASS% -d %USERPREFIX% -Q "SELECT Username, UserDOB, UserGender FROM TABLE" -o %USERDATA%\%USERPREFIX%\FACT_BP.CSV -h-1 -s","
is there something i'm missing here? some setting that only looks at the first column of the query results?
any advice at all would be a huge help - i'm lost.
Here is the reference page from MSDN on SQLCMD.
http://technet.microsoft.com/en-us/library/ms162773.aspx
I placed this command in a batch file in C:\temp as go.bat.
sqlcmd -S(local) -E -dmaster
-Q"select cast(name as varchar(16)), str(database_id,1,0), create_date from sys.databases"
-oc:\temp\sys.databases.csv -h-1 -s,
Notice I hard coded the file name and removed the "" around the field delimiter.
I get the expected output below.
Either the command does not like the system variables or something else is wrong. Please try my code as a base line test. It works for SQL 2012.
Also, the number of lines is always dumped to file. You must clear this out of the file. That is why I do not use SQLCMD for ETL.
Why not use BCP instead?
I have writing several articles on my website.
http://craftydba.com/?p=1584