Determining number of characters in SQL Script - sql-server

I'm looking to parameterize a SQL script that holds more than 8000 characters, and since a variable can only hold 8000 characters, I am wondering if there is a way to determine how many characters are in a specific script, so I would have some foresight on when I should use a new variable.
Any ideas?

I had many cases like this and the simpliest and free tool to be used is Notepad++. Just copy your script there and start selecting characters (you can first Ctrl+A to see whether the script is more than 8000 characters at all). There is "Sel" parameter in the status bar in the borrom, when it reaches about 8000 cheracters - just break your current variable and start a new one.

Related

BCP Fixed Width Import -> Unexpected EOF encountered in BCP data-file?

I have some sensitive information that I need to import into SQL Server that is proving to be a challenge. I'm not sure what the original database that housed this information was, but I do know it is provided to us in a Unix fixed length text file with LF row terminator. I have two files: a small file that covers a month's worth of data, and a much larger file that covers 5 years worth of data. I have created a BCP format file and command that successfully imports and maps the data to my SQL Server table.
The 5 year data is supposedly in the same format, so I've used the same command and format file on the text file. It starts processing some records, but somewhere in the processing (after several thousand records), it throws Unexpected EOF encountered and I can see in the database some of the rows are mapped correctly according to the fixed lengths, but then something goes horribly wrong and screws up by inserting parts of data in columns they most definitely do not belong in. Is there a character that would cause BCP to mess up and terminate early?
BCP Command: BCP DBTemp.dbo.svc_data_temp in C:\Test\data2.txt -f C:\test\txt2.fmt -T -r "0x0A" -S "stageag,90000" -e log.rtf
Again, format file and command work perfectly for the the smaller data set, but something in the 5 year dataset is screwing up BCP.
Thanks in advance for the replies!
So I found the offending characters in my fixed width file. Somehow whoever pulled the data originally (I don't have access to the source), escaped (or did not escape correctly) the double quotes in some of the text, resulting in some injection of extra spaces breaking the fixed width guidelines we were supposed to be following. After correcting the double quotes by hex editing the file, BCP was able to process all records using the format file without issue. I had used the -F and -L flags to examine certain rows of the data and to narrow it down to where I could visually compare the rows that were ok and the rows where the problems started to arise, which led me to discover the double quotes issue. Hope his helps for somebody else if they have an issue similar to this!

What's the size limit for SQL Server SQLCMD mode :setvar?

I know to include a :setvar command for a string in double quotes but what's the size limit there? Also can it do multiple line strings?
:SETVAR myWebsiteURL "https://something.somewhere.net"
:SETVAR myLongString "This is a long string that goes longer than this box but I am not sure if it will still work in a case like this one where I just keep typing willynilly"
I still don't know the definitive answer to my questions but I have done some testing in SQL Server 2016 SP2 and found that the string length works at least up to 11160 characters which is enough for my purposes. I'm not sure how to do it multiline but sql server management studio query windows allow at least 11160 characters on a line.
Here's what I used to test (note that you need the single quotes to make it a valid string and that you have to escape double quotes with another double quote):
:SETVAR myLongString "'This is a long string that goes longer than this box but I am not sure if it will still work in a case like this one where I just keep typing willynilly'"
select $(myLongString)
select len($(myLongString))

SAS ExportPackage command exceeds 8191 characters

We have an automated process for exporting metadata items for promotion, using the ExportPackage commandline utility (documented here).
The command is written to a .bat file, and then executed (in SAS) via a filename pipe.
We recently observed a strange behaviour when exporting multiple objects (around 60), that we believe is due to the windows line length limitation for batch commands.
Basically, one character would be removed (meaning that particular object would not be found), but the rest of the line (after 8191 chars) executes successfully.
Am interested to know:
Can the ExportPackage command be executed in a way that does not hit the 8191 limitation?
Alternatively, can the ExportPackage command be split over multiple lines somehow?
Or is there some way to pass a file to the -objects parameter, rather than space separated values?
Or is it possible to append to (rather than replace) an .spk file?
I doubt there's any answer to this that you're going to like. The documentation you linked states that existing package files with the same names are overwritten and does not mention any way of appending to one.
You can split the command over multiple lines within the batch file using ^ characters, but this still doesn't get around the overall 8191 character limit after recombining the pieces.
Therefore, you will need to do one or more of the following:
Export your items to separate packages with different filenames or in different folders, e.g. 20 at a time
Move your objects into a limited set of folders and subfolders before exporting, and export only the top level folders rather than the individual objects. It looks as though you can still use the other command line options to limit which objects are exported.
Silly option: create a dummy object with dependencies on all of the objects you want to export, mention only that one explicitly in the objects list, and use the -includeDep parameter to force the export utility to export all its dependencies.
Disclaimer: I have never actually used the export utility in question.
I got around this issue by inserting a dummy character at the 8191 point. Note that this 8191 length limit is everything in the command AFTER "ExportPackage".
One solution is therefore as follows:
/* If the ExportPackage command line is more than 8191 characters, it will
fail due to a windows line length limitation. To avoid this, add a
hash character at the 8191 point.
*/
%let log= \path\to\my.log;
%let profile= -profile "\path\to\my\dummy\profile.swa";
%let package= -package "\path\to\my\desired.spk";
%let str=&my_list_of_objects; /* previously defined */
%let breakpoint=%eval(
8191 - %length(%str(&profile &package -objects))-1);
%if %length(&str)>=&breakpoint %then %let objects=%substr(
&str,1,&breakpoint-1)#%substr(&str,&breakpoint,%length(&str)-&breakpoint+1);
%else %let objects=&str;
The command could then executed along the lines of:
ExportPackage &profile &package &objects -subprop -includeEmptyFolders -log &log
What DIDN'T work:
Inserting spaces between quotes at the 8191 point, eg:
"object1" "object2"
The extra spaces were ignored and a character was still removed from the second object.
Inserting spaces within the literal, eg:
"object1 " "object2"
Object 1 was not found, presumably due to the trailing spaces.

Stable text feed for vim through vimserver

I am searching for a highly stable way to feed text (output of a program) into vim through vimserver. Assume that I have started a (g)vim session with gvim --servername vim myfile. The file myfile contains a (unique) line OUT: which marks the position where the text should be pasted. I can straight forwardly achieve this from the commandline with vim --servername vim --remote-send ':%s/OUT:/TEXT\\rOUT:/<Enter>'. I can repeatedly feed more text using the same command. Inside a C-program I can execute it with system(). However TEXT which is dynamic and arbitrary (received as a stream in the C-program) needs to be passed on the command line and hence it needs to be escaped. Furthermore using the replacement command %s vim will jump to the position where TEXT is inserted. I would like to find a way to paste large chunks of arbitrary text seamlessly in vim. An idea is to have vim read from a posix pipe with :r pipe and to write the the string from within the C-program to the pipe. Ideally the solution would be such that I can continuously edit the same file manually without noting that output is added at OUT: as long as this location is outside the visible area.
The purpose of this text feed is to create a command line based front end for scripting languages. The blocks of input is entered manually by the user in a vim buffer and is being sent to the interpreter through a pipe using vim's :! [interpreter] command. The [interpreter] can of course write the output to stdout (preceded by the original lines of input) in which case the input line is replaced by input and output (to be distinguished using some leading key characters for instance). However commands might take a long time to produce the actual output while the user might want to continue editing the file. Therefore my idea is to have [interpreter] return OUT: immediately and to append subsequent lines of output in this place as they become available using vimserver. However the output must be inserted in a way which does not disturb or corrupt the edits possibly made by the user at the same time.
EDIT
The proposed solutions seem to work.
However there seem to be at least two caveats: * if I send text two or more times this way the `` part of the commands will not take me back to the original cursor position (if I do it just once still the markers are modified which may interrupt the user editing the file manually) * if the user opens a different buffer (e.g. the online help) the commands will fail (or maybe insert the text in the present buffer)
Any ideas?
EDIT: After actually trying, this should work for you:
vim --servername vim --remote-send \
":set cpo+=C<CR>/OUT:<CR>:i<CR>HELLO<CR>WORLD<CR>.<CR>\`\`"
As far as I can see, the only caveats would be the period on a single line, which would terminate :insert instead of being inserted, and <..> sequences that might be interpreted as keypresses. Also, need to replace any newlines in the text with <CR>. However, you have no worries about regular expressions getting muddled, the input is not the command line, the amount of escaping necessary is minimal, and the jumping is compensated for with the double backticks.
Check out :help 'cpoptions', :help :insert, and :help ''.
Instead of dealing with the escaping, I would rather use lower-level functions. Use let lnum = search('^OUT:$') to locate the marker line, and call setline(lnum, [text, 'OUT:']) to insert the text and the marker line again.

Can you make SQLCMD immediately run each statement in a script without the use of "GO"?

When running a script in SQLCMD, is there a setting, parameter or any other way of making it effectively run a GO command at the end of each SQL statement?
This would prevent the need to insert periodic GO commands into very large scripts.
No.
SQLCMD would have to implement a full T-SQL parser itself in order to determine where the statement boundaries are. Whereas as it currently exists, all it has to do is find lines with "GO" on them.
Determining statement boundaries can be quite tricky, given that ";" terminators are still optional for most statements.
Hmmm, I guess this is not going to help but is an answer to your question:
http://msdn.microsoft.com/en-us/library/ms162773.aspx
-c cmd_end
Specifies the batch terminator. By default, commands are terminated and sent to SQL Server by typing the word "GO" on a line by itself. When you reset the batch terminator, do not use Transact-SQL reserved keywords or characters that have special meaning to the operating system, even if they are preceded by a backslash.
Especially the last sentence sounds daunting....
If all your lines end in ; and there is no ; any where else (in text fields for example)
try
sqlcmd -c ;

Resources