The Server Explorer built into Visual Studio is a nice way to connect to your database, view existing data, and edit data. Very useful for entering sample data into a database for testing and development purposes.
However, what if you have a "text" column and want to enter a long text field? For example, I have a text column in which I want to store a multi-line XML value. I'd like to be able to do multi-line editing from within the table data viewer.
I can't figure out how to do that? Does anyone know if it can be done, and if so, how?
I don't think it's possible. Management Studio display multi-lines text as one string.
I don't think the editor supports multi-line input or editing. However, you can enter multiple line data by creating a simple INSERT statement. For example:
CREATE TABLE TableA (
id INT IDENTITY PRIMARY KEY,
data VARCHAR(MAX)
)
INSERT INTO TableA (data) VALUES
('row 1
row 2
row 3')
Then, just select the entire INSERT statement and all of the data, and hit F5 to execute it.
Also, FWIW, the TEXT data type has been deprecated; VARCHAR(MAX) is the replacement (it's better for many reasons).
Related
I have a quite strange / unusual request for an SSRS report. The client wants to be able to paste in large lists of ID numbers from an Excel sheet column (normally < 100 values but can be as many as 20,000 + values) as a search parameter in the report.
Now, I know how to pass multi value parameters from SSRS to a stored proc etc, that's not the issue. The issue is with this requirement to literally paste in a list of IDs into the multi value parameter input box and then limit the dataset based on that list (rather than pre-populate the Multi Value parameter with a list of values based on a query / SP as you normally would)
My question is what would be the best method / approach to this problem as I have never had a similar ask in many years of SSIS development? I need to make the solution as "self-service" as possible too, so as easy as running an SSRS report from report manager ideally. I know I could just import the Excel data into a table in the database and join to that etc but ideally would like something the user can run without the need for any tech input to import data or run SQL through SSMS to get the datset.
When you copy/paste from excel its just a tab delimited string. You can configure a string parameter to allow multiple values, and then in the expression editor, split the string by tab values or by line breaks.
Here's how:
In the parameter properties check the 'Allow multiple values' checkbox on the
general tab
Then in the datasource properties click the 'fx' (expression) button next to the parameter on the parameters tab
In the expression editor that appears, type the following:
=replace(split(Parameters!MyParameter.Value,vbCrLf),"\x0009","")
This will split the string by line breaks, then strip the tab characters out. After that you can treat it like any multi-valued parameter.
If you have a typical multi-value parameter setup then you can just copy/paste the column from excel directly in, it will automatically put each cell row copied from excel on a new line.
So if your query looked something like
SELECT * FROM myTable WHERE EmployeeName IN(#empName)
and empName was you report parameter, if no available values are configured you'll get an empty list when you click in the parameter field, just copy paste direct from Excel and it will work.
I'm not sure if there would be any limits or how good performance would be especially if copying thousand of values but certainly for a reasonably small number of items this will work.
The only other way I can think of that means no real extra work for the user would be to have them drop the workbook into a specified folder (maybe with a subfolder based on their SSRS username, then use openrowset to read the contents either directly into a dataset or better still, into a permanent table with their username and the parameter value on each row.
The openrowset statement could sit at the top of your main dataset query
Then your query could do something simple like
DELETE FROM myParamValueTable WHERE UserName = #UserID
INSERT INTO myParamValueTable
SELECT * FROM (OPENROWSET .....) xl
SELECT * FROM myTable t
JOIN myParamValueTable p
ON t.EmployeeName = p.EmployeeName
WHERE p.UserName = #UserID
In SSMS, trying to add some test data by editing top 200 rows. How do I get a uuid to paste into the cell? I'm trying not to have to write insert statements.
Have tried getting uuids from an online uuid generator - tried all formats from this generator. Reviewing other stack overflow questions seem to relate to programming rather than editing issues with uuids. Tried adding uuid enclosed with single quotes and without single quotes.
unfortunately the field is defined as binary 16, not null.
AFAIK, Enterprise Manager doesn't allow pasting into binary fields.
If you define the column with the correct data type of uniqueidentifier your paste should work just fine.
https://learn.microsoft.com/en-us/sql/t-sql/data-types/uniqueidentifier-transact-sql?view=sql-server-2017
The fix isn't to use an INSERT statement (although that is a workaround for right now), the fix is to use the correct datatype.
As an added benefit, the uniqueidentifier column won't allow invalid values, while binary will handle any bytes you can jam in there.
The error message says:
You cannot use the results pane to set this field to values other than null
I would presume this means you cannot supply a guid in the grid, you have to leave it null
Does the column have a default value of NewID()? If so, it should fill in itself when you commit the row (focus another row)
If not, try inserting data by writing an INSERT statement:
INSERT INTO table(list,of,columns,except,auto,generated,ones)
VALUES ('list','of','values'...)
Or by selecting/modifying them from elsewhere:
INSERT INTO table(list,of,columns,except,auto,generated,ones)
SELECT mixed,'list','of','values',and,columns
FROM othertable
I have table [user], created in SQL Server Management Studio, with this structure:
id int PRIMARY NOT NULL
login varchar(255) NOT NULL
password varchar(32) NOT NULL
Now, I want to insert first user into database. Right click to table [user], choose Edit top 200 rows and type new user values into the grid:
id | login | password
1 | admin | MD5('admin')
But after save, inserted password is MD5('admin'), but I expect the 21232f297a57a5a743894a0e4a801fc3 hash.
How can I do that in Microsoft SQL Server Management Studio?
Thanks
P.S. I am using SQL Server 2008 Express 10.50.1600.1 and Microsoft SQL Server Management Studio 10.50.1600.1.
hash values are byte arrays, not character strings. Use VARBINARY column type.
Do not insert MD5 of unsalted passwords. It takes 2.96 seconds to reverse crack online the hash to the password. Use a properly salted password and write the salt in the table.
Right click to table [user], choose Edit top 200 rows and type new user values into the grid. Well, what do you expect? The table edit is a feature to enter values, and if you enter the string MD5('admin') then the value in the table will be... MD5('admin'). It is not an interactive function evaluator (aside from MD5 not being a SQL Server function...)
Do not reinvent the wheel, specially do not reinvent a security wheel if you don't speak fluent crypto. Most frameworks have modules for membership management. Eg. Introduction to Membership.
You cannot run queries in grid tables, you have to run query to do update. To calculate MD5 you can use:
CONVERT(VARCHAR(32), HashBytes('MD5', 'admin'), 2)
The Edit Top 200 Rows feature is for interactive data entry, where only values are accepted, not expressions that need to be evaluated before being stored.
If you want the actual value inserted to be the result of an expression, use a query window to insert the data. (I don't know if you used the Edit feature merely because you wanted to try that functionality or for some other reason, but if that was because you didn't know how to insert data using SQL, take a look at this manual.)
Also, as #Damien has correctly noted, there's no MD5 function in Transact-SQL. There is one called HASHBYTES, which can use various hashing algorithms, including MD5. But the result of this function is varbinary, not varchar. For MD5 specifically, it is varbinary(16). To store the direct result of HASHBYTES, therefore, you'll need to change the type of the password column accordingly.
So, change the type of the column, then open a new query window and type in a command (or statement) to insert the data. The one that should do the job for you might look something like this:
INSERT INTO user (id, login, password)
VALUES (1, 'admin', HASHBYTES('MD5', 'admin'));
To be fair, you may omit changing the column type, in which case you'll need to replace the simple HASHBYTES call like above with one like in #Garath's answer. Whether you really need to store the hashes as varchar(32) instead of varbinary(16) is an entirely different question, though.
Ok I'm using SQL Server 2008 and have a table field of type VARCHAR(MAX). Problem is that when saving information using Hibernate, the contents of VARCHAR(MAX) field is getting truncated. I don't see any error messages on either the app server or database server.
The content of this field is just a plain text file. The size of this text file is 383KB.
This is what I have done so far to troubleshoot this problem:
Changed the database field from VARCHAR(MAX) to TEXT and same
problem occurs.
Used the SQL Server Profiler and I noticed that the full text
content is being
received by the database server, but for some reason the profiler freezes when trying
to view the SQL with the truncation problem. Like I said, just before it freezes, I
did noticed that the full text file content (383KB) are being received, so it seems
that it might be the database problem.
Has anyone encountered this problem before? Any ideas what causes this truncation?
NOTE: just want to mention that I'm just going into SQL Studio and just copying the TEXT field content and pasting it to Textpad. That's how I noticed it's getting truncated.
Thanks in advance.
Your problem is that you think Management Studio is going to present you with all of the data. It doesn't. Go to Tools > Options > Query Results > SQL Server. If you are using Results to Grid, change "Maximum Characters Retrieved" for "Non XML data" (just note that Results to Grid will eliminate any CR/LF). If you are using Results to Text, change "Maximum number of characters displayed in each column."
You may be tempted to enter more, but the maximum you can return within Management Studio is:
65535 for Results to Grid
8192 for Results to Text
If you really want to see all the data in Management Studio, you can try converting it to XML, but this has issues also. First set Results To Grid > XML data to 5 MB or unlimited, then do:
SELECT CONVERT(XML, column) FROM dbo.table WHERE...
Now this will produce a grid result where the link is actually clickable. This will open a new editor window (it won't be a query window, so won't have execute buttons, IntelliSense, etc.) with your data converted to XML. This means it will replace > with > etc. Here's a quick example:
SELECT CONVERT(XML, 'bob > sally');
Result:
When you click on the grid, you get this new window:
(It does kind of have IntelliSense, validating XML format, which is why you see the squigglies.)
BACK AT THE RANCH
If you just want to sanity check and don't really want to copy all 383K elsewhere, then don't! Just check using:
SELECT DATALENGTH(column) FROM dbo.table WHERE...
This should show you that your data was captured by the database, and the problem is the tool and your method of verification.
(I've since written a tip about this here.)
try using SELECT * FROM dbo.table for XML PATH
I had a similar situation. I have an excel sheet. A couple of columns in the sheet may have more than 255 chars, sometimes even 500. A simple way was to sort the rows of data, placing the rows with the most characters up on top. You actually need just one row. When SQL imports the data, it recognizes the field being more than 255 characters and imports the entire data :)
Otherwise, they suggested using regedit to change a specific value. Didn't want to do that.
Hope this helps
I have a table with a VARBINARY(MAX) field (SQL Server 2008 with FILESTREAM)
My requirement is that when I go to deploy to production, I can only supply my IT team with a group of SQL scripts to be executed in a certain order. A new table I am making in production has this VARBINARY(MAX) field. Usually with new tables, I will script out the CREATE TABLE script. And, if I have data I need to go with it, I will then script out the INSERT scripts. Not too complicated.
But with VARBINARY(MAX), the Stored Procedure I was using to generate the INSERT statements fails on that table. I tried selecting that field, printing it, copying it, converting to hex, etc. The main issue I have with that is that it doesn't select all the data in the field. I do a check DATALENGTH([FileColumn]) and if the source row contains 1,004,382 bytes, the max I can get the copied or selected data when inserting again is 8000. So basically it is truncated (i.e. invalid) data.....
How can I do this better? I tried Googling this like crazy but I must be missing something. Remember, I can't access the filesystem. This has to be all scripted.
If this is a one time (or seldom) thing to do, you can try scripting the data out from the SSMS Wizard as described here:
http://sqlblog.com/blogs/eric_johnson/archive/2010/03/08/script-data-in-sql-server-2008.aspx
Or, if you need to do this frequently and want to automate it, you can try the SQL# SQLCLR library (which I wrote and while most of it is free, the function you need here is not). The function to do this is DB_DumpData and it also generates INSERT statements.
But again, if this is a one time or infrequent task, then try the data export wizard that is built into Management Studio. That should allow you to then create the SQL script that you can run in Production. I just tested this on a table with a VARBINARY(MAX) field containing 3,365,964 bytes of data and the Generate Scripts wizard generated an INSERT statement with the entire hex string of 6.73 million characters for that one value.
UPDATE:
Another quick and easy way to do this in a manner that would allow you to copy / paste the entire INSERT statement into a SQL script and not have to bother with BCP or SSMS Export Wizard is to just convert the value to XML. First you would CONVERT the VARBINARY to VARCHAR(MAX) using the optional style of "1" which gives you a hex string starting with "0x". Once you have the hex string of the binary data you can concatenate that into an INSERT statement and that entire thing, when converted to XML, can contain the entire VARBINARY field. See the following example:
DECLARE #Binary VARBINARY(MAX) = CONVERT(VARBINARY(MAX),
REPLICATE(
CONVERT(NVARCHAR(MAX), 'test string'),
100000)
)
SELECT 'INSERT INTO dbo.TableName (ColumnName) VALUES ('+
CONVERT(VARCHAR(MAX), #Binary, 1) + ')' AS [Insert]
FOR XML RAW;
Don't script from SSMS
bcp the data out/in, or use something like SSMS tools to generate INSERT statements
It more than a bit messed up, but in the past and on the web I've seen this done using a base64-encoded string. You use an xml value to wrap the string and from there you can convert it to a varbinary. Here's an example:
http://blogs.msdn.com/b/sqltips/archive/2008/06/30/converting-from-base64-to-varbinary-and-vice-versa.aspx
I can't speak personally to how effective or performant this is, though, especially for large values. Because it is at best an ugly hack, I'd tuck it away inside a UDF somewhere, so that if a better method is found you can update it easily.
I have never tried anything like this before, but from the documentation for SQL Server 2008 R2, it sounds like using SUBSTRING will work to get the entire varbinary value, although you may have to work with it in chunks, using UPDATEs with the .WRITE clause to append the data.
Updating Large Value Data Types
Use the .WRITE (expression, #Offset, #Length) clause to perform a partial or full update of varchar(max), nvarchar(max), and varbinary(max) data types. For example, a partial update of a varchar(max) column might delete or modify only the first 200 characters of the column, whereas a full update would delete or modify all the data in the column.
For best performance, we recommend that data be inserted or updated in chunk sizes that are multiples of 8040 bytes.
Hope this helps.