SQL Server FORMAT create varchar with very large lenght [duplicate] - sql-server

This question already has answers here:
Why does VARCHAR need length specification?
(7 answers)
Closed 1 year ago.
It not really a problem but i am curious to know why.
I am using sql server and using FORMAT in a number returns a varchar with 4000 lenght. Why?
select format(1,'00')
This query returns "01"

Consider a query like
drop table if exists #t
create table #t(val int, fmt varchar(20))
insert into #t(val,fmt) values (1,'00')
insert into #t(val,fmt) values (1,'0000000')
insert into #t(val,fmt) values (1,'0000000000000000000')
select format(val,fmt) formatted from #t
What data type should formatted be? It can't change per row, so a single type must be chosen to hold all the values. And nvarchar(4000) is a reasonable choice, as it can hold any string with up to 4000 characters.

Related

SQL Server varchar(max) is not working properly [duplicate]

This question already has answers here:
SQL Server reducing the length of the string to 8000 characters
(2 answers)
Closed 2 years ago.
In the following example, the txt column length is not correctly shown.
CREATE TABLE dbo.test
(
Id int IDENTITY,
txt varchar(max)
);
insert into test(txt) select REPLICATE('0',8000);
insert into test(txt) select REPLICATE('00',8000);
select id, len(txt) from test;
drop table test;
The result shown for both were 8000.
Could any body help?
If you want to insert something into a varchar(max) column - your inserting code needs to explicitly convert it to varchar(max) - try this:
insert into test(txt) select REPLICATE(cast('00' as varchar(max)), 8000);
That should give you a string of 16'000 characters length in the table.

Converting bigint to smallint shows an error

ALTER TABLE employee
ALTER COLUMN emp_phoneNo SMALLINT;
I am trying to alter the data type from BIGINT to SMALLINT and it is showing this error:
Arithmetic overflow error converting expression to data type int.
I am not able to understand what is wrong.
You have existing rows with values in that specific column that are bigger than the new data type allows.
You need to update or delete the rows that are currently "oversized".
(or not perform the column alter at all .. because most likely you don't want to lose the information)
You can find the rows with this query:
SELECT 'CurrentlyOverSized' as MyLabel, * FROM dbo.employee WHERE ABS(emp_phoneNo ) > 32767
Note a phone number like : 5555555555 (which would be numeric for 555-555-5555) would be greater than the 32767 number.
Even 5555555 (for 555-5555 (no area code)) is too big for 32767.
Also
A debatable topic. But number or string for storing phone numbers...check out this link for food for thought:
What datatype should be used for storing phone numbers in SQL Server 2005?
Personally I think numeric is the wrong data type for phone numbers.
Whatever you do, be consistent. If you go with a string (varchar(xyz)) for example..........store them with no extra characters 5555555555, with hyphens 555-555-5555 , with dot 555.555.5555 .. or whatever but do them all the same would be my advice.

Ms Sql Convert and insert between 2 databases

I have two databases.I insert data from DATABASE_2 TO DATABASE_1 however i need to convert some column.
I must convert Customer_Telephone_Number from varchar to bigint after that insert it.
So,
My Question is in below.
SET IDENTITY_INSERT DATABASE_1.dbo.CUSTOMER_TABLE ON
INSERT INTO DATABASE_1.dbo.CUSTOMER_TABLE
(
Customer_Id,
Customer_Telephone_Number
)
Select
Customer_Id,
Customer_Telephone_Number // This is varchar so i need to convert it to Big int.
from
DATABASE_2.DBO.CUSTOMER_TABLE
Any help will be appreciated.
Thanks.
If the data stored is without spaces or other non numeric symbols:
Select
Customer_Id,
CONVERT(BIGINT,Customer_Telephone_Number)
from
DATABASE_2.DBO.CUSTOMER_TABLE
For instance, if there is value with (222)-3333-333 it would fail. If the value was 2223333333 it would succeed

SQL INSERT using tables where the columns have different data types and getting error converting

I have two tables and I would like to insert from one into the other. In my staging (source) table every column is defined as nvarchar(300) and this restriction cannot change.
In my destination table, the columns are of all different types. If I want to, for example, select from the source table (data type nvarchar(300)) and insert that column into a data type of decimal(28, 16).
When this happens I get the following error:
Error converting data type nvarchar to numeric.
Even when I use a cast I get the error.
INSERT INTO Destination (
Weighting
)
VALUES (
CAST(src.Weighting AS decimal(28, 16))
)
Could null values be affecting this at all? Is there anything else to consider?
If all data in your staging table column can be implicitly converted to the target data type then you do not have to set up an explicit cast.
But if any one value cannot be converted implicitly (i.e. one cell contains a non-numeric or ill-formatted string value that is supposed to end up in a decimal type column) then the entire transaction will fail.
You can migrate the risk of a failing transaction by setting up the insert like this:
INSERT
LiveTable (
VarcharCol,
DecimalCol,
NonNullableCol
)
SELECT
NvarcharCol1,
CASE ISNUMERIC(nvarcharCol2) = 0 THEN NvarcharColl2 END,
ISNULL(NvarcharCol3, '')
FROM
StagingTable
But clearly that means the risk of losing potentially relevant data or numeric precision.
You can read which data types are implicitly convertible between each other on the MSDN (scroll down to the matrix). For all other conversions you'll have to use CAST or CONVERT.
This will search for non numeric strings
select src.Weighting from src where isnumeric(src.Weighting) = 0
INSERT INTO Destination (Weighting)
SELECT CAST(src.Weighting AS decimal(28, 16))
FROM [Source] src
should work OK, provided your varchar values are in correct format.
If the error still occurs, please give an example of value being converted.
NULLs will successfully convert to NULLs.
TSQL has functions for casting or converting data to the type you want it to be. If your data types in the source are strictly what you are trying to store them as in the destination table and with in the specifications of the destination table you won't have much trouble.
If you have a column of numbers and one of the is 'three' instead of '3' it gets complicated. Here is a question about converting a varchar to a decimal
An example: I can cast 123 as a varchar(20) then cast the varchar into a decimal with no problem when it is appropriate.
SELECT cast(cast('123' as varchar(20)) as decimal(8,2))
However if I try to convert a character it will give an error.
SELECT cast(cast('1a3' as varchar(20)) as decimal(8,2))
The null only be a problem if the target column does not allow nulls, I think the problem is that the format string that can not always be converted into a decimal, see if the decimal separator is a comma instead of a point.

SQL Server Row Length

I'm attempting to determine the row length in bytes of a table by executing the following stored procedure:
CREATE TABLE #tmp
(
[ID] int,
Column_name varchar(640),
Type varchar(640),
Computed varchar(640),
Length int,
Prec int,
Scale int,
Nullable varchar(640),
TrimTrailingBlanks varchar(640),
FixedLenNullInSource varchar(640),
Collation varchar(256)
)
INSERT INTO #tmp exec sp_help MyTable
SELECT SUM(Length) FROM #tmp
DROP TABLE #tmp
The problem is that I don't know the table definition (data types, etc..) of the table returned by 'sp_help.'
I get the following error:
Insert Error: Column name or number of supplied values does not match table definition.
Looking at the sp_help stored procedure does not give me any clues.
What is the proper CREATE TABLE statement to insert the results of a sp_help?
How doing it this way instead?
CREATE TABLE tblShowContig
(
ObjectName CHAR (255),
ObjectId INT,
IndexName CHAR (255),
IndexId INT,
Lvl INT,
CountPages INT,
CountRows INT,
MinRecSize INT,
MaxRecSize INT,
AvgRecSize INT,
ForRecCount INT,
Extents INT,
ExtentSwitches INT,
AvgFreeBytes INT,
AvgPageDensity INT,
ScanDensity DECIMAL,
BestCount INT,
ActualCount INT,
LogicalFrag DECIMAL,
ExtentFrag DECIMAL
)
GO
INSERT tblShowContig
EXEC ('DBCC SHOWCONTIG WITH TABLERESULTS')
GO
SELECT * from tblShowContig WHERE ObjectName = 'MyTable'
GO
Try this:
-- Sum up lengths of all columns
select SUM(sc.length)
from syscolumns sc
inner join systypes st on sc.xtype = st.xtype
where id = object_id('table')
-- Look at various items returned
select st.name, sc.*
from syscolumns sc
inner join systypes st on sc.xtype = st.xtype
where id = object_id('table')
No guarantees though, but it appears to be the same length that appears in sp_help 'table'
DISCLAIMER:
Note that I read the article linked by John Rudy and in addition to the maximum sizes here you also need other things like the NULL bitmap to get the actual row size. Also the sizes here are maximum sizes. If you have a varchar column the actual size is less on most rows....
Vendoran has a nice solution, but I do not see the maximum row size anywhere (based on table definition). I do see the average size and all sorts of allocation information which is exactly what you need to estimate DB size for most things.
If you are interested in just what sp_help returns for length and adding it up, then I think (I'm not 100% sure) that the query to sysobjects returns those same numbers. Do they represent the full maximum row size? No, you are missing things like the NULL bitmap. Do they represent a realistic measure of your actual data? No. Again VARCHAR(500) does not take 500 bytes if you only are storing 100 characters. Also TEXT fields and other fields stored separately from the row do not show their actual size, just the size of the pointer.
None of the aforementioned answers is correct or valid.
The question is one of determining the number of bytes consumed per row by each column's data type.
The only method(s) I have that work are:
exec sp_help 'mytable' - then add up the Length field of the second result set (If working from Query Analyzer or Management Studio - simply copy and paste the result into a spreadsheet and do a SUM)
Write a C# or VB.NET program that accesses the second resultset and sums the Length field of each row.
Modify the code of sp_help.
This cannot be done using Transact SQL and sp_help because there is no way to deal with multiple resultsets.
FWIW: The table definitions of the resultsets can be found here:
http://msdn.microsoft.com/en-us/library/aa933429(SQL.80).aspx
I can't help you with creating a temp table to store sp_help information, but I can help you with calculating row lengths. Check out this MSDN article; it helps you calculate such based on the field lengths, type, etc. Probably wouldn't take too much to convert it into a SQL script you could reuse by querying against sysobjects, etc.
EDIT:
I'm redacting my offer to do a script for it. My way was nowhere near as easy as Vendoran's. :)
As an aside, I take back what I said earlier about not being able to help with the temp table. I can: You can't do it. sp_help outputs seven rowsets, so I don't think you'll be able to do something as initially described in the original question. I think you're stuck using a different method to come up with it.
This will give you all the information you need
Select * into #mytables
from INFORMATION_SCHEMA.columns
select * from #mytables
drop table #mytables
UPDATE:
The answer I gave was incomplete NOT incorrect. If you look at the data returned you'd realize that you could write a query using case to calculate a rows size in bytes. It has all you need: the datatype|size|precision. BOL has the bytes used by each datatype.
I will post the complete answer when I a chance.

Resources