SQL Server 2008 R2 Varbinary Max Size - sql-server

What is the max size of a file that I can insert using varbinary(max) in SQL Server 2008 R2? I tried to change the max value in the column to more than 8,000 bytes but it won't let me, so I'm guessing the max is 8,000 bytes, but from this article on MSDN, it says that the max storage size is 2^31-1 bytes:
varbinary [ ( n | max) ]
Variable-length binary data. n can be a value from 1 through 8,000. max indicates that the maximum storage size is 2^31-1 bytes. The storage size is the actual length of the data entered + 2 bytes. The data that is entered can be 0 bytes in length. The ANSI SQL synonym for varbinary is binary varying.
So how can i store larger files in a varbinary field? I'm not considering using a FILESTREAM since the files I want to save are from 200kb to 1mb max, The code I'm using:
UPDATE [table]
SET file = ( SELECT * FROM OPENROWSET ( BULK 'C:\A directory\A file.ext', SINGLE BLOB) alias)
WHERE idRow = 1
I have been able to execute that code successfully to files less or equal than 8000 bytes. If i try with a file 8001 bytes size it will fail. My file field on the table has a field called "file" type varbinary(8000) which as I said, I can't change to a bigger value.

I cannot reproduce this scenario. I tried the following:
USE tempdb;
GO
CREATE TABLE dbo.blob(col VARBINARY(MAX));
INSERT dbo.blob(col) SELECT NULL;
UPDATE dbo.blob
SET col = (SELECT BulkColumn
FROM OPENROWSET( BULK 'C:\Folder\File.docx', SINGLE_BLOB) alias
);
SELECT DATALENGTH(col) FROM dbo.blob;
Results:
--------
39578
If this is getting capped at 8K then I would guess that either one of the following is true:
The column is actually VARBINARY(8000).
You are selecting the data in Management Studio, and analyzing the length of the data that is displayed there. This is limited to a max of 8192 characters in results to text, if this is the case, so using DATALENGTH() directly against the column is a much better approach.

I would dare to say, use file stream for files bigger than 1 MB based on the following from: MS TechNet | FILESTREAM Overview.
In SQL Server, BLOBs can be standard varbinary(max) data that stores
the data in tables, or FILESTREAM varbinary(max) objects that store
the data in the file system. The size and use of the data determines
whether you should use database storage or file system storage. If the
following conditions are true, you should consider using FILESTREAM:
Objects that are being stored are, on average, larger than 1 MB.
Fast read access is important.
You are developing applications that use a middle tier for application logic.
For smaller objects, storing varbinary(max) BLOBs in the database
often provides better streaming performance.

"SET TEXTSIZE" Specifies the size of varchar(max), nvarchar(max), varbinary(max), text, ntext, and image data returned by a SELECT statement.
select ##TEXTSIZE
The SQL Server Native Client ODBC driver and SQL Server Native Client OLE DB Provider for SQL Server automatically set TEXTSIZE to 2147483647 when connecting. The maximum setting for SET TEXTSIZE is 2 gigabytes (GB), specified in bytes. A setting of 0 resets the size to the default (4 KB).
As mentioned, for big files you should prefer file stream.

Related

Get large string from database

I have large string in my postgresql database. It's a base64 encoded mp3 and I have to select the column with that large string and get all the data with one select. If I write normal select
SELECT * FROM public.song_data WHERE id=1;
it will return just 204 kB from that string and the string has 2.2 MB.
Also the datagrip shows me just 204 kB of data from that string. Is there a way to get all the data with just one select?
It's strange. Are you sure so your data was not trimmed somewhere? You can use function length for check of actual size.
postgres=# select length('aaa');
┌────────┐
│ length │
╞════════╡
│ 3 │
└────────┘
(1 row)
Two MB are nothing for Postgres, but some clients (or protocols) can problem with it. Sometimes is necessary to use functions lo_import and lo_export as workaround for client / protocol limits. For selecting data from table you should to use SELECT statement. There is not any other way. Theoretically you can transform any string to large object and then by function lo_export you can download this large object from database with LO special protocol. For 2MB it should not be necessary I think.
Please, try to check if your data was stored to postgres correctly. Theoretical limit for text, varchar is 1GB. Practical limit is less value - about 100MB. It is significantly higher value than 2MB.
Postgres has special data type for binary data - bytea. It does conversation to hex code by default and base64 encoding is supported too.
You can select number of chars you want to show using left with negative values (which removes characters from the end) and concat:
SELECT left(concat(public.song_data,' '), -1) from mytable;
The client was actually the problem. Datagrip can’t print of 2MB. I tried another client (DBeaver and Heidisql) and it was ok. Then I selected a row of 2MB with a php select and I got all that data.

RODBC ERROR: 'Calloc' could not allocate memory

I am setting up a SQL Azure database. I need to write data into the database on daily basis. I am using 64-bit R version 3.3.3 on Windows10. Some of the columns contain text (more than 4000 characters). Initially, I have imported some data from a csv into the SQL Azure database using Microsoft SQL Server Management Studios. I set up the text columns as ntext format, because when I tried using nvarchar the max was 4000 and some of the values got truncated even though they were about 1100 characters long.
In order to append to the database I am first saving the records in a temp table when I have predefined the varTypes:
varTypesNewFile <- c("Numeric", rep("NTEXT", ncol(newFileToAppend) - 1))
names(varTypesNewFile) <- names(newFileToAppend)
sqlSave(dbhandle, newFileToAppend, "newFileToAppendTmp", rownames = F, varTypes = varTypesNewFile, safer = F)
and then append them by using:
insert into mainTable select * from newFileToAppendTmp
If the text is not too long, the above does work. However, sometimes I get the following error during the sqlSave command:
Error in odbcUpdate(channel, query, mydata, coldata[m, ], test = test, :
'Calloc' could not allocate memory (1073741824 of 1 bytes)
My questions are:
How can I counter this issue?
Is this the format I should be using?
Additionally, even when the above works, it takes about an hour to upload about 5k of records. Is it not too long? Is this the normal amount of time it should take? If not, what could I do better.
RODBC is very old, and can be a bit flaky with NVARCHAR columns. Try using the RSQLServer package instead, which offers an alternative means to connect to SQL Server (and also provides a dplyr backend).

How to import huge blob into SQL Server database?

I have a .csv file with one column of blob data type (from cassandra) that is binary data. That data can be huge - much more then 8000 bytes.
I tried to set source and destination data type DT_BYTES->binary/varbinary in SQL Server import wizard but failed with error that data will be truncated.
How to import such data?
You need to set column type to varbinary(max) not varbinary only, so that column will accept more than 8000 bytes. See following microsoft link.
varbinary [ ( n | max) ]
Variable-length binary data. n can be a value from 1 through 8,000.
max indicates that the maximum storage size is 2^31-1 bytes.
The storage size is the actual length of the data entered + 2 bytes.
The data that is entered can be 0 bytes in length.
The ANSI SQL synonym for varbinary is binary varying.
For integration services data types you can look to following link. What you want is DT_IMAGE:
DT_IMAGE
A binary value with a maximum size of 231-1 (2,147,483,647) bytes.

What data type use instead of 'ntext' data type?

I want to write a trigger for one of my tables which has an ntext datatype field an as you know the trigger can't be written for ntext datatype.
Now I want to replace the ntext with nvarchar datatype. The ntext maximum length is 2,147,483,647 character whereas nvarchar(max) is 4000 character.
what datatype can I use instead of ntext datatype.
Or are there any ways to write trigger for when I have ntext datatype?
It's better to say my database is designed before with SQL 2000 and it is full of data.
You're out of luck with sql server 2000, but you can possibly chain together a bunch of nvarchar(4000) variables. Its a hack, but it may be the only option you have. I would also do an assesment of your data, and see what the largest data you actually have in that column. A lot of times, columns are made in anticipation of a large data set, but in the end it doesn't have them.
in MSDN i see this :
* Important *
ntext, text, and image data types will be removed in a future version of Microsoft SQL Server. Avoid using these data types in new development work, and plan to modify applications that currently use them. Use nvarchar(max), varchar(max), and varbinary(max) instead.
Fixed and variable-length data types for storing large non-Unicode and Unicode character and binary data. Unicode data uses the UNICODE UCS-2 character set.
and it preferd nvarchar(MAX) , You can see details below :
nvarchar [ ( n | max ) ]
Variable-length Unicode string data. n defines the string length and can be a value from 1 through 4,000. max indicates that the maximum storage size is 2^31-1 bytes (2 GB). The storage size, in bytes, is two times the actual length of data entered + 2 bytes. The ISO synonyms for nvarchar are national char varying and national character varying.

VarBinary vs Image SQL Server Data Type to Store Binary Data?

I need to store binary files to the SQL Server Database. Which is the better Data Type out of Varbinary and Image?
Since image is deprecated, you should use varbinary.
per Microsoft (thanks for the link #Christopher)
ntext , text, and image data types will be removed in a future
version of Microsoft SQL Server. Avoid using these data types in new
development work, and plan to modify applications that currently use
them. Use nvarchar(max), varchar(max), and varbinary(max) instead.
Fixed and variable-length data types for storing large non-Unicode and
Unicode character and binary data. Unicode data uses the UNICODE UCS-2
character set.
varbinary(max) is the way to go (introduced in SQL Server 2005)
There is also the rather spiffy FileStream, introduced in SQL Server 2008.
https://learn.microsoft.com/en-us/sql/t-sql/data-types/ntext-text-and-image-transact-sql
image
Variable-length binary data from 0 through 2^31-1 (2,147,483,647)
bytes. Still it IS supported to use image datatype, but be aware of:
https://learn.microsoft.com/en-us/sql/t-sql/data-types/binary-and-varbinary-transact-sql
varbinary [ ( n | max) ]
Variable-length binary data. n can be a value from 1 through 8,000. max indicates that the maximum storage
size is 2^31-1 bytes. The storage size is the actual length of the
data entered + 2 bytes. The data that is entered can be 0 bytes in
length. The ANSI SQL synonym for varbinary is binary varying.
So both are equally in size (2GB). But be aware of:
https://learn.microsoft.com/en-us/sql/database-engine/deprecated-database-engine-features-in-sql-server-2016#features-not-supported-in-a-future-version-of-sql-server
Though the end of "image" datatype is still not determined, you should use the "future" proof equivalent.
But you have to ask yourself: why storing BLOBS in a Column?
https://learn.microsoft.com/en-us/sql/relational-databases/blob/compare-options-for-storing-blobs-sql-server

Resources