How can I declare this table with varchar(max) in Firebird?
CREATE TABLE MyUser
(
Id INT, -- unique id of user
Signature VARCHAR(max),
Login VARCHAR(50),
UserPassword VARCHAR(100),
CONSTRAINT PK_MyUser PRIMARY KEY (Id)
);
COMMIT;
Is it possible?
Firebird doesn't have a type VARCHAR(MAX). You either need to use VARCHAR(32765) assuming you are using a 1-byte character-set or VARCHAR(8191) (with UTF-8), or you need to use a BLOB SUB_TYPE TEXT (which is a blob type for text data).
As far as I know, it isn't.
Use VARCHAR(32765), Firebird can hold up to 32,765 characters (max). Or you could also use BLOB.
Related
We are using the node-mssql package to insert into and read out of our azure mssql database.
The database is very simple, because it is just used as a key/value cache, where the value can be quite long and also contain special characters.
DB Schema:
create table cache
(
id int identity
constraint cache_pk
primary key nonclustered,
cacheKey varchar(500),
ttl bigint,
value varchar(max)
)
go
create index cache_cacheKey_index
on cache (cacheKey desc)
go
create index cache_ttl_index
on cache (ttl)
go
For some reason, when I insert values into "value", some special characters are not treated well.
Dash – example
turns into:
Dash example
I have seen the same thing happening with the french apostrophe.
I also tried to change the collation already, but that did not help.
Also tried it by using nvarchar(max) as column type.
This is the insert code (including the sql):
const result = await conn.request()
.input('key', mssql.VarChar, key)
.input('value', mssql.Text, value)
.input('ttl', mssql.BigInt, cacheEndTime)
.query`INSERT INTO cache (cacheKey, value, ttl) VALUES (#key, #value, #ttl)`;
Can you please help with a correct table structure or sql statement to make this work?
I'm not sure if can help you, but have you checked the collation of the table, the database, and the server? The collate have differents levels.
The answer for your question are in one of this items:
Server collation
Table collation
Field collation
Cast the insert text
For example, if you create nvarchar (I recommend if you have international scenario) field, cast the text insert like N'text to insert'.
It will work ;)
I have found the answer.
Like #RealSeik and #Larnu already stated it was probably not a problem with the database or the queries themselves, but rather an input problem.
I realized, that node-sass has a type for Unicode text, where they took care of casting it correctly.
So instead of mssql.Text I changed it to mssql.NText.
So now the insert command looks as follows:
const result = await conn.request()
.input('key', mssql.VarChar, key)
.input('value', mssql.NText, value)
.input('ttl', mssql.BigInt, cacheEndTime)
.query`INSERT INTO cache (cacheKey, value, ttl) VALUES (#key, #value, #ttl)`;
I have also added collations to my other scripts, for good measure as well.
(that alone did not help, but for good measure)
ALTER DATABASE MyDbName
COLLATE Latin1_General_100_CI_AI_SC_UTF8 ;
create table cache
(
id int identity
constraint cache_pk
primary key nonclustered,
cacheKey varchar(500) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
ttl bigint,
value varchar(max) COLLATE Latin1_General_100_CI_AI_SC_UTF8
)
go
create index cache_cacheKey_index
on cache (cacheKey desc)
go
create index cache_ttl_index
on cache (ttl)
go
It possible to create an user-defined type with check constaint?
somethins like:
create type json from nvarchar(max) check isjson(value)=1
You cannot define a user defined type with constraint as you have mentioned. you can define a table type with columns containing CHECK constraint. Read on CREATE TYPE
If you want to define for a column type,as suggested by you, sp_bind might be removed in future. Reference
I would suggest you to define check constraint and utilize the IsJSon() function in the check constraint as given below:
CREATE TABLE TestJson
(
DocumentId BIGINT IDENTITY(1,1) PRIMARY KEY,
JsonText VARCHAR(max) CHECK (IsJson(JsonText) =1)
);
OK I can add a rule.
create rule json_rule
as
isjson(#range) = 1
GO
CREATE TYPE dbo.[json] FROM nvarchar(max) NULL
GO
EXEC sys.sp_bindrule #rulename=N'[dbo].[json_rule]', #objname=N'[dbo].[json]' , #futureonly='futureonly'
I have a column of type float that contains phone numbers - I'm aware that this is bad, so I want to convert the column from float to nvarchar(max), converting the data appropriately so as not to lose data.
The conversion can apparently be handled correctly using the STR function (suggested here), but I'm not sure how to go about changing the column type and performing the conversion without creating a temporary column. I don't want to use a temporary column because we are doing this automatically a bunch of times in future and don't want to encounter performance impact from page splits (suggested here)
In Postgres you can add a "USING" option to your ALTER COLUMN statement that specifies how to convert the existing data. I can't find anything like this for TSQL. Is there a way I can do this in place?
Postgres example:
...ALTER COLUMN <column> TYPE <type> USING <func>(<column>);
Rather than use a temporary column in your table, use a (temporary) column in a temporary table. In short:
Create temp table with PK of your table + column you want to change (in the correct data type, of course)
select data into temp table using your conversion method
Change data type in actual table
Update actual table from temp table values
If the table is large, I'd suggest doing this in batches. Of course, if the table isn't large, worrying about page splits is premature optimization since doing a complete rebuild of the table and its indexes after the conversion would be cheap. Another question is: why nvarchar(max)? The data is phone numbers. Last time I checked, phone numbers were fairly short (certainly less than the 2 Gb that nvarchar(max) can hold) and non-unicode. Do some domain modeling to figure out the appropriate data size and you'll thank me later. Lastly, why would you do this "automatically a bunch of times in future"? Why not have the correct data type and insert the right values?
In sqlSever:
CREATE TABLE dbo.Employee
(
EmployeeID INT IDENTITY (1,1) NOT NULL
,FirstName VARCHAR(50) NULL
,MiddleName VARCHAR(50) NULL
,LastName VARCHAR(50) NULL
,DateHired datetime NOT NULL
)
-- Change the datatype to support 100 characters and make NOT NULL
ALTER TABLE dbo.Employee
ALTER COLUMN FirstName VARCHAR(100) NOT NULL
-- Change datatype and allow NULLs for DateHired
ALTER TABLE dbo.Employee
ALTER COLUMN DateHired SMALLDATETIME NULL
-- Set SPARSE columns for Middle Name (sql server 2008 only)
ALTER TABLE dbo.Employee
ALTER COLUMN MiddleName VARCHAR(100) SPARSE NULL
http://sqlserverplanet.com/ddl/alter-table-alter-column
I have a SQL Server 2008 table which contains an external user reference currently stored as a bigint - the userid from the external table. I want to extend this to allow email address, open ID etc to be used as the external identifier. Is it possible to alter the column datatype from bigint to varchar without affecting any of the existing data?
Yes, that should be possible, no problem - as long as you make your VARCHAR field big enough to hold you BIGINT values :-)
You'd have to use something like this T-SQL:
ALTER TABLE dbo.YourTable
ALTER COLUMN YourColumnName VARCHAR(50) -- or whatever you want
and that should be it! Since all BIGINT values can be converted into a string, that command should work just fine and without any danger of losing data.
I want to build an employees table using SQL SERVER 2008 , and in my table I want to have an ID for each employee .
I heared about GUID and I kind of understood that its a data type , But I couldn't use it
could you please show me the way to use it ...
by the way , lets say I want something like this :
CREATE TABLE Employees (
ID guid PRIMARY KEY,
Name NVARCHAR(50) NOT NULL
)
How can I do it ?? because I want to benefit from it , but I couldn't find out how to do that
It's not called GUID in SQL Server. It's called uniqueidentifier
The type is called UNIQUEIDENTIFIER as others have pointed out already. But I think you absolutely must read this article before proceeding: GUIDs as PRIMARY KEYs and/or the clustering key
They are called uniqueidentifier
Don't use uniqueidentifier as primary key if clustered (which is the default for PKs)
Seriously: use a standard IDENTITY instead
You can also consider using NEWSEQUENCIALID as the default value for your ID column as it would be faster than using NEWID() generate the GUIDs.
BUT (from the same link above):-
If privacy is a concern, do not use this function. It is possible to guess the value of the next generated GUID and, therefore, access data associated with that GUID.
Practical demo, FWIW
DECLARE #guid1 AS uniqueidentifier
SET #guid1 = NEWID()
SELECT #guid1
The GUID in sql server is known by UNIQUEIDENTIFIER data type. below is the desired code.
CREATE TABLE Employees
(
Id UNIQUEIDENTIFIER PRIMARY KEY,
Name NVARCHAR (50) not null
)
GO