When creating an external table on SQL Server with wrong column mapping, I got this error:
105083;The following columns in the user defined schema are incompatible with the external table schema for table 'tasks':
user defined column: ([shared] BIT) is not compatible with the type of detected external table column: ([shared] NVARCHAR(5) COLLATE Latin1_General_CI_AS). The detected external table schema is:
(
[id] NVARCHAR(MAX) COLLATE Latin1_General_CI_AS NOT NULL,
[name] NVARCHAR(MAX) COLLATE Latin1_General_CI_AS,
[description] NVARCHAR(MAX) COLLATE Latin1_General_CI_AS,
[shared] NVARCHAR(5) COLLATE Latin1_General_CI_AS,
[deleted_at] DATETIME2(6)
).
This makes me wonder, is there a way to get the detected external table schema using query?
Or maybe create an external table using the detected external table schema? (so we don't need to specify the column on external table creation)
Related
I have a table in SQL Server 2008 database hosted on a shared web hosting. I cannot change the collation of the database because I don't have permissions.
When I created the table, I set the collation for the columns that I want but it doesn't do anything and I still see ???? when I query the table. I tried nvarchar as well and it didn't work.
The table:
CREATE TABLE [dbo].[T_Client]
(
[ClientID] [int] IDENTITY(1,1) NOT NULL,
[ClientName] [varchar](200) collate Hebrew_CI_AI null ,
[Address] [varchar](200) collate Hebrew_CI_AI null
)
You must ensure that the data is passed all the way to SQL Server using a format with compatible code points. Since you don't have Hebrew as your database or instance collation a varchar variable can't be used to store the data. So this
declare #d varchar(100) = 'שלום לך עולם' collate Hebrew_CI_AI
select #d
outputs
???? ?? ????
In this scenario you have to pass the value to the databse as NVARCHAR
declare #d nvarchar(100) = N'שלום לך עולם' collate Hebrew_CI_AI
select #d
You could use a varchar column with a Hebrew collation to store the data, but you should just use an nvarchar column. Still use the collation to produce the desired sorting and comparision semantics.
The problem is your INSERT/UPDATE statements. Unless you define those values as an nvarchar then the characters outside the databases collation will be lost. This means you need to declare your parameters as an nvarchar. As a result I would suggest, instead, not changing the collation of the columns and changing them as an nvarchar and using an nvarchars throughout your code.
I created an EXTERNAL DATA SOURCE in Azure as an alternative to the inability to created LINKED SERVERS on the Azure platform.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'secretpassword';
CREATE DATABASE SCOPED CREDENTIAL LinkedServerCredential
WITH IDENTITY = 'login_name', SECRET = 'login_password-here';
CREATE EXTERNAL DATA SOURCE LinkedProductionDb
WITH
(
TYPE=RDBMS,
LOCATION='azure.database.windows.net',
DATABASE_NAME='ProductionDb',
CREDENTIAL= LinkedServerCredential
);
Everything goes great, no errors and the statements executed successfully. So now I create an EXTERNAL TABLE having the identical table structure as the table that resides in the underlying database defined in the EXTERNAL DATASOURCE; but I want it assigned under a different SCHEMA to be able to distinguish it as a linked table in the targeted database. So I attempt this:
CREATE EXTERNAL TABLE [LNK].[Transactions](
TransactionId BIGINT NOT NULL,
CustomerId BIGINT NOT NULL,
SubscriptionId BIGINT NOT NULL,
ProductId BIGINT NOT NULL,
TransType VARCHAR(100),
TransKind VARCHAR(100),
Success BIT,
Amount MONEY,
GatewayUsed VARCHAR(100),
RecordImportDate DATETIMEOFFSET,
)
WITH
(
DATA_SOURCE = LinkedProductionDb
)
GO
Now surprisingly, I am able to create this EXTERNAL TABLE without any issue; except for when I attempt to access the data via a simply query. I receive the error: "Error retrieving data from one or more shards. The underlying error message received was: 'Invalid object name 'LNK.Transactions'.'."
It only took me a few minutes to think about what I'd done and why I received the error. Obviously, since the table in the original database isn't created under the LNK schema; it is an invalid object.
So is there any way or methodical practice that I can use to distinguish my EXTERNAL TABLES from my physical tables within my database?
Obviously, among other things; this is undoubtedly one of the advantages of having separately defined objects as LINKED SERVERS within SQL.
I know I cannot be the only person who see's the benefit of segregating the objects to clearly distinguish them if they both need to reside in the same database.
It is possible to create the external table with a different schema from the one used in the external table by using the WITH options SCHEMA_NAME and OBJECT_NAME.
From Microsoft:
CREATE EXTERNAL TABLE [ database_name . [ schema_name ] . | schema_name . ] table_name
( { <column_definition> } [ ,...n ])
{ WITH ( <rdbms_external_table_options> ) }
)[;]
<rdbms_external_table_options> ::=
DATA_SOURCE = <External_Data_Source>,
[ SCHEMA_NAME = N'nonescaped_schema_name',] -- This is what we want
[ OBJECT_NAME = N'nonescaped_object_name',] -- ^
In the WITH, if you specify the SCHEMA_NAME and OBJECT_NAME, you can name the EXTERNAL TABLE something other that how the database with the table names it.
Example:
CREATE EXTERNAL TABLE [LNK].[Transactions](
TransactionId BIGINT NOT NULL,
CustomerId BIGINT NOT NULL,
SubscriptionId BIGINT NOT NULL,
ProductId BIGINT NOT NULL,
TransType VARCHAR(100),
TransKind VARCHAR(100),
Success BIT,
Amount MONEY,
GatewayUsed VARCHAR(100),
RecordImportDate DATETIMEOFFSET,
)
WITH
(
DATA_SOURCE = LinkedProductionDb,
SCHEMA_NAME = 'dbo', -- This is the name of the schema on the host database
OBJECT_NAME = 'Transactions' -- Name of the table on the host database
)
GO
Now you can use this table as LNK.Transations (assuming the schema LNK is valid).
I achieved this by creating a view on the external server with the [schema].[table] that i wanted to segregate. In your case, you'd use create a view called [LNK].[Transactions] in LinkedProductionDb, then everything works.
We have Powerbuilder app that ran fine on the 2000 db before we migrated to 2005. Now we get the following error:
sqlstate 5001 error "There is already an object named PKCURSOR in the
database"
The partial code below has been modified by adding a drop contraint PKCURSOR . So the error does not pop up now for 2 of the dba's who have powerbuilder installed and they run the app from their network drive. The other user runs it from her network drive and gets the error. I've also made that user a dbo and still gets the error.
Any ideas?
ALTER PROCEDURE [dbo].[GUMBO_SP_PROP_ACTUAL_ACCOMPLISHMENTS_PF]
#PROG_YEAR CHAR(4)
AS
DECLARE #PROGRAM_YEAR CHAR(4),
#SUM_LOW_MOD_PERSONS_PROPOSED INTEGER,
#SUM_LOW_MOD_PERSONS_ACTUAL INTEGER,
#SUM_TOTAL_PERSONS_PROPOSED INTEGER,
#SUM_TOTAL_PERSONS_ACTUAL INTEGER,
#ERROR_STRING CHAR(132),
#ACTIVITY_CODE CHAR(3)
CREATE TABLE #ACCOMPLISHMENTS(
PROGRAM_YEAR CHAR(4) NOT NULL,
ACTIVITY_CODE CHAR(3) NOT NULL,
TOTAL_PERSONS_PROPOSED DECIMAL(18,2) DEFAULT 0,
TOTAL_PERSONS_ACTUAL DECIMAL(18,2) DEFAULT 0,
LOW_MOD_PERSONS_PROPOSED DECIMAL(18,2) DEFAULT 0,
LOW_MOD_PERSONS_ACTUAL DECIMAL(18,2) DEFAULT 0
)
-- Alter the temporary table to have a primary key of application number and program year.
ALTER TABLE #ACCOMPLISHMENTS
ADD CONSTRAINT PKCURSOR PRIMARY KEY (PROGRAM_YEAR, ACTIVITY_CODE)
DECLARE ACTIVITY_CURSOR CURSOR FOR
SELECT dbo.ACTIVITY_CODE.activity_code
FROM dbo.ACTIVITY_CODE
WHERE
(dbo.ACTIVITY_CODE.activity_code LIKE 'P%%')
and (dbo.ACTIVITY_CODE.activity_code <> 'P01')
ORDER BY dbo.ACTIVITY_CODE.activity_code
ALTER TABLE #ACCOMPLISHMENTS
DROP CONSTRAINT PKCURSOR
This might happened because you have a primary key with the same name already in the same schema. To find on which table it is on, run the following query:
SELECT
DISTINCT
Constraint_Name AS [Constraint],
Table_Schema AS [Schema],
Table_Name AS [TableName]
FROM INFORMATION_SCHEMA.KEY_COLUMN_USAGE
WHERE CONSTRAINT_NAME = 'PKCURSOR'
Solution:
Add the code below to drop the key if it exists, put this snippet after CREATE TABLE #ACCOMPLISHMENTS part of your stored procedure.
IF EXISTS(select * from sys.key_constraints
WHERE name ='PKCURSOR')
ALTER TABLE #ACCOMPLISHMENTS
DROP CONSTRAINT PKCURSOR
I have a database where I have to convert all varchar datatype columns to nvarchar type. I have been able to convert all tables except one. Before converting into nvarchar datatype I am dropping all constraints like foreign key, primary key, unique key, check constraints, default constraints and indexes. I also have deleted all the data before altering to nvarchar.
The problem is that I am getting the error
ALTER TABLE ALTER COLUMN "Permanent Address" failed because one or more objects access this column.
when I am executing the drop and create table statements it is working there but while converting to nvarchar with ALTER command I am getting the error
I have the following scripts
--The actual table deifinition
USE [Customer]
GO
--Permanent_Address is a computed column
CREATE TABLE [dbo].[Customer_Contact](
[Customer_id] int NOT NULL,
[Present_Address] [varchar](250) NOT NULL,
[Permanent_Address] AS ([Present_Address]))
--Alter statement to convert from varchar to nvarchar
ALTER TABLE [Customer_Contact] ALTER COLUMN [Present Address] NVARCHAR(250) NOT NULL
ALTER TABLE [Customer_Contact] ALTER COLUMN [Permanent_Address] NVARCHAR(250) NOT NULL
Result-: ALTER TABLE ALTER COLUMN "Permanent Address" failed because one or more objects access this column.
--Drop and Create the table script
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Customer_Contact]') AND type in (N'U'))
DROP TABLE [dbo].[Customer_Contact]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Customer_Contact](
[Customer_id] int NOT NULL,
[Present_Address] [Nvarchar](250) NOT NULL,
[Permanent_Address] AS ([Present_Address]))
--The table is created successfully with Nvarchar Datatype
Is it possible to have collate SQL_Latin1_General_CP1_CS_AS in table variable columns' definitions?
The reason I want to do this is because I have case sensitive information in my source table but when I insert it in the table variable there is a problem with the primary key (it is clustered) - duplicated values are detected - like 'All' and 'ALL'.
That's why I am trying to find a way to make the table variable columns case sensitive too as the following statement:
SELECT SERVERPROPERTY ('Collation')
gives me: "SQL_Latin1_General_CP1_CI_AS"
Yes it is possible. You can specify the collation for each column when you declare your table variable.
declare #T table
(
Col varchar(20) collate SQL_Latin1_General_CP1_CS_AS
)
Yes. It took something like 2 minutes to write the following script:
declare #T table (
ID int not null,
Val1 varchar(10) collate SQL_Latin1_General_CP1_CS_AS not null primary key
)
insert into #T(ID,Val1) values (1,'All'),(2,'ALL')
insert into #T(ID,Val1) values (3,'All')
Which first stored two rows, then errors on the second insert statement.