On a Sybase ASE 15.7 database I'm trying to modify a column type Image from NULL to NOT NULL (I'm using SQSH so ; is a valid terminator):
create table LOB_TEST (XML image NULL);
alter table LOB_TEST modify XML image NOT NULL;
Error message:
Msg 13907, Level 16, State 1
Server 'MYSERVER', Line 1
ALTER TABLE 'LOB_TEST' failed. You cannot modify column 'XML' to TEXT/IMAGE/UNITEXT type.
This works on an int type column:
create table NON_LOB_TEST (XML_ID int NULL);
alter table NON_LOB_TEST modify XML_ID int NOT NULL;
(0 rows affected)
Any clues why? I cannot find anything online. Thanks.
Text/image datatypes are very different internally from the other datatypes due to the way they are stored. Therefore, it is not a surprise that operations that work on an INT column do not work on a text/image column.
The documentation is not terribly clear on this point, but implicitly it sort of says that you cannot modify the nullability of a text/image column: http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.infocenter.dc36272.1600/doc/html/san1393050903443.html
Related
Backend code = java, hibernate, maven, hosted in AEM.
DB = SQL db
Existing table had column of type INT.
Backup of the original table containing that column was made by Select * into backupTable from originalTable
Backup of the audit table (for the original table) containing that column was made by Select * into backupTable_AUDIT from originalTable_AUDIT
Column type INT was changed to VARCHAR(255) by dropping-recreating that column in originalTable.
Column type INT was changed to VARCHAR(255) by dropping-recreating that column in originalTable_AUDIT.
All places in the code that used that column has been changed to accomodate VARCHAR.
BE Code was rebuilt.
Code was deployed.
When trying to run app getting error: "wrong column type encountered in column [mycolumn] in table [originalTable]; found [nvarchar (Types#NVARCHAR)], but expecting [int (Types#INTEGER)])"
After deleting backupTable_AUD => no error any more, all works fine.
As much as I know each table in the db schema has an id.
It seems BE-code/Hibernate was looking at backup table id?
Can somebody please explain more why deleting backup tables did help to eliminate the error,
and which step was missed during deployment/backup to avoid this error?
Many thanks
I've got a block of data with lat/longs and I'm trying to add a point to a Snowflake table from this data.
First I tried to accomplish it when I created the table with:
create or replace table geo (
best_lat double,
best_lon double,
geography geography as (ST_POINT(best_lon, best_lat)));
This errored out with SQL compilation error: error line 4 at position 2 Data type of virtual column does not match the data type of its expression for column 'GEOGRAPHY'. Expected data type 'GEOGRAPHY', found 'VARIANT'
Then I tried to add the column with:
alter table geo
add column geom geography as (select st_makepoint(best_lon, best_lat) from geo)
This errored out with SQL compilation error: Invalid virtual column expression [(SELECT ST_MAKEPOINT(GEO.BEST_LON, GEO.BEST_LAT) AS "ST_MAKEPOINT(BEST_LON, BEST_LAT)" FROM GEO AS GEO)]
Clearly I'm doing something wrong here. Can anyone provide some insight?
Snowflake doesn’t really support calculated columns, what it does do is allow you to set a default value for a column and this default can be a simple SQL statement. The syntax is documented here
Because this isn’t a pure calculated column, you can still insert values directly into the column, which will override the defined default value.
I'm working on a legacy system using SQL Server in 2000 compatibility mode. There's a stored procedure that selects from a query into a virtual table.
When I run the query, I get the following error:
Error converting data type varchar to numeric
which initially tells me that something stringy is trying to make its way into a numeric column.
To debug, I created the virtual table as a physical table and started eliminating each column.
The culprit column is called accnum (which stores a bank account number, which has a source data type of varchar(21)), which I'm trying to insert into a numeric(16,0) column, which obviously could cause issues.
So I made the accnum column varchar(21) as well in the physical table I created and it imports 100%. I also added an additional column called accnum2 and made it numeric(16,0).
After the data is imported, I proceeded to update accnum2 to the value of accnum. Lo and behold, it updates without an error, yet it wouldn't work with an insert into...select query.
I have to work with the data types provided. Any ideas how I can get around this?
Can you try to use conversion in your insert statement like this:
SELECT [accnum] = CASE ISNUMERIC(accnum)
WHEN 0 THEN NULL
ELSE CAST(accnum AS NUMERIC(16, 0))
END
I've created a very simple migration, that creates a table with a FK referencing a column on an existing table. The problem is that the migration creates a NCHAR datatype column, while the referenced column is of CHAR datatype, so the FK can't be created because of different datatypes columns.
Is there any way to enforce Laravel to use CHAR instead of NCHAR?
Thanks!
I've got a work around for this issue, I got the idea from mikebronner's comment here https://github.com/laravel/framework/issues/9636.
I've modified my migration to alter the 'cliente' column type, after it's been created, using raw SQL. This way I can override Laravel's default datatype of NCHAR, when creating CHAR columns. The altered column can't have any restrictions such as FK or PK before being modified. Hope this helps anyone else having this problem in the future.
The following code is inside my migration file, right after the code that creates the table itself, inside the up() function.
Schema::table('UsuariosWeb', function ($table) {
DB::statement("
ALTER TABLE UsuariosWeb ALTER COLUMN cliente CHAR(6) NOT NULL;
");
$table->primary('cliente');
$table->foreign('cliente')->references('Cliente')->on('Clientes');
});
I am trying to execute stored proc through SSIS and it gives me following error:
[Execute SQL Task] Error: Executing
the query "Exec sp1 ?" failed with
the following error: "Procedure: sp1
Line: 66 Message: Cannot insert
duplicate key row in object
'table.sq1' with unique index
'UIX_sp1_Key'.". Possible failure
reasons: Problems with the query,
"ResultSet" property not set
correctly, parameters not set
correctly, or connection not
established correctly.
Actually the stored Proc sp1 is truncating & reloading a data into a table.
I am not able to figure out where exactly its trying to insert duplicate record.
Thanks in advance.
Your data must have duplicate values across the key column(s). You could remove the primary key temporarily (or create another table with the same structure but without the definition of the primary key, but that would mean changing the stored procedure), then load the data into the table. Then use this SQL statement:
select keycol1 {,keycol2 {,keycol3} ...}, count(*)
from tablename
group by keycol1 {,keycol2 {,keycol3} ...}
having count(*) > 1
That will show you the data values that are duplicates across the key column(s).
If you are truncating the table before load, then you must have duplicate data in the source.
Examine this to see what there is. use Excel or Access or some such if needed to assist. Or drop the unique constraint then query the staging table with an aggregate query.