Consider there is a table in DSE 4.8.3 Cassandra:
CREATE TABLE retail.test_orders2 (
order_id uuid PRIMARY KEY,
order_name text
)
select * from test_orders2;
order_id | order_name
--------------------------------------+------------
d60c23c3-15e9-4687-9088-0be402eb90e7 | hello test
(1 rows)
I am using Simba SparkSQL Connector to Connect between Tableau and Cassandra.
But I am getting the following error:
[Simba][Hardy] (61) Server returned error with no error message during operation: FetchResults TStatus.statusCode=ERROR_STATUS TStatus.infoMessages= TStatus.sqlState= TStatus.errorCode=0 TStatus.errorMessage="" TStatus.__isset.errorCode: false TStatus.__isset.errorMessage: false TStatus.__isset.infoMessages: false TStatus.__isset.sqlState: false
This error seems to be occurring because of UUID field - order_id in test_orders column family. How can this be fixed?
Related
Using Apache Kafka sink connector to insert/update data from SQL Server to SQL Server getting following ERROR
java.sql.BatchUpdateException: Cannot insert explicit value for identity column in table 'table_name'
when IDENTITY_INSERT is set to OFF.
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(SQLServerPreparedStatement.java:2075)
The Source Configuration
name=jdbc-mssql-prod-5
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
connection.url=jdbc:sqlserver:
connection.user=
connection.password=
topic.prefix= source_topic.
mode=timestamp
table.whitelist=A,B,C
timestamp.column.name=ModifiedDateTime
connection.backoff.ms=60000
connection.attempts=300
validate.non.null= false
# enter timestamp in milliseconds
timestamp.initial= -1
The Sink Configuration
name=mysql-sink-prod-5
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1
topics= sink_topic_a, sink_topic_b
connection.url=jdbc:sqlserver:
connection.user=
connection.password=
insert.mode=upsert
delete.enabled=true
pk.mode=record_key
errors.log.enable= true
errors.log.include.messages=true
In the table, primary key column and identity column are same.
I'm setting up a data pipeline to export from a Kusto table to a SQL Server table. The only problem is the target table has two GENERATED ALWAYS columns. Looking for some help implementing the solution using Kusto.
This is the export statement:
.export async to sql ['CompletionImport']
h#"connection_string_here"
with (createifnotexists="true", primarykey="CompletionSearchId")
<|set notruncation;
apiV2CompletionSearchFinal
| where hash(SourceRecordId, 1) == 0
Which gives the error:
Cannot insert an explicit value into a GENERATED ALWAYS column in table 'server.dbo.CompletionImport'.
Use INSERT with a column list to exclude the GENERATED ALWAYS column, or insert a DEFAULT into GENERATED ALWAYS column.
So I'm a little unsure how to implement this solution in Kusto. Would I just add a project pipe excluding the GENERATED ALWAYS columns? Maybe ideally how could I insert a DEFAULT value into GENERATED ALWAYS SQL Server columns using a Kusto query?
Edit: Trying to use materialize to create a temporary table in the cache and export this cached table. However I can't find any documentation on this and the operation is failing:
let dboV2CompletionSearch = apiV2CompletionSearchFinal
| project every, variable, besides, generated, always, ones;
let cachedCS = materialize(dboV2CompletionSearch);
.export async to sql ['CompletionImport']
h#"connect_string"
with (createifnotexists="true", primarykey="CompletionSearchId")
<|set notruncation;
cachedCS
| where hash(SourceRecordId, 1) == 0
with the following error message:
Semantic error: 'set notruncation;
cachedCS
| where hash(SourceRecordId, 1) == 0'
has the following semantic error: SEM0100:
'where' operator: Failed to resolve table or column expression named 'cachedCS'.
So I'm using postgresql 9.5 to store a bunch of data in a table named ids. The data is stored like this :
id1 (UNIQUE, NOT NULL) | bool1 | bool2 | bool3 | id2 (not unique)
id1 and id2 are text variables.
When I try to insert new data into the database I use this query :
"INSERT INTO ids (id1, bool2, id2) VALUES (%s, TRUE, %s) ON CONFLICT (id1) DO UPDATE SET interaction = TRUE, id2 = %s;"
Now when I try to insert this specific id1 : c1316a6a-3bdd-4aeb-a1b6-b3044851880a__19
that is unique (I double checked that) the database hangs. While I am using python and psycopg2 to call the query, if I try to insert that key into the database by hand (by connecting to the database via CLI or GUI) the database still hangs.
However, if I create a similar database and use the exact same command, it works fine.
Any ideas as to why it hangs and how to fix it ?
EDIT :
Thank you, a_horse_with_no_name
It was indeed a waiting transaction. I killed its pid and everything worked fine afterwards.
Thanks again
i can't import data to cassandra because i am using DSE Solr now and as i can see it created solr_query (virtual column) in my table.
So i tried COPY table FROM 'file' WITH SKIPCOLS = "solr_query";
but getting same error.
Failed to import 10 rows: ParseError - Invalid row length 9 should be 10 - given up without retries.
So how can i import data and ignore solr_query column ?
The copy command accepts the columns to import as a list COPY. Try to list them, avoiding the solr_query column, and it should be ok:
COPY table (colA, colB, colC,...) FROM 'file'
I'm new to Grails and mapping and I have something that looks like this.
I have two domain classes and I need to make a relationship between them, and when the relationship is done that no changes would be made to existing tables from my PostgreSQL database.
class Insurance{
Integer id
String osg_name
String osg_logo
String osg_email
String osg_link
static hasMany = [ insurancePackage: InsurancePackage]
static constraints = {
id(blank: false)
osg_name (blank: false, size: 0..155)
osg_logo (size: 0..155)
osg_email (blank: false, size: 0..100)
osg_link (size: 0..155)
}
static mapping = {
table name: "insurance", schema: "common"
version false
id generator :'identity', column :'osg_id', type:'integer'
}
}
class InsurancePackage{
Integer id
Integer osg_id
String osgp_comment
Integer tpo_id
String osgp_link
String osgp_label
//static belongsTo = Insurance
static belongsTo = [insurance: Insurance]
static constraints = {
id(blank: false)
osg_id (blank: false)
osgp_comment (blank: false, size: 0..500)
tpo_id (blank: false,)
osgp_link (blank: false, size: 0..155)
osgp_label (blank: false, size: 0..10)
}
static mapping = {
table name: "insurance_package", schema: 'common'
version false
id generator :'identity', column :'osgp_id', type:'integer'
}
}
This is the error that I'm getting
Error 2015-07-16 13:38:49,845 [localhost-startStop-1] ERROR hbm2ddl.SchemaUpdate - Unsuccessful: alter table revoco.insurance_package add column insurance_id int4 not null
| Error 2015-07-16 13:38:49,845 [localhost-startStop-1] ERROR hbm2ddl.SchemaUpdate - ERROR: column "insurance_id " contains null values
| Error 2015-07-16 13:38:49,845 [localhost-startStop-1] ERROR hbm2ddl.SchemaUpdate - Unsuccessful: alter table revoco.insurance_package add constraint FK684953517A89512C foreign key (insurance_id ) references revoco.insurance
| Error 2015-07-16 13:38:49,845 [localhost-startStop-1] ERROR hbm2ddl.SchemaUpdate - ERROR: column "insurance_id " referenced in foreign key constraint does not exist
So I cant connect the two tables and I'm getting the same error, for some reason Grails are trying to find insurance_id but that is not defined in classes and they are trying to alter my tables and I don't want that to happen.
You are created a new column in the insurance_package table that holds a foreign key to the insurance table. (hasMany and belongsTo --> one-to-many)
The problem here is that the column has a NOT NULL contraint by default but the table appears to have already data in it.
The question is now: What to do with the data already contained in the table. Grails wants to set the NOT NULL constraint but can't because there are already in there and because you have just created the column and the values are NULL
You have 3 options depending on your use case:
delete the values already contained in the table (maybe not wanted)
Go in your db management tool and set a foreign key for those rows and then restart the server. The error should disappear
set the constraint for your insurance reference (belongsTo) in your "InsurancePackage" object to be nullable:true