I cannot create a table in Bigquery due to schema issue - database

I am having difficulty with a creating table in Bigquery.
The table I tried to upload looks like this.
enter image description here
First, I tried auto-detect but it didn't work.
And error message is like this:
Error while reading data, error message: Could not parse '4/12/2016 12:00:00 AM' as TIMESTAMP for field SleepDay (position 1) starting at location 65 with message 'Invalid time zone: AM'
So, I tried Edit as text like this:
Id:INTEGER, SleepDay:DATETIME, TotalSleepRecords:INTEGER, TotalMinutesAsleep: INTEGER, TotalTimeInBed: INTEGER
And now I see error like this:
Error while reading data, error message: Could not parse 'Id' as INT64 for field Id (position 0) starting at location 0 with message 'Unable to parse'
Isn't the Id like 1503960366 integer?
How should I change this?

Could not parse 'Id' as INT64 for field Id (position 0) starting at
location 0 with message 'Unable to parse'
This error message reads to me as you are trying to read in the header ('Id') as an int, too. There is an option to skip one or more header row(s) under "Advanced options" when adding a table to BigQuery, try adding the number 1 there to skip the top row.

Related

How to exclude raw from result set in SQL SERVER?

I was Reading About SQL-Injection in SQL-Server and i saw an Technique That allow me to show any information from database via Errors
assume this is the vulnerable site:
www.victim.com/showproducts.aspx?category=bikes
and when i put this code after parameter value :
www.victim.com/showproducts.aspx?category=bikes AND 1=0/username
this will show an Error that told me its can do an / via int and string and get an data from column username .
Conversion failed when converting the varchar value 'admin' to data type int
its show me the first data in column username , etc : admin
when i try to get another username i exclude the fist value 'admin' via :
www.victim.com/showproducts.aspx?category=bikes AND username not in ('admin') AND 1=0/usernaame;
but its still get the first value " admin " , . why ? and how to get the next value from this technique ...

Talend parse Date "yyyy-MM-dd'T'HH:mm:ss'.000Z'"

I have an error parsing a date in Talend.
My input is an excel file as String and my output is a Date with the following Salesforce format "yyyy-MM-dd'T'HH:mm:ss'.000Z'"
I have a tMap with this connection
TalendDate.parseDate("yyyy-MM-dd'T'HH:mm:ss'.000Z'",Row1.firstDate)
but is throwing the following error:
java.lang.RuntimeException: java.text.ParseException: Unparseable
date: "2008-05-11T12:02:46.000+0000" at
routines.TalendDate.parseDate(TalendDate.java:895)
Any help?
Thanks
In TalendDate.parseDate, the parameter "pattern" must match the pattern of the input String, and not the pattern of the Date you want in the output.
You can try :
TalendDate.parseDate("yyyy-MM-dd'T'HH:mm:ss'.000+0000'",Row1.firstDate )
Formatting of Date output is accessible in the 'schema' menu, in "Date Model" column.
Try this,
TalendDate.parseDate("MM/dd/yyyy",'T'HH:mm:ss',Row1.firstDate);

Import JSON into ClickHouse

I create table with this statement:
CREATE TABLE event(
date Date,
src UInt8,
channel UInt8,
deviceTypeId UInt8,
projectId UInt64,
shows UInt32,
clicks UInt32,
spent Float64
) ENGINE = MergeTree(date, (date, src, channel, projectId), 8192);
Raw data looks like:
{ "date":"2016-03-07T10:00:00+0300","src":2,"channel":18,"deviceTypeId ":101, "projectId":2363610,"shows":1232,"clicks":7,"spent":34.72,"location":"Unknown", ...}
...
Files with data loaded with the following command:
cat *.data|sed 's/T[0-9][0-9]:[0-9][0-9]:[0-9][0-9]+0300//'| clickhouse-client --query="INSERT INTO event FORMAT JSONEachRow"
clickhouse-client throw exception:
Code: 117. DB::Exception: Unknown field found while parsing JSONEachRow format: location: (at row 1)
Is it possible to skip fields from JSON object that not presented in table description?
The latest ClickHouse release (v1.1.54023) supports input_format_skip_unknown_fields user option which eneables skipping of unknown fields for JSONEachRow and TSKV formats.
Try
clickhouse-client -n --query="SET input_format_skip_unknown_fields=1; INSERT INTO event FORMAT JSONEachRow;"
See more details in documentation.
Currently, it is not possible to skip unknown fields.
You may create temporary table with additional field, INSERT data into it, and then do INSERT SELECT into final table. Temporary table may have Log engine and INSERT into that "staging" table will work faster than into final MergeTree table.
It is relatively easy to add possibility to skip unknown fields into code (something like setting 'format_skip_unknown_fields').

How to log the failed records while insert the data in bulk mode

I am inserting the data in bulk mode.I want to insert the data from one db table to another db table. I am using Scatter-gather message processor.I have 10 records in source db table, in this 10 records the second record has some invalid data (like firstname is null) remaining 9 records are valid data, but in my target db table firstname column is not null. While inserting these 10 records into target db, its throwing the error as firstname is not null. How to identify particular record has invalid data using exception handling in mule. I am new in mule esb. Can anyone help on this scenario.`
%output application/java
payload map
{
id : $.Id,
customerid : $.Customerid,
address : $.Address,
dob : $.Dob,
firstname : $.Firstname,
lastname : $.LastName,
middlename : $.Middlename,
phoneno : $.Phoneno,
batch : $.Batch,
recorddate : $.RecordDate
}]]>
`
Kindly post the exception message you are getting along with your xml flow.
But as of now i may give below suggestion.
Use a collection splitter to split and process each records.
catch the exception in the error handling block using the context #**[Exception.causedBy(your exception class)]**
After this kindly configure your strategy what to do in case of this exception happens.
In your case log your information with any column value or any record id that is unique for every message.This may help you out to see on which particular record your exception has occurred.
Thanks!

Sybase SQL Anywhere - Unable to export data to file

I am attempting to export a query in Sybase SQl Anywhere but am receiving an error when getting to the OUTPUT TO command. My query looks like this:
SELECT User_Name as 'Remote Database', nDaysBehind as 'Days Behind', Time_Received as 'Last Message Received'
FROM DailySynchRptView
WHERE Time_Received < today() -1 AND nDaysBehind > 0
ORDER BY Time_Received ASC
OUTPUT TO c:\daysbehind.txt format ascii
The information that shows up in ISQL when I leave off the "OUTPUT TO" is the following:
Remote Database,Days Behind,Last Message Received
'Rem00027',23,'2011-02-23 16:10:14.000'
'Rem00085',7,'2011-03-11 04:47:02.000'
'Rem00040',5,'2011-03-13 15:22:15.000'
'Rem00074',4,'2011-03-14 16:01:25.000'
'Rem00087',3,'2011-03-15 06:04:16.000'
However, when the OUTPUT TO command is placed in the query, I receive the following error:
Could not execute statement.
Syntax error near 'OUTPUT' on line 5
SQLCODE=-131, ODBC 3 State="42000"
Line 1, column 1
I am open to any suggestions that might help me be able to export the data from the query. I have ran a similar query that returns a single line of information and it does export without errors.
After a while looking at the code, I found that I was missing a semi-colon ; to separate the two sets of commands. Once I added the semi-colon before the OUTPUT line, I was able to export the information.

Resources