Importing csv file into pgAdmin 4 - pgadmin-4

When I'm importing a .csv file into pgAdmin 4, says error "invalid input syntax for type integer: "NA". ", CONTEXT: COPY results, line 42397, column home_score: "NA"
I don't understand what to do. I gave the image below. Please everyone helps me. Thanks!
https://i.stack.imgur.com/FxER7.png

Related

How to import a .csv file with double quotes in column values to SQL table

I am trying to import the data from a .csv file to SQL table using SSIS data flow task. One row in my .csv file is like
Col1,Col2,Col3
1200,"ABC","Value is \"greater\" than expected"
While creating the Flat file connection, I have given Comma as Delimiter and " as Qualifier. And created a derived column (REPLACE(Col3,"\"","")) as the second step to remove \" from column3.
But as soon as I start running the package I get an error in the Flat file source itself as "Column delimiter for col3 was not found".
Can someone please guide me in solving this issue?
You may need to escape the slash too, try this please and let us know:
(REPLACE(Col3,"\\\"",""))

Is there a way to find out details of data type erorr in Snowflake?

I am pretty new to Snowflake Cloud offering and was just trying to load a simple .csv file from AWS s3 staging are to a table in Snowflake using copy command.
Here is what I used as the command:
copy into "database name"."schema"."table name"
from #S3_ACCESS
file_format = (format_name = format name);
When run the above code, I get the following error: Numeric value '63' is not recognized
Please see the attached image. Not sure what this error is and i'm not able to find any lead in Snowflake UI itself to find out what could be wrong with the value.
Thanks in Advance!
The error says, it was waiting a numberic value, but it got "63", and this value can not be converted to numeric value.
From the image you share, I can see that there are some weird characters around 6 and 3. There could be an issue with file encoding or data is corrupted.
Please check encoding option for file format:
https://docs.snowflake.com/en/sql-reference/sql/create-file-format.html#format-type-options-formattypeoptions
By the way, I recommend you always use utf-8.

Potential Loss of Data reading from CSV with decimal

I have read a large number of questions and answers on this and I still can't get it to work.
I have a csv like the following:
Field1;Field2;Field3
CCC;DDD;0.03464
EEE;FFF;0.08432
...
When I attach a Flat File Source, in SSIS, it gives me the following:
[Sample CSV [2]] Error: Data conversion failed. The data conversion
for column "Field3" returned status value 2 and status text "The value
could not be converted because of a potential loss of data.".
I have already changed the output to DT_DECIMAL, with 5 as the scale value, in the advance properties but I still get the same error.
Any clue on this?
It seems like a simple solution that I am somehow overlooking.
Thanks!
There are many values that cannot be converted to DT_DECIMAL, you can detect the values that cause this error by utilizing of the Flat File Error Output which redirect the rows that are causing errors when loading data.
Helpful Links
ERROR HANDLING IN SSIS WITH AN EXAMPLE STEP BY STEP
SSIS error when loading data from flat files

Running a sql query to csv file

Im trying to run a stored procedure and output the result to CSV. I am running this via the command line using the following command:
SQLCMD -S SERVER/INSTANCE -E -Q "Exec DBNAME.STORED_PROCEDURE" -s "," -o "C:\temp\stats_data.csv"
This works but with 2 problems.
Problem 1) I end up with whitespace around my headers, the library im using to import the CSV file later in my project can trim these out, but ideally I'd like them not to be there if possible.
example:
StatId ,AsAtDate ,PracticeCode ,
0, 2017-03-09,EMIS-170 ,
Problem 2) It outputs this row of "----" to separate the headers and data which is being treated as my first row of data when I try to parse in my project and throws an exception.
example:
StatId ,AsAtDate ,PracticeCode ,
-----------,----------------,--------------------,
0, 2017-03-09,EMIS-170 ,
Does anyone know how to solve either of these problems? My priority is the separator line (problem 2) as the first problem i can handle if need be.
Any help is greatly appreciated.
To overcome the first problem try to use -W parameter in your command line.

pgadmin4 - Download Query result as CSV

I wrote a query using the query tool in pgadmin 4. Now I want to download the results as a csv. I´ve got two problems with that.
The 'Download as CSV'-button does not work sometimes. Especially when the result contains 1000+ rows.
When I finally have a csv and I want to open it, this message is all I see:
"'ascii' codec can't encode character u'\xbb' in position 26: ordinal not in range(128)"
Since I´m fairly new to all of this, could someone enlighten me to what is wrong?
On your questions:
The broken CSV download was a known bug that was fixed in pgAdmin v1.5 (Bug summary at the login-required https://redmine.postgresql.org/issues/2253; the gist is that there were multiple issues with exporting JSON data and Unicode). If you're not on that version, try updating and see whether you continue to have the issue.
You didn't specify where you're seeing that message regarding encoding, but the character referenced in the error is a "Right-Pointing Double Angle Quotation Mark" (») (http://www.codetable.net/hex/bb).

Resources