In visual foxpro, i have a cursor which is the result of a sql query, when i export the content of that cursor to a csv file using the statement :
COPY TO "c:\test.csv" type DELIMITED
all data is messed up, i do not pecify any delimiter so basically foxpro takes the default, which is every column in that cursor. bow when i run the same command to an xls file, and then convert it to a csv file...it works very well:
COPY TO "c:\test.xls" type XL5
anyone has had such issue, any one still using foxpro and doing stuff like those?
Have you tried using TYPE CSV in the COPY TO command?
Personally I never liked the built-in DBF to CSV converters. They always seemed to do things I did not want them to do. So I just wrote my own. Here is some code to get you started.
LOCAL lnFields
SELECT DBF
lnFieldCount = AFIELDS(laFields)
lnHandle = FOPEN("filename.csv", 1)
ASSERT lnHandle > 0 MESSAGE "Unable to create CSV file"
SCAN
lcRow = ""
FOR lnFields = 1 TO lnFieldCount
IF INLIST(laFields[lnFields,2], 'C', 'M')
lcRow = lcRow + IIF(EMPTY(lcRow), "", ",") + '"' + ;
STRTRAN(EVALUATE(laFields[lnFields,1]),'"', '""') + '"'
ELSE
lcRow = lcRow + IIF(EMPTY(lcRow), "", ",") + ;
TRANSFORM(EVALUATE(laFields[lnFields,1]))
ENDIF
ENDFOR
FWRITE(lnHandle, lcRow)
ENDSCAN
FCLOSE(lnHandle)
Related
I am trying to load .csv file data to Snowflake table and using following command
COPY INTO MYTABLE
FROM #S3PATH PATTERN='.*TEST.csv'
FILE_FORMAT = (type = csv skip_header = 1) ON_ERROR = CONTINUE PURGE=TRUE FORCE=TRUE;
Following scenario I am seeing
1) if even one column of the table is numeric it will throw error
Numeric value '""' is not recognized
2) if i change all the columns data type to varchar, then it will load the data but it will populate
all the columns data with "" double quotes ( instead of 15 , "15")
Thanks in advance for your response!
You're likely missing FIELD_OPTIONALLY_ENCLOSED_BY = '\042' in your file_format. Add that in and try.
https://docs.snowflake.com/en/sql-reference/sql/create-file-format.html#type-csv
https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html
Thanks CodeMonkey!
One issue is solved
current scenario:
One column is defines as " NUMBER" in SF table and if the csv file has a value populated for that columns then those were the only rows loaded in the table. basically if the numeric column in csv file is null (or blank) those record as not loaded.
also tried using
EMPTY_FIELD_AS_NULL = TRUE
still the same result as above.
"first_error" message: Numeric value '' is not recognized
here is what i did and it is working
FILE_FORMAT = (type = csv field_delimiter = ',' skip_header = 1 FIELD_OPTIONALLY_ENCLOSED_BY = '\042' EMPTY_FIELD_AS_NULL = TRUE NULL_IF = ('NULL','null','')) ON_ERROR = CONTINUE PURGE=TRUE FORCE=TRUE;
I’m pulling data from a SQL Server table using pyodbc python code.
In the output file I’m getting records like this:
1, 1, None, None, None, None, None, None
The None values are Null in the SQL table.
I’d like to see records in the text file in this format. I do not want to see the None.
1, 1, , , , , ,
Any ideas how I can do this?
Here is the code I'm using:
import pyodbc
outputfile = 'MyOut.txt'
output_data = open(outputfile, 'w+')
conn=pyodbc.connect(
r'Driver={SQL Server};'
r'Server=MyServer;’
r'Database=MyData;'
r'Trusted_Connection=yes;')
crsr = conn.cursor()
crsr.execute('select * from MyTable’)
for row in crsr:
print(str(row))
outrows = str(row).strip('(')
outrows = outrows.strip(')')
output_data.write(outrows + '\n')
output_data.close()
I understand that outrows is a string, but this would probably be made easier with an list. Aside from that, the output is probably meant to be a string, since you're writing into a txt.
You could modify your for loop as such
for row in crsr:
outrows = str(row).strip("(").strip(")")
line = outrows.split(",")
# creating the array, by splitting the string at each comma
for component in line:
if component == " None":
# with " " as there is most likely a space after the "," in the file
line[line.index(component)] = ""
file.write(",".join(line)+"\n")
I'm afraid I'm not particularly familiar with pyodbc, but I hope this was of help.
I have a pretty simple CSV file (separator ; Notepad++ says CR LF as line separator; UCS-2 Little Endian) that I need to daily import into SQL.
I spent a day now making BULK INSERT or OPENROWSET work, but I fail constantly
BULK INSERT tt_MaterialSAPx
FROM '\\192.168.89.22\LandingZone\MAT.csv'
WITH (FIRSTROW = 2
,LASTROW = 5
--,DATAFILETYPE='native'
, FIELDTERMINator = ';'
,ROWTERMINATOR = '\r'
--,Formatfile = '\\192.168.89.22\LandingZone\formatfile.fmt'
)
The BULK insert Code (it doesn't import at all if I set the rowterminator to\r\n) doesn't read the characters correctly (Basically its a "blank" between each characters - UTF16 vs UFT-8?) . Furthermore it fails if I take out the LASTROW = 5 with "Bulk load: An unexpected end of file was encountered in the data file."
The
SELECT * FROM OPENROWSET(
BULK '\\192.168.89.22\LandingZone\MAT8.csv'
, SINGLE_CLOB
--,Formatfile = '\\192.168.89.22\LandingZone\formatfile.fmt'
) AS DATA
This sticks all data into the first column and first row (so ignores field and rowterminators
Utterly frustrated: What am I missing? Or is there a third way to solve thsi?
I put a SQL statement into a button in visual to make it insert data in the DB and when I touch it, this error happens:
Conversion from string "Insert into TBL_Usuario_102 valu" to type 'Double' is not valid.
This is the code that's in the button:
Private Sub Guardar_Click(sender As Object, e As EventArgs) Handles Guardar.Click
If NombreDePersona.Text <> "" And Cedula.Text <> "" And RepetirContraseña.Text <> "" And Contraseña.Text <> "" Then
If (RepetirContraseña.Text = Contraseña.Text) Then
instruccionSQL = New SqlClient.SqlCommand("Insert into TBL_Usuario_102 values" +
"(" + Cedula.Text + "," +
NombreDePersona.Text + "," + 3 +
"," + Contraseña.Text + "," +
FechaInclusion.Text + "," + 0 +
"," + FechaInclusion.Text + "," + 3 + ")")
MsgBox("Datos Guardados Correctamente")
Cedula.Clear()
NombreDePersona.Clear()
Contraseña.Clear()
RepetirContraseña.Clear()
Else
MsgBox("Las contraseñas no coinciden")
End If
Else
MsgBox("Escriba en Cada Campo")
End If
End Sub
The SQL connection is in a module and it working good because when I insert the data manually in SQL Server the login works fine.
The type of data in the table of the database is in this order
varchar(15)
varchar(20)
int
varchar(50)
datetime
bit
datetime
int
Creating a SQL string like this is dangerous, as it can lead to SQL injection attacks. Usually it is recommended to use command parameters; however, you can also escape single quotes in strings by doubling them. This should make such an attack impossible. Command parameters also have the advantage that you don't have to care about the formatting of strings (and escaping them), numbers, Booleans and dates. E.g. see: How to pass a parameter from vb.net.
As it is now, there is another problem with your SQL statement. Strings must be enclosed in single quotes. Also use & for string concatenation. Not + (it's this + which let's VB think that you want to add Doubles).
The type of your texts and numbers inputs does not seem to match the one in the table (is NombreDePersona a varchar(20)?) and you are inserting FechaInclusion twice.
I would also specify the column names explicitly
INSERT INTO TBL_Usuario_102 (column_name1, column_name2, ...) values ('a text', 3, ...)
Finally, you don't execute your command. After having opened a connection:
instruccionSQL.ExecuteNonQuery()
This sp_send_dbmail script works in one of our processes. It attaches an Excel file filled with whatever the query is. It knows to do this because of the extension on the file's name (.xls).
However, it changes a varchar(50) field into a number field, and removes the leading zeroes. This is a known annoyance dealt with in a million ways that won't work for my process.
EXEC msdb.dbo.sp_send_dbmail
#profile_name = #profileName
,#recipients = #emailRecipientList
,#subject = #subject
,#importance = #importance
,#body = #emailMsg
,#body_format = 'html'
,#query = #QuerySQL
,#execute_query_database = #QueryDB
,#attach_query_result_as_file = 1
,#query_attachment_filename = #QueryExcelFileName
,#query_result_header = 1
,#query_result_width = #QueryWidth
,#query_result_separator = #QuerySep
,#query_result_no_padding = 1
Examples of problem below: this simple query changes the StringNumber column from varchar to number in Excel, and removes the zeroes.
SELECT [RowID],[Verbage], StringNumber FROM [dbo].[tblTestStringNumber]
In SQL Server (desired format):
After in Excel (leading zeroes missing):
Now, there might be a way. I only say this because in SQL Server 2016 results pane, if you right click in upper left hand corner, it gives the option of "Open in Excel"
And. . . . drum roll . . . the dataset opens in Excel and the leading zeroes are still there!
If you start a number with a single quote (') in Excel, it will interpret it as a string, so a common solution is to change the query to add one in:
SELECT [RowID]
,[Verbage]
, StringNumber = '''' + [StringNumber]
FROM [dbo].[tblTestStringNumber]
And Excel will usually not display the single quote because it knows that it's a way to cast to type string.
#JustJohn I think it will work fine:
SELECT [RowID]
,[Verbage]
, '="' + [StringNumber]+ '"' StringNumber
FROM [dbo].[tblTestStringNumber]