Import data to SQL Server from R without reading it - sql-server

I need to import a list of files (txt) to SQL Server, but I would like to do this without reading them in R. Is there a way to do this?
I know that this works, but I would like to avoid the read.table part?
tabla<-read.table("Cuota_Ingreso_201612.txt", header=TRUE)
require (RODBC)
dbWar <- odbcDriverConnect('driver={SQL Server};server=XX;database=Warehouse;trusted_connection=true')
sqlSave(dbWar ,tablename='Cuota_Ingreso_201612',tabla, rownames=FALSE)
Thanks.
PS: I want to use R because is a big list of files and I want to write a loop to do it. Please don't answer using another program.

Related

Problem with dump command using DB Browser for SQLite

I am relatively new to SQlite3. I am currently using DB Browser. I have two very large .db files that I want to combine such that a all the tables (all different) all reside together in one .db (ie- I don't want to just attach them).
I am trying to use a simple .dump code in order to achieve this, but keep getting errors. I first define the path/directory and then:
for db in OctoberData, OctoberData2; { sqlite3 ${db}.db .dump | sqlite3 OctoberDataAll.db }
and get
Result: near "for": syntax error
At line 1:
for
Im sure it is something simple, but what am I doing wrong?
Thank you for your help!

disregard line breaks in field

I'm trying to import data into my MS SQL DB (as a flat file). However, there is a problem with one of the fields: it contains a line break within the data, which leads to the import wizard thinking it's the end-of-line, hence breaking each row into two. I've tried to import the data into excel as well (just to try it out), but it's the same behavior.
Does anyone know how to solve this? Any pre-import mechanism that might massage the data somehow?
(unfortunately, it's not practically possible for me to ask the source system to change the encoding)
//Eva-Lotta
Use to replace new line character in columns having values.
Replace(Replace(columnName,char(13),' '),char(10),' ')
Regards
I've managed to find a work-around! I start with splitting the files into chunks (as they are 3.8 GB in size ...), open them in UltraEdit, loop through them to join the 2 lines together, and import them into excel / my SQL DB. It's not neat, but it has solved my immediate problem ... but thanks for your engagement!

GAE - How to import CSV file to Cloud SQL?

Is there a way for me to import csv files to cloud sql? I've tried to follow this tutoriaL but I didn't work: https://cloud.google.com/sql/docs/admin-api/v1beta4/instances/import#request-body
Is there any other way to accomplish this?
Thanks!
If you provide some more information regarding the fail message we might be able to tell you what you need to fix in order to complete the tutorial. However, You can do something a little ugly, but it works. You'll need to import pymysql.
So lets assume that the CSV data is in csv_file_lines list variable. Now we need to go over the lines and execute an insert to the tables, with the current line's data, in the following way:
db = pymysql.connect(host=<host>,port=<port>,user=<user>,password=<password>)
cursor = db.cursor()
for line in csv_file_lines:
cursor.execute('INSERT INTO TABLE....')

How do I exploit "EXEC #sql"?

My co-worker is being unsafe with his code and is allowing a user to upload an SQL file to be run on the server.
He strips out any key words in the file such as "EXEC", "DROP", "UPDATE", "INSERT", "TRUNC"
I want to show him the error of his ways by exploiting his EXEC ( #sql )
My first attempt will be with 'EXEXECEC (N''SELECT ''You DRDROPOPped the ball Bob!'')'
But he might filter that all out in a loop.
Is there a way I can exploit my co-worker's code? Or is filtering out the key words enough?
Edit: I got him to check in his code. If the code contains a keyword he does not execute it. I'm still trying to figure out how to exploit this using the binary conversion.
Tell your co-worker he's a moron.
Do an obfuscated SQL query, something like:
select #sql = 0x44524f5020426f627350616e7473
This will need some tweaking depending on what the rest of the code looks like, but the idea is to encode your code in hex and execute it (or rather, let it be executed). There are other ways to obfuscate code to be injected.
You've got a huge security hole there. And the funny part is, this is not even something that needs to be reinvented. The proper way to stop such things from happening is to create and use an account with the correct permissions (eg: can only perform select queries on tables x, y and z).
Have a look at ASCII Encoded/Binary attacks ...
should convince your friend he is doomed.. ;)
And here some help on how to encode the strings ..
Converting a String to HEX in SQL

Script output to file when using SQL-Developer

I have a select query producing a big output and I want to execute it in sqldeveloper, and get all the results into a file.
Sql-developer does not allow a result bigger than 5000 lines, and I have 100 000 lines to fetch...
I know i could use SQL+, but let's assume I want to do this in sqldeveloper.
Instead of using Run Script (F5), use Run Statement (Ctrl+Enter). Run Statement fetches 50 records at a time and displays them as you scroll through the results...but you can save the entire output to a file by right-clicking over the results and selecting Export Data -> csv/html/etc.
I'm a newbie SQLDeveloper user, so if there is a better way please let me know.
This question is really old, but posting this so it might help someone with a similar issue.
You can store your query in a query.sql file and and run it as a script. Here is a sample query.sql:
spool "C:\path\query_result.txt";
select * from my_table;
spool off;
In oracle sql developer you can just run this script like this and you should be able to get the result in your query_result.txt file.
#"C:\Path\to\script.sql"
Yes you can increase the size of the Worksheet by change the setting Tool-->Preferences - >Database -> Worksheet -> Max rows to print in a script(depends on you).
Mike G answer will work if you only want the output of a single statement.
However, if you want the output of a whole sql script with several statements, SQL*Plus reports, and some other output formats, you can use the spool command the same way as it is used in SQL*Plus.

Resources