As said in my previous question I am having trouble loading data into my table through a csv file. The error I am getting is that the file is not found but it's there. Could this be a WAMP issue i.e. permissions? You can find the code below along with the error.
Appreciate your help as always:
function load_table() {
global $wpdb;
$filename = 'upper_db_.csv';
$table_nme = $wpdb->prefix . "upper_winds";
$sql = "LOAD DATA INFILE '" . $filename . "'
INTO TABLE $table_nme
FIELDS TERMINATED BY ','
ENCLOSED BY '\"'
ESCAPED BY '\"'
LINES TERMINATED BY '\n'
";
$wpdb->query($sql);
}
File 'c:\wamp\bin\mysql\mysql5.6.17\data\wp-test\upper_db_.csv' not found (Errcode: 2 - No such file or directory)
P.S. I have tried using backslashes, giving full path and I have also tried using the LOAD DATA LOCAL INFILE but to no avail.
Problem has been solved using the below
$filename = str_replace("//","//////",__DIR__ ."/file.csv");
The plugin has now been completed and can be viewed at http://howtoflyahelicopter.com/aviation-weather-briefing/
Related
I have a pipe delimited csv file whose data are to be inserted into sql server.
Then I
Opened that file in excel and added comma in column values.
Then saved.
Here is the data in notepad++ after adding comma.
Then I bulk inserted that file into sql server.
BULK INSERT #temp1
FROM '\\as03\AppData\PipeDelimited.csv'
WITH (
FIRSTROW = 2
, FIELDTERMINATOR ='|'
, ROWTERMINATOR='\n'
, ERRORFILE ='\\as03\AppData\ErrorFiles\PipeDelimited.csv'
, CODEPAGE = '65001'
**strong text**, MAXERRORS = 99999999
)
But got the double quotes (") in first and last column values and also got two consecutive double quotes ("") where one double quote was already in file.
Here is the data inserted in sql server.
Is where some way to insert data in sql server by ignoring double quotes that were added by excel or notepad????
This appears to be a non-(PowerShell)issue as it actually works fine in PowerShell:
$Csv = #'
Name|Country|Address|Mobile
Tom|Nepal|Kathmandu,Nardevi|98456667365
Harry|India|Delhi"Patna,™ or R|9856524524
'# -Split '[\r?\n]'
$Csv |ConvertFrom-Csv -Delimiter '|' # Import-Csv .\PipeDelimited.csv -Delimiter '|'
Yields:
Name Country Address Mobile
---- ------- ------- ------
Tom Nepal Kathmandu,Nardevi 98456667365
Harry India Delhi"Patna,™ or R 9856524524
In other words, you might simply convert your PipeDelimited.csv to a more common CommaDelimited.csv with text indicators like:
Import-Csv .\PipeDelimited.csv -Delimiter '|' |Export-Csv .\CommaDelimited.csv
Your file was corrupted by editting it in Excel and saving it as a CSV. The best solution is to not use Excel to edit such files, but rather either use scripting or a text editor (depending on the complexity - just adding a comma to one field feels easiest in a text editor).
However; if we're saying the damage is done and you need a script to fix the issues caused you could run something like this; this reads in the data as plain text, applies a regex to remove the offending quotes, then spits the result out to a copy of the file (I've written to a copy rather than back to the original so you can rollback more easily if this wasn't what you wanted).
[string]$sourcePath = '.\PipeDelimited.csv'
[string]$outputPath = '.\PipeDelimitedCorrected.csv'
[System.Text.Encoding]$encoding = [System.Text.Encoding]::UTF8 # based on your code page
[string[]]$data = [System.IO.File]::ReadAllLines($sourcePath, $encoding)
$data = $data | ForEach-Object {((($_ -replace '^"', '') -replace '"$', '') -replace '""', '"')}
# replaces quotes at the start and end of the line with blanks,
# then replaces escaped double quotes with individual double quotes...
# it's a quick and dirty approach, but looking at your example should be sufficient
[System.IO.File]::WriteAllLines($outputPath, $data, $encoding)
I am trying to "COPY INTO" command to load data from s3 to the snowflake
Below are the steps I followed to create the stage and loading file from stage to Snowflake
JSON file
{
"Name":"Umesh",
"Desigantion":"Product Manager",
"Location":"United Kingdom"
}
create or replace stage emp_json_stage
url='s3://mybucket/emp.json'
credentials=(aws_key_id='my id' aws_secret_key='my key');
# create the table with variant
CREATE TABLE emp_json_raw (
json_data_raw VARIANT
);
#load data from stage to snowflake
COPY INTO emp_json_raw from #emp_json_stage;
I am getting below error
Field delimiter ',' found while expecting record delimiter '\n' File
'emp.json', line 2, character 18 Row 2, column
"emp_json_raw"["JSON_DATA_RAW":1]
I am using a simple JSON file, and I don't understand this error.
What causes it and how can I solve it?
File format is not specified and is defaulting to CSV format hence the error.
Try this:
COPY INTO emp_json_raw
from #emp_json_stage
file_format=(TYPE=JSON);
There are other options too that can be specified with file_format other than TYPE. Refer the documentation here: https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html#type-json
try:
file_format = (type = csv field_optionally_enclosed_by='"')
The default settings do not expect the " wrapping around your data.
So you could strip all the " or ... just set the field_optionally_enclosed_by to a ". This does mean if your data has " in it things get messy.
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-copy-into.html
https://docs.snowflake.com/en/sql-reference/sql/create-file-format.html#type-csv
Also have a standard practice to mention type of file either it could be CSV, JSON ,AVRO , Parquet etc.
https://docs.snowflake.com/en/sql-reference/sql/create-file-format.html
In a file, few of the rows have \ in a column value for example, i have rows in below format.
101,Path1,Z:\VMC\PSPS,abc
102,Path5,C:\wintm\PSPS,abc
I was wondering how to load \ character
COPY INTO TEST_TABLE from #database.schema.stage_name FILE_FORMAT = ( TYPE = CSV FIELD_OPTIONALLY_ENCLOSED_BY = '\"' SKIP_HEADER = 1 );
is there any thing that i can mention the file_format line?
Are you still getting this error? I just tried to recreate it by creating a CSV based off your sample data and a test table. I loaded the CSV into an internal stage and then ran your COPY command. It worked for me. Please see the screenshot below.
Could you provide more details on the error you are facing? Perhaps there was something off with your table definition.
Im getting started with Snowflake and something I dont understand. I tried to issue a copy command as below but it shows no rows processed.
copy into customer
from #bulk_copy_example_stage
FILES = ('dataDec-9-2020.csv')
file_format = (type = csv field_delimiter = '|' skip_header = 1)
FORCE=TRUE;
I tried with another file from the same S3 folder
copy into customer
from #bulk_copy_example_stage
FILES = ('generated_customer_data.csv')
file_format = (type = csv field_delimiter = '|' skip_header = 1)
FORCE=TRUE;
And this worked.
At this stage im pretty sure that something was wrong with my first file. but my question is, how do we get to print out what the error was? all it shows in the console is as below which is not really helpful.
You could try looking at the copy_history to find out what's wrong with the file.
Reference: copy_history
Im trying to use the \copy command from POSTGRES using laravel 5.5, to insert a large file at the DB, but im getting this error bellow.
I tried this way:
DB::statement( DB::raw("\\copy requisicoes FROM '".$file1."' WITH DELIMITER ','"));
Get this error:
SQLSTATE[42601]: Syntax error: 7 ERROR: syntax error at or near "\" LINE 1: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_... ^ (SQL: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_log1_2018-10-29' WITH DELIMITER ',')
Tried this way too:
DB::statement( DB::raw('\copy requisicoes FROM \''.$file1.'\' WITH DELIMITER \',\''));
Get this error:
SQLSTATE[42601]: Syntax error: 7 ERROR: syntax error at or near "\" LINE 1: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_... ^ (SQL: \copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_log1_2018-10-29' WITH DELIMITER ',')
If i execute the command that returns on the error above with psql line command, works fine
\copy requisicoes FROM '/srv/www/bilhetagem_logs/bilhetagem_log1_2018-10-29' WITH DELIMITER ','
Could somebody helps me? :)
I have to use \copy insted of copy becouse I dont have superuser privilege at the DB.
https://www.postgresql.org/docs/9.2/static/sql-copy.html
COPY naming a file is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
See this article on PostgreSQL and note this line:
Do not confuse COPY with the psql instruction \copy. \copy invokes
COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in
a file accessible to the psql client. Thus, file accessibility and
access rights depend on the client rather than the server when \copy
is used.
\copy is a psql instruction, so you do not need to write \copy, just COPY.
This is my code to import data from sql to pgsql database
First export CSV file with separator '^'
Then import same file into pgsql using copy command
$users = User::select('*')->get()->toArray();
$pages = "id,warehouse_id,name,email,email_verified_at,password,remember_token,created_at,updated_at\n";
foreach ($users as $where) {
$pages .= "{$where['id']}^{$where['warehouse_id']}^{$where['name']}^{$where['email']}^{$where['email_verified_at']}^{$where['password']}^{$where['remember_token']}^{$where['created_at']}^{$where['updated_at']}\n";
}
$file = Storage::disk('local')->put('user.csv', $pages);
if($file){
$data = "";
try {
$file_path = storage_path('app/user.csv');
$data = DB::connection('pgsql')->statement("copy public.users (id, warehouse_id, name, email, email_verified_at, password, remember_token, created_at, updated_at) FROM '$file_path' DELIMITER '^' CSV HEADER ENCODING 'UTF8' ESCAPE '\"';");
} catch (\Exception $e) {
throw $e;
}
}