I am trying to import a json file using mongoimport
mongoimport --db test --collection test1 --drop –file ~/Downloads/data/data.json –jsonArray
it some times gives this error:
error parsing command line options: invalid argument for flag `/j, /numInsertionWorkers' (expected int): strconv.ParseInt: parsing "sonArray": invalid syntax
and sometimes this:
error validating settings: can not use --fields when input type is JSON
I am new to mongodbImport and i couldn't find any solution
You enter
mongoimport --db test --collection test1 --drop –file ~/Downloads/data/data.json –jsonArray
but it should be
mongoimport --db test --collection test1 --drop –file ~/Downloads/data/data.json --jsonArray
Note, you can use option -j or --numInsertionWorkers=<number>, however -j is not mentioned in mongoimport documentation, you get it only from mongoimport --help command.
Related
Having some trouble using the bulk:upsert command to update Account objects via a csv file. Hopefully someone can help me with this. Below is what I'm doing:
My csv file name is account.csv and it contains the following data:
Id,Name
0012F00000QjhC7QAJ,LimTest 1
0012F00000QjhkSQAR,LimTest 2
Below is the command that I'm running:
sfdx force:data:bulk:upsert -s Account -f account.csv -i Id -u dev
Above command gets submitted sucessfully. But the job failed.
The Batch status is as of below:
When I view the request, it looks like below:
It works after I manually created an empty file and copied and pasted the data into this new file. The original file, account.csv, was created using this command:
sfdx force:data:soql:query -q "select Id, Name from Account" -r csv -u dev > account.csv
I guess the above command must have created the file in a different encoding that the bulk:upsert does not know how to handle.
I want to dump a shapefile from my postgresql database with the following command line:
pgsql2shp -f output.shp -h localhost -u postgres -P admin parcel "SELECT * FROM parcel.export_output WHERE ParcelNoEng=116"
but it goes on showing the error:
ERROR: function postgis_version() does not exist
LINE 1: SELECT postgis_version()
^
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
What shall I do to get it work?
I have added Postgresql/version/bin to my environmental variable.
You're missing the PostGIS extension. Execute the following command in your PostgreSQL using a client of your choice, e.g. psql, and try again:
CREATE EXTENSION postgis;
Good Day,
I've been trying to restore a dump file using the psql client and I'm getting this error:
psql.bin:/home/user/Desktop/dump/dumpfile.sql:907:
ERROR: more than one function named "pg_catalog.avg"
CONTEXT: COPY pg_aggregate, line 1, column aggfnoid: "pg_catalog.avg"
I created the dump file from a different Postgres DB (version: 9.4.5) using the command:
pg_dump --username=pgroot ${tables} --no-owner --no-acl --no-security
--no-tablespaces --no-unlogged-table-data --data-only dbname > dumpfile.sql
Where ${tables} is a variable in the for:
-T table1 -T table2 -T table3 ...
This is because I'm dumping specific tables listed in a new-line delimited file. Hence its not the entire database but specific tables I want to dump.
I tried loading the the dump file int another Postgres DB (9.6) using the following command:
psql -d dbname -U superuser -v "ON_ERROR_STOP=1" -f
${DUMP_DIR}dumpfile.sql -1 -a > ${LOG_ERR_DIR}dumpfile.log
2>${LOG_ERR_DIR}dumpfile.err
This gave the error mentioned above. It seems this error is occurring because the dump file tries to add the function "pg_catalog.avg" to the database and it gives an error because it already exists.
The sql file generated by the pg_dump does not have anywhere in it where it creates the pg_catalog.avg function, so i don't know why this is occurring.
So I tried dropping the database and creating it from template0, and still I got the error. It seems to me that its a bug based on the follwoing post:
Re: BUG #6176: pg_dump dumps pg_catalog tables
I'm stuck trying to reslove this issue. If anyone can help me resolve this issue please respond?
Thank you in advance,
j3rg
I found out what was causing this issue. It seems that there was an extra newline in the file containing the table listing. This was causing an extra table argument with no table specified and in turn pg_dump exported the sys tables into the file. I file I was searching for the avg function was the wrong file too.
I'am trying to import data into sql server table from a file using a format file.
In fact I have 2 databases: a production database and a local database
I want to insert some row of the table shipper of the production database in the local one. The table shipper don't have neither the same columns nor the same order of column in the 2 databases.
That's why I used a file format to do my bcp.
I generate file containing the rows I want to insert in my local database with the following commande
bcp "SELECT shipper_id,Shipper_name FROM ProductionDatabase.dbo.shipper where shipper_id >5" queryout shipper.txt -c -T
It works !!
I generate then the format file with the schema of my local table with the following commande
bcp LocalDatabase.dbo.shipper nul -T -n -f shipper-n.fmt
It works !!
Unfortunately when I tried to insert the file data in my local table
with the following commande:
bcp LocalDatabase.dbo.shipper in shipper.txt -T -f shipper-n.fmt
it generates the following error (translated from french)
Can anyone know what is the problem and how can I get arround it.
Thanks in advance
unexpected end of file encountered in the bcp data file
Your format file does not match the data. You are exporting using text using -c
bcp "SELECT shipper_id,Shipper_name FROM ProductionDatabase.dbo.shipper where shipper_id >5" queryout shipper.txt -c -T
But your format file is made for native (binary) data using -n
bcp LocalDatabase.dbo.shipper nul -T -n -f shipper-n.fmt
Either export both as native (my recommendation), or both as text. To prevent this error, export the data file and the format file at the same time, simply add -f shipper.fmt to your export
Text version:
bcp "SELECT shipper_id,Shipper_name FROM ProductionDatabase.dbo.shipper where shipper_id >5" queryout shipper.txt -c -T -f shipper.fmt
or
Native Version:
bcp "SELECT shipper_id,Shipper_name FROM ProductionDatabase.dbo.shipper where shipper_id >5" queryout shipper.txt -n -T -f shipper.fmt
PS. Since you can run into scenarios where your record or row delimiters exist in the data you should pick a character sequence that does not exist in your data as a separator for instance -t"\t|\t" (Tab-Pipe-Tab) for fields and -r"\t|\n" (Tab-Pipe-Newline) for rows. If you combine the format statement with the export the data and the format file will match and you have the freedom to change the separators on a single command line.
Specify separators after the -n or -c on the command line
The name of our production database (created by a previous vendor) begins with numbers (e.g. 3717_databasename). As such, I keep getting the message "an error occurred while processing the command line" when trying to export the contents of a table. Here is my command:
bcp 3717_databasename..TableName out "C:\Temp\TableName.dat" -E -n -Slocalhost -Umyusername -Pmypassword
In such a scenario, the solution is to specify the database name separately with the "-d" switch, like so:
bcp TableName out "C:\Temp\TableName.dat" -E -n -Slocalhost -d3717_databasename -Umyusername -Pmypassword