I tried to import the example with nebula graph importer and the sample script file provided at https://docs.nebula-graph.io/3.1.0/nebula-importer/config-without-header/ . Script throw the following error:
2022/06/03 13:37:29 yaml: line 94: did not find expected key
I tried to customize the script with my own data and encounter the same issue.
What's wrong?
Related
In one of the Oracle 19c databases any expdp or impdp command is failing with the below error
ORA-31626: job does not exist
ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user SYSTEM
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT", line 1145
ORA-04045: errors during recompilation/revalidation of SYS.DBMS_LOGREP_UTIL
ORA-01775: looping chain of synonyms
ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_LOGREP_UTIL"
ORA-06512: at "SYS.KUPV$FT", line 957
I can see that the package SYS.DBMS_LOGREP_UTIL is in invalid state with the following errors in the screenshot
How can I resolve the error and compile the package. The package is wrapped so I cannot see the table which is throwing the looping chain of synonyms error
The issue got resolved after running catrep.sql and other related sqls in rdbms/admin directory.
I have created a database, Restuarants through Atlas and made an empty collection, restuarants.
I connected to my Atlas account with mongosh and attempted to import a .bson file with restore using the following command:
mongorestore -d Restaurants -c restaurants /Users/username/restaurants.sampleRestaurants.bson
However, I get the following error:
Uncaught:
TypeError: Cannot assign to read only property 'message' of object 'SyntaxError: unknown: Missing semicolon. (1:15)
I can't tell what's wrong. Been looking through the docs but can't really find an answer.
When I was trying to create an AVRO file with data containing a date object, it throws an AvroTypeException error.
This is the schema I have used:
screen shot of schema
This is the code-bit writing the data:
screen shot of code-bit
This is the error shown while running the code:
screen shot of error
Please find the link here to full version of my code I have tried.
NOTE: Python version: 3.7.10, avro-python3 version 1.10.2
Any help or suggestions are appreciated.
avro-python3 never supported logical types and was abandoned in favor of having the avro library support both Python 2 and 3 (and now it only supports Python 3).
To fix the problem you should pip uninstall avro-python3 and pip install avro.
I' trying since 2 days to import data to my mongo cluster using the mongimport command but i'm getting an error
my mongoimport command
mongoimport --uri mongodb+srv://cabinetdermes:<MYPASSWORDHERE>#cluster2.ufzkq.mongodb.net/cabinetdermes --collection myData --type json --file /Users/zakisb/Desktop/inventory.crud.json
The error that i keep getting
2021-10-08T13:33:35.808+0100 error parsing command line options: error parsing uri: lookup cluster2.ufzkq.mongodb.net on 172.20.10.1:53: cannot unmarshal DNS message
it's strange because i've already done the import before with another db and it worked successfully. yet this time i' getting error. Please help !
I solved the problem by simply go to settings and changing my dns adresse.
Use with option --gzip
check this
I registered a snowflake trail account and I tried to upload local file to snowflake. following the tutorial https://docs.snowflake.net/manuals/user-guide/data-load-internal-tutorial.html
I am on windows platform and it is failing when I run the put command.
https://docs.snowflake.net/manuals/user-guide/getting-started-tutorial-stage-data-files.html​
The error is
john#MYWAREHOUSE#MYDATABASE.PUBLIC>put file://c:\temp\load\contacts*.csv #my_csv_stage auto_compress=true;
Unable to parse config file: C:\Users\john/.aws/credentials
Could you please help on that?
You might be trying to upload a file to a S3 stage for which you don't have the proper credentials.
Try to run this instead -
put file://c:\temp\load\contacts*.csv #~ auto_compress=true;