How to change the default log file location in CnosDB? - database

There are options we can set the log level, query enabling, log format per this doc. But is there a way I could change the default log file location in CnosDB?

you can set the log file location in the config file, for example:
:
You can set all config of log in the config file, too.

Related

Putsftp is taking a wrong path in SFTP server in nifi

I have a flow to fetch file from SFTP server, rename it and put it back to server in same location.
My flow:
Listsftp-> fetchsftp-> updateAttribute-> putsftp
My file location is in d drive, I have mentioned that location in remote path property of putsftp but it taking the path like
c:/users/myname/d:/file/location
And of course it is giving me error.
Is there any solution for this?
Thanks in advance.
you can use the SFTP processor only if you are using a server with Host - Port etc.
If you want to get some files from your disk (C:/ for example) you can use the GETFILE processor
an example of flow could be this:
GETSFTP with the property Keep Source File to false
UpdateAttribute
new property -> filename -> new_file_test.example
PUTSFTP
you can use GETSFTP/GETFILE PUTSFTP/PUTFILE

Image Extractor by AI Habitat produces a configuration error when importing Matterport dataset

I need help understanding the error message, which is along the lines of changing the file name to json because the configuration fails. I have a long error message but pasted the part that is mostly repeated throughout the message:
/Users/kyra/Documents/GitHub/habitat-sim/matterport/scans/house1/8194nk5LbLH 13/poisson_meshes/8194nk5LbLH_10.stage_config.json
I0412 19:04:17.735939 42397184 AttributesManagerBase.h:296] AttributesManager::createFromJsonOrDefaultInternal (Stage) : Proposing JSON name : /Users/kyra/Documents/GitHub/habitat-sim/matterport/scans/house1/8194nk5LbLH 13/poisson_meshes/8194nk5LbLH_10.stage_config.json from original name : /Users/kyra/Documents/GitHub/habitat-sim/matterport/scans/house1/8194nk5LbLH 13/poisson_meshes/8194nk5LbLH_10.ply | This file does not exist.
I0412 19:04:17.736085 42397184 AbstractObjectAttributesManagerBase.h:182] AbstractObjectAttributesManager::createObject (Stage) : Done making attributes with handle : /Users/kyra/Documents/GitHub/habitat-sim/matterport/scans/house1/8194nk5LbLH 13/poisson_meshes/8194nk5LbLH_10.ply
I0412 19:04:17.736093 42397184 AbstractObjectAttributesManagerBase.h:189] File (/Users/kyra/Documents/GitHub/habitat-sim/matterport/scans/house1/8194nk5LbLH 13/poisson_meshes/8194nk5LbLH_10.ply) exists but is not a recognized config filename extension, so new default Stage attributes created and registered.
I0412 19:04:17.736124 42397184 SceneDatasetAttributes.cpp:46]
What I did: Ran image extractor after activating Conda env. I modified the image extractor to change the file path to point to a .ply file in the matterport dataset.
Setup: 1)Facebook's AI Habitat-sim built from source,
2)MacBook Air M1,
3)Conda environment with the dependencies (using pip install -r requirements.txt) but habitat-sim is not installed by Conda,
4)Matterport3D dataset (downloaded one house).
Thank you.

Rundeck - Failed to read SSH Private Key stored at path - Path does not exist

I am running the Rundeck war file directly
java -jar rundeck-3.0.17-20190311.war
I get this error message when I trigger a build.
Failed to read SSH Private key stored at path:
keys/rundeck.pem: org.rundeck.storage.api.StorageException:
Path does not exist: keys/rundeck.pem
Failed: ConfigurationFailure: Failed to read SSH
Private key stored at path: keys/rundeck.pem
It makes sense that the reference in the Default Node Executor is invalid and that Rundeck cannot find the .pem file.
I've tried
referencing the full working directory (/home/user/rundeck/keys/rundeck.pem) It wants the location to start with keys/.
referencing it to its relative path (keys/rundeck.pem)
copied the keys directory to /home/user/
In desperation, I ran chmod 700 on the pem file.
Most of the questions and examples I found were on older versions of Rundeck.
I'd like to know where the .pem file must be configured and how it should be referenced. Any other information that could help me configure the SSH keys will be appreciated.
You must add the key using the GUI and use the path that you are defined in your resources.xml.
For add your key, you can follow this. Although the video is based on Rundeck 2.x it is valid for Rundeck 3.x:
Check that https://www.youtube.com/watch?v=qOA-kWse22g
And for generate your resources.xml file select your new project and go to Project Settings > Edit Nodes > Click on "Configure Nodes" button (up to right) > Click on "Add Sources +" Button > Select "+ File" option > in "Format" field select "resourcexml" and fill the path in "File Path" field (put the file name at the end, usually "resources.xml"), then select "Generate", "Include Server Node" and "Writeable" checkboxes and click on "Save" button.

Solr extract text from image and imagePdf files

I am working with Solr-6.5.1, I want to extract text from Image file and ImagePdf file.for this i installed TesseractOcr and configured this with solr in two ways:
1.Environment variable is set for TESSDATA_PREFIX = C:\Program Files (x86)\Tesseract-OCR and i used /update/extract request handler to index image with content.
2.I modified the tesseractOCRConfig.properties file in tika-parsers-1.13 jar file in solr lib to" tesseractPath=C:/Program Files (x86)/Tesseract-OCR" and i used /update/extract request handler to index image/imagePdf with content.
In this two way also i'm not getting any content ,But response giving only attr_x_parsed_by=org.apache.tika.parser.ocr.TesseractOCRParser.
Any other configuration i need to set for solr to TesseractOcr to extract content for Image/ImagePdf file.
Thanks in advance.

ODK briefcase command-line form export

I am trying to automaticly backup forms submited to transform them to CSV.
I am using this commandline:
java -jar ./ODK_Briefcase_v1.4.5_Production.jar --form_id NameOfTheForm
--odk_username USER --odk_password PASSWORD
--export_directory /var/www/data --storage_directory /var/www/data
--export_filename A_Chaufferie.csv --overwrite_csv_export
--export_start_date 2014/02/05 --export_end_date 2016/02/06
I get the error GRAVE: Form not found
I have no idea what is the purpose of storage_directory. I can't find any forms submissions on mys server (tryed with the linux command find).
Do you know what I am missing?
i have this --help:
java -jar briefcase.jar
-ed,--export_directory </path/to/dir> Directory to export the CSV and
media files into (relative path
unless it begins with / or C:\)
-em,--exclude_media_export Flag to exclude media on export
-end,--export_end_date <yyyy/MM/dd> Include submission dates before
(exclusive) this date in export
to CSV
-f,--export_filename <name.csv> File name for exported CSV
-h,--help Print help information (this
screen)
-id,--form_id <form_id> Form ID of form to download and
export
-oc,--overwrite_csv_export Flag to overwrite CSV on export
-od,--odk_directory </path/to/dir> /odk directory from ODK Collect
(relative path unless it begins
with / or C:\)
-p,--odk_password <password> ODK password
-pf,--pem_file </path/to/file.pem> PEM private key file (relative
path unless it begins with / or
C:\)
-sd,--storage_directory </path/to/dir> Directory to create or find ODK
Briefcase Storage directory
(relative path unless it begins
with / or C:\)
-start,--export_start_date <yyyy/MM/dd> Include submission dates after
(inclusive) this date in export
to CSV
-u,--odk_username <username> ODK username
-url,--aggregate_url <url> ODK Aggregate URL (must start
with http:// or https://)
-v,--version Print version information
I didn't know i had to put --aggregate_url even if the brieface and ODK aggregate are in the same server. Don't miss the http:// or it won't work
java -jar ./ODK_Briefcase_v1.4.5_Production.jar --form_id NameOfTheForm
--odk_username USER --odk_password PASSWORD
--export_directory /var/www/data --storage_directory /var/www/briefcase
--export_filename A_Chaufferie.csv --overwrite_csv_export
--export_start_date 2014/02/05 --export_end_date 2016/02/06
--aggregate_url http://your.odk-aggregate.site

Resources