Where is the BaseX Db stored? - basex

I'm using BaseX to run an XQuery against a set of XML files. I set this up with BaseX by adding the XML files with the ADD command and then executing my XQuery command. This works fine on my win7 box.
I'd like to share this DB with others. I expected to find the corresponding files at the path of the BaseX DB, but this directory is empty.
What's the BaseX way of sharing this work with other developers?

By calling INFO or db:system(), you will see where your database is stored. Some general information on the location of databases can be found in the documentation of BaseX (→ Configuration).

I believe if you use the function db:name or db:path with the name of a known node, you can get the path to the database.
It may be in-memory however, so you can use the export function to make it persist on the file system.

Related

Why does Liquibase generateChangeLog generate an empty changelog file?

I'm using Liquibase to generate a DB scheme from existing H2 database.
I use the following:
liquibase --driver=org.h2.Driver --classpath=./h2-1.4.199.jar --changeLogFile=db.schema.sql --url="jdbc:h2:mem:testdb" --username=sa --password= --logLevel=debug generateChangeLog
So, absolutely default values in order to connect to the H2 instance. But the command above generates an empty changelog file (just some basic Liquibase headers).
I tried to use different urls (h2 in file), I tried to set different password and username, I even tried to define defaultSchemaName parameter, but still the same.
Liquibase maven plugin says: No changes found, nothing to do
Liquibase without maven plugin says: Liquibase command 'generateChangeLog' was executed successfully.
I also tried to put invalid credentials (username or password), but still the same.
generateChangeLog command exports data from the specified database. It means that such database should exist and be populated with some data.
There is no point to specify the in-memory embedded URL jdbc:h2:mem:testdb here. Each process have own memory and own in-memory databases. Liquibase will definitely see an empty database here inside its own memory.
You need to create a normal persistent database with your application and use its URL here. I strongly suggest you to specify the absolute database path (without file name extension) in the database URL to be sure that the same database will be used by your application and by Liquibase. Note that you can't use the embedded database by two applications at the same time without additional parameters (auto-server mode), so you need to close the database in your application before you'll launch Liquibase. As an alternative you can start the H2 Server process and use remote URLs or you can use the auto-server mode.
You can also append ;IFEXISTS=TRUE to the database URL for Liquibase only. It will prevent accidental silent creation of a new empty database in it.

Using SQL override in ODI 12C with source data server different then target

I am trying to load data into OBIEE db Oracle database using a custom source query that needs to be executed on the CRM DB in ODI 12c. I am using a default LKM as the source and target are on on different servers. It requires a LKM else the mapping is failing on source connectivity.
I can see the C$ tables creating which are not relevant as I only need a direct load from source query to target. Can you suggest me relevant LKM or ways to avoid using LKM?
As answer to your question, you can create your own LKM, at let it blank .. so no command will be executed. In this mode you can "avoid" LKM's. But, for this, you need a IKM that will "make" your query ..
I don't know exactly if there is a IKM that will work without a LKM (predefined) but, also, you can create your own IKM, easily. If you need help, please tell me

Exporting Neo4j database for importing in another Neo4j-desktop application

I should say, that neo4j-admin tools didnt work for me.
neo4j-admin dump --database=<database-name> --to=<database-address>
and i get an error every time, that database doesnt exist.
so is there any other way to export my Neo4j database ??
C:\Users\Shafigh\.Neo4jDesktop\neo4jDatabases\database-2912eb35-11ba-4ae1-
b5b9-cb4b88a6f0a9\installation-3.4.7\bin> neo4j-admin dump --database=test_1
--to=C:/Users/Shafigh/Desktop/files
org.neo4j.commandline.admin.CommandFailed: database does not exist: test_1
at org.neo4j.commandline.dbms.DumpCommand.execute(DumpCommand.java:83)
at org.neo4j.commandline.admin.AdminTool.execute(AdminTool.java:127)
at org.neo4j.commandline.admin.AdminTool.main(AdminTool.java:51)
Caused by: java.lang.IllegalArgumentException: Directory
'C:\Users\Shafigh\.Neo4jDesktop\neo4jDatabases\database-2912eb35-11ba-4ae1-
b5b9-cb4b88a6f0a9\installation-3.4.7\data\databases\test_1' does not contain
a database
at
org.neo4j.kernel.impl.util.Validators.lambda$static$3(Validators.java:111)
at org.neo4j.commandline.dbms.DumpCommand.execute(DumpCommand.java:79)
... 2 more
command failed: database does not exist: test_1
snapshot of my Neo4j desktop
graph_db size
test_1 is just the name of your Neo4j Desktop "project". It is not the name of your "database".
Use this command line instead (which uses the default database name, which is what you are using):
neo4j-admin dump --database=graph.db --to=C:/Users/Shafigh/Desktop/files
[UPDATED with more details, from my comments]
Neo4j Desktop is an environment that allows you to have many projects, with each project having possibly multiple DBs and plugins with different versions. So, your "test_1" is just the name of your project. Within that project, each DB will have its own directory structure, and by default the directory in that structure that contains your DB is given the name "graph.db". You can change the name of that DB if you want by setting the dbms.active_database property in that DB's neo4j.conf file -- but that is rarely useful.

How do i use an SQL query to determine TFS Area name and build path including file name for installers

I am trying to automate the release process associated with installers that have been created on my build server. To do this I was hoping to be able to use an SQL query to get the drop path of the installer including the file name.
In TFS when I go to "Build" and select "Artifacts" I can then use the "Explore" link to get the root path for the build. The subsequent folders "Installer\Disk" path is part of the configuration. However, the actual setup file is composed of the + " setup.exe". Since there are multiple projects in our TFS I was hoping to use a query to find all builds that have a build quality set to "Release" and dynamically find the installer on disk.
Generally our installer names are made up of the root area path name with all text removed. I can't figure out how to connect the build to the root area path in SQL.
Any ideas?
Generally, Accessing the information from Database directly is not recommended since it is in a high risk. I would recommend you to use TFS API to do this.
The drop location of the build is stored in TFS Collection Database\tbl_Build table and the quality information is stored in tbl_BuildQuality table. Join these two tables to query the information you want.

Checking a directory contains files or not - tsql

I'm trying to code a mssql job that does something using the files in a specific directory. But I don't know the name of the file / files, they will vary in time.
I've found xp_cmdshell command, but I can not use it because of security reasons
Is there any other way to check directory if it contains txt files or not (and if yes get the names of them) in tsql.
Thanks in advance,
Without access to the xp_ stored procedures, no. The other way would be to create a COM object using sp_OACreate that creates a COM Scripting.FileSystemObject, but again access to this may well be restricted as it's a security issue.
As your describing this as an MSSQL job, I'm assuming this is going to be a scheduled task of some description? If so, your best option is probably going to be creating a standard Windows batch file (.BAT) that's scheduled in SQL Server agent that does the existential checking and passes whatever files are found in to your SQL script via sqlcmd/osql.

Resources