Following this QuickStart doc, I created a database but couldn't see it coming up in the query.
Details are:
docker run -itd --env cpu=2 --env memory=4 -p 31007:31007 cnosdb/cnosdb
docker exec -it <container_id> sh
$ cnosdb-cli
CREATE DATABASE oceanic_station;
show databases;
SHOW DATABASES;
The error message:
? show databases;
"{"error_code":"0100000","error_message":"Error executiong
query: Failed to do execute statement, err:Failed to do parse. err:
sql parser error: Expected tables/databases, found: databases"}"
❯ SHOW DATABASES; "error sending request for url
(http://0.0.0.0:31007/api/v1/sql?db=public): connection closed before
message completed"
docker versions may behind stable releases versions
Related
I am using ubuntu 22.04 lts as a sudo user.
I made a react application and then I created a image and ran it in a container successfully. But i want to go inside the container for which i ran the below command:
docker exec -it e448b7024af bash
but i got the following error:
Error response from daemon: Container e448b7024af19a0bb is not running
I ran the below command to check if container is running:
docker ps
// i got my container in the list
// also i did some actions in react application to double check if conatiner was working and it worked perfectly
below is the output for the above command:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
56f8042d2f1 react_d "docker-entrypoint.s…" 12 minutes ago Up 12 minutes 0.0.0.0:3000->3000/tcp, :::3000->3000/tcp youthful_sammet
then based on some other solution i tried the below command:
docker run -it e448b7024af /bin/bash
and i got the following error:
Unable to find image 'e448b7024af:latest' locally
docker: Error response from daemon: pull access denied for e448b70254af, repository does not exist or may require 'docker login': denied: requested access to the resource is denied.
See 'docker run --help'.
then i tried the following command based on some solution i found:
docker pull e448b7024af:latest
but i got the following error:
Error response from daemon: pull access denied for e448b7024af, repository does not exist or may require 'docker login': denied: requested access to the resource is denied
i also tried:
docker exec -it 568f8042d2f1 bash
and i got the following error:
OCI runtime exec failed: exec failed: unable to start container process: exec: "bash": executable file not found in $PATH: unknown
Below is my Dockerfile:
FROM node:alpine
WORKDIR /app
COPY /package*.json ./
RUN npm install
COPY . .
CMD ["npm","run","start"]
My container is working properly but i am unable to get inside of the container. Any help is appreciated. Thanks in advance.
Based on the output from docker ps, your container id is 56f8042d2f1 and not e448b7024af which I suspect might be your image id or a container id from a previous run.
Another thing is that bash isn't installed in Alpine images by default. You can use sh instead.
You can use the more human-friendly container name of youthful_sammet in your command and do
docker exec -it youthful_sammet sh
or, if you prefer the id
docker exec -it 56f8042d2f1 sh
you are using the wrong container identifier in the docker exec command as when you do docker ps the container id is different and you are using the wrong one.
I personally use the container name as identifier it is easy to remember.
I want to install the wal2json plugin in the opengauss installed by docker, but an error is reported. I installed it according to this article
https://opengauss.org/en/blogs/blogs.html?post/lihongda/debezium-adapt-opengauss/
Error message when I execute pg_recvlogical -d postgres -S test_wal2json --create -U gaussdb -h localhost -P wal2json statement
could not send replication command "CREATE_REPLICATION_SLOT "test_wal2json" LOGICAL "wal2json"": FATAL: could not load library "wal2json.so": /usr/local/opengauss/lib/postgresql/wal2json.so: undefined symbol: _Z20RelationGetIndexListP12RelationData
add shared_preload_libraries = 'wal2json' to postgresql.conf
Then restart OpenGauss to try.
I have a SQL server which is triggered inside docker container from gitlab. Following yaml file does that :
services:
- name: mcr.microsoft.com/mssql/server:2019-latest
alias: mssql
variables:
MSSQL_HOST: microsoft-mssql-server-linux
ACCEPT_EULA: Y
MSSQL_COLLATION: Latin1_General_CS_AS
SA_PASSWORD: yourStrong(!)Password
Now when i am starting this pipeline its giving error in running the sql script saying that login failed for user SA . Error is as :
$ /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P
"yourStrong(!)Password" -i Scripts/DBScript.sql Sqlcmd: Error:
Microsoft ODBC Driver 17 for SQL Server : Login failed for user 'SA'..
Cleaning up file based variables ERROR: Job failed: command terminated
with exit code 1
This error gets resolved once i remove the collation "MSSQL_COLLATION:Latin1_General_CS_AS" from the yaml file.
This explains that i am not able to change MSSQL Serve collation.
Note: Container spawned by Gitlab is Linux Container which is also installing docker image of "mcr.microsoft.com/mssql/server:2019-latest".
Any idea how to change Collation level at MS SQL server level.
I try to work with SQL Server data using Apache.Zeppelin Docker image.
Run this docker
docker run --name zeppelin -p 8088:8080 -v /zeppelin-sqlserver/notebook -v /zeppelin-sqlserver/conf -d yorek/zeppelin-sqlserver:latest
Run Zeppelin GUI http://localhost:8088
Edited the interpreter SQL Server: entered my SQL Server instance name, database, user name, password
Created the note in Notebook.
All is ok, but when I try to run query, for example:
Select ##version
to test how it works I receive:
java.lang.NoClassDefFoundError: org/apache/hadoop/security/UserGroupInformation$AuthenticationMethod
What is incorrect? What need to fix?
May be I need the other Zeppelin docker?
I want to create a shell script which automates the creation and running and creating database in a postgres database using docker.
I want to use the docker postgres official package for postgres in docker.
The script that I use is as follows:
docker network create --subnet=172.18.0.0/16 shared_network;
docker kill postgres_linker;
docker rm postgres_linker;
docker run --name postgres_linker -e POSTGRES_PASSWORD=blahblahblah -d --net shared_network --ip 172.18.0.2 postgres:10-alpine;
docker exec -it postgres_linker psql -U postgres -c "create database linker;";
But when I run this I get the following output without any database being created:
Error response from daemon: network with name shared_network already exists
postgres_linker
postgres_linker
b2a9fd4d6e25b62d60adb05c8b6b653a1b55ec7a869c4728677d6289f5cddd63
psql: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
The first line of this log is OK, the second and third are too, The problem is that psql does not run the command on the postgres container althogh the command I am trying to run is correct. If I run the last command seperately from my shell script:
docker exec -it postgres_linker psql -U postgres -c "create database linker;";
It does not give me error! and it works!
Why is this behavior happening?
I found the solution to my problem.
Unfortunately I thought that when postgres container is run the server is up and running immediately.
It was not true and it takes some time to come up. so I have needed to add some delay.
So the resulting script file should look like this:
docker network create --subnet=172.18.0.0/16 shared_network;
docker kill postgres_linker;
docker rm postgres_linker;
docker run --name postgres_linker -e POSTGRES_PASSWORD=blahblahblah -d --net shared_network --ip 172.18.0.2 postgres:10-alpine;
sleep 5;
docker exec -it postgres_linker psql -U postgres -c "create database linker;";