Have Visual Studio 2017 (15.3) solution with two projects:
An API written in ASP.NET Core 2 MVC
Database Project
I was able to "dockerize" the MVC project easily (right click, add Docker support) but while trying to dockerize the Database project keep getting the error: Value cannot be null. Parameter name: stream. My Google-fu is failing me; the closest resource found is for Visual Studio 15.2.
How I've Setup Database Project So Far
Added Dockerfile to root:
FROM microsoft/mssql-server-linux:latest
EXPOSE 1433
ENV ACCEPT_EULA=Y
ENV LANG en_US.UTF-8
ENV LANGUAGE en_US:en
ENV LC_ALL en_US.UTF-8
ENV MSSQL_TCP_PORT=1433
# Add Database project output from VS build process
RUN mkdir --parents /_scripts/generated
COPY ./_scripts /_scripts/
COPY ./_scripts/generated/*.sql /_scripts/generated/
# Add shell script that starts MSSQL server, waits 60 seconds, then executes script to build out DB (script generated from VS build process)
CMD /bin/bash /_scripts/entrypoint.sh
Modified docker-compose.yml file to include new project
version: '3'
services:
webapp-api-service:
image: webapp-api
build:
context: ./src/API
dockerfile: Dockerfile
webapp-db-service:
image: webapp-db
build:
context: ./src/Database
dockerfile: Dockerfile
Modified docker-composeoverride.yml file to expose port for dev SSMS access
version: '3'
services:
webapp-api-service:
environment:
- ASPNETCORE_ENVIRONMENT=Development
ports:
- "80"
webapp-db-service:
ports:
- "1433"
Here's the build output
2>C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\MSBuild\Sdks\Microsoft.Docker.Sdk\build\Microsoft.VisualStudio.Docker.Compose.targets(279,5): error : Value cannot be null.
2>C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\MSBuild\Sdks\Microsoft.Docker.Sdk\build\Microsoft.VisualStudio.Docker.Compose.targets(279,5): error : Parameter name: stream
2>Done building project "docker-compose.dcproj" -- FAILED.
Thanks in advance!
I ran into this same issue yesterday. I just solved it by removing the build portion of the database service. I'll just have to build the database project manually for now.
You can add a file named AppType.cache to /obj/Docker with content AspNetCore as a workaround.
Related
I have created a WPF project. It is using .net6 (no Blazor or Razor), and it is using WebView2. The Problem I have is with automated creating a Setup.exe or Msi File or what ever. And I already have a setup.exe, but something is strange with the executable of my application after installing my application with the setup.exe What I tried: WIX (but I got a tip of using innosetup, because it's free and easy to use) So I tried innosetup, and I am so extremely close to got it to work, that I do not want to give up yet. I achieved to create my Setup.iis file and I created a setup.exe. For this I used Gitlab-CE and a Runner. And the amake/innosetup image This one here
Tool_Build:
stage: build
image: mcr.microsoft.com/dotnet/sdk:6.0
only:
- master
script:
- dotnet build "$CI_PROJECT_DIR/Tool/Tool.csproj" /p:Version=$VERSION --configuration Release
- dotnet publish -p:PublishProfile=FolderProfile
tags:
- runner
artifacts:
expire_in: 1 day
paths:
- "$CI_PROJECT_DIR/Tool/bin/publish"
- "$CI_PROJECT_DIR/ATASetup.iss"
Tool_Setup:
stage: setup
image:
name: amake/innosetup
entrypoint: [""]
dependencies:
- Tool_Build
only:
- master
tags:
- runner
script:
- iscc "ATASetup.iss"
artifacts:
expire_in: 1 day
paths:
- "Output/Setup.exe"
The current problem I have is. when I install my application. It runs, BUT the EXE works strange. It opens a console, and this opens my application. When I use the EXE in my "publish" folder of the project, everything works well. When I use my innosetup on my pc manually, it works perfect. Has anyone some help for me? I am just a beginner, and I am so close to get it done (i hope )
in short: i am relatively new to programming and CI/CD.
i just could need a patient helping hand =)
I'm trying to get React to change content of the site when it's file is being saved.
I'm using VS code which doesn't have safe write. I'm using docker-compose on Windows via Docker Desktop.
Dockerfile:
FROM node:17-alpine
WORKDIR /front
ARG FRONT_CMD
ARG API_HOSTNAME
ENV REACT_APP_API_HOSTNAME=$API_HOSTNAME
COPY . .
RUN npm i #emotion/react #emotion/styled
CMD $FRONT_CMD
relevant part of docker-compose.yml:
frontend:
volumes:
- ./frontend/src:/front/src
- /front/node_modules
build:
context: ./frontend
dockerfile: Dockerfile
args:
- FRONT_CMD=${FRONT_CMD}
- API_HOSTNAME=${API_HOSTNAME}
env_file:
- .env.dev
networks:
- internal
environment:
- CHOKIDAR_USEPOLLING=true
- FAST_REFRESH=false
- NODE_ENV=development
Everything is running behind traefik. CHOKIDAR_USEPOLLING and FAST_REFRESH seem to make no difference, I start with ' docker-compose --env-file ..env.dev up' - within the file FRONT_CMD="npm start" which behaves just fine. Env.dev should be clear indication of dev build (and is, works the same without the addition) to React, but I added NODE_ENV just be safe. I tried adding all of them into build envs just be super sure, but nothing changes. React files lay in 'frontend' folder, which is in the same location as docker-compose.yml.
Every time React says it compiled successfully and warns me that it's a development build.
Only suspicion I have left is that there's some issue with updating files with Windows locally while docker uses Linux, but I have no idea where to go from there even if that's the case.
Shortest way I found was to start from the other side, that is attach editor to container instead of updating container based on changes in system files. I followed the guide here: https://code.visualstudio.com/docs/remote/attach-container
I'm having this problem after creating the Oracle Database XE 18.4 image with WSL2. I'm trying to create a container based on this image and I keep receiving these errors, even though I'm doing exactly the same thing that is asked on this tutorial provided by Oracle.
The errors show up when I try to create the container and turn it on for the first time
sed: can't read /etc/oratab: No such file or directory
/opt/oracle/runOracle.sh: line 194: /etc/init.d/oracle-xe-18c: No such file or directory
grep: /etc/oratab: No such file or directory
/opt/oracle/checkDBStatus.sh: line 18: oraenv: No such file or directory
#####################################
########### E R R O R ###############
DATABASE SETUP WAS NOT SUCCESSFUL!
Please check output for further info!
########### E R R O R ###############
#####################################
The following output is now a tail of the alert.log:
tail: cannot open '/opt/oracle/diag/rdbms/*/*/trace/alert*.log' for reading: No such file or directory
tail: no files remaining
I would like to know what on earth could I do to solve these errors and get this database running. As a matter of fact, I can't verify if these files mentioned in the log exist because I can't even connect to this container as it remains on for around 5 seconds due to these errors.
I'm using Docker on Windows integrated with WSL2.
My Docker installation info is:
Client: Docker Engine - Community
Cloud integration: 1.0.12
Version: 20.10.5
API version: 1.41
Go version: go1.13.15
Built: Tue Mar 2 20:14:53 2021
OS/Arch: windows/amd64
Context: default
Server: Docker Engine - Community
Engine:
Version: 20.10.5
API version: 1.41 (minimum version 1.12)
Go version: go1.13.15
Git commit: 363e9a8
Built: Tue Mar 2 20:15:47 2021
OS/Arch: linux/amd64
containerd:
Version: 1.4.4
runc:
Version: 1.0.0-rc93
docker-init:
Version: 0.19.0
The Dockerfile for XE v18.4.0 is different from all the others in the same repo as it tries to pull the image directly from the oracle web server using https during the container build phase. But the container build phase itself happens inside a container (FROM...) so, in my case, it was failing because it did not have the right internet connectivity (shitty proxy with https problems). The image was created, but it did not contain the database engine.
I have modified the Dockerfile inside OracleDatabase\SingleInstance\dockerfiles\18.4.0 so that istead of using the https origin it does a COPY of a (already downloaded) oracle-database-xe-18c-1.0-1.x86_64.rpm inside the image and installs it "locally".
I used as a template the Dockerfile of another version, but it's quite simple.
Tucked away in docker-images\OracleDatabase\SingleInstance\README.md is the following:
IMPORTANT: You will have to provide the installation binaries of Oracle Database (except for Oracle Database 18c XE) and put them into the dockerfiles/<version> folder. You only need to provide the binaries for the edition you are going to install. The binaries can be downloaded from the Oracle Technology Network, make sure you use the linux link: Linux x86-64. The needed file is named linuxx64_<version>_database.zip. You also have to make sure to have internet connectivity for yum. Note that you must not uncompress the binaries. The script will handle that for you and fail if you uncompress them manually!
If you download the OracleXE RPM from https://www.oracle.com/uk/database/technologies/xe-downloads.html and copy it into the version directory as described in the doc, when building the Oracle XE docker image, it seems to do a full install of Oracle XE using the downloaded RPM.
If you skip the manual download of the RPM before building the Docker image, it seems the container will be built, but it tries to do the download when run, which for me at least, didn't work.
Once the Docker image has been built, a container can be started using:
docker run --name oraclexe \
-p 51521:1521 \
-p 55500:5500 \
-v [HOST_PATH]:/opt/oracle/oradata \
-e ORACLE_PWD=mysecurepassword \
-e ORACLE_CHARACTERSET=AL32UTF8 \
oracle/database:18.4.0-xe
Where HOST_PATH is the path to the local directory that Oracle will store the database in.
I spent most morning trying to figure out not only how to copy an initial SQL dump into the container, but also how to auto-import (execute) the dump into the DB. I have read countless other posts, none of which seem to work. I have the following docker compose file:
version: '3.8'
services:
db:
image: mariadb:10.5.8
restart: always
container_name: database
environment:
MYSQL_ROOT_PASSWORD: default
volumes:
- db-data:/var/lib/mysql
- ./db-init:/docker-entrypoint-initdb.d
volumes:
db-data:
The SQL dump is found in the db-init folder. I got the docker-entrypoint-initdb.d from the official docs on DockerHub.
After docker-compose up, the SQL is correctly copied into the docker-entrypoint-initdb.d but is never ran against the DB, aka the dump is never imported and the DB remains empty.
I have tried placing the volumes directive around in the docker compose file as this was suggested in another post. From what I've read, the SQL dump should be imported automatically when mounting the volume.
Is there no way to accomplish this via the docker-compose.yml only?
Edit: Switching the version to 2.x did not work
EDIT2: Container logs:
2021-02-10 17:53:09+00:00 [Note] [Entrypoint]: /usr/local/bin/docker-entrypoint.sh: running /docker-entrypoint-initdb.d/wordpress.sql
ERROR 1046 (3D000) at line 10: No database selected
From your logs, a quick google search pointed to this post. Adding MYSQL_DATABASE to the environment should solve the issue and the .sql should then be imported correctly on startup.
Final docker-compose should look like this:
services:
db:
image: mariadb:10.5.8
restart: always
container_name: database
environment:
MYSQL_DATABASE: wordpress
MYSQL_ROOT_PASSWORD: default
volumes:
- db-data:/var/lib/mysql
- ./db-init:/docker-entrypoint-initdb.d/
Maybe not worded as strongly as it should be, but the docs mention this: SQL files will be imported by default to the database specified by the MYSQL_DATABASE variable.
I am studying to automate the build and deployment of my google app engine application in Travis, so far it allows me to have static or predefined version name during deployment in .travis.yml.
Is there any way to make it dynamically generated at runtime? Like for example below in my .travis.yml file, I have deployment for production and staging version of the application, both are named or labeled as production and qa-staging, and I would like to suffix the version names with a timestamp or anything as long as it would be unique every successful build and deployment.
language: node_js
node_js:
- "10"
before_install:
- openssl aes-256-cbc -K $encrypted_c423808ed406_key -iv $encrypted_c423808ed406_iv
-in gae-creds.json.enc -out gae-creds.json -d
- chmod +x test.sh
- cat gae-creds.json
install:
- npm install
script:
- "./test.sh"
deploy:
- provider: gae
skip_cleanup: true
keyfile: gae-creds.json
project: traviscicd
no_promote: true
version: qa-staging
on:
branch: staging
- provider: gae
skip_cleanup: true
keyfile: gae-creds.json
project: traviscicd
version: production
on:
branch: master
Have you tried with https://yaml.org/type/timestamp.html ?
Im not sure if the context is the correct but seems a good and elegant option for your yaml file.
Perhaps you can use go generate to generate a version string that can be included? You need to run go generate as part of the build process for it to work, though.