I've created a Java application with a Database running on Derby 10.10.2.0 (JDK 1.7 )
The problem is when NetBeans 7.3.1 is opened and the Database is connected everything works fine .
But when I compile and build this application then go to dist folder inside NetBeans Projects , I run the application and the database won't connect .
This is the code to connect to the Database :
String url = "jdbc:derby://localhost:1527/reflet;create=true";
String driver = "org.apache.derby.jdbc.ClientDriver";
String userName = "root";
String password = "admin";
Class.forName(driver).newInstance();
conn = DriverManager.getConnection(url,userName,password);
conn.close();
Related
Hello i have an localhost wordpress project. The last thing i need to do is connect a database. Because i get an message in my browser that there is no Error establishing a database connection. Can anyone tell me step-by-step how to do this? This beneath is where the dev.js for asks. My project runs on Node.js
$GLOBALS['DB_HOST'] = 'localhost';
$GLOBALS['DB_NAME'] = 'name';
$GLOBALS['DB_USER'] = 'username';
$GLOBALS['DB_PASSWORD'] = 'password';
$GLOBALS['DB_TABLE_PREFIX'] = 'prefix_';
I have a little problem, I want to deploy a Java Spring App to Cloud RUN and take connection from CLOUD SQL SQL SERVER , I know that can connect via unix socket for MySQL and Postgresql (https://cloud.google.com/sql/docs/mysql/connect-run?hl=es-419) but for SQL Server do not has the drivers jet.
Another way is to connect for Proxy Like (https://medium.com/#petomalina/connecting-to-cloud-sql-from-cloud-run-dcff2e20152a)
I tried but I can't, even that when deploy the script, it's tell me that is listenning for 127.0.0.1 for my instance id but when a tried to connect I can't.
here is my docker file
# Use the official maven/Java 8 image to create a build artifact.
# https://hub.docker.com/_/maven
FROM maven:3.5-jdk-8-alpine as builder
# Copy local code to the container image.
WORKDIR /app
COPY pom.xml .
COPY src ./src
COPY ohJpo-2.1.0.jar .
# download the cloudsql proxy binary
# RUN wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O ./build/cloud_sql_proxy
# RUN chmod +x ./build/cloud_sql_proxy
COPY cloud_sql_proxy /build/cloud_sql_proxy
RUN chmod +x /build/cloud_sql_proxy
# copy the wrapper script and credentials
COPY run.sh /build/run.sh
COPY credentials.json /build/credentials.json
# Build a release artifact.
RUN mvn install:install-file -Dfile=/app/ohJpo-2.1.0.jar -DgroupId=ovenfo -DartifactId=ohJpo -Dversion=2.1.0 -Dpackaging=jar
RUN mvn package -DskipTests
# Use AdoptOpenJDK for base image.
# It's important to use OpenJDK 8u191 or above that has container support enabled.
# https://hub.docker.com/r/adoptopenjdk/openjdk8
# https://docs.docker.com/develop/develop-images/multistage-build/#use-multi-stage-builds
FROM adoptopenjdk/openjdk8:jdk8u202-b08-alpine-slim
RUN /build/cloud_sql_proxy -instances=idInstanceID=tcp:1433 -credential_file=/build/credentials.json & sleep 10
COPY --from=builder /app/target/helloworld-*.jar /helloworld.jar
# Run the web service on container startup.
CMD ["java","-Djava.security.egd=file:/dev/./urandom","-Dserver.port=${PORT}","-jar","/helloworld.jar"]
I my java application I have this way to connect, in local PC with proxy found wit
#GetMapping("/pruebacuatro")
String pruebacuatro() {
Map<String, String> config = new HashMap<String, String>();
config.put("type", "SQLSERVER");
config.put("url", "127.0.0.1");
config.put("db", "bd");
config.put("username", "user");
config.put("password", "pass");
Object data = null;
Jpo miJpo = null;
try {
miJpo = new Jpo(config);
Procedure store = miJpo.procedure("seg.menu_configuraciones");
data = store.ejecutar();
} catch (Exception e) {
e.printStackTrace();
} finally {
if(miJpo != null) {
try {
miJpo.finalizar();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
return "Contents json: "+(new Gson().toJson(data));
}
I want to connect with my public IP or Private IP from my SQL Server But also I can't find information about that, Do you have any suggestion?
Cloud SQL proxy work in 2 modes: Unix socket and TCP
When you use it on your computer, you should use the TCP mode and you can connect to it in the localhost IP. However, with Cloud Run, it's the unix socket mode which is used and there isn't SQL server client that can use this connexion mode.
Thus, you have to use the Cloud SQL IP to connect your Cloud SQL instance to your Cloud Run.
For your local tests, continue to use Cloud SQL proxy in TCP mode
For Cloud Run, I recommend you to use the private IP of your SQL server.
Expose your instance in your VPC
Create a serverless VPC connector in the correct region
Attach the Serverless VPC connector to your Cloud Run service
Use the Cloud SQL private IP in your code to connect your DB.
I wanted to do a realtime deployment of my model on azure, so I plan to create an image which firsts queries an ID in azure SQL db to get the required features, then predicts using my model and returns the predictions. The error I get from PyODBC library is that drivers are not installed
I tried it on the azure ML jupyter notebook to establish the connection and found that no drivers are being installed in the environment itself. After some research i found that i should create a docker image and deploy it there, but i still met with the same results
driver= '{ODBC Driver 13 for SQL Server}'
cnxn = pyodbc.connect('DRIVER='+driver+';SERVER='+server+';PORT=1433;DATABASE='+database+';UID='+username+';PWD='+ password+';Encrypt=yes'+';TrustServerCertificate=no'+';Connection Timeout=30;')
('01000', "[01000] [unixODBC][Driver Manager]Can't open lib 'ODBC
Driver 13 for SQL Server' : file not found (0) (SQLDriverConnect)")
i want a result to the query instead i get this message
and/or you could use pymssql==2.1.1, if you add the following docker steps, in the deployment configuration (using either Environments or ContainerImages - preferred is Environments):
from azureml.core import Environment
from azureml.core.environment import CondaDependencies
conda_dep = CondaDependencies()
conda_dep.add_pip_package('pymssql==2.1.1')
myenv = Environment(name="mssqlenv")
myenv.python.conda_dependencies=conda_dep
myenv.docker.enabled = True
myenv.docker.base_dockerfile = 'FROM mcr.microsoft.com/azureml/base:latest\nRUN apt-get update && apt-get -y install freetds-dev freetds-bin vim gcc'
myenv.docker.base_image = None
Or, if you're using the ContainerImage class, you could add these Docker Steps
from azureml.core.image import Image, ContainerImage
image_config = ContainerImage.image_configuration(runtime= "python", execution_script="score.py", conda_file="myenv.yml", docker_file="Dockerfile.steps")
# Assuming this :
# RUN apt-get update && apt-get -y install freetds-dev freetds-bin vim gcc
# is in a file called Dockerfile.steps, it should produce the same result.
See this answer for more details on how I've done it using an Estimator Step and a custom docker container. You could use this Dockerfile to locally create a Docker container for that Estimator step (no need to do that if you're just using an Estimator run outside of a pipeline) :
FROM continuumio/miniconda3:4.4.10
RUN apt-get update && apt-get -y install freetds-dev freetds-bin gcc
RUN pip install Cython
For more details see this posting :using estimator in pipeline with custom docker images. Hope that helps!
Per my experience, I think the comment as #DavidBrowne-Microsoft said is right.
There is a similar SO thread I am getting an error while connecting to an sql DB in Jupyter Notebook answered by me, which I think it will help you to install the latest msodbcsql driver for Linux on Microsoft Azure Notebook or Docker.
Meanwhile, there is a detail about the connection string for Azure SQL Database which you need to carefully note, that you should use {ODBC Driver 17 for SQL Server} instead of {ODBC Driver 13 for SQL Server} if your Azure SQL Database had been created recently (ignore the connection string shown in Azure portal).
you can use AzureML built in solution dataset to connect to your SQL server.
To do so, you can first create an azure_sql_database datastore. reference here
Then create a dataset by passing the datastore you created and the query you want to run.
reference here
sample code
from azureml.core import Dataset, Datastore, Workspace
workspace = Workspace.from_config()
sql_datastore = Datastore.register_azure_sql_database(workspace = workspace,
datastore_name = 'sql_dstore',
server_name = 'your SQL server name',
database_name = 'your SQL database name',
tenant_id = 'your directory ID/tenant ID of the service principal',
client_id = 'the Client ID/Application ID of the service principal',
client_secret = 'the secret of the service principal')
sql_dataset = Dataset.Tabular.from_sql_query((sql_datastore, 'SELECT * FROM my_table'))
You can also do it via UI at ml.azure.com where you can register an azure SQL datastore using your user name and password.
I am new to HADOOP hdfs and Sqoop.
I installed hadoop hdfs on one machine with single node. I am able to put and get the file to/from hdfs repectively.
Requirement [Do not want to install hadoop and sqoop on client machine]:
I want to access hdfs from differnt machine using WebHDFS without installation of hadoop on client machine and that part is working fine.
To access HDFF, I am using webhdfs java client jar.
Now I want to execute export/import command of sqoop with remote hdfs.
Case: Export to local file system where HADOOP as well as Sqoop is not installed, we are using only HADDOP and Sqoop client jar.
public int importToHdfs(String tablename, String stmpPath){
int resultcode=-1;
try {
String s = File.separator;
String outdir = stmpPath.substring(0,stmpPath.lastIndexOf(s));
String[] str = {"import"
, "--connect", jdbcURL
,"--table",tablename
, "--username", sUserName, "--password", sPassword
, "-m", "1"
,"--target-dir","/tmp/user/hduser"
,"--outdir",outdir
};
Configuration conf = new Configuration();
resultcode = Sqoop.runTool(str,conf);
}catch(Exception ex){
System.out.print(ex.toString());
}
return resultcode;
}
I am trying to access MS SQL server 2005 from a servlet file. I am using JDBC 4.0 driver.
I have already added the JAR files sqljdbc.jar and sqljdbc4.jar files to my Tomcat /lib folder.
But while running code I am getting an error
HTTP Status 500 - Java Runtime Environment (JRE) version 1.7 is not supported by this driver. Use the sqljdbc4.jar class library, which provides support for JDBC 4.0.
How is this caused and how can I solve it?
My code is:
Class.forName("com.microsoft.sqlserver.jdbc.SQLServerDriver");
Connection conn = conn = DriverManager.getConnection("jdbc:sqlserver://localhost;databaseName=school;user=sa;password=123");
PrintWriter pwOut = res.getWriter();
pwOut.println("Connected");
Statement st = conn.createStatement();
String searchCriteria = req.getParameter("txtSearch");
ResultSet rs = st.executeQuery("select * from student");
res.setContentType("text/html");
The error message is pretty clear. Tomcat is using the wrong driver.
You state that you copied sqljdbc.jar and sqljdbc4.jar into the Tomcat lib folder. That is most probably the reason for your problem.
You only need sqljdbc4.jar otherwise Tomcat picks up the wrong one.
Try to delete sqljdbc.jar from the Tomcat lib folder
Here is my Code to connect java to Microsoft sql Server 2012
You only need sqljdbc4.jar that avail on offical Microsoft website. Here is the link:
http://download.microsoft.com/download/0/2/A/02AAE597-3865-456C-AE7F-613F99F850A8/sqljdbc_4.0.2206.100_enu.exe
It contains 2 jar files, and I am trying to use sqljdbc4.jar. This is the code I am using to connect:
package com.Sql.ConnectDB;
import java.sql.*;
public class DbClass {
public static void main(String[] args) {
// TODO Auto-generated method stub
try{
**String url="jdbc:sqlserver://localhost;databaseName=Student";**//important
String user="username";
String pass="password";
**Class.forName("com.microsoft.sqlserver.jdbc.SQLServerDriver");**//important
Connection con=DriverManager.getConnection(url,user,pass);
System.out.println("Conneccted Successfully");
}catch(Exception e){
e.printStackTrace();
}
}
}