Connecting Apache Superset to an external database - database

I am running apache superset on docker, and I have been trying to connect to an external database(Postgres) using the example link on SQLAlchemy Docs for connecting to a Postgres database (postgresql://scott:tiger#localhost/mydatabase // postgresql://username:password#localhost:5433/postgres). However, I have been getting the following error: Connection failed, please check your connection settings. Could someone please help me with this.

Are you sure that your postgres is on the same network (localhost)? It seems for external database, that it would likely be on another network (and therefore you would use IP address)?
If these are the docs you are looking at --> https://docs.sqlalchemy.org/en/12/core/engines.html#database-urls
Then you might want to think in terms of 'host', meaning then an IP(v4) address and/or DNS.

As it was recommended you may need to whitelist your Superset IP address in pg_hba.conf.
You may also need to check if you have the right driver installed in the docker instance that you are running superset.

Related

Uploading Databases

How does one go about uploading a database like Apache Cassandra after creating one? Furthermore, is there a way to upload/share only its skeleton structure, without the data gathered in it? I'm on MacOS and would like to use Python to do all of this. Thank you!
Based on your second comment, I guessed it to mean you want the database to be remotely accessible to clients/apps not installed locally.
Clients/apps connect to Cassandra on the IP address set for rpc_address and the CQL port set for native_transport_port (default is 9042) set in cassandra.yaml.
You mentioned that your Cassandra instance is running on your laptop so only clients/apps running on your local network can access it if you configure rpc_address to an IP address accessible on the network (default is localhost).
If you're just trying out Cassandra and want to collaborate with other developer friends, try Astra and launch Cassandra instance on the free-tier (no credit card required). With it you can share the database credentials with your friends and they can connect to it over the internet.
You can connect to Astra from your Python app using the Python driver. Otherwise, Astra includes Stargate.io pre-configured and ready to use. Stargate is a data access gateway that lets you connect to Cassandra from your app using REST API, GraphQL API or JSON/Doc API without having to learn CQL. For more info, see Connecting to your Astra database. Cheers!

MediaWiki installation issue - port problems

I am trying to install MediaWiki version 1.31 localy and I have run into some issues that I cant get past by. Mainly when I input datatabe connection (I am trying to connect to PostgreSQL database) information it returns this error.
Thing is the port I am trying to connect is 5433 not 5432, also the names "template1" and "postgres" are not included in my input trough the dialogue screen - I dont know where they came from. "test1" is the name of the database I am trying to connect to.
Any help or advice how to get trough this error would be greatly appreciated. Thank you.
That the port you specify is not used while setting up the database schema in the first place is a long-standing known bug. One workaround is to run your database on the default port until you have wiki set up, then change it back to the port you want.
In order to create a new database, you need to connect to an existing database in the same cluster. 'template1' and 'postgres' are pre-existing databases (usually created at the time the cluster was created) commonly used to connect to in order to create a new database. These names are "well-known", you don't need to specify them.

Google Data Studio MySql data source connection does not exist Error

Platform: Google Data Studio
Data Source: MySQL
Connection was working before,
meaning no issues with credentials.
All of a sudden, getting the below error:
All IPs have been whitelisted from the google data studio list of ips.
The only thing that comes to mind is a limitation of GDS to process data.
The data source table has around 200K+ rows.
Not sure what is the limitation for GDS with MySQL.
There's no indication anywhere.
Anyone out there can help to solve this or maybe provide some info would be appreciated.
Thanks
If you use a firewall, be sure to double check the Google ip adresses. They may have added new ips (in my case, the last one was missing).
Check them here !
After doing so, I had to change the Host name of the connection to the database for a url alias (www.yourserver.com <- url pointing on your server), and change it back to the IP to make it work.
Sounds like a the connector cannot establish a new connection.
Cloud SQL Connector:
At the time of writing this, the connector seems unable to establish a new connection once the existing one has timed out and modifying the JDBC url to include query parameters gives you an error when authenticating.
This is probably due to the connector appending it's own parameters.
(Seems to be a possible bug here when a connection no longer exists)
MySQL Connector (with IP Address):
This connector allows you to add query parameters to the JDBC url. Enable SSL and append useSSL=true to the url.
e.g.jdbc:mysql://<ip>/<database>?useSSL=true
This worked as expected and establishes new connections when required.
Example Source Setup
Suffering from this issue too, my experience is that using the MySQL connector instead of the Cloud SQL Connector provides better stability in combination with setting wait_timeout to a value above 12 hours.
This issue has been reported on the official Google Data Studio bug tracker. Please vote them up if you are also suffering from this !
🐛 130205306 MySQL connection does not exist Apr 9, 2019 04:36PM
🐛 118470083 Data source password not stored for MySQL sources. Oct 26, 2018 01:24PM

Reasons for "The network path was not found" in ASP.NET MVC

I made ASP.NET MVC web application, uploaded the files, also the database, but I get the following error when browsing it.
The network path was not found
I'm using Entity Framework and this is the connection string in my web.config file
<connectionStrings>
<add name="[mydatabase]Entities" connectionString="metadata=res://*/Models.Model1.csdl|res://*/Models.Model1.ssdl|res://*/Models.Model1.msl;provider=System.Data.SqlClient;provider connection string="data source=sql.[somedomain].net;initial catalog=[mydatabase name];User ID=[myUsername];Password=[myPassword];MultipleActiveResultSets=True;App=EntityFramework"" providerName="System.Data.EntityClient" />
</connectionStrings>
I've uploaded many websites using IP Address eg. xxx.xxx.x.xxx as data source, but this is the first time to use server name eg. sql.[somedomain].net -I cannot get the server ip-, so I'm not sure if this caused the error, or if I should make something special to make it work.
So, I'm asking if there is something I should do to use server name as data source, if not then what else may cause this error.
This answer doesn't really help.
Thanks in advance.
Update
If I ping the server sql.[somedomain].net, I get this result
Ping request could not find host sql.[somedomain].net. Please check the name and try again.
And if I nslookup it, I get this result
*** Unknown can't find sql.[somedomain].net: Non-existent domain
So does that mean -for sure- that the server is not accessible. And is there anything I can do beside contacting the hosting technical support?
Solved
It was the Hosting Provider error/misconfiguration. After 3 days of searching and contacting the customer support, they realized it was their issue. I'm leaving this question to tell future viewers to Host Only with reliable/well-known Hosting Providers no matter what.
The info you got from tech support seems fairly contradictory. It's common to disable remote access for database servers, but if that's the case, then using the domain to connect doesn't really help you.
If you're trying to connect to this database from a published MVC project, residing with the same provider as supplies your database, then you should have no issues connecting as you're no longer "remote". However, unless your DNS is also hosted at the same provider, using the domain may make the connection appear remote, as it's going outside to come back in. The safest bet is to simply use the IP address of the database server in the connection string.
Also, pay attention to the IP address you have for the database server. If it's in the 10.* or 172.* range, it's a local IP, but if it's something else, it's most likely an outside IP address. Trying to connect to such an address, may also make the connection appear remote as you're going outside the network to come back in. Also, while disallowing remote access to a database server is a good idea, you can generally safely allow remote access to certain IPs. You most likely don't have control over that directly, but you can check with your provider to see if they can add an explicit rule for your web server's IP so that even if the connection is coming through as remote, it'll still work.
As far as working locally in development, you'll just have to use a local database. You may already have that covered, but your question wasn't entirely clear on that aspect.
I also faced the same issue. I used the forward slash instead of back slash for instance name (clustername\instancename). Once I changed it to back slash. It worked fine.

Connecting to Google Cloud SQL from Eclipse Not Using App Engine

We are trying to connect to Google Cloud SQL from Eclipse using the Database Development perspective. To do so I'm trying to add a new Database Connection, which I was able to do successfully for a local MySQL instance running on my machine.
The motivation for doing this is that we currently run our JUnit tests against the local instance. However, we are switching to Hibernate and want to make sure that all of our configuration files work with Cloud SQL. As a general guide I've been using:
https://developers.google.com/appengine/articles/using_hibernate
We're diverging slightly in that we're using hibernate.cfg.xml instead of persistence.xml, but I don't think this will actually have a bearing on the current issue of simply connecting to the database. From another answer as well as some Google documentation I'm aware that I can't use the com.google.appengine.api.rdbms.AppEngineDriver, because that needs to be run from an AppEngine instance. Instead I'm trying to follow the directions here:
https://developers.google.com/cloud-sql/docs/external
and am using com.mysql.jdbc.Driver.
I have assigned my Cloud SQL instance an ip address and have added my current ip address to the whitelist, as described here:
https://developers.google.com/cloud-sql/docs/access-control#appaccess
My driver is the Connector/J driver I've been using successfully with the local instance, and the url I'm using is:
jdbc:google:rdbms://my-app:my-cloud-sql-instance/myDatabase
which I got based on:
https://developers.google.com/appengine/articles/using_hibernate
After adding the connection and setting the information I click Test Connection, which worked successfully on my local instance. However, this throws the following error:
java.lang.Exception: Connection failed with unspecified error.
at org.eclipse.datatools.connectivity.DriverConnectionBase.internalCreateConnection(DriverConnectionBase.java:110)
at org.eclipse.datatools.connectivity.DriverConnectionBase.open(DriverConnectionBase.java:54)
at org.eclipse.datatools.connectivity.drivers.jdbc.JDBCConnection.open(JDBCConnection.java:73)
at org.eclipse.datatools.enablement.internal.mysql.connection.JDBCMySQLConnectionFactory.createConnection(JDBCMySQLConnectionFactory.java:28)
at org.eclipse.datatools.connectivity.internal.ConnectionFactoryProvider.createConnection(ConnectionFactoryProvider.java:83)
at org.eclipse.datatools.connectivity.internal.ConnectionProfile.createConnection(ConnectionProfile.java:359)
at org.eclipse.datatools.connectivity.ui.PingJob.createTestConnection(PingJob.java:76)
at org.eclipse.datatools.connectivity.ui.PingJob.run(PingJob.java:59)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)
Obviously this isn't very helpful.
I've tried fiddling with the url, tried a number of users (none of which require passwords, so I'm leaving the password fields blank), and different versions of the driver for different versions of MySQL. Nothing has worked.
There are perhaps more deep-seated issues with doing it this way, such as how I will easily switch between test and deployment versions of my hibernate.cfg.xml, and I don't have good answers. I was just planning on editing them by hand back to the AppEngineDriver, which means I might run into further configuration issues at that point even if the JUnit tests are passing. Nevertheless, I think getting a connection set up to Cloud SQL that will allow JUnit testing will be a step in the right direction. I'd appreciate any input!
You should use jdbc:mysql://<cloudsql-instance-ip>:3306/<database-name> to connect from an external network. The connection string you are using is to connect from Google App Engine.

Resources