Oracle public server - database

I want to learn about oracle, to try some queries and other SQL features of oracle data base, but don't want to install and mess with all realted issues. So my question is - is there any publicly available oracle server, to which I can connect through terminal and play with it?
I mean a service where I can register and some space would be allocated to my profile

Take a look at: http://apex.oracle.com/

The only thing I can think of is SQLFiddle: http://sqlfiddle.com/
But it won't let you have a "private" space. You need to re-create your schema each time (but you can bookmark your script which might be enough for you).

You could also try one of the pre-built virtual appliances - see
http://www.oracle.com/technetwork/community/developer-vm/index.html

If you need direct database access, you can run it in a Docker instance:
docker run -d -p 1521:1521 -p 8080:8080 alexeiled/docker-oracle-xe-11g
Then connect to it with sqlplus
sqlplus system/oracle#localhost:1521/xe
See here for more passwords, info on apex, etc.

Just came across this: Oracle Live SQL. It is browser based so nothing to install locally. But, you need to have an Oracle account.
Browser based SQL worksheet access to an Oracle database schema

Related

How to get a SQL dump for development in Shopware 6?

We are developing a customers shop and want to develop locally, but without all the user / order data.
How can we achieve this?
The question has those points:
How to create such dumps?
How to import them?
How to manage storage / make them easily accessible for developers?
You can use the smile SA gdpr dump to create a anonmyized dump which is fine for development.
This can be either stored in GIT or downloaded from the production server or you might setup another server to store those dumps.
There is also a nice script from kellerkinder especially for shopware 6 https://github.com/kellerkinderDE/shopware6-database-dump
To import just use mysql --force -u -p my-database < dump.sql

How to copy a database from server to localhost

I'm trying to copy a database from a server (to which I'm connected through ssh) to my localhost. But all that I find is using the copyDatabase() method which is now deprecated, and the documentation doesn't explain how to do something similar (Or I didn't understand how to)
Also, I'd like to know how can I generalize that to also copy a DB from atlas if it's possible.
If you are using mongodb then its like
step 1: create a tunnel
ssh username#yourdomainOrIP -L 27017:localhost:27017
step 2 :
mongo
use admin
db.copyDatabase(<fromdb>,<todb>,"localhost:27017",<username>,<password>)
mongodump dump either whole database or a specific collection
mongorestore restore to your local database

How to push data from local SQL Server to Tableau Server on AWS

We are developing Tableau dashboards and deploying the workbooks on a EC2 windows instance in AWS. One of the data source is the company SQL server inside firewall. The server is managed by IT and we only have read permission to one of the databases. Now the solution is to build workbook on Tableau desktop locally by connecting to the company SQL server. Before publishing the workbooks to Tableau server, the data are extracted from data sources. The static data got uploaded with workbooks when published.
Instead of linking to static extracted data on Tableau server, we would like to set up a database on AWS (e.g. Postgresql), probably on the same instance and push the data from company SQL server to AWS database.
There may be a way to push directly from SQL server to postgres on AWS. But since we don't have much control of the server plus the IT folks are probably not willing to push data to external, this will not be an option. What I can think of is as follows:
Set up Postgres on AWS instance and create the tables with same schemas as the ones in SQL server.
Extract data from SQL server and save as CSV files. One table per file.
Enable file system sharing on AWS windows instance. So the instance can read files from local file system directly.
Load data from CSV to Postgres tables.
Set up the data connection on Tableau Server on AWS to read data from Postgres.
I don't know if others have come across a situation like this and what their solutions are. But I think this is not a uncommon scenario. One change would be to have both local Tableau Desktop and AWS Tableau Server connect to Postgres on AWS. Not sure if local Tableau could access Postgres on AWS though.
We also want to automate the whole process as much as possible. On local server, I can probably run a Python script as cron job to frequently export data from SQL server and save to CSVs. On the server side, something similar will be run to load data from CSV to Postgres. If the files are big, though, it may be pretty slow to import data from CSV to postgres. But there is no better way to transfer files from local to AWS EC2 instance programmatically since it is Windows instance.
I am open to any suggestions.
A. Platform choice
If you use a database other than SQL Server on AWS (say Postgres), you need to perform one (or maybe two) conversions:
In the integration from on on-prem SQl Server to AWS database you need to map from SQL Server datatypes to postgres datatypes
I don't know much about Tableau, but if it is currently pointing at SQL Server, you probably need some kind of conversion to point it at Postgres
These two steps alone might make it worth your while to investigate a SQL Express RDS. SQL Express has no licencing cost but obviously windows does. You can also run SQL Express on Linux which would have no licencing costs, but would require a lot of fiddling about to get running (i.e. I doubt if there is a SQL Express Linux RDS available)
B. Integration Approach
Any process external to your network (i.e. on the cloud) that is pulling data from your network will need the firewall opened. Assuming this is not an option, that leaves us only with push from on-prem options
Just as an aside on this point, Power BI achieves it's desktop data integration by using a desktop 'gateway' that coordinates data transfer, meaning that cloud Power BI doesn't need to open a port to get what it needs, it uses the desktop gateway to push it out
Given that we only have push options, then we need something on-prem to push data out. Yes, this could be a cron job on Linux or a windows scheduled task. Please note, this is where you start creating shadow IT
To get data out of SQL Server to be pushed to the cloud, the easiest way is to use BCP.EXE to generate flat files. If these are going into a SQL Server, these should be native format (to save complexity). If these are going to Postgres they should be tab delimited
If these files are being uploaded to SQL Server, then it's just another BCP command to push native files into tables into SQL Server (prior to this you need to run SQLCMD.EXE command to truncate the target table
So for three tables, assuming you'd installed the free* SQL Server client tools, you'd have a batch file something like this:
REM STEP 1: Clear staging folder
DEL /Y C:\Staging\*.TXT
REM STEP 2: Generate the export files
BCP database.dbo.Table1 OUT C:\Staging\Table1.TXT -E -S LocalSQLServer -N
BCP database.dbo.Table2 OUT C:\Staging\Table2.TXT -E -S LocalSQLServer -N
BCP database.dbo.Table3 OUT C:\Staging\Table3.TXT -E -S LocalSQLServer -N
REM STEP 3: Clear target tables
REM Your SQL RDS is unlikely to support single sign on
REM so need to use user/pass here
SQLCMD -U username -P password -S RDSSQLServerName -d databasename -Q"TRUNCATE TABLE Table1; TRUNCATE TABLE Table2; TRUNCATE TABLE Table3;"
REM STEP 4: Push data in
BCP database.dbo.Table1 IN C:\Staging\Table1.TXT -U username -P password -S RDSSQLServerName-N
BCP database.dbo.Table2 IN C:\Staging\Table2.TXT -U username -P password -S RDSSQLServerName-N
BCP database.dbo.Table3 IN C:\Staging\Table3.TXT -U username -P password -S RDSSQLServerName-N
(I'm pretty sure that BCP and SQLCMD are free... not sure but you can certainly download the free SQL Server tools and see)
If you wanted to push to Postgres SQL instead,
in step 2, you'd need to drop the -N option, which would make the file text, tab delimited, readable by anything
in step 3 and step 4 you'd need to use the associated Postgres command line tool, but you'd need to deal with data types etc. (which can be a pain - ambiguous date formats alone are always a huge problem)
Also note here the AWS RDS instance is just another database with a hostname, login, password. The only thing you have to do is make sure the firewall is open on the AWS side to accept incoming connections from your IP Address
There are many more layers of sophistication you can build into your integration: differential replication, retries etc. but given the 'shadow IT status' this might not be worth it
Also be aware that I think AWS charges for data uploads, so if you are replicating a 1G database everyday, that's going to add up. (Azure doesn't charge for uploads but I'm sure you'll pay in some other way!)
For this type of problem I would strongly recommend use of SymmetricDS - https://www.symmetricds.org/
The main caveat is that the SQL Server would require the addition of some triggers to track changes but at that point SymmetricDS will handle the push of the data.
An alternative approach, similar to what you suggested, would be to have a script export the data into CSV files, upload them to S3, and then have a bucket event trigger on the S3 bucket that kicks off a Lambda to load the data when it arrives.

Google cloud sql instance super privilege error

I am very new in Google app engine please help me to solve my problem
I have created one instance in Google cloud sql when I import SQL file then it shows me error like this.
ERROR 1227 (42000) at line 1088: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
How do I to add super privilege to my instance.
As stated at the Cloud SQL documentation:
The SUPER privilege is not supported.
You can take a look at this page that explains how to import data to a Cloud SQL instance.
I also faced the same issue. But the problem was in dumped sql database. When exporting the database use these flags
--hex-blob --skip-triggers --set-gtid-purged=OFF
Here is the complete documentation of how to do it (https://cloud.google.com/sql/docs/mysql/import-export/importing). Once the data is exported, it can be imported using command line, gcloud shell or there is an option of import in gcloud sql as well.
I used the import feature of gcloud sql console and it worked for me.
I ran into the the same error when backporting a gzipped dump (procured with mysqldump from a 5.1 version of MySQL) into a Google Cloud SQL instance of MySQL 5.6. The following statement in the sql file was the problem:
DEFINER=`username`#`%`
The solution that worked for me was removing all instances of it using sed :
cat db-2018-08-30.sql | sed -e 's/DEFINER=`username`#`%`//g' > db-2018-08-30-CLEANED.sql
After removal the backport completed with no errors. Apparently SUPER privilege is needed, which isn't available in Google Cloud SQL, to run DEFINER.
Another reference: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
Good luck!
i faced same issue you can try giving 'super permission' to user but isn't available in GCP cloud SQL.
The statement
DEFINER=username#`%
is an issue in your backup dump.
The solution that you can work around is to remove all the entry from sql dump file and import data from GCP console.
cat DUMP_FILE_NAME.sql | sed -e 's/DEFINER=<username>#%//g' >
NEW-CLEANED-DUMP.sql
After removing the entry from dump and completing successfully you can try reimporting.
For the use case of copying between databases within the same instance, it seems the only way to do this is using mysqldump, which you have to pass some special flags so that it works without SUPER privileges. This is how I copied from one database to another:
DB_HOST=... # set to 127.0.0.1 if using cloud-sql proxy
DB_USER=...
DB_PASSWORD=...
SOURCE_DB=...
DESTINATION_DB=...
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
| mysql -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $DESTINATION_DB
Or if you just want to dump to a local file and do something else with it later:
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
> $SOURCE_DB.sql
See https://cloud.google.com/sql/docs/mysql/import-export/exporting#export-mysqldump for more info.
It's about the exporting of data. When you export from the console, it exports the whole Instance, not just the schema, which requires the SUPER privilege for the project in which it was created. To export data to another project, simply export by targeting the schema/s in the advanced option. If you run into could not find storage or object, save the exported schema to your local, then upload to your other project's storage, then select it from there.
In case somebody is searching for this in 2018 (at least august) the solution is:
Create a database. You can do this from UI, just go to Database menu and click "Create a database".
After you clicked "import" and selected your sql_dump (previously saved in a bucket), press "Show advanced options" and select your Db (not that advanced, are they?!). Otherwise, the default is the system mysql which, of course can not
support import.
Happy importing.
I solved this by creating a new database and in the SQL instance. (Default database is sys for mysql).
Steps(Non-cli version):
1) In GCP > SQL > Databases , create a new database e.g newdb
2) In your sql script, add: Use newdb;
Hope that helps someone
SUPER privilege is exclusively reserved for GCP
For you question, you need to import data into a YOUR database in which you have permission ..

Tool to Generate ER diagrams from Views in SQL Server

Is there any third party tool (free or paid) which could be useful in generating Entity Relationship Diagram from the views in SQL Server 2005/2008 or higher version?
For example,I have a view in my database and I wish to generate ER Diagram based on all the tables that are being referred in the view.
Let me know for any doubts
Thanks!
you can use a Java-based free tool named SchemaSpy. It basically works with any RDBMS as long as it has a JDBC connector.
I have discovered this while scratching my own itch, and also created a detailed post on it. Can be found here: http://blog.kmonsoor.com/generate-er-diagram-from-sql-database/
Summary:
First of all, your system should have Java runtime properly
installed. SchemaSpy is a .jar file. Get it:
http://sourceforge.net/projects/schemaspy/files/
JDBC connector to
you DB. Make sure to match your DBMS version.
Also, SchemaSpy
depends on GarphViz to generate the ER-diagrams, so you need to be
installed it on your system. http://www.graphviz.org/Download..php
target DB instance must be up & running
Now, this command will do the magic:
$ java -jar ./schemaSpy_5.0.0.jar -t pgsql -host 127.0.0.1:5432 -db your_database_name \
-u your_DB_user_name -p your_password -s public \
-dp ./database_specific.jdbc3.jar \
-o output_folder
There are a couple ways to do this, many outlined below. The easiest approach would be to use SQL Server's own diagrammer (described on the link below).
http://msdn.microsoft.com/en-us/library/aa224825(v=sql.80).aspx
Just do it in SQL Server Management Studio:
Connect your DB, open Object Explorer and right click your view and select "Design". This gives you the graphical view designer, which is a pretty good ER diagram of your view.
use mysql workbench. There are tools already provided to generate ER diagram directly from database.

Resources