Unknow database [...] when selecting the database (Ubuntu) - database

I'm trying to work with MySQL on my laptop (Ubuntu) and always that I have to export a .sql file to database, the console show me the same message: "unknow database Spotify (for example) when selecting the database".
The sql script is correct, and must work, but always show the same message; any solution?
CREATE DATABASE Spotify;
USE Spotify ;
DROP TABLE IF EXISTS Spotify.Usuarios ;
CREATE TABLE IF NOT EXISTS Spotify.Usuarios
(
iduser INT NULL ,
user VARCHAR(10) NULL ,
password VARCHAR(45) NULL ,
reg VARCHAR(45) NULL
) ENGINE = InnoDB;

Finally, I solved it: there was a problem with Ubuntu packages, and the mysql's installation didn't finished correctly

Related

Create table using sqlcmd || Azure SQL Server

I have a create table statement in a file. Using sqlcmd command, I want to create a table. Below is the table structure present in the file column.sql:
CREATE TABLE [dbname].[accessforms].tblename1
(
pk_column int PRIMARY KEY,
column_1 int NOT NULL
);
GO
I run it like this:
sqlcmd -S server_name -U username -P password -i /home/usr/columns.sql -o /home/usr/columns.txt
And I am getting this error;
Reference to database and/or server name in 'dbname.accessforms.tblename1' is not supported in this version of SQL Server
Could you please help me? Why am I getting this error and how we can solve this?
You're running that query in the Cloud.
Azure Cloud doesn't allow three part naming conventions, such as database_name.schema_name.object_name.
You'll have to drop the database name from your reference and only use schema.object.
Your script will have to become:
CREATE TABLE [accessforms].tblename1
(
pk_column int PRIMARY KEY,
column_1 int NOT NULL
);
GO

Syntax Error Creating Identity/Primary Key Column on Azure through SSMS

I came across an syntax error message when I tried creating a table in sql-azure through a local SSMS connection. This error does not occur if I run the same query on a local host DB instead of my azure connection.
Create table Portfolio_Company_Financials
(
PCF_ID int not null identity(1,1) PRIMARY KEY,
CompanyID int,
ReportingDate date,
Revenue float,
)
Above throws the error:
Parse error at line: 3, column: 21: Incorrect syntax near 'identity'
It will execute when I comment out identity(1,1) and over. It has the same issue with with using using only primary:
...PCF_ID int not null PRIMARY KEY,...
Additionally it looks like I cannot manually change column properties through the SSMS object explorer and can only refresh/delete when right clicking to the column.
It looks like a SSMS/permissions/azure issue? Can anyone help me here?
This error occurs when you try to create the table in Microsoft Azure SQL Data Warehouse. Azure SQL Data Warehouse does not yet support primary keys and the identity property. You can confirm your version with the following sql:
select ##version
https://learn.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-tables-overview

Importing all mysql database at once

mysql allows you to export complete database at once but I found if really very tough to import the complete database at once.
I used mysqldump -u root -p --all-databases > alldb.sql and when I am trying to import the complete database by mysql -u root -p < alldb.sql command its giving me very weird error.
Error
SQL query:
--
-- Database: `fadudeal_blog`
--
-- --------------------------------------------------------
--
-- Table structure for table `wp_commentmeta`
--
CREATE TABLE IF NOT EXISTS `wp_commentmeta` (
`meta_id` BIGINT( 20 ) UNSIGNED NOT NULL AUTO_INCREMENT ,
`comment_id` BIGINT( 20 ) UNSIGNED NOT NULL DEFAULT '0',
`meta_key` VARCHAR( 255 ) COLLATE utf8mb4_unicode_ci DEFAULT NULL ,
`meta_value` LONGTEXT COLLATE utf8mb4_unicode_ci,
PRIMARY KEY ( `meta_id` ) ,
KEY `comment_id` ( `comment_id` ) ,
KEY `meta_key` ( `meta_key` ( 191 ) )
) ENGINE = INNODB DEFAULT CHARSET = utf8mb4 COLLATE = utf8mb4_unicode_ci AUTO_INCREMENT =1;
MySQL said: Documentation
#1046 - No database selected
Its saying #1046 - No database selected , my question is when the mysql knows that I have exported the complete database at once then how can I specify just one database name?
I don't now if I am right or wrong but I tried it multiple times, I found the same problem. Please let me know how we can upload or import complete database at once.
Rohit , i think you will have to create the database and then issue a USE and then run the SQL. I looked at the manual here https://dev.mysql.com/doc/refman/5.7/en/mysql-batch-commands.html and it also provides an option for you to mention the db_name when you connect itself
something like mysql -u root -p < sql.sql (assuming the db is already created. It may be worth a try doing that way
shell> mysqldump --databases db1 db2 db3 > dump.sql
The --databases option causes all names on the command line to be treated as database names. Without this option, mysqldump treats the first name as a database name and those following as table names.
With --all-databases or --databases, mysqldump writes CREATE DATABASE and USE statements prior to the dump output for each database. This ensures that when the dump file is reloaded, it creates each database if it does not exist and makes it the default database so database contents are loaded into the same database from which they came. If you want to cause the dump file to force a drop of each database before recreating it, use the --add-drop-database option as well. In this case, mysqldump writes a DROP DATABASE statement preceding each CREATE DATABASE statement.
from : https://dev.mysql.com/doc/refman/5.7/en/mysqldump-sql-format.html

How can I update a nullable column to be not nullable, in SQL Server using DACPAC

I'm trying to update a database that is maintained and deployed using a database project (.sqlproj) in Visual Studio 2012. This is easier with SQL Server Management Studio, but in this case I have to deploy using a DACPAC.
What is the correct way to change a column to not be nullable, using DACPAC and without risking data loss?
A nullable column was added to a table. Now I need to publish an update that sets the column to not null and sets a default. Because there are rows in the table, the update fails. There is a setting to 'allow data loss' but that isn't an option for us and this update should not result in data loss. Here's a simple example that shows the problem:
CREATE TABLE [dbo].[Hello]
(
[Id] INT IDENTITY(100,1) NOT NULL PRIMARY KEY,
[HelloString] NVARCHAR(50) NULL ,
[Language] NCHAR(2) NOT NULL
)
Now publish that database and add rows, at least one row should have a null for HelloString.
Change the table definition to be:
CREATE TABLE [dbo].[Hello]
(
[Id] INT IDENTITY(100,1) NOT NULL PRIMARY KEY,
[HelloString] NVARCHAR(50) NOT NULL DEFAULT 'Hello' ,
[Language] NCHAR(2) NOT NULL
)
This cannot be published.
Error:
Rows were detected. The schema update is terminating because data loss might occur.
Next, I tried to add a pre-deployment script to set all NULL to be 'Hello':
UPDATE Hello SET HelloString = 'Hello' WHERE HelloString IS NULL
This publish attempt also fails, with the same error. Looking at the auto generated publish script it is clear why, but this seems to be incorrect behavior.
The NOT NULL alteration is applied BEFORE the default is added
The script checks for ANY rows, it doesn't matter whether there are
nulls or not.
The advice in the comment (To avoid this issue, you must add values to this column for all rows) doesn't solve this.
/*
The column HelloString on table [dbo].[Hello] must be changed from NULL to NOT NULL. If the table contains data, the ALTER script may not work. To avoid this issue, you must add values to this column for all rows or mark it as allowing NULL values, or enable the generation of smart-defaults as a deployment option.
*/
IF EXISTS (select top 1 1 from [dbo].[Hello])
RAISERROR (N'Rows were detected. The schema update is terminating because data loss might occur.', 16, 127) WITH NOWAIT
GO
PRINT N'Altering [dbo].[Hello]...';
GO
ALTER TABLE [dbo].[Hello] ALTER COLUMN [HelloString] NVARCHAR (50) NOT NULL;
GO
PRINT N'Creating Default Constraint on [dbo].[Hello]....';
GO
ALTER TABLE [dbo].[Hello]
ADD DEFAULT 'hello' FOR [HelloString];
Seen in SQL Server 2012 (v11.0.5343), SQL Server Data Tools 11.1.31009.1
When publishing a dacpac using SSMS, you'll not have access to the full set of publish options that are available when publishing from SqlPackage.exe or Visual Studio. I would suggest publishing with either SqlPackage.exe or with Visual Studio and enabling the "Generate smart defaults, where applicable" option. In the case of SqlPackage.exe, you would run a command like:
"C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\SqlPackage.exe" /a:publish /sf:"C:\MyDacpac.dacpac" /tcs:"Data Source=MYSERVER;Initial Catalog=MYDATABASE;Integrated Security=true" /p:GenerateSmartDefaults=true
In the case of Visual Studio, you'd check the Generate smart defaults option in the Advanced publish options dialog.

SQL Server to DB2 Conversion: Migration Toolkit Issues

I am trying to convert from MS SQL Server 2008 Express to DB2 9.7.
I have installed IBM Migration Toolkit and successfully connected to my SQL Server database (hosted locally).
I tried to extract from database, keeping all default data mappings, but when I extract, I get the following (for all the tables) - Has anyone had a problem like this where all column types are null with Migration Toolkit?
At first I thought it might be because I am using the Express edition of SQL Server and maybe it couldn't see the column types under this application, but the fact that Name and Unit are null(50) and therefore it has understood that these columns have associated lengths makes me doubt this.
CREATE TABLE [Action] ([Id] null IDENTITY , [Name] null(50) NOT NULL , [UserId] null NOT NULL , [FreqId] null NOT NULL , [Quant] null NULL , [Unit] null(50) NULL , [Start] null NOT NULL , [End] null NOT NULL , [Created] null NOT NULL , [Modified] null NOT NULL , [Deleted] null NOT NULL )
I'm currently use DCW(data conversation workbench) toolkit, it's very useful to get all the data table of SQL Server 2008.
You may try DCW. Remind to you: after installing plugin to DB2, you must add driver of sql to DB2 (window->reference->data management -> connectivity> drive definitions)

Resources