We are evaluating Tableau and noticed that it doesn't appear to recognize sparse columns in our SQL Server 2008 tables. Is this possible or are there any common workarounds?
Yes you can use it like that, the visualization will cut off the parts that are empty, that's all.
Tableau displays the columns it finds from a SELECT * query.
In reviewing this blog we see that SELECT * omits SPARSE columns. https://blogs.msdn.microsoft.com/sreekarm/2009/01/08/sparse-columns-in-sql-server-2008/
You would have to explicitly name the columns you want to see using a CUSTOM SQL connection in Tableau.
Related
I have to compare the tables in Server1 database A dbo.X and Server2, database B dbo.Y. Both table X and table Y contains same values.
SO I need to validate both tables contains same values in every row and column. Is it possible to do it?
Thanks
If you do not want to use any Tool like SSIS/Visual Studio then Linked Server will be required.
Select * FROM Server1.databaseA.dbo.X
EXCEPT
Select * FROM Server2.databaseB.dbo.Y
EXCEPT returns distinct rows from the left input query that aren’t output by the right input query.
EXCEPT
Sure, you can do it by creating linked servers. Please, follow this manual to create it:
Creating Linked Servers
After this you will able to make sql-queries to another server like this:
SELECT name FROM [SRVR002\ACCTG].master.sys.databases ;
There is a more easy way if you have visual studio installed. There is a option to compare schema and data with any server and it is very efficient as you can update the target server within the tool as well.
VisualStudio -> Tools -> SQL server -> Data Comparison
In SQL Server Management Studio 2014, I'm frequently writing a query, copying the results grid to Excel, and adding filters as I explore data.
Is there a way to filter the results grid from within SSMS? I'd prefer something that's built into the program, but a plugin would work. This appears to be a feature in some other programs like DbVisualizer and DataGrip, but I don't have a license of those.
EDIT: what I'm asking for is quick sorting and filtering of the result set in the same point-and-click way that I can sort and filter an Excel spreadsheet. I was thinking this would be more convenient and faster than adding additional clauses and conditions to queries.
If this still isn't a real question, I would not be opposed to a mod closing it, since I don't like the idea of a question floating out there that implies I don't know what a where clause is :-(
What you are asking for isn't a SSMS function. What you'd probably find valuable is a data visualization tool like tableau or spotfire. In addition to the graphical features they have a lot of grid features. Or, an SSIS to do some of the filtering and cleansing you need. Then you would t have to copy and paste every time.
To filter the result set you have to use where clause.
For example result set query -
select firstName, lastName from employee
To filter it -
select firstName, lastName from employee
where lastName = 'abcd'
I'm used to scripting in Python or Matlab, and my first couple hours with SQL have been infuriating. I would like to make a list of columns appear on the screen in any way, shape, or form; but when I use commands like
select *
from "2Second Log.dbo.TagTable.Columns"
I keep getting the error:
Invalid column name '[the first column in my table]'.
even though I never explicitly asked for [the first column in my table], it found it for me. How can you correctly identify the first column name, and then still claim it's invalid!? Babies will be strangled.
This db was generated by Allen Bradley's FactoryTalk software. What I would really like to do is produce an actual list of "TagName" strings...but I get the same error when I try that. If there were a way to actually double click the table and open it up and look at it (like in Matlab), that would be ideal.
Echoing juergen's suggestion in the comment above. It looks like you're running the query on the master database, not the 2Second Log database that actually has your table. (You can tell this by looking at the database in the dropdown in the top left of your screenshot). Two things you can do:
Change the dropdown in the top left to 2Second Log. This will target your query to a different database
Put your database name in brackets as suggested by juergen i.e. select * from [2Second Log].dbo.TagTable
As an side, if you're looking for a good SQL tutorial, I highly recommend the Mode SQL tutorial. It's a fantastic interactive platform to get your SQL feet wet.
always use brackets when names/field have spaces or dashes.
select * from [2Second Log].dbo.TagTable
I believe there is a new feature where you can define columns as HIDDEN so that a SELECT * returns all except hidden columns.
Is this possible? If yes, how would you achieve it with SQL Server 2016 or SQL Azure?
Adding info from the comments into the answer..
We cant specify a column as hidden and do a select * which returns all columns except hidden like Temporal tables.Moreover this feature is applicable only for validfrom,valid to columns ,though it is nice to have such a feature.As Satya mentioned you can use views to achieve more or less the same.
I'm trying to export some tables from SQL Server 2005 and then create those tables and populate them in Oracle.
I have about 10 tables, varying from 4 columns up to 25. I'm not using any constraints/keys so this should be reasonably straight forward.
Firstly I generated scripts to get the table structure, then modified them to conform to Oracle syntax standards (ie changed the nvarchar to varchar2)
Next I exported the data using SQL Servers export wizard which created a csv flat file. However my main issue is that I can't find a way to force SQL Server to double quote column names. One of my columns contains commas, so unless I can find a method for SQL server to quote column names then I will have trouble when it comes to importing this.
Also, am I going the difficult route, or is there an easier way to do this?
Thanks
EDIT: By quoting I'm refering to quoting the column values in the csv. For example I have a column which contains addresses like
101 High Street, Sometown, Some
county, PO5TC053
Without changing it to the following, it would cause issues when loading the CSV
"101 High Street, Sometown, Some
county, PO5TC053"
After looking at some options with SQLDeveloper, or to manually try to export/import, I found a utility on SQL Server management studio that gets the desired results, and is easy to use, do the following
Goto the source schema on SQL Server
Right click > Export data
Select source as current schema
Select destination as "Oracle OLE provider"
Select properties, then add the service name into the first box, then username and password, be sure to click "remember password"
Enter query to get desired results to be migrated
Enter table name, then click the "Edit" button
Alter mappings, change nvarchars to varchar2, and INTEGER to NUMBER
Run
Repeat process for remaining tables, save as jobs if you need to do this again in the future
Use the SQLDeveloper migration tools
I think quoting column names in oracle is something you should not use. It causes all sort of problems.
As Robert has said, I'd strongly advise agains quoting column names. The result is that you'd have to quote them not only when importing the data, but also whenever you want to reference that column in a SQL statement - and yes, that probably means in your program code as well. Building SQL statements becomes a total hassle!
From what you're writing, I'm not sure if you are referring to the column names or the data in these columns. (Can SQLServer really have a comma in the column name? I'd be really surprised if there was a good reason for that!) Quoting the column content should be done for any string-like columns (although I found that other characters usually work better as the need to "escape" quotes becomes another issue). If you're exporting in CSV that should be an option .. but then I'm not familiar with the export wizard.
Another idea for moving the data (depending on the scale of your project) would be to use an ETL/EAI tool. I've been playing around a bit with the Pentaho suite and their Kettle component. It offered a good range of options to move data from one place to another. It may be a bit oversized for a simple transfer, but if it's a big "migration" with the corresponding volume, it may be a good option.