Can't query data from my postgres database in Qt - database

I am very new to sql (3 days) and im using pgAdmin. Within pgAdmin, I can query any data from any availabe database's as normal, but from Qt after connecting successfully, I can only query data from the default database 'postgres' that I created a table for. Attempting to query data from any other database I created does nothing from Qt. My guess is that it could be an access privilege problem, so I used the grant wizard to grant all access to the table I want to view, but still no luck:
Access privileges
Schema | Name | Type | Access privileges | Column privileges | Policies
--------+--------+-------+----------------------------------+---------------- ---+----------
public | tTable | table | postgres=a*r*w*d*D*x*t*/postgres+| |
| | | =arwdDxt/postgres | |
(1 row)
Like I said I am really new and kinda lost with this issue. Any help appriciated. Thanks

Related

Import data from Excel to existing table in the bottom while handling auto Increment column in SSMS

I have a existing table like below in SSMS (SQL Server Management Studio).
Students_Table
UID | ID | Name | Location |
-----+------+---------+-----------+
1 1 Jogn US
2 2 Alia UK
where in UID is an auto increment and ID,Name,Location are my data columns
So I have an excel with data like below:
ID | Name | Location |
----+---------+-----------+
3 Jk Santa
4 Lima PS
Now the above two columns should be added to the bottom of the above existing Students_Table in SQL Server, but how to handle the auto increment column now?
Am I suppose to add the that column in my excel and import by considering the next incremented number.
Please help.
To be able to copy and paste the data from Excel to SQL Server, you will have to include an empty column in Excel for the identity (UID) column:
You can check out my blog article that describes how to copy data from Excel to SQL Server for further details. Section 2b includes a step-by-step instruction with screenshots for your case when having an identity column.

SSIS - Insert result set from stored proc into table on another DB server

I think I have stumbled upon an edge-case where neither ADO.NET Data Source nor OLDEB Data Source can fully meet my needs:
Control Flow -> Execute SQL Task -> ADO.NET Data Source allows handling stored procedures with user-defined tables (table-valued parameters) as parameters.
However, I don't have a way/know how to then insert the data into a table on another server.
Data Flow -> OLEDB Data Source allows piping the results of a data source on one server directly into the data source on another server.
However, all Data Flow sources (OLEDB Data Source or even ADO.NET Data Source) do not appear to allow passing in Parameter Mapping, and therefore have no way to pass in a complex user-defined type.
I also cannot use Variable Expressions, because there doesn't appear to be a way to insert an Object as an Expression Value.
Server1 has the following stored procedure:
CREATE TYPE [dbo].[OrderKeyList] AS TABLE(
[OrderKey] [varchar](50) NULL
)
GO
CREATE PROCEDURE [dbo].[GetAllOrdersInOrderList] (
#OrderList dbo.OrderKeyList not null
)
AS
BEGIN
SELECT o.*
FROM dbo.Orders o
WHERE o.OrderKey in (SELECT o.OrderKey FROM #OrderList);
END
GO
SSIS Package is as follows:
.-[Sequence Container]-----------------------------------------------.
| |
| .-[Data Flow Task - Populate User Variable User::OrderList]-. |
| | | |
| '-----------------------------------------------------------' |
| | |
| \|/ |
| .-[Execute SQL Task - call dbo.GetOrdersByOrderList]--------. |
| | | |
| '-----------------------------------------------------------' |
| | |
| \|/ |
| .-[ ?????????????????????????????????????? ]----------------. |
| | | |
| '-----------------------------------------------------------' |
'--------------------------------------------------------------------'
The only solution I can think of is to add a second stored procedure on the Source DB, which takes a #OrderList varchar(max) instead of a dbo.OrderKeyList table, and calls a dbo.Split(',',#OrderList) and passes that to the real stored procedure:
CREATE PROCEDURE [dbo].[GetAllOrdersInOrderListWrapper] (
#OrderList varchar(max)
)
AS
BEGIN
DECLARE #tmpOrderList dbo.OrderKeyList
SELECT
DISTINCT CAST(o.Data as varchar(50))
INTO #tmpOrderList
FROM dbo.Split(',', #OrderList) o;
EXEC dbo.GetAllOrdersInOrderList #tmpOrderList
END;
GO
But I really dislike this approach, because:
How will it scale to thousands of rows?
It requires the stored procedure source to add another stored procedure, just for SSIS.
If you go with the Control Flow / Execute SQL Task approach, you can move the data to a table on another server via a Linked Server.
If you go with the wrapper stored procedure approach you mentioned, it won't scale horribly for an ETL process, even at thousands of rows. If it were me, this is the option I would go with.
Your problem might be related to trying to pass an object as a user defined type, I have never tried passing a 'table' as a parameter.
I would do this:
Instead of pushing data to an object variable push it to a table, possibly temp, on the source. Your stored procedure can be run with the name of the table as a parameter and then execute a dynamic sql
CREATE PROCEDURE [dbo].[GetAllOrdersInOrderList] (
#OrderList varchar(500) not null
)
AS
BEGIN
Set #Sql = ' SELECT o.*
FROM dbo.Orders o
WHERE o.OrderKey in (SELECT o.OrderKey FROM ' + #OrderList +')';
Exec #Sql
END
Maybe someone else has a way to do it exactly as you describe, I will be very interested to see such a solution myself.
After researching some more, I found a suggestion on Stack Overflow which pointed me to this article by #Jon Seigel -- Using table-valued parameters in SSIS.
The net-net of it is to pass the User::OrderList as a parameter to a Script Task, and write a C# program to use ADO.NET directly, thereby bypassing the limitations of SSIS GUI.
Why would you want to do it this way? Why not just write everything in C#? Well, by doing it this way, I personally still see the benefit of SSIS, in that the overall orchestration of the ETL process is still graphical and so should be very easy to read, even if it is more painful to write (Ha! when is SSIS not painful to write?).
I am trying this approach now, as I can see multiple benefits, including the option in the future to use C# SqlBulkCopy class to bulk insert the data for faster loads.

Using Full-Text indexing to crawl binary blobs

If i store binary files (e.g. doc, html, xml, xps, docx, pdf) inside a varbinary(max) column in SQL Server, how can i use Full-Text indexing to crawl the binary files?
Imagine i create a table to store binary files:
CREATE TABLE Documents (
DocumentID int IDENTITY,
Filename nvarchar(32000),
Data varbinary(max),
)
How can i leverage the IFilter system provided by Windows to crawl these binary files and extract useful, searchable, information?
The motivation for this, of course, is that Microsoft's Indexing Service is derprecated, and replaced with Windows Search. Indexing Service provided an OLEDB provider (MSIDX that SQL Server could use to query the Indexing Service catalog. The Indexing Service OLE DB Provider
Windows Search, on the other hand has no way to query the catalog. There is no way for SQL Server to access Windows Search.
Fortunately, the capabilities of Windows Search (and Indexing Service before it) were brought into SQL Server proper. The SQL Server Full-Text indexing service uses the same IFilter mechanism that has been around for 19 years.
The question is: how to use it to crawl blobs stored in the database.
SQL Server fulltext can index varbinary and image columns.
You can see the list of all file types currently supported by SQL Server:
SELECT * FROM sys.fulltext_document_types
For example:
| document_type | class_id | path | version | manufacturer |
|---------------|--------------------------------------|----------------------------------------------------------------------------------|-------------------|-----------------------|
| .doc | F07F3920-7B8C-11CF-9BE8-00AA004B9986 | C:\Windows\system32\offfilt.dll | 2008.0.9200.16384 | Microsoft Corporation |
| .txt | C7310720-AC80-11D1-8DF3-00C04FB6EF4F | c:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Binn\msfte.dll | 12.0.6828.0 | Microsoft Corporation |
| .xls | F07F3920-7B8C-11CF-9BE8-00AA004B9986 | C:\Windows\system32\offfilt.dll | 2008.0.9200.16384 | Microsoft Corporation |
| .xml | 41B9BE05-B3AF-460C-BF0B-2CDD44A093B1 | c:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Binn\xmlfilt.dll | 12.0.9735.0 | Microsoft Corporation |
When creating the varbinary (or image) column to contain your binary file, you must have another string column that gives the file type through its extension (e.g. ".doc")
CREATE TABLE Documents (
DocumentID int IDENTITY,
Filename nvarchar(32000),
Data varbinary(max),
DataType varchar(50) --contains the file extension (e.g. ".docx", ".pdf")
)
When adding the binary column to the full-text index SQL Server needs you to tell it which column contains the data type string:
ALTER FULLTEXT INDEX ON [dbo].[Documents]
ADD ([Data] TYPE COLUMN [DataType])
You can test by importing a binary file from the filesystem on the server:
INSERT INTO Documents(filename, DataType, data)
SELECT
'Managing Storage Spaces with PowerShell.doc' AS Filename,
'.doc', *
FROM OPENROWSET(BULK N'C:\Managing Storage Spaces with PowerShell.doc', SINGLE_BLOB) AS Data
You can view the catalog status using:
DECLARE #CatalogName varchar(50);
SET #CatalogName = 'Scratch';
SELECT
CASE FULLTEXTCATALOGPROPERTY(#CatalogName, 'PopulateStatus')
WHEN 0 THEN 'Idle'
WHEN 1 THEN 'Full population in progress'
WHEN 2 THEN 'Paused'
WHEN 3 THEN 'Throttled'
WHEN 4 THEN 'Recovering'
WHEN 5 THEN 'Shutdown'
WHEN 6 THEN 'Incremental population in progress'
WHEN 7 THEN 'Building index'
WHEN 8 THEN 'Disk is full. Paused.'
WHEN 9 THEN 'Change tracking'
ELSE 'Unknown'
END+' ('+CAST(FULLTEXTCATALOGPROPERTY(#CatalogName, 'PopulateStatus') AS varchar(50))+')' AS PopulateStatus,
FULLTEXTCATALOGPROPERTY(#CatalogName, 'ItemCount') AS ItemCount,
CAST(FULLTEXTCATALOGPROPERTY(#CatalogName, 'IndexSize') AS varchar(50))+ ' MiB' AS IndexSize,
CAST(FULLTEXTCATALOGPROPERTY(#CatalogName, 'UniqueKeyCount') AS varchar(50))+' words' AS UniqueKeyCount,
FULLTEXTCATALOGPROPERTY(#CatalogName, 'PopulateCompletionAge') AS PopulateCompletionAge,
DATEADD(second, FULLTEXTCATALOGPROPERTY(#CatalogName, 'PopulateCompletionAGe'), 0) AS PopulateCompletionDate
And you can query the catalog:
SELECT * FROM Documents
WHERE FREETEXT(Data, 'Bruce')
Additional IFilters
SQL Server has a limited set of built-in filters. It can also use IFilter implementations registered on the system (e.g. Microsoft Office 2010 Filter Pack that provides docx, msg, one, pub, vsx, xlsx and zip support).
You must enable the use of the OS-level filters by enabling the option:
sp_fulltext_service 'load_os_resources', 1
and restart the SQL Server service.
load_os_resources int
Indicates whether operating system word breakers, stemmers, and filters are registered and used with this instance of SQL Server. One of:
0: Use only filters and word breakers specific to this instance of SQL Server.
1: Load operating system filters and word breakers.
By default, this property is disabled to prevent inadvertent behavior changes by updates made to the operating system. Enabling use of operating system resources provides access to resources for languages and document types registered with Microsoft Indexing Service that do not have an instance-specific resource installed. If you enable the loading of operating system resources, ensure that the operating system resources are trusted signed binaries; otherwise, they cannot be loaded when verify_signature is set to 1.
If using SQL Server before SQL Server 2008, you must also restart the Full-Text indexing service after enabling this option:
net stop msftesql
net start msftesql
Microsoft provides filter packs contain IFilter for the Office 2007 file types:
Microsoft Office 2007 Filter Packs
Microsoft Office 2010 Filter Packs
And Adobe provides an IFilter for indexing PDFs (Foxit provides one, but theirs is not free):
Adobe PDF iFilter 64 11.0.01
Bonus Reading
KB945934: How to register Microsoft Filter Pack IFilters with SQL Server
MS Blogs: Getting a custom IFilter working with SQL Server 2008/R2 (IFilterSample)

How to join tables in different database on the same Sybase server

Am working on Sybase ASE 15.5.
I have 2 databases created in the same server, "DatabaseA" and "DatabaseB". The database owner is "User".
Logging in as "User" I created a table in "DatabaseB", called "TableA".
Now, my user has access to both database, but the default database is "DatabaseA".
This is sucessful when i login to DatabaseA:
USE DatabaseB
GO
SELECT * from DatabaseB.User.TableA
GO
But this is not:
USE DatabaseA
GO
SELECT * from DatabaseB.User.TableA
GO
It tells me that there is "No such object or user exists in the database".
I have Googled and most sites say that if the user has rights, then you only need to append the database and owner name to the table to access it. But it does not seems to work for my case.
I have tried creating a non DBO user "User2", and assigning it select rights using
GRANT SELECT ON DatabaseB.User.TableA to User2
and sp_helprotect shows that the rights is there for this user. But the results are exactly the same as when i query it with User.
Below is the result from sp_helprotect
grantor | grantee | type | action | object | column | grantable
'User' | 'User2' | 'Grant' | 'Select' | 'TableA' | 'All' | 'FALSE'
Is there anything configuration or setting that needs to be checked to enable this?
EDIT (22 July 2015)
Just discovered something. There are a few tables with DatabaseB that i can access from DatabaseA, but not all tables.
For example, there is TableA, TableB, TableC, and TableD in DatabaseB. Out of which TableB and TableD can be queried from DatabaseA using
USE DatabaseA
GO
SELECT * from DatabaseB.User.TableB
GO
SELECT * from DatabaseB.User.TableD
GO
which is sucessful. And
USE DatabaseA
GO
SELECT * from DatabaseB.User.TableA
GO
SELECT * from DatabaseB.User.TableC
GO
fails.
Help!!!
try SELECT * from DatabaseB..TableA to fetch the result from different database
also you can do below
use DatabaseB
SELECT * from TableA
you must at least have read access to the database .
Just another example :
Just another example:
select tabA.*,tabC.* from DatabaseB..TableA tabA, DatabaseA..TableC tabC
where tabA.xxx = tabC.xxx
The ASE server login seems to be 'User' -- a login is the thing that needs a password. This is not the same as the database user inside a database. THis mapping is established with 'sp_adduser'.
To resolve your problem, you need to figure out which DB user you are.
Run the following:
use DatabaseA
go
select user_name() user_in_A, suser_name() login_name
go
use DatabaseB
go
select user_name() user_in_B, suser_name() login_name
go
The output of these queries should help you move forward.
Having a login and user caled "USER" is not a good idea as it is also the name of the system function that returns the name of the current user. Can you try the same query using a different user altogether (eg. "bob")?
You may need to grant the appropriate permissions to bob but at least there won't be any confusion about user names.
This is further to RobV's comments (I don't have enough rep to add comments though!)

How to Auto-Update Database according to date?

how to auto update the database according to date?
Message ID | Message | StartDate | EndDate | Status
1 | Hello | 07/7/2012 | 08/7/2012 | Expired
2 | Hi | 10/7/2012 | 12/7/2012 | Ongoing
3 | Hi World | 11/7/2012 | 18/7/2012 | Pending
How to auto update the status according to the date today?
More information : I'm using SQL-Server Management Studio. Sorry for not stating.
I would create a SP that sets the status to "Expired" for all messages that have EndDate > GETDATE() and schedule it using a job in Sql Server:
CREATE PROCEDURE UpdateMessages
AS
UPDATE Messages SET Status = 'Expired' WHERE EndDate > GETDATE()
GO
The best thing you can do is to create store procedure that update records on your table based on your date time and create SQL server job then schedule it on your desired time when to execute it.
I don't think there is a way for the table to update itself. You should consider scheduling a Job in SQL Server.
See this MSDN article here
The job would run daily and consider each row and update the status where appropriate.

Resources