I get the good old
Incorrect syntax near 'EXTERNAL'.
error. I am exactly doing what this answer describes. But SQL Server returns the aforementioned error when I come to this code-chunk:
CREATE EXTERNAL FILE FORMAT csvformat
WITH (
FORMAT_TYPE = DELIMITEDTEXT,
FORMAT_OPTIONS (FIELD_TERMINATOR =',')
);
GO
What am I doing wrong?
What I tried
Java runtime environment is installed (Java 8 Update 201)
PolyBase is installed with "PolyBase Query Service for External Data"
I enabled PolyBase with EXEC sp_configure 'hadoop connectivity', 4;. I also set that option to 1 and 7 - I still get that error
Using EXEC sp_configure, I also set 'polybase enabled' to 1
I checked SELECT SERVERPROPERTY ('IsPolybaseInstalled') AS IsPolybaseInstalled; - it returns 1
My TCP is enabled
My PolyBase is running:
Setup: SQL Server 2019 on a Virtual Machine (Azure), no Azure SQL Server or Azure DWH.
Perhaps too simple answer, but can you restart entire virtual server and try it again?
Update: Reboot of the server/service after installation of Polybase is not stated in the documentation, also not requested by the installer, however plenty of messages on user boards tell that it is required to make Polybase work.
Related
I regularly end up developing on projects where either the Central Test server, or Prod server is an older version of SQL Server than my local install.
e.g. Prod is SQL Server 2014. Local install in SQL Server 2019.
Mostly that's fine, it just means I have to remember to only use old syntaxes. (:cry:)
But occasionally I forget which syntaxes are old. Ooops.
Obviously our test environments catch this, but it would be great to be able to tell my Local Server ... "only accept SS2014 syntax", and have these mistakes caught before they're committed/pushed.
I thought this was what CompatibilityLevel was supposed to do. But either it doesn't, or I'm using it wrong. (See below)
How should I achieve this? (other than just installing a different SQL version!)
Attempt so far:
Run ALTER DATABASE MyLocalProjectDB SET COMPATIBILITY_LEVEL = 120; (120 represents SS2014)
Run SELECT name, compatibility_level FROM sys.databases WHERE name = db_name(); to confirm that the previous command "worked".
Run DROP TABLE IF EXISTS [ATableThatDoesExist], to see if the syntax is accepted. It was.
DROP IF EXISTS was new to SS2016:
MSDN: IF EXISTS ... Applies to: SQL Server ( SQL Server 2016 (13.x) through current version).
Additional Source
Why hasn't this worked?
I have an issue whereby I can execute an SSRS report which calls an Oracle Stored Procedure in VS2017, but when I deploy to the SSRS Server and run, it returns the following message:-
• An error has occurred during report processing. (rsProcessingAborted)
o Query execution failed for dataset 'spTestSubDet'. (rsErrorExecutingCommand)
For more information about this error navigate to the report server on the local server machine, or enable remote errors
The dataset 'spTestSubDet' is the Oracle Stored Proc.
Some configuration details:-
Oracle Database 19c Standard Edition 2 Release 19.0.0.0.0 - Production
SSRS version is 15.0.19528.0.
SQL Server version is 2014.
I can execute SQL code and Views against the Oracle server with the same DSN from the deployed report (without the oracle stored proc being present), so I know the DSN configuration is not the issue.
I have also check marked the "Use single transaction when processing the queries" box in the DS Properties.
I’m guessing that it might be some form of “Execute” permissions issue on Oracle, rather than the Report Server, where the Stored Proc is concerned.
As a developer, I don’t have any DBA permissions to interrogate how the SSRS Server is set up, or the Oracle DB, so any suggestions will have to be passed on to my ICT dept.
I also can't enable "remote errors" on the Report Server, but have requested that with the ICT dept.
Any help greatly appreciated.
Seems I got lucky with enabling “Remote Errors” on the report server and not personally having to restart the service.
I now have a more explicit error message from the SSRS report:-
“ORA-06550: line 1, column 7: PLS-00306: wrong number or types of arguments in call to 'SPTESTSUBDET' ORA-06550: line 1, column 7: PL/SQL: Statement ignored”
As mentioned in my original post, the report works fine locally from VS2017, so I don’t know why it’s telling me when deployed and run from the server that there seems to be a problem with the SQL code:-
create or replace
PROCEDURE SPTESTSUBDET
(s1 OUT SYS_REFCURSOR)
IS
BEGIN
OPEN s1 FOR
SELECT
*
FROM
onemain.sbceysubmitted sbceysub
WHERE
sbceysub.STUD_ID = 167071
;
END SPTESTSUBDET;
It’s as simple a test as I can put together and doesn’t use any parameters to complicate things.
I’m wondering if it might be a driver issue, though why it works locally and not on the server is baffling me.
I have Oracle Developer tools “ODAC v18.3.0” installed for VS2017.
The user in the referenced post below had what looked like to be the same problem, but it's not clear what version of the ODAC tools has been used to resolve the issue:-
https://stackoverflow.com/a/60569788/2053847
Any thoughts/help greatly appreciated.
The easiest thing to do is check the log files. I bet this is a SQL exception and it is related to something wrong with the way you are calling the stored procedure or within the stored procedure itself. The log files reside on the SSRS instance at -> SQL SERVER INTALL DIR\MSSQL.15(OR OTHER SSRS VERSION DIR)\Reporting Service\Log Files. Log files for the SSRS manager and SSRS service are saved here. Open the log for the SSRS Service after you encounter the error search for "spTestSubDet" and you should see the detail of the exception that is causing your problems.
This question is related to: Debezium How do I correctly register the SqlServer connector with Kafka Connect - connection refused
In Windows 10, I have Debezium running on an instance of Microsoft SQL Server that is outside of a Docker container. I am getting the following warning every 390 milliseconds:
No maximum LSN recorded in the database; please ensure that the SQL
Server Agent is running
[io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource]
I checked Debezium's code on Github and the only place that I can find this warning states in the code comments that this warning should only be thrown if the Agent is not running. I have confirmed that the SQL Server Agent is running.
Why is this warning showing up and how do I fix it?
Note:
My current solution appears to only work in a non-production environment - per Docker's documentation.
LSN is the "pieces" of information related about your SQL Server changes. If you don't have LSN, is possible that your CDC is not running or not configured properly. Debezium consumes LSNs to replicate so, your SQL Server need to generate this.
Some approaches:
Did you checked if your table are with CDC enabled? This will list your tables with CDC enabled:
SELECT s.name AS Schema_Name, tb.name AS Table_Name
, tb.object_id, tb.type, tb.type_desc, tb.is_tracked_by_cdc
FROM sys.tables tb
INNER JOIN sys.schemas s on s.schema_id = tb.schema_id
WHERE tb.is_tracked_by_cdc = 1
Your CDC database are enabled and runnig? (see here)
Check if enabled:
SELECT *
FROM sys.change_tracking_databases
WHERE database_id=DB_ID('MyDatabase')
And check if is running:
EXECUTE sys.sp_cdc_enable_db;
GO
Your CDC service are running on SQL Server? See in docs
EXEC sys.sp_cdc_start_job;
GO
On enabling table in CDC, I had some issues with rolename. For my case, configuring at null solved my problem (more details here)
EXEC sys.sp_cdc_enable_table
#source_schema=N'dbo',
#source_name=N'AD6010',
#capture_instance=N'ZZZZ_AD6010',
#role_name = NULL,
#filegroup_name=N'CDC_DATA',
#supports_net_changes=1
GO
Adding more to William's answer.
For the case SQL Server Agent is not running
You can enable it by following :
Control panel >
Administrative Tools >
Click "Services"
Look for SQL Server Agent
Right click and Start
Now you can fire cdc job queries in your mssql.
PS: you need to have login access to windows server.
Another possibility of this error (I just ran into this warning myself this morning trying to bring a new DB online) is the SQL login does not have the permissions needed. Debezium runs the following SQL. Check that the SQL login you are using has access to run this stored procedure and it returns the tables you have set up in CDC. If you get an error or zero rows returned, work with your DBA to get the appropriate permissions set up.
EXEC sys.sp_cdc_help_change_data_capture
I am trying out the new Polybase-Feature in SQL-Server by connecting to a CSV. However I do not manage to connect to the Azure Blob Storage:
CREATE EXTERNAL DATA SOURCE AzureBlob WITH (
TYPE = HADOOP,
LOCATION = 'wasbs://myfolder#myblob.blob.core.windows.net',
CREDENTIAL = mycredential
);
GO
I always get an error saying:
Incorrect syntax near 'HADOOP'
My SQL Server runs on an Azure VM, however I am not sure which services are supposed to be running:
I also checked TCP/IP is enabled.
I also tried using SSDT and dsql-files as suggested in this post - but the error doesn't go away.
However I do not manage to connect to the Azure Blob Storage
Should it not be a Type=BLOB_STORAGE?
CREATE EXTERNAL DATA SOURCE AzureBlob WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'wasbs://myfolder#myblob.blob.core.windows.net',
CREDENTIAL = mycredential
);
Update 2020-02-18:
I encounter the same famous message recently:
Incorrect syntax near 'HADOOP'
It can be fixed running:
exec sp_configure 'polybase enabled', 1;
GO
RECONFIGURE
Microsoft built a nice page: Configure PolyBase to access external data in Azure Blob Storage. However, they didn't include that important command.
I think it could also be a reason of the initial issue of 5th
While I accepted Alexander's answer, it turns out that the option BLOB_STORAGE doesn't allow to create external tables. The option HADOOP was the correct one for me. There were three steps I needed to do to make the HADOOP option work:
Re-install Java Runtime Environment
Repair the SQL Server Installation
Restart the Virtual Machine
Then the SQL-Statement from my question worked.
We are now in progress of moving all our production databases from a SQL Server 2005 32 bit instance to a brand new SQL Server 2012 64 bit instance.
one of the main hardships that our developers still suffer is Linked Servers.
We have a lot of programs that need to get some data from text, csv or excel files, and the way it's implemented is with a linked server to text files so you can easily throw a select statement to the text file and insert it into a table.
The problem raised that the 32 bit server used the Microsoft.Jet.OLEDB.4.0 driver the files where on a sheared directory that had full permissions for everyone and we never ran into security issues.
On the new 64 bit server we added a linked server with the following syntax:
USE [master]
GO
EXEC master.dbo.sp_addlinkedserver
#server = N'TEMP_FILES_1'
, #srvproduct=N''
, #provider=N'Microsoft.ACE.OLEDB.12.0'
, #datasrc=N'\\SERVER-APP01\BWA\TempFiles'
, #provstr=N'Text'
Note:
The data source is on a network share.
The MSSQL service runs as the domain administrator account.
I'm logged in remotely as the domain administrator which is of course a local administrator too.
The \\SERVER-APP01\BWA\TempFiles directory has full access set for everyone.
Now when i run EXEC sp_testlinkedserver [TEMP_FILES_1] i get the following error message:
OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "TEMP_FILES_1" returned message "'\\SERVER-APP01\BWA\TempFiles' is not a valid path. Make sure that the path name is spelled correctly and that you are connected to the server on which the file resides.".
This is definitely a security issue, but the funny part is that when i run xp_cmdshell 'dir \\SERVER-APP01\BWA\TempFiles' it returns records so obviously the service has access to this folder...
On the other side on my local computer i also have a 64 bit instance with the same linked server and it works like a charm!
I've been crawling around the internet to find a solution to my problem but seems that linked servers to text files is used very little especially with 64 bit.
We had to copy the file to a drive on the destination SQL Server system before we could read the data as below:
EXEC xp_cmdshell 'net use y: \\[source directory path] [pw] /USER:[active user]'
EXEC xp_cmdshell 'copy y:\[source file] [destination directory]'
EXEC xp_cmdshell 'net use y: /delete'