I am trying to connect Snowflake to Matillion ETL using key-pair authentication.
I'm facing the error 'Default Database must not be empty'.
We have given a default database from Snowflake side as well.
click to see the error
In this environment, we are unable to see the dropdown for default database. We have even tried manually passing the default variables but it shows the error 'invalid JWT token' and our key-pair is correct since we have tested it in other environment where it is working.
When we try to establish the connectivity from another environment of ours, it is successful. We can see a list of options to select from in the dropdown of default database(which is how it should ideally be). click to see successful connection in the other environment
The Default Database, Warehouse and Schema are all set in the 3rd dialog in Matillion's Environment setup window.
According to this Matillion guide to Connecting Matillion ETL to Snowflake, these dropdowns can show no selectable values if there are any problems on the previous (2nd dialog).
If it is an account level problem (account not correct, or no network access) then the default database dropdown list will show a "Loading..." message for several seconds before rendering an empty list. I guess it tries to make network contact in the background, and eventually times out. You will see this if you go backwards and forwards between the 2nd and 3rd dialogs.
In contrast, if it is a user level problem (bad credentials, or not enough privileges) then the default database dropdown will be blank immediately when you enter the 3rd dialog.
According to this Matillion document on environments you can use either a password or a private key to authenticate into Snowflake. So - as per the comments - I agree that if you can connect to Snowflake using the same Account, username and private key using a different SQL client, then it should also work in Matillion ETL.
I have an application with Microsoft Access front end and SQL Server back end. The link is implemented via ODBC data source using SQL Server Native Client 11.0.
There is a table with a column Attachments of OLE Object data type. The back end is a table with VARBINARY(MAX) data type for the Attachments column.
I save files in the Attachment field using Bound Object Frame. Everything works fine until the file size exceeds about 20MB. The statement BoundOLEFrame.Action = acOLECreateEmbed takes about 2.5 minutes to complete. It does not throw any exceptions, but when the following MoveNext or any recordset re-positioning statements is performed, they fail with Run-time error "3426":
"The action was cancelled by an associated object."
As the result the file does not appear to be stored in the database. An attempt to open the file with Access UI by double-clicking on the field causes the error:
"A problem occurred while Microsoft Access was communicating with the
OLE server or ActiveX Control. Close the OLE server and restart it
outside of Microsoft Access. Then try the original operation again in
Microsoft Access."
Suspecting that the issue could be related to ODBC timeout, which is by default is set to 60 seconds, I tried to set the current database QueryTimeout to 600 seconds. But this did not help...
Inserting these large files directly in the table (in Access table datasheet view, right-clicking on the field and selecting Insert Object... in the pop-up menu) first appear as successful, because the file looks like it was inserted and could be opened by double-clicking on the field. But, when I try to close the table, I am prompted if I wish to save it. Answering in affirmative leads to the following error:
"You can't save this record at this time. Microsoft Access may have
encountered an error while trying to save a record. If you close this
object now, the data changes you made will be lost. Do you want to
close the database object anyway?"
According to Access specifications the size of an OLE Object field is 1 GB, which is well above the size of my files.
Any suggestions would be appreciated. I am looking for a way to resolve this particular problem. I don’t think that alternative design for file storage is pertinent to the topic.
I do not have an option to store files any other way.
well the OLE Object field size max may be 1G - - but the overall file size for Access is 2G.
so if you have more than 100 x 20M fields - then there is going to be a problem.
it's unclear how many records are actually transferring to the front end file. I would sanity test a very limited set of records to see whether it is the field size or the overall file size that is coming into play.... perhaps that will give some insight.
I can tell from your post that you DON'T want me to suggest putting those attached files in their own folder and just store the link - but hey - it is the better design.....
I'm facing issue with Linked Table, Access DB and Excel Report. I would provide the scenario, please suggest for a solution if you are aware:
1. Admin user, creates Linked Table in Access DB. This table is open 24/7. Data inserted to DB via Linked Table/Worksheet
2. End User, generates the report from Access DB
3. If the admin user does any data update and when the end user refreshes the report, the Linked worksheet changes to "Read-Only"
Note:
Linked worksheet and Access DB stored in Server-1
End users connected to Server-1 (or) different
This same scenario worked fine in 2007, but its causing issue after migration to 2013.
Thanks in advance.
You can install the Power Query Add-ins and select the option “From Database”
=> “from Microsoft access database” to import the data from access to excel (to generate the report for End users)
Download Power query from below link:
https://www.excelcampus.com/install-power-query/
Refer for more details:
https://www.excelcampus.com/install-power-query/
Note:
This features is already available in Excel 2016
Your post is not clear. Let's review:
Admin user, creates Linked Table in Access DB. This table is open 24/7. Data inserted to DB via Linked Table/Worksheet
*** do you mean that the Access db is linking to an excel file? ... if so what is meant by "inserted to db" ?? There are 2 choices: linked to excel - or - import excel data into Access table
End User, generates the report from Access DB
*** ok
If the admin user does any data update and when the end user refreshes the report, the Linked worksheet changes to "Read-Only"
*** admin is changing data in excel sheet?
*** In general; Access does not like an excel opened when it is communicating to it. i.e. generating a report. It wants exclusive use. There should be no problem working in the Excel when the Access app is closed.
Have just deployed my Project on to my reporting Server.
I have multiple datasets which are referencing views which exist on the db on that server.
When I try to go into any report part I am getting this message:
An error has occurred during report processing. (rsProcessingAborted)
Query execution failed for dataset 'dataset1'. (rsErrorExecutingCommand)
For more information about this error navigate to the report server on the local server machine, or enable remote errors
Can anyone help?
I enabled remote errors to pinpoint the problem.
I identified that a column in a particular dataset (one of my views) was throwing an error.
So using a tool "SQL Delta", I compared the development version of the database with the live version on the reporting server. I noticed that one of the views had an extra column on the development server, that was not on the live version of the db.
SQL Delta generated the script I needed to run to update the view on my live db.
I ran this script, re-ran the report, everything worked.
I encountered a similar error message. I was able to fix it without enabling remote errors.
In Report Builder 3.0, when I used the Run button to run the report, an error alert appeared, saying
An error has occurred during report processing. (rsProcessingAborted)
[OK] [Details...]
Pressing the details button gave me a text box where I saw this text:
For more information about this error navigate to the report server
on the local server machine, or enable remote errors
----------------------------
Query execution failed for dataset 'DataSet1'. (rsErrorExecutingCommand)
I was confused and frustrated, because my report did not have a dataset named 'DataSet1'. I even opened the .rdl file in a text editor to be sure. After a while, I noticed that there was more text in the text box below what I could read. The full error message was:
For more information about this error navigate to the report server
on the local server machine, or enable remote errors
----------------------------
Query execution failed for dataset 'DataSet1'. (rsErrorExecutingCommand)
----------------------------
The execution failed for the shared data set 'CustomerDetailsDataSet'.
(rsDataSetExecutionError)
----------------------------
An error has occurred during report processing. (rsProcessingAborted)
I did have a shared dataset named 'CustomerDetailsDataSet'. I opened the query (which was a full SQL query entered in text mode) in SQL Server Management Studio, and ran it there. I got error messages which clearly pointed to a certain table, where a column I had been using had been renamed and changed.
From that point, it was straightforward to modify my query so that it worked with the new column, then paste that modification into the shared dataset 'CustomerDetailsDataSet', and then nudge the report in Report Builder to recognise the change to the shared dataset.
After this fix, my reports no longer triggered this error.
Like many others here, I had the same error. In my case it was because the execute permission was denied on a stored procedure it used. It was resolved when the user associated with the data source was given that permission.
I experienced the same issue, it was related to security not being granted to part of the tables. review your user has access to the databases/ tables/views/functions etc used by the report.
The solution for me came from GShenanigan:
You'll need to check out your log files on the SSRS server for more detail. They'll be somewhere like: "C:\Program Files (x86)\Microsoft SQL Server\MSRS10_50.DEV\Reporting Services\LogFiles\"
I was able to find a permissions problem on a database table referenced by the view that was not the same one as the where the view was. I had been focused on permissions on the view's database so this helped pinpoint where the error was.
I just dealt with this same issue. Make sure your query lists the full source name, using no shortcuts. Visual Studio can recognize the shortcuts, but your reporting services application may not be able to recognize which tables your data should be coming from. Hope that helps.
I had the similar issue showing the error
For more information about this error navigate to the report server on
the local server machine, or enable remote errors Query execution
failed for dataset 'PrintInvoice'.
Solution:
1) The error may be with the dataset in some cases, you can always check if the dataset is populating the exact data you are expecting by going to the dataset properties and choosing 'Query Designer' and try 'Run', If you can successfully able to pull the fields you are expecting, then you can be sure that there isn't any problem with the dataset, which takes us to next solution.
2) Even though the error message says "Query Failed Execution for the dataset", another probable chances are with the datasource connection, make sure you have connected to the correct datasource that has the tables you need and you have permissions to access that datasource.
In my situation, I created a new SSRS report and new stored procedure for the dataset. I forgot to add the stored procedure to the database role that had permission to execute it. Once I added the permissions to SQL database role with EXECUTE, all was fine!
The error message encountered by the user was "An error occurred during client rendering. An error has occurred during report processing (rsProcessingAborted). Query execution failed for dataset "DataSet1'. (rsErrorExecutingCommand) For more information..."
Very grateful I found this great post. As for my case, the user executing the stored procedure did not have EXECUTE permissions. The solution was to grant EXECUTE permissions for the user within the stored procedure by adding below code to the end of the stored procedure.
GRANT EXECUTE ON dbo.StoredProcNameHere TO UsernameRunningreports
GO
I also had a very similar issue with a very similar error message. My issue was that the database could not be connected to. In our case, we have mirrored databases and the connection string did not specify the Failover Partner. So when the database couldn't connect, it never went to the mirror and was throwing this error. Once I specified the Failover Partner in the connection string for my datasource, it resolved the issue.
BIGHAP: A SIMPLE WORK AROUND FOR THIS ISSUE.
I ran into the same problem when working with SharePoint lists as the DataSource, and read the blogs above which were very helpful. I had made changes in both the DataSource and Data object names and query fields in Visual Studio and the query worked in visual Studio. I was able to deploy the report to SharePoint but when I tried to open it I received the same error.
I guessed that the issue was that I needed to redeploy both the DataSource and the DataSet to SharePoint so that that changes in the rendering tools were all synced.
I redeployed the DataSource, DataSet and the Report to sharePoint and it worked.
As one of the blogs stated, although visual studio allowed the changes I made in the dataset and datasource, if you have not set visual studio to automatically redeploy datasource and dataset when you deploy the report(which can be dangerous, because this can affect other reports which share these objects) this error can occur.
So, of course the fix is that in this case you have to redeploy datasource, dataset and Report to resolve the issue.
I was also facing the same issue - I checked below things to fix this issue,
If you have recently changed pointing database-name in data-source
then first check that all the store procedures for that report exist
on changed database.
If there are multiple sub reports on main report then make sure each
report individually running perfectly.
Also check security panel - user must have access to the databases/
tables/views/functions for that report.
Sometimes, we also need to check dataset1 - store procedure. As if you are trying to show the report with user1 and if this user doesn't have the access(rights) of provided (dataset1 database) database then it will throw the same error as above so must check the user have access of dbreader in SQL Server.
Also, if that store procedure contains some other database (Database2) like
Select * from XYZ inner join Database2..Table1 on ... where...
Then user must have the access of this database too.
Note: you can check log files on this path for more details,
C:\Program Files\Microsoft SQL Server\MSRS11.SQLEXPRESS\Reporting Services
I got same error but this worked and solved my problem
If report is connected to Analysis server then give required permission to the user (who is accessing reporting server to view the the reports) in your model of analysis server.
To do this add user in roles of model or cube and deploy the model to your analysis server.
Using SSRS, Report Builder 3.0, MSSQL 2008 and query to an Oracle 11G database,
I found that the oracle stored procedure ran well, produced consistent results with no errors. When I tried bringing the data into SSRS, I got the error as listed in OP's query. I found that the data loaded and displayed only if I removed the parameters (not a good idea).
On Further examination, I found that under dataset properties>parameters I had set the start date to parameterName P_Start and parameter Value to #P_Start.
Adding the Parameter value as [#P_Start] cleared the problem, and the data loads well, with parameters in place.
This problem was caused by an orphaned SQL Login. I ran my favorite sp_fixusers script and the error was resolved. The suggestion above to look at the logs was a good one...and it led me to my answer.
This might be the permission issue for your view or store procedure
In addition to the above answers, it could be due to a missing SQL stored-procedure or SQL function. For example, this could be due to the function not migrating from a non-prod region to the production (prod) region.
Removing all comments from the Select Query fixed this for me. My dataset was working in the Preview but when I went to Design/Query Designer and and tried the query there I was getting ORA-01006;bind variable does not exist. After removing all comments from the select it worked.
I am having to modify an old web project that us using classic asp. There are actually 2 different projects that are clones of each other, they just point to different databases.
I modified the code from the first project (asp, db, stored procs etc.) and it all works great.
I then copied all that code to the other project since they are clones. All works just fine there too. I can execute the stored procs in query analyzer and all the data comes back as expected and it shows up on the display asp pages.
When i hit the edit button on the page I get the "Microsoft OLE DB Provider for SQL Server error '80040e09'" and it shows the select part of the query in the error window.
I dont get anything about permissions etc.. If I view the page source the data is actually in there. I am really confused as to what is going on.
Anyone have any suggestions or things to look for.
Thanks
This appears to be a permissions error based on the usual meanings of this error code.
I would manually log in to the database using the same credentials you have configured in your application's connection string. Then run the same query and see what happens.