QGIS can't load and edit MSSQL layers anymore - sql-server

I have QGIS 3.4 Madeira LTR connected to my Microsoft SQL Server Management Studio 17. I have a lot of data on the SQL Server and since the start of the new year I can't edit my layers in QGIS anymore. I can load the data but it doesn't visualize and I can't zoom on the Layer Extent (In Options it says Extent=Empty but thats not true, because I checked the tables on the Server and they are structured like before). The weird thing is, when I load a layer from my harddrive everything works just fine. The loaded layers from my SQL Server show up but I can't open the attribute table or select features. In some cases Im able to open the attribute table but it only shows one entry (no filters activated). I was thinking that something is wrong with the geometry or the CRS but I did not update the software or change anything in the SQL tables. QGIS even crashes when trying to open attribute tables. It gave me the option "try to repair the map document" but after trying it the connected SQL Table disappeared on the Server but is still visible in the MSSQL dropdown menu on the left (but the data on the SQL Server is definitely gone). Also weird is that saved map documents show the data when I open them but when I add a new SQL layer the data doesn't show up. I would really appreciate some help.
I checked the SQL tables if maybe some primary keys were missing or the geometry column. I checked my update history but nothing was updated. I'm a bit lost where to start and scared to lose more of my data.

Related

How can I update existing SQL datasources (Live) to connect to correct views (reconnect) in Tableau Deskop?

I have one question. I am new in Tableau and I have to update existing SQL datasources (Live datasources) to connect to correct views (reconnect) in Tableau Deskop, because of some mappings/new data that were added to SQL database.
For the same reasons, I have to also create new Tableau datasources.
How do I do that, update existing SQL datasources and create new datasources? Also, for creating new datasources is it possible to do it from Tableau Server or just from Tableau Desktop?
Thanks.
I believe Tableau only allows you to create data sources in the desktop environment.
If your SQL source is using a SQL statement you can alter your query to bring in the new data and then refresh the extract.
It's worth noting that the desktop extract is no longer used once you've published your workbook to the server (until you need to make more changes in the desktop environment).
I find making changes on Tableau server limited, so it's almost always best to make significant changes on desktop.
Also, if you're changing to new data sources, once you've loaded the data into Tableau, in the Sheet view you should be able to right click on one source and choose to point your views to the new source.
It all depends on the desired result.

Import SQL server database without empty columns

I'm importing a set of tables from a SQL Server DB to Power BI Desktop.
It's a huge database and I'm selecting only some of its tables.
However, these tables have empty columns or columns with zeros only, so I've created two M functions to apply to each and every table to "clean" them.
Is it possible to specify an SQL command in the import settings to fetch only "valid" columns, thus avoiding loading them to Power BI and use custom functions?
Can anyone please share sucha a SQL command?
Thank you!
You can specify an initial SQL query in the connection object. If you click on the settings for it and open the advanced section, you can enter it there (screenshot is from the MySQL connector but I'm sure the SQL Server one is similar).
The output code would look something like
SQL.Database("sql.server.address:1234", "your_sql_server", [Query="SELECT columns, you, want FROM table_you_want"])
However note that if you are using the native connector in Power Query/BI, it attempts to convert whatever you do in Power Query into SQL on the back-end. If you right click on steps in your PQ process and click "Native Query" it will show you the SQL code it is converting to. If at some point that is greyed out it means whatever you were doing was something it couldn't convert.

Element X in the DataSet references an object missing from the Database

When first time I created my App, I created a Database using Microsoft SQL SERVER Management Studio and I connected my App with it.
I created another DB with the same tables and every thing but with diferent names and I let my App to connect to the second one because I want to make some changes and when I am trying to edit my DataSet with Wizard I get this tables page :
as you can see my app couldn't find the right tables and when I am trying to select LastWork table as in the pic, it will make the table name in the DataSet LastWork1.
How I can fix this problem? and let it find the right tables
I've seen this problem when using copies of databases as well, after pointing to a different connection in the settings area of the project properties. The XSD evidently hard codes each DbObjectName with the name of the database and schema in use at design time. One approach to fixing it is to open the wizard for the appropriate dataset, uncheck the red-x objects with the missing references, close the wizard, then re-open it and re-select the objects that are needed. This is not ideal in a large xsd if many findby queries, custom columns, etc. have been added. So an alternative is to do a find and replace on the database name within the XSD itself.
Interestingly, my experience has been that an application runs fine when the connection string points to a differently named but otherwise identical database.

How can I change the source of a Data Source View and the Report Models based on it to a different database?

I have a number of reports deployed to a SQL Server 2005 Reporting Services server. They were all developed using the same Report Model (SDML) that references the same Data Source View (DSV) that points to a test database filled with mostly dummy data. Now, I would like to make those reports pull data from the live database with our real data instead. The two databases have exactly the same structure.
It seems to me, that if I could just change the Data Source being referenced in the Data Source View, then I could redeploy the report model, and all the reports based on it would also reference the correct data. I can see in Business Intelligence Development Studio 2005 that there's an option in the Data Source View property list in Design mode to change the Data Source. So I changed the Data Source, thinking that would work. However, when I try to redeploy the report model after changing the Data Source in the Data Source View, I get a number of error messages like this one:
Error 1 The Table property of the Entity 'Address' refers to the Table 'dbo_address', which is not in the primary data source. Events.smdl 0 0
Is there something else I need to be doing here? Something in the Report Model or Data Source View that should be updated? Is there another way to do what I need to?
Edit 1:
I tried changing the datasource of the report model on the server after the reports were deployed, and that seemed to work pretty well. It's not exactly what I wanted to do, but it works. Thanks everyone.
The strategy that has worked best for me is to deploy the "test" shared datasource to the server then edit it via the Report Manager interface to point to the "production" database (changing the connection string). Making sure of course Overwrite Datasources is set to false on deploy.
Also, your database schema must be the same in test as it is in production.
I don't have as much experience with the report models but generally SSRS doesn't like it when you make changes to the datasource and asks you to refresh all the datasets that you have if you do.
Alternatively, just change the datasource definition on the report server itself.
It sounds like you're changing the data source that the dsv references. Instead, why don't you try to change connection string of the data source. Internally the DSV uses GUIDs to identify the various tables and fields, I suspect that by creating a new data source the GUIDs will change and that is why you're seeing these error messages (as the error message is mapping the internally used GUID to it's "friendly name").
The setup I use has an identically named Data Source (.rds) file for each environment, in the same folder the reports are deployed to. It's just a connection string...
My experience has been the same as zalzaw's - if you change the Data Source, you have to refresh all the datasets associated with the report while pointing at the new environment based on the data source changes. It's very tedious - you go to the Data tab for the report in Business Intelligence Development Studio 2005:
Select a Dataset from the dropdown menu
Click the Refresh button (2nd to the right of the Dataset dropdown, icon looks like recycle)
Repeat steps until all datasets have been refreshed.
Make sure that the database(s) (and stored procedures) are in sync. It's all for naught if a table exists in Dev but not in Test or Prod...

Migrate Access 2007 Database Application to SQL Server 2005 using SSMA - Issues

I have managed to get SQL Server 2005 Express up and running on my computer Ok in order to do some testing before trying this in the "Real World".
I have a fairly large MS Access 2007 Database application I need to migrate to SQL Server
retaining the "Front End" as the user interface. (The app' is already a "split" database
with a Front and Back end....)
I have done some initial testing on using SSMA to migrate my Access database To SQL
Server Express.
Clearly I don't understand some things and I thought I'd see if anyone has
any ideas.
Conceptually I thought that what needed to happen was that the Back End of the
database that resides on the server needed to be migrated to SQL server
and then the Front End re linked to the (now linked to SQL) tables in the Back End.
When I do this using SSMA I end up with renamed tables in the Back End
Access file that look something like "SSMA$myTableNameHere$local". I also
get the original table names underneath showing as ODBC linked tables.
So far so good.
BUT.... When I go to re-establish the linked tables from the FRONT END (The
user interface) all I can see is the "SSMA$myTableNameHere$local" names NOT
the original table names.(Now linked via ODBC)
I can link to the "SSMA,,,," tables but it would mean changing the names of
every table in every query and on every form and in all code on the Front
End! Not something I really want to do.
SO....
I thought I'd try to migrate the FRONT END and see what happens.
What I ended up with is a situation where, basically it works (there are
some serious errors and issues that I haven't even looked at yet... like
missing data etc.!!!!) and I still get the "SSMA$myTableNameHere$local"
tables and the ODBC linked tables with the original names.
I'm trying to understand...... Does this mean that we would do the
migration on the Front End and then just copy the same file to each user's
computer?
Another subject I'm a little confused about is that I can't link via ODBC
to SQL Server Express on the local machine (ie my computer) so I can't test
migrating the Back End and then linking to the tables via the Front End as I
have in the past in more of a client/server situation.
Assuming that SSMA replaces the tables in your back end with links to the SQL Server, all you need to do is delete the original table links in your front end and import the newly-created table links from the back end. You can then discard the back end, since it's not used for anything at all any longer.
I did transfer all my tables one by one to SQL Server 2005 fro Access DB back-end using ODBC.Instruction:
Open Access DB(back-end)
Right-click on table, you need to transfer
Scroll down drop-down box and select ODBC Databases
Select Data Source dialog box opened, Click "New" button
Create new data source dialog box opened
Scroll to the bottom and select SQL Server, Click Next
Give name to your Data Source, Click Next, Click Finish
Create New Data Source Dialog opens
Give some discription OR leave empty, Type Name of your SQL Server (you named it, when install SQL Server on your machine)
Click Next, Click Next
Check "change default database to check box
Select DB where you want your data transfer to
Click Next, Click Finish
NOTE: You need to create new DB (empty) on SQL Server, before doing all this
Now: Right-click any table, select Export, select from drop-down list ODBC, from Data Sources window select your Data Source, You created, Click OK
Use SQL Server with SQL Management Studio Express.
All dates must have a input mask; all text and Memo must have Allow Zero Length =Yes
After all disconnect all links from Access back-end, and establish links from SQL.RENAME all newly linked tables to old names. Use Fron-end user interfase, until do some new.
Forgive my lack of knowledge of Acronym Soup, but I assume SSMA is the SQL Server 2005 "import data wizard" or the wizard in Access to send the data to SQL Server. It appears that you sent the data to SQL Server from Access - something you don't want to do. You want to use the DTS in SQL Server (now called SSIS or something?) to import the data into SQL Server. Then you'll have your tables in SQL Server. Then, simply create your DSN entry for the SQL Server and re-link your tables. All should be well.
Overall, the general rule is to import Access tables using SQL Server instead of using Access to send the data to SQL Server.
I'd bite the bullet and rename the tables on the SQLServer side back to the friendly names that you had in the original database. You'll probably have less problems. Especially if you have any embedded code the MS Access side.
As far as how you will deploy the MS Access side now, it should be pretty much create the ODBC link on the user's workstation, and copy the MS Access file to their desktop (although you might want to make an MDE (or the 2007 equivalent) to prevent them from accidentally breaking it).
Frankly, now that you have migrated, you need to look at the design of your tables. It is my experience that the wizards for Access migration do a poor job of selecting the correct datatype. For instance if you had a memo field, you might easily get away with a varchar field instead but the last wizard I used (an earlier version) always converted them to text fields. Now would also be the time consider some fixes such as making date fileds datetime instead of character based if you have had that mistake in the past.
I would never consider using a wizard again to do data migration myself having experienced how very badly they can do it.
You will alos find that just converting the data to SQL Server is often not eough to really get any performance benefit. YOu will need to test all the queries and consider if you can convert them to stored procs instead if they are slow. Eliminating the translation from Jet SQL to T-sql can being performance improvements. Plus there are many features of t-sql that can imporve performance that do not have Access equivalents. Access is not big on performance tuning, but to get the benefit of performance tuning with a SQL Server backend, you need to have SQL Server specific queries written. INdexing needs to be considered if the Access tables were not indexed properly.
Using SSMA is different when you use odbc. If you have an application using fully access (back end and front end). You can manipulate objects easily bounding forms, using DAO, etc.. without problem, then when u need to migrate database to sql server u can use directly odbc (by linking yourself tables to sql server), ssma, ... the main problem how to preserve bounded forms, queries, code in the client-side.
If U use directly odbc you must relink by yourself all objects and change code but if u use ssma, you have to do nothing, you will continue to work as u did before. The problem with SSMA is how to deploy the front end to the clients if you developed client side in other place using another sql server?

Resources