As a .NET Desktop developer, I have a lot of experience working with various databases that are already up and running; but I'm not a DBA. I'm currently working at a company where I am ther only software guy here to build them software from scratch -- their previous enterprise-level solution was an Access database with macros and a couple forms built it. So, I basically have no one else to go to.
With that preface, how the heck do I get a database -- ANY DATABASE!!! -- added to my VS solution? I've been beating my head against this for almost 6 hours and have made zero headway. At this point, I'm ready to say, "Screw MS databases!" and start looking at MySQL or PostgreSQL or something.
The desktop application I'm developing has to work whether there is an internet connection or not, so I need a local database that installs with ClickOnce. From what I've found so far:
SQL Server [Express] 2016+ requires Windows 8 or later (a non-starter since 95% our customers are still running Windows 7)
SQL Server Compact is being deprecated and won't work past VS2013
I think LocalDB is what replaces Compact in 2016+ (?)
Okay, so I started with this tutorial:https: //learn.microsoft.com/en-us/visualstudio/data-tools/create-a-sql-database-by-using-a-designer However, trying to add a "Service-based Database" just gives me this error once: "The 'DBProviderFactories' section can only appear once per config file." I try again and get this error repeatedly: "Unable to find DbProviderFactory for type System.Data.SqlClientConnection" I've Googled both errors and all the answers that I've been able to find pertain to VS2010 or earlier and their solutions are either not applicable or don't work.
Next, I tried this tutorial: https://msdn.microsoft.com/en-us/library/aa983322.aspx I've tried adding new data connections through the "Server Explorer" panel. I don't see "[*] Compact" as an option. When I try "Microsoft SQL Server Database File", I just get the error: Unable to find the requested .Net Framework Data Provider. It may not be installed."
I've even tried adding data sources through the "Data Sources" panel; that doesn't work either.
I've installed the "Data storage and development" addon from the Visual Studio Installer, several versions of SQL Server 2014, SQL Server Compact 4.0, and maybe a few other executables from Microsoft's website.
Nothing works.
Help...
I think I just found it!
Evidently, there are "machine.config" files on your computer. Search for them all, and make sure that there is only a single tag for "DbProviderFactories". I can add a database object now. Hopefully, this puts me in business...
https://social.msdn.microsoft.com/Forums/vstudio/en-US/7b4f353b-77fd-427c-976b-5968abc88c13/visual-studio-2010-unable-to-find-the-requested-net-framework-data-provider-for-sql?forum=vseditor
If what you are saying is that you are writing a browser based application - then one would migrate the tables to SQL Server (Express) or even MySQL - it really doesn't matter. Then write a new web app. The existing Access app would serve as a model for seeing features & screen layout but is otherwise not portable.
On the other hand, if you are re-writing a Windows application; then the decision is whether the payload requires a server solution or if one can stay at the PC level. If the payload is suitable for PC then a re-write using either Visual Studio or Access again.
Access is a front end db - the tables in the back end whether they be stored in SQL Server or an Access file are entirely passive. All the processing is done by the user's PC. If the payload allows that then this is the lowest cost re-write option.
If you've outgrown a PC level payload - then one must develop a back end database feature set with a more passive front end.
Related
I have designed a SSIS project and deployed it to SQL server and also created the job to run on daily basis but its giving me this error when executing this as job (doesnt give any error within VS):
There is this CLSid in this error message but there is no application associated to it in
--> Component Services -> Computers -> My Computer -> DCOM Config
But this CLSid is registered inside registry editor
About this particular task on which this error is occurring: This is a script task which is modifying and deleting the un-wanted rows from the excel file in which I am trying to write SQL table data.
Script task code looks like this:
I have been working for hours now trying to fix this problem but no success. Kindly guide me how can I fix this issue. If any other information is required related to this project, please let me know....
Doing Excel automation in a SQL Server agent job is totally unsupported and probably won't work.
To have even a ghost of a chance of making this work you'll need to run a real desktop session on the server and automate Excel in that. Excel expects a real user to be logged in with a full profile. And Excel has failure conditions where it displays a popup window, which you'll need to be able to access via remote desktop.
You can read and write Excel files on a server with the OpenXML SDK, without actually having to run Excel. There's also a wrapper library called ClosedXML which you may find easier to use than using OpenXML directly.
tl;dr;
You need to install Office (Excel) on the server AND ensure that you install it in a manner that mirrors the SQL Agent's expected bit-edness. Default for Agent is going to be 64bit, default for Office is still 32 :(
Error guessing
You have a script task that uses the Office interop libraries to delete some rows (2 through 11?) out of a spreadsheet.
You have Office installed on your machine and therefore you have the libraries installed. Excel still has COM based "stuff" in it, thus the interop and errors shrieking about the CLSid, registry, etc but that's likely just secondary errors because there is no base "application is not installed" exception to be thrown.
If Office is installed, then ensure your agent execution model matches the version of Office. If 32 bit Excel is already installed, don't potentially break everyone else's stuff by uninstalling and reinstalling as 64 bit, just got the Advanced section of the SQL Agent Job Step and check the 32bit box.
Once all that's done, then if you're still getting errors but new ones, then the existing comments mentioning permissions may come into play - it depends on where the Excel document actually exists (on the computer where SQL agent can access vs on the computer where it cannot vs networked drive)
Good luck in not finding people on the sanctions lists.
I am attempting to use a Packaged Solution for my Access 2010 application that has its backend linked to SQL Server. At the moment, I'm using the .accdb file as the frontend, and I would like to distribute my application to some other Windows computers, but the Packaged Solution does not work. I had the package include Access Runtime, so their version of the frontend is running on Runtime and not full Access. However, once the application makes a request to the backend, the application does nothing, as I am not even prompted for the SQL Password as per usual with the full version. I've read on about including a .dsn file in the package can secure the SQL connection (see here), but going through steps of other tutorials to create .dsn files hasn't led to any results. Would anyone know how to correctly generate the .dsn file or if I've done something else wrong at this point?
(And yes, I understanding using Access 2010 in the year 2019 is almost a joke at this point, but I'm doing this for testing purposes. I plan to completely remake the frontend in Angular in the future.)
One other unrelated note... would it be a better idea to have the frontend hosted as a .html file like through the "Publish to Access Services" process? I did read that Access Services was discontinued last year, so would that not be possible?
Edit: This is not a duplicate of "DSN Less Connection (MS Access to SQL2016)" because A) I want to utilize a DSN Connection, not DSN-less and B) I am not using connection strings in my code to hook up with SQL.
You should be able to just create FILE dsn, link your tables, and then distribute the compiled accDE to each desktop.
However, what SQL odbc source provider did you use? If you use the SQL server ODBC provider, then that is by default installed on each computer.
However, if you linked using Native 11 (or later), then that driver is NOT installed on each workstation by default. So, I HIGH recommend you create a FILE dsn (not a user or system DSN), and link the table using that. (Access will create DSN-less links for you)
And you should NOT be seeing a logon prompt with your application. This suggests you forgot or missed the save password option.
So, I would re-link your tables, creating a new FILE DSN. And if you using the linked table manager, then make sure you check the prompt for new location to force creating of a NEW DSN. If you just re-fresh, then you DO NOT get a chance to click on the save password option during the linking process.
So, what odbc driver are you using? The native 11 or later are better, but they are not installed by default on each workstation. However, CAUTION is required here, since the older sql driver does NOT support the newer datetime2 formats. If you used these newer sql column types, they will be returned as string data types in Access and create a mess of issues.
So, first, I would re-link using a FILE dsn.
Make sure you check the save password during the re-link.
You then compile your accDB into an accDE, and then distribute that. You don’t really need to use the package wizard, since once each workstation has the runtime installed, then a simple copy of the accDE to each person’s computer will thus work fine. There is NO special connection between your accDE and the package wizard. Once the runtime is installed, then any and all mdb, accDB, and your accDE can simply be clicked on to launch + run. So for testing, you can skip the package wizard, and just copy the accDE to the target machine, click on it, and see if it works.
Edit
The prompt and check box during this process is this:
So you have to check that box to save the password. Note that you ONLY get this dialog WHEN you create a new FILE dsn.
I have written a VB.Net application that uses an SQL Express DB file containing a single table and a handful of stored procedures.
I have successfully built and exported the application to my VPS.
The problem comes when knowing what to do concerning the database file, there is a wealth of stuff online but not specifically to suit my needs.
I plan to use LocalDB on the VPS but being commandline - it is hard to know if the scripts that I have run have been successful after creating an instance , starting it... etc,
I want to keep installation requirements to an absolute minimum on my VPS machine and (in time other end users machines)... hence using LocalDB and not SQL Express
So, what do I have to do on the VPS to enable my application to connect to the database.. ? This was simple when it was Access - (supply the MDB file and run the AccessDatabaseEngine(redistributable) - job done)
The connection on my devt. machine runs as expected.
The connection string in my code is:
Const strSQLConnection As String = "Data Source= (localdb)\v11.0;Database=SoccerTrader;Trusted_Connection=True"
Can anyone help please.. this is driving me around the bend.. surely it cant be that difficult..?
===========================
I have found the following in an MSDN blog which says:
Database as a File: LocalDB connection strings support AttachDbFileName property that allows attaching a database file during the connection process. This lets developers work directly with databases instead of the database server. Assuming a database file (*.MDF file with the corresponding *.LDF file) is stored at “C:\MyData\Database1.mdf” the developer can start working with it by simply using the following connection string: “Data Source=(localdb)\v11.0;Integrated Security=true;AttachDbFileName=C:\MyData\Database1.mdf”.
================ ADDED 12th June =====================
OK, this is really bugging me now... I have read around this till it is coming out of my ears and nothing specifically seems to target what I am trying to do. All the blogs I read refer to installing / running SQL Server and changing permissions etc.
As I have mentioned I am using a VPS and propose to use LocalDB on the VPS to access a simple/small database file for a VB.Net application I am writing.
This is the story so far.
1) I have built a working prototype on my development PC and connected using SQL Express to a database file SoccerTrader.mdf - no problem.
In the Visual Studio Project properties I have added a requirement to the project that checks for SQL Server ..and if it is missing, installs it...
2) I install the project on the VPS and as expected SQL Server 2012 LocalDB is installed .... see here..
3) I have copied the SoccerTrader.MDF and SoccerTrader.LDF files into "C:\BESTBETSoftware\SoccerBot" on the VPS
4) for practical reasons given the problems I am having getting this to work, I have implemented an inputbox for me to specify the connection string when the application runs.... the connection strings I have used give the following...
1]: http://i.stack.imgur.com/i2tro.png
I have not changed any file permissions on the development PC and the database state is NOT read only....
So, the question is where do I go from here...? What have I missed.. why is it not working..?
I have managed to sort the problem.
Seemingly, the connection string I was using was OK. It was my error handling that wasnt 'clean' enough. It transpired the connection was being made on my VPS but when the application attempted to update the table , the directory I had created and put the MDF file into, would not permit write access.
I moved the MDF into the C:\Users\Public\Documents folder and all works as it should.
You have to specify the full path of the Db file with folder name/ip-address
This is the biggest problem I've had with my project so far. Every time I try to publish it, I get hangups and crashes whenever the project tries to access a database file (which is fairly often).
I have two database objects, both created as part of my Visual Studio project under the main solution, as BUILDERDATA.mdf and CHARACTERS.mdf. When I access them during test builds, everything functions just fine. However, they always fail in the published project.
I get the feeling either they're not exporting correctly, or the act of publishing the project is breaking the connection strings, so I'm hoping someone can help me pinpoint the issue.
Under my project settings, I have a connection set up to each database. They're both Connection Strings, their scope is Application, and the value is this:
Data Source=(LocalDB)\MSSQLLocalDB;AttachDbFilename=|DataDirectory|\BUILDERDATA.mdf;Integrated Security=True;Connect Timeout=30
Data Source=(LocalDB)\MSSQLLocalDB;AttachDbFilename=|DataDirectory|\CHARACTERS.mdf;Integrated Security=True;Connect Timeout=30
Under Project Properties -> Publish -> Application Files, both databases and their log files are set to Data File (Auto) for Publish Status, Required for Download Group, and Include for Hash.
Under my Prerequisites, I have SQL Server 2012 Express LocalDB marked, as well as Microsoft .NET Framework 4. Part of my code uses .NET, so that's required anyway, and I included SQL Server Express so it can access the database files. However, 2012 is the newest version available for me to include, so I'm not sure if there's an issue with that version?
If there's anything else you guys would need to see, please let me know. I just want to get this figured out and fixed so I can have my friends start testing my program. :(
The goal: decom the old server where TFS/SQL was originally installed, and run TFS/SQL on new server. To add insult to injury, the old server I will reference here is SBS 2011 - if you know anything about that environment, you may understand why it is slated for decom.
I performed a restoration-based move last week. While it was successful with respect to functionality, I now have what I would describe as a dual data + application tier implementation. Today, I have TFS/SQL installed on two servers, both with TFS Version: 11.0.60610.1 (Tfs2012.Update3) and SQL Version: 2008R2. Both servers in the same domain.
My curiosity lies in the behavior of the Tfs_Configuration db. I restored both the Tfs_ db as well as the Tfs_Configuration db (via .BAK files) to the new SQL server, but I still see activity happening on the old server here "c:\Program Files\Microsoft Team Foundation Server 11.0\Application Tier\Web Services_tfs_data", but no updated/recent files in the same location on the new server, suggesting the Tfs_Configuration db really did not move/restore properly.
In the TFS console on the new server, I see the URLs in the “Application Tier Summary” section referring to the old server, but the Machine Name is the new server. I also see in the "Application Tiers" section, a reference to the old server Machine name. Yet, in verifying change logs, the Tfs_ db is now resident on the new server and accepting Visual Studio commits/check-ins. There is a Tfs_Configuration db on the new server, but it seems to be the default install copy and not my restored db.
In the various guides I have read, I do understand the web.config file holds the instructional set for the catalog, etc. here "appSettings … add key="applicationDatabase" value="Data Source=instance name;Initial Catalog=Tfs_Configuration;Integrated Security=True". I was expecting to change that entry once it migrated to the new server, but rather it is still parked on the old server.
I have turned off the TFS and SQL services on the old server as a trial to see if the new installation would pick up the load, but as you might expect, TFS then goes into an unavailable state to the users.
The primary questions are:
Why did the Tfs_Configuration db not restore to the new server in the same fashion as the Tfs_ db?
How can I move that Tfs_Configuration db and turn off that old SBS 2011 unit?
Any tips or tricks are welcomed and appreciated.
Thank you.
What you did is a non trivial operation (see Move Team Foundation Server from One Hardware Configuration to Another).
Typical missing steps:
Changing URLs
Cleaning caches
Changing server ID if you want to keep both instances live
Changing accounts in case you used local user accounts
I have completed this process successfully. It required a triangulation between the three servers. The essential aspects involved solid SQL backups, coupled with the settings.xml from healthy TFS Console backups.
It was certainly a process that took a lot of planning and anticipating snags.
All-in-all, it was a great exercise in watching the data flow and understanding the roles of the configuration and collection DBs. Thank you for responding to my inquiry.