This is probably going to be an underspecified question, as I'm not looking for a specific fix:
I want to run a machine learning algorithm on some data in a SQL Server database. I'd like to use R to do the calculations -- which would involve using R to connect to the database, process the data, and write a table of results back to the database.
Is this possible? My guess is yes. Shouldn't be a problem using a client...
however, would it be possible to set this up on a linux box as a cron job?
Yes to all!
Your choices for scripting are either Rscript or littler as discussed in this previous post.
Having struggled with connecting to MSSQL databases from Linux, my recommendation is to use RJDBC for database connections to MSSQL. I used RODBC to connect from Windows but I was never able to get it working properly in Linux. To get RJDBC working you will need to have Java installed properly on your Linux box and may need to change some environment variables (seems I always have SOMETHING mis-configured with rJava). You will also need to download and install the JDBC drivers for Linux which you can get directly from Microsoft.
Once you get RJDBC installed and the drivers installed, the code for pulling data from the database will look something like the following template:
require(RJDBC)
drv <- JDBC("com.microsoft.sqlserver.jdbc.SQLServerDriver",
"/etc/sqljdbc_2.0/sqljdbc4.jar")
conn <- dbConnect(drv, "jdbc:sqlserver://mySqlServer", "userId", "Password")
sqlText <- paste("
SELECT *
FROM SomeTable
;")
myData <- dbGetQuery(conn, sqlText)
You can write a table with something like
dbWriteTable(conn, "myData", SomeTable, overwrite=TRUE)
When I do updates to my DB I generally use dbWriteTable() to create a temporary table on my database server then I issue a dbSendUpdate() that appends the temp table to my main table then a second dbSendUpdate() that drops the temporary table. You might find that pattern useful.
The only "gotcha" I ran into was that I could never get a Windows domain/username to work in the connection sequence. I had to set up an individual SQL Server account (like sa).
You may just write a script containing R code and put this in the first line:
#!/usr/bin/env Rscript
change the file permissions to allow execution and put in crontab as it would be a bash script.
Related
Is it possible to connect to a SQL Server via SAS without using ODBC?
If the answer is yes can you give a code example of how to do it?
OleDb. Those are your two choices. You could also use a web service to front the SQL Server db and use PROC HTTP or PROC SOAP (Preference would be REST/JSON). For direct access you need the SAS Access engine for SQL Server, ODBC, or OleDB.
Yes it is possible. You can do it either as libname or in proc sql. i suggest using insertbuff-option when uploading. Just make sure your username, passwords and server are not quoted, like in oleDb connections.
libname My_libname odbc noprompt =
"DRIVER=SQL Server; server=&server.; Uid=&userName.;Pwd=&password.; DATABASE=&Database.;"
INSERTBUFF=32767;
The second way is the
proc sql;
connect to odbc as my_conn
(noprompt = "DRIVER=SQL Server; server=&server; Uid=&user.; Pwd=&password.; DATABASE=&database.;");
create table Query_result as
select * from connection to my_conn (
select * from table_in_server_database
);
quit;
For more you can read in SAS support page
Whilst this technically uses ODBC/OLEDB, you don't need SAS licenced for it... write a VBScript (potentially receiving arguments passed from SAS) which connects to your data source using the system ODBC/OLEDB drivers, executes the query, and returns delimited records.
Then execute said VBScript via a pipe infile, read the records into SAS.
I do this regularly against multiple data sources (Informix, SQL Server, DB2, MSAccess/Excel).
I cannot comment on Chris J's answer but it is a cool idea. I would make some modifications (suggestions) to it. Skip VBScript since it is out of date and is not allowed in a lot of places. Use .NET Core/C# and Entity Framework. EF is a full ORM, easy to use, and will auto-magically discover the entire DB structure on almost every db out there. Stream the data back using a memory buffer.
Like Chris' answer but just updating it to more modern mechanisms. Will have to give it a try since it is a clever way around the issue.
I have a bunch of legacy access based databases that I've been using for years without issue - queries have been running between them for years using ODBC/DAO/ADO. Now suddenly in the last few days, I've started getting the "The database has been placed in a state by user...." error on a bunch of them.
I have tried to narrow the problem down, but it seems to be getting worse. I have tried making a local copy of the database file, opening it, and then on the same machine, trying to create an ODBC connection to it, and get the error. I have also tried running successive queries on the database and still get the same thing (copy of the file on my local machine, so there is only my single connection, basically connect to the database, run a query, close the connection, wait 2 minutes, then try to open a new connection - FAIL - so it is definitely not a multi user limit problem or anything like that.
The issue is consistent across multiple platforms (directly in MS Access (2010 and 2013), with Excel (2010 and 2013) queries to the Access DB, and with Windows Forms VB.net applications trying to query the access DB (through datasets, OLEDB, and ADO)
Until this week all of these applications were working as designed and had been for years- I am the only Dev working on this stuff, so I know that nothing in the programming has changed, so it must be an external issue.
The back end databases reside on a shared server drive (server is running Windows Server 2008) - and we have had no other connection issues to the server or network; it is limited to connections to access database files.
Does anyone know if something has changed lately (in the last week or so) with the ODBC drivers? Maybe an MS update?
Thanks in advance!
It seems that you can fix this issue by buffering the Access binary. Use the Binary.Buffer function in a query that defines your Access database, then reference that query in order to use the binary in a query that pulls each table. Note: I also define parameters for my folder path and file names.
For example:
//myDbBinary
let
Source = Binary.Buffer(File.Contents(DataFolder_param & FileName_param),
[CreateNavigationProperties=true]))
in
Source
// Table1 Query
let
Source = Access.Database(myDbBinary, [CreateNavigationProperties=true]),
_Table1 = Source{[Schema="",Item="Table1"]}[Data]
in
_Table1
The source is this
I'm fairly new to SQL Server. I have done basic admin, backups etc. I have also spent 2 years doing MySQL for a software company offering software support for their MySQL bespoke program. I'm mainly a tech guy (desktop, Networking) but getting my head round this DB stuff!
I have started with a company that run SQL Server 2005 and need some stuff doing, and I am struggling with the syntax more than anything. The company have 4 SQL Servers running the same db's (program wise) for 4 differing locations.
What I am trying to do is copy the updated cost price list from table 1 to the other tables with * criteria. Basically copy table.parts from server1.parts to server2.parts * currencyconvertion field * markup (%)
That bit seems to be quite easy except I cannot get the db's to link. I enter the server name which contains - and the syntax says wrong eg uk-server1 'can't find 'uk'? Also I am unsure in the 4 part address is correct servername, dbname, schema, table?
Right ok. Previously when tried I was unable to link the two servers. I have now resolved this and the server is now linked. I have been told that maybe there is a need for [] to quote'' server name. I have tried this with no success. The problem seems to be the name of the server having a - uk-efacs. as soon as I type this and remember it is now linked the herror is can't find server efacs an uk is wrong?? It's not ready the full server name? WHY?
Figured this out by trial and error just needs [] by server name ie [uk-efacs].db.table.field. This now is ok just need to work on my syntax as the query shows errors.
Try creating a Linked Server record on the server you're running this from. In Object Explorer (in SSMS) expand Server Objects, right click Linked Servers and select new. Select SQL Server and type the name of your remote server and then try your query again. Bit puzzled as the snippet you provided
update partmaster
set partmaster.fsunit = uk-efacs.efacsdb.partmaster.fsunit * uk-efacs.efacsdb.currency.currate * 1.32
Seems to parse just fine.
I ultimately want to query an existing MS-Access database (say, contacts.mdb) from a Lazarus program I will write.
It appears that configuring a User DSN is the first step.
In the ODBC DSA, I am stuck at Adding a driver for MS-Access databases. What do I need to download for this?
OK, I'm starting over from scratch. Please bear with me.
I'm now trying to use the Lazarus example given at wiki.lazarus.freepascal.org/MS_Access.
Following the Instruction given as "Goto your [Data sources (ODBC)] at the control panel administrative tools..." I run /usr/bin/ODBCConfig and get an error which says "Invalid window handle." Clicking OK closes without anything.
Does this mean I've gotten everything so FUBAR that's it's hopeless?
If your program is written in PHP you can use php5-odbc for access any odbc source, like mssql server or access databases.
This post http://phplens.com/phpeverywhere/node/view/9 illustrate you all the necessary step.
Thsi might be the thing you need
http://www.easysoft.com/products/data_access/odbc-access-driver/
I think it should be enough to put something like the following in ~/.odbc.ini:
[Contacts]
Description = The Contacts Database
Driver = /usr/lib/libmdbodbc.so
Database = /home/dkjmusic/data/contacts.mdb
Of course you need an MDB ODBC driver (e.g. libmdbodbc Install libmdbodbc http://bit.ly/software-small) to be installed
We have a classic ASP application that simply works and we have been loathe to modify the code lest we invoke the wrath of some long-dead Greek gods.
We recently had the requirement to add a feature to an application. The feature implementation is really just a database operation requires minimal change to the UI.
I changed the UI and made the minor modification to submit a new data value to the sproc call (sproc1).
In sproc1 that is called directly from ASP, we added a new call to another sproc that happens to be located on another server, sproc2.
Somehow, this does not work via our ASP app, but works in SQL Management Studio.
Here's the technical details:
SQL 2005 on both database servers.
Sql Login is authenticating from the ASP application to SQL 2005 Server 1.
Linked server from Server 1 to Server 2 is working.
When executing sproc1 from SQL Management Studio - works fine. Even when credentialed as the same user our code uses (the application sql login).
sproc2 works when called independently of sproc1 from SQL Management Studio.
VBScript (ASP) captures an error which is emitted in the XML back to the client. Error number is 0, error description is blank. Both from the ADODB.Connection object and from whatever Err.Number/Err.Description yields in VBScript from the ASP side.
So without any errors, nor any reproducibility (i.e. through SQL Mgmt Studio) - does anyone know the issue?
Our current plan is to break down and dig into the code on the ASP side and make a completely separate call to Server 2.sproc2 directly from ASP rather than trying to piggy-back through sproc1.
Have you got set nocount on set in both stored procedures? I had a similar issue once and whilst I can't remember exactly how I solved it at the moment, I know that had something to do with it!
You could be suffering from the double-hop problem
The double-hop issue is when the ASP/X page tries to use resources that are located on a server that is different from the IIS server.
Windows NT Challenge/Response does not support double-hop impersonations (in that once passed to the IIS server, the same credentials cannot be passed to a back-end server for authentication).
You should verify the attempted second connection using SQL Profiler.
Note that with your manual testing you are not authenticating via IIS. It's only when you initiate the sql via the ASP/X page that this problem manifests.
More resources:
http://support.microsoft.com/kb/910449
http://support.microsoft.com/kb/891031
http://support.microsoft.com/kb/810572
I had a similar problem and I solved it by setting nocount on and removing print commands.
My first reaction is that this might not be an issue of calling cross-server, but one of calling a second proc from a first, and that this might be what's acting differently in the two different environments.
My first question is this: what happens if you remove the cross-server aspect from the equation? If you could set up a test system where your first proc calls your second proc, but the second proc is on the same server and/or in the same database, do you still get the same problem?
Along these same lines: In my experience, when the application and SSMS have gotten different results like that, it has often been an issue of the stored procedures' settings. It could be, as Luke says, NOCOUNT. I've had this sort of thing happen from extraneous PRINT statements in the code, although I seem to remember the PRINTed value becoming part of the error description (very counterintuitively).
If anything is returned in the Messages window when you run this in SSMS, find out where it is coming from and make it stop. I would have to look up the technical terms, but my recollection is that different querying environments have different sensitivities to "errors", and that a default connection via SSSM will not throw an error at certain times when an ADO connection from a scripting language will.
One final thought: in case it is an environment thing, try different settings on your ASP page's connection string. E.g., if you have an OLEDB connection, try ODBC. Try the native and non-native SQL Server drivers. Check out what connection string options your provider supports, and try any of them that seem like they might be worth trying.
Example code might help :) Are you trying to return two tables from the stored procedure; I don't think ADO 2.6 can handle multiple tables being returned.
I did consider that (double-hop), but what is the difference between a sproc-in-a-sproc call like I am referring to vs. a typical cross-server join via INNER JOIN? Both would be executed on Server1, using the Linked Server credentials, and authenticating to Server 2.
Can anyone confirm that calling a sproc cross-server is different than doing a join on data tables? And why?
If the Linked Server config is a sql account - is that considered a double-hop (since what you refer to is NTLM double-hops?)
In terms of whether multiple resultsets are coming back - no. Both Server1.Sproc1 and Server2.Sproc2 would be "ExecuteNonQuery()" in the .net world and return nothing (no resultsets and no return values).
Try to check the permissions to the database for the user specified in the connection string.
Use the same user name in the connection string to log in to the database while using sql mgmt studio.
create some temporary table to write the intermediate values and exceptions since it can be a effective way of debugging your application.
Can I just check: You made the addition of sproc2? Prior to that it was working fine for ages.
Could you not change where you call sproc2 from? Rather than calling it from inside sproc1, can you call it from the ASP? That way you control the authentication to SQL in the code, and don't have to rely on setting up any trusts or shared remote authentication on the servers.
How is your linked server set up? You generally have some options as to how it authenticates to the remote server, which include logging in as the currently logged in user or specifying a SQL login to always use. Have you tried setting it to always use a specific account? That should eliminate any possible permissions issues in calling the remote procedure...