Progress Providex database openquery SQL error - sql-server

I get this error when I do a openquery select to a linked server using an providex odbc driver. The database I am trying to connect to is built on Progress.
Cannot get the current row value of column "[MSDASQL].IVD_PRICE" from OLE DB provider "MSDASQL" for linked server "FCEU". Conversion failed because the data value overflowed the data type used by the provider.
Is there a work around for this? I do not have access to the server I am trying to query.
Thanks!

The Progress database implements all datatypes as variable length. The "format" is just a suggestion for default display purposes. Progress applications routinely ignore that suggestion and "over-stuff" fields.
This gives most SQL clients hissy fits.
The cure depends on the version of Progress/OpenEdge.
All versions of Progress starting with version 9 support a utility called "dbtool" which will scan the db and adjust the "SQL-WIDTH" attribute for any fields that have been over-stuffed. You must run this on the server. (Or convince the DBA to do it.)
http://knowledgebase.progress.com/articles/Article/P24496
This is a very common, routine procedure for Progress databases.
You can also use the -checkwidth parameter to keep these things from happening -- but in your case the horse is already out of the barn and it might break the application. So it probably isn't useful to you right now.
Starting with OpenEdge 11.5 there are new features to automatically handle width violations when a SQL client connects:
http://knowledgebase.progress.com/articles/Article/How-to-enable-Authorized-Data-Truncation-in-a-JDBC-or-ODBC-connection

Related

SQL Server 2019 "Invalid object name sys.sysrowsets"

I get folowing error: SQL Server 2019 "Invalid object name sys.sysrowsets", then I tried to select data from sys.sysrowsets table.
I have sql server 2019.
Do you know how solve this?
Thank you
This is a system base table.
It still exists in SQL Server 2019
Exists in every database. Contains a row for each partition rowset for
an index or a heap.
It can be seen in the execution plan when selecting from sys.partitions but (as the docs explain)
To bind to a system base table, a user must connect to the instance of
SQL Server by using the dedicated administrator connection (DAC).
Trying to execute a SELECT query from a system base table without
connecting by using DAC raises an error.
So if you have appropriate permissions then technically the software can run the SELECT by connecting to the DAC port.
It is not advisable to do this though. Having software that routinely connects via the DAC rather than use documented views is not a good idea and is explicitly warned against in the docs
Important
Access to system base tables by using DAC is designed only for
Microsoft personnel, and it is not a supported customer scenario.
What is this software doing for you? Why is it accessing base tables directly rather than using documented interfaces?

Optimizing OLE DB Destination for Fast load from Oracle to SQL Server for SSIS

I'm working with a SSIS package for importing from an Oracle Table to an SQL Server Table. for this in between I had to put a data conversion.
the OLE DB Source is retrieving the complete Table, then being converted by the data conversion and then sent to the OLE DB Destination with current setup
now, the table I'm trying to import has around 7.3 Million records with 53 columns.
I need to know how can I setup (or what changes should do to current setup) to speed up as much as possible this process.
This package is going to run scheduled as a job in the SQL server agent.
In the last run inserted 78k records in 15 minutes. at this pace is too slow.
I believe I have to tune setting with the "rows per batch" and "maximum insert commit size" but looking around I haven't found information about what settings should work, and I've tried different settings here, not finding actual difference between them.
UPDATE: After a bit more test, the delay is from getting records from Oracle, not to insert them into SQL server. I need to check on how can I improve this
I think that the main problem is not loading data into SQL Server, check the OLE DB provider you are using to extract data from Oracle.
There are many suggestions you can go with:
Use Attunity connectors which are the fastest one available
Make sure you are not using the old Microsoft OLEDB Provider for Oracle (part of MDAC). Use the Oracle Provider for OLEDB (part of ODAC) instead
If it didn't work, try using an ODBC connection / ODBC Source to read data from Oracle

How do you remove linked server metadata cache when there are no defined linked servers?

Microsoft Linked Server instances cache metadata for faster query resolution. This does not get properly refreshed when the Progress Database resides on a remote server.
Dropping the Linked Server instance does not remove the metadata cache.
Any new linked server continues to use that old cache from the previous linked server. This is causing an error like this...
The OLE DB provider "MSDASQL" for linked server "ANY NAME" supplied inconsistent metadata. An extra column was supplied during execution that was not found at compile time.
This problem did not happen until someone made a schema change on the remote Progress DB. Specifically dropping columns from the table that causes the error above.
I'm using SQL Standard Edition 2012. So, don't ask about lazy schema validation. ;)
Backup the data and drop the offending table in the Progress OpenEdge system. Create a new table and load the data.
Now, the issues is resolved.
What's different? There are now no ID gaps in SYSPROGRESS.SYSCOLUMNS or ordinal issues in the Microsoft linked server instances cache metadata.

Getting Data from an Oracle database to SQL Server

I am a SQL Server database developer. We have a current requirement that we need to link our product with an existing application of a client. Our product has a SQL Server 2012 database while the client's existing application uses Oracle 11g. Since the time limit for the project completion is limited we cannot migrate our application to Oracle.
The requirement is that we have to get the customer details from the Oracle database to do billing activities in our system.
So I went through a few links and found that SQL Server linked server can be used to do this. I have successfully created a view which uses the Customer table from the Oracle database using a linked server. It will solve our problem.
Now here are my questions:
Is there any better solutions to do this other than linked server?
Are there any drawbacks when using linked server for this?
Thanks in advance
One drawback to consider is that the filtering on your view may take place at "your" end, rather than in Oracle. e.g. if you have a linked server (using, say, an OPENQUERY statement) and a view based on that and you do this:
select id from myView where id = 4711
expecting that the result will be very quick (assuming id is indexed etc.etc.), then you may be in for a shock as what will actually happen is:
the entire contents of the Oracle table are passed to SQL Server
SQL Server then filters this data, i.e. the filtering cannot be "pushed
down" into the view objects (as they are remote).
N.B.: I know there are two ways to define linked server (openquery and the other one, I forget the details), so this may not always apply, but you should be aware of the potential performance hit.

Migrate Access 2007 Database Application to SQL Server 2005 using SSMA - Issues

I have managed to get SQL Server 2005 Express up and running on my computer Ok in order to do some testing before trying this in the "Real World".
I have a fairly large MS Access 2007 Database application I need to migrate to SQL Server
retaining the "Front End" as the user interface. (The app' is already a "split" database
with a Front and Back end....)
I have done some initial testing on using SSMA to migrate my Access database To SQL
Server Express.
Clearly I don't understand some things and I thought I'd see if anyone has
any ideas.
Conceptually I thought that what needed to happen was that the Back End of the
database that resides on the server needed to be migrated to SQL server
and then the Front End re linked to the (now linked to SQL) tables in the Back End.
When I do this using SSMA I end up with renamed tables in the Back End
Access file that look something like "SSMA$myTableNameHere$local". I also
get the original table names underneath showing as ODBC linked tables.
So far so good.
BUT.... When I go to re-establish the linked tables from the FRONT END (The
user interface) all I can see is the "SSMA$myTableNameHere$local" names NOT
the original table names.(Now linked via ODBC)
I can link to the "SSMA,,,," tables but it would mean changing the names of
every table in every query and on every form and in all code on the Front
End! Not something I really want to do.
SO....
I thought I'd try to migrate the FRONT END and see what happens.
What I ended up with is a situation where, basically it works (there are
some serious errors and issues that I haven't even looked at yet... like
missing data etc.!!!!) and I still get the "SSMA$myTableNameHere$local"
tables and the ODBC linked tables with the original names.
I'm trying to understand...... Does this mean that we would do the
migration on the Front End and then just copy the same file to each user's
computer?
Another subject I'm a little confused about is that I can't link via ODBC
to SQL Server Express on the local machine (ie my computer) so I can't test
migrating the Back End and then linking to the tables via the Front End as I
have in the past in more of a client/server situation.
Assuming that SSMA replaces the tables in your back end with links to the SQL Server, all you need to do is delete the original table links in your front end and import the newly-created table links from the back end. You can then discard the back end, since it's not used for anything at all any longer.
I did transfer all my tables one by one to SQL Server 2005 fro Access DB back-end using ODBC.Instruction:
Open Access DB(back-end)
Right-click on table, you need to transfer
Scroll down drop-down box and select ODBC Databases
Select Data Source dialog box opened, Click "New" button
Create new data source dialog box opened
Scroll to the bottom and select SQL Server, Click Next
Give name to your Data Source, Click Next, Click Finish
Create New Data Source Dialog opens
Give some discription OR leave empty, Type Name of your SQL Server (you named it, when install SQL Server on your machine)
Click Next, Click Next
Check "change default database to check box
Select DB where you want your data transfer to
Click Next, Click Finish
NOTE: You need to create new DB (empty) on SQL Server, before doing all this
Now: Right-click any table, select Export, select from drop-down list ODBC, from Data Sources window select your Data Source, You created, Click OK
Use SQL Server with SQL Management Studio Express.
All dates must have a input mask; all text and Memo must have Allow Zero Length =Yes
After all disconnect all links from Access back-end, and establish links from SQL.RENAME all newly linked tables to old names. Use Fron-end user interfase, until do some new.
Forgive my lack of knowledge of Acronym Soup, but I assume SSMA is the SQL Server 2005 "import data wizard" or the wizard in Access to send the data to SQL Server. It appears that you sent the data to SQL Server from Access - something you don't want to do. You want to use the DTS in SQL Server (now called SSIS or something?) to import the data into SQL Server. Then you'll have your tables in SQL Server. Then, simply create your DSN entry for the SQL Server and re-link your tables. All should be well.
Overall, the general rule is to import Access tables using SQL Server instead of using Access to send the data to SQL Server.
I'd bite the bullet and rename the tables on the SQLServer side back to the friendly names that you had in the original database. You'll probably have less problems. Especially if you have any embedded code the MS Access side.
As far as how you will deploy the MS Access side now, it should be pretty much create the ODBC link on the user's workstation, and copy the MS Access file to their desktop (although you might want to make an MDE (or the 2007 equivalent) to prevent them from accidentally breaking it).
Frankly, now that you have migrated, you need to look at the design of your tables. It is my experience that the wizards for Access migration do a poor job of selecting the correct datatype. For instance if you had a memo field, you might easily get away with a varchar field instead but the last wizard I used (an earlier version) always converted them to text fields. Now would also be the time consider some fixes such as making date fileds datetime instead of character based if you have had that mistake in the past.
I would never consider using a wizard again to do data migration myself having experienced how very badly they can do it.
You will alos find that just converting the data to SQL Server is often not eough to really get any performance benefit. YOu will need to test all the queries and consider if you can convert them to stored procs instead if they are slow. Eliminating the translation from Jet SQL to T-sql can being performance improvements. Plus there are many features of t-sql that can imporve performance that do not have Access equivalents. Access is not big on performance tuning, but to get the benefit of performance tuning with a SQL Server backend, you need to have SQL Server specific queries written. INdexing needs to be considered if the Access tables were not indexed properly.
Using SSMA is different when you use odbc. If you have an application using fully access (back end and front end). You can manipulate objects easily bounding forms, using DAO, etc.. without problem, then when u need to migrate database to sql server u can use directly odbc (by linking yourself tables to sql server), ssma, ... the main problem how to preserve bounded forms, queries, code in the client-side.
If U use directly odbc you must relink by yourself all objects and change code but if u use ssma, you have to do nothing, you will continue to work as u did before. The problem with SSMA is how to deploy the front end to the clients if you developed client side in other place using another sql server?

Resources