I created a SSIS package in BIDS that pulls data from a sharepoint list and inserts it into a SQL Server database using OLE DB Destination. This SSIS package works perfectly, and data is being populated. I then needed to pull data from a different SharePoint list on the same site.
I am using the exact same credentials, same read/write access, and almost identical SSIS packages. However, on this new SSIS package, there are 0 records being loaded.
There are no errors or warnings in the execution results page. To ensure there is not some setting I missed, I copied the data flow task from the working SSIS package into the SSIS package that is not loading any records.
The data flow task that I just copied pulls information from its SharePoint List as expected, so I know there is not some setting that I am missing.
Any thoughts about why this is happening? Google hasn't been much help on this and I am stumped.
The execution results page says:
[SharePoint List Source] Information: Loaded 0 records from list 'LISTNAME' at 'SITEURL'. Elapsed time is 131ms
Thank you in advance.
Check the name of the view that you put in the SharePoint List Source (SLS) Component Properties. The SLS will only pull items that are visible in the specified view. For example if you create a new calendar and add 10 items in the next month, they will not show up in the default view "calendar". But if you change the view (in SharePoint or SSIS) to "All Events", you will see your 10 items show up in your browser and also as rows in the SSIS package.
Use OData source of a tool on MSSQL 2014/16, which successfully load whole list and data as well.
Related
Can someone help me to automate a SSIS package with the following criteria?
Pull data from an URL website. This website uploads monthly Excel sheets with data.
The data in the Excel sheets needs to be updated monthly in our SQL Server table.
We only need the latest data, so every month we can truncate & refill or do a delta.
Background info:
There is already a table in our DEV environment that is in place and the necessary data is being populated monthly, however data is incorrect & inconsistent. Our company does not manage the DB, so we do not have control over this. That's why I need to make our own table within the DB so we can efficiently load the correct data.
So far these are the things that I have done to test it out:
Create the package with a Data Flow Task in VS 2019. Excel Source -> Data Conversion -> OLEDB Destination.
Package ran perfectly fine, with the exception of one column which data types did not convert correctly. (I will work on this issue)
These are the things I need to do to successfully execute this requirement:
Automate the execution of the SSIS package by creating a SQL maintenance job which will run monthly.
My questions:
How can I automate the process of pulling the data from the official source (Excel sheets) to our DEV environment on a monthly basis? Or do I have to do this manually?
How can I track the changes that were made in the table monthly? I know I have these options but not sure which one is better:
Configure CDC
Configure change tracking
DDL Triggers
Any help and best practices will be much appreciated.
How can I automate the process of pulling the data from the official source (Excel sheets) to our DEV environment on a monthly basis? Or do I have to do this manually?
Use a script task. It is extremely very little code:
System.Net.WebClient wc = new System.Net.WebClient();
wc.DownLoadFile("URL of file", "where you want the file to go");
This assumes you can get out of your dev environment.
Need to migrate data from Sql tables to SharePoint 2013 lists. The database size is approximately 80 GB and need to move the contents from Sql tables to SharePoint lists with same schema.
Just want to know if there is any tool available for this. Or do we need to create an application (probably in .net) to fetch data from Sql and write back to SharePoint lists.
Any suggestions.
Try to see this article:
How to: Create an External Content Type Based on a SQL Server Table
In addition, see CodePlex, if you will use SSIS to perform the exportSharePoint List Source and Destination in the Microsoft SQL Server Community Samples: Integration Services project on CodePlex.
Using current toolchain you may consider using this path:
Open Excel
Add a data connection to your SQL Server
Import all data you want to have as a SharePoint list - you got a sheet now
Save as *.xlsx file
Go to SharePoint 365, select "New List"
Choose "Import from Excel" in the "New" dialog
There appears a sheet preview where you can adjust column types (most likely needed)
Click "Next", give the list a name (you may need to remove the silly Guid attached to the generated name, apart from this the name is the table name)
Click "Create"
Done
For a single table with roughly 800 records it takes 2 mins approx.
I've created a SSIS package in SQL Server to export data from a ODBC source(QuickBase app). I've also scheduled it through a sql job agent in SQL Server and everything is working perfectly.
When there is some changes in source the job will fail.
I'm encountering an error VS_NEEDSNEWMETADATA when I modify the column of source tables from which I'm exporting data.
After refresh metadata it was working but i want permanent solution for this error.
how can I refresh metadata automatically, please hep me.
Thanks.
There is no such functionality provided by SSIS itself. The metadata is a design time function and hence static. You'll have to refresh and revalidate each time the underlying metadata changes.
There are some custom components on codeplex which do the automatic metadata refresh. But the last time I saw one, it was able to do that only for file-system storage.
Just curious to know why do you need to frequently change the metadata? This error comes when an existing column is modified. If the database design is correct, then I would assume such changes will be minimal.
This is when -> You have column name mismatch between source and destination (Correct the name and it should work)
-> You added/ deleted column(s) in source and destination (refresh the metadata)
I fixed it by opening the advanced editor for the source and change the setting "ValidateExternalMetadata" to FALSE
I have a SQL Server database with defined (1) tables, (2) corresponding views, and (3) stored procedures.
To my knowledge there is one and only one SSIS package, which takes care of the process of loading the data in the mentioned database.
At the end there is a web interface for the end user simply makes SELECT statements on top of one view in the database and displays the data.
Problem I am facing with is - some values (specific logs) are outdated.
It means - the load job (SSIS package) executes successfully, however does not displays the newest data.
I am assuming that the problem is because either the log files are not placed where they should be placed, or the source for the logs has changed.
Therefore, I opened the productive version of the SSIS package. But then I go the error messages:
The script is corrupted.
There were errors during task validation.
There should be no errors, because the package runs every day successfully.
I tried to find something like 'Rebuild the Project' option, but could not find anything.
How can I test it in order to find the cause for outdated rows?
-> Where in the package can I see for example where a particular table is being filled?
You should open the ssis package with the proper version of BIDS or visual studio. i.e. the SSIS packages differ from SQL 2008 or SQL 2012. which version of SQL is the package made with and with which tool are you opening the package
What's the correct way of exporting data from Excel 2013 file to SQL Server database? The data from the Excel file should be transferred into SQL when saving excel file to a database.
I know many answers for this are available but my question is bit different: every time the excel data changes, or the user clicks on save button the data in the database also needs to be updated.
The easiest way to do this is with an SSIS package. SSIS (SQL Server Integration Services) is a package built in to SQL which allows transformations between data formats.
You can create a package by right-clicking on your target database in SQL Server Management Studio and selecting Tasks > Import Data. In the wizard that comes up asking for a data source choose "Microsoft Excel" from the top drop-down labelled as Data Source, then follow the wizard through. You'll have the choice of importing the Excel data into a new table or mapping it into an existing table.
If you want do this programmatically, you can save your package at the end of the wizard and then invoke it via code. But that's a different question.
What you want is not possible (as long as I know off). You can use SSIS package to migrate the Excel sheet into SQL Server, but is imposible to determine if someone "click save" or do some changes on the excel file. SSIS package can be programed to run on schedule or by demand. You should investigate SSIS packages. It is not easy to learn, but do what you need.
Maybe you would like to try a tool I have developed? It's an Excel Add-In that exports Excel data to SQL Server.
There is a feature to automatically export the data to SQL Server every time you press the save button in Excel. If you need to update the database every time a cell value change, you will need to add a few lines of VBA-code that will push the data to SQL Server.
To beta testers I currently give away a free license, so if you are interested in testing it out, send me an email:)
www.sqlpreads.com