Update Access Database Linked to SharePoint Using CSV File - database

Synopsis
I am needing to bridge a gap between a CSV file and an Access 2007 database linked to a SharePoint list. The end goal is to fully automate this process: Remedy > CSV > Access > SharePoint
Problem Details
Remedy ---> CSV
I am using a macro in BMC Remedy to run a report and export data to a CSV file. Each time the macro runs, it is set up to overwrite the previous version of the CSV file.
CSV --x Access
While I can import the CSV table (as a text file) to Access, the program won't let me update the database. Creating macros, relationships or queries is impossible since the CSV file is overwritten each time the Remedy macro runs. When I attempt to copy the CSV table to the linked database (using export feature in Access) I get the following error:
You can't delete the table 'tablename'; it is participating in one or more relationships.
Access --> SharePoint
The Access database I am wanting to update is linked to a SharePoint list so that any edits made (and saved) in Access update SharePoint.
Work-Around
I can copy & paste the data from the CSV to the Access database, but am wanting a better solution that doesn't require any maintenance.
I have tried creating a macro, but when I use RunCommand > SelectAll I get the following error:
The command or action 'SelectAll' isn't available now.
Is it possible to do this with a macro or do I need a VB script?

Related

How to generate Insert statement from PGAdmin4 Tool?

We are writing a new application, and while testing, we will need a bunch of dummy data. I've added that data by using MS Access to dump excel files into the relevant tables into the Postgres database.
What should I do now to generate an Insert statements from the PGAdmin4 Tool similar to what SQL Studio allow us to generate an Insert statements for SQL Server? There are no options available to me. I can't use the closest one, which is to export and import the data via CSV.
I understand that you cannot import the CSV file into the actual DB as this needs to be done through ASP.NET core EF. Perhaps, you can probably create a test schema and import the CSV file into the test schema. Once you have the data imported into the test schema, you can use that to generate SQL statements using the steps below:
Right click on target table and select "Backup".
Select a file path to store the backup. You can save the file name as data.backup
Choose "Plain" as Format.
Open the tab "Options" check "Use Column Inserts".
Click the Backup-button.
Once the file gets generated you can open with Notepad++ or VSCode to get the SQL insert statements
You can use the statements generated and delete the test schema created
Here is a resource that might help you in loading data from Excel file into PostgresSQL if you still need to take this path Transfer Data from Excel to PostgreSQL

How to load data from UNIX to snowflake

I have created CSV files into UNIX server using Informatica resides in. I want to load those CSV files directly from UNIX box to snowflake using snowsql, can someone help me how to do that?
Log into SnowSQL:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-log-in.html
Create a Database, Table and Virtual Warehouse, if not done so already:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-create-objects.html
Stage the CSV files, using PUT:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-stage-data-files.html
Copy the files into the target table using COPY INTO:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-copy-into.html

SQL Compare Command-line does not synchronize database

In order to automate database synchronization using the SQL Compare command line; I have created a project to compare and deploy from my local database to a database in another server. Then I create a bat file. here is my bat file:
SET curdir=%~dp0
SET sqlcompare="C:\Program Files (x86)\Red Gate\SQL Compare 13\sqlcompare.exe"
%sqlcompare% /project:"%curdir%IcLoyalty.scp" /sync /include:Identical
The result of command is:
Registering data sources
Creating mappings
Comparing
Summarizing Project Selections
Retrieving migration scripts
Checking for identical databases
Creating SQL
Deploying changes (from DB1 to DB2)
When I check the destination database the changes does not apply. Consider that project work correctly when I open it with SQL Compare Application.
What I've missed in bat file?
It's possible that you had your selection (the checkboxes on the comparison results) deselected when you saved the project. Try selecting them all and resave the project.
This is from the documentation page.
"If you want to include or exclude objects from an existing project, you must modify your selection using the graphical user interface."

Spoon - read SQL code from txt file and execute on DB

I'm learning to develop ETL using Pentaho Spoon, I'm pretty noob yet.
Instead of storing SQL operations inside its file, I'd like to have them on their own .sql files. It makes easier to track changes on Subversion, and in case of need I can just open the sql file on DB manager and execute it directly.
How could I do that? I suppose I could use some component to read a txt file into a variable, and another component to take that variable and execute it on DB.
How's the simplest way to achieve that?
In the standard SQL Table input, you can define the query to be a parameter ${my_query} and this parameter has to be defined (without ${...} decoration) in the transformation properties: right-click anywhere, select Properties on the popup menu, the Parameter tab.
Each time you run the transformation, you'll presented the list of parameters, among which my_query which you can overwrite.
To automatize, follow the example which was shipped with the installation zip. In the same directory as you spoon.bat/spoon.sh, there is a folder named sample, in which you will find a job to read_all_files or read all_tables. Basically this job list the files in a directory, and for each one puts it in a variable and use it as a parameter to run the transformation. Much more easier to do than to explain.

How to execute folder with SQL Server 2008 scripts

I have a folder with a .sql files; 1 file per query. I want to execute all queries/ sql files and save them as csv.
Is there a way to do that automated without using the windows cli (disabled in my environment). I do have the SQL Server Management Studio.
I would approach this task using SSIS, providing you have Business Intelligence Development Studio (BIDS) installed.
First create a 'Foreach Loop Container' pointed to the folder with the SQL files, then use a variable to retreive each file name.
Next, create a flat file connection and set the 'Connection String' property to the variable that contains the file location.
Next, using the 'Execute SQL Task' component set the 'SQLSourceType' to 'File Connection' and the 'FileConnection' to the one created in the previous step.
Finally, depending on how the data is returned you have a couple of options, if the result set is small, only a row or a single column, then you can save the results to a variable and using a 'Dataflow' task create a 'Derived Column' component and export the contents of that variable to a CSV file. Or, if the dataset is larger you could dump the results to a temp table and then using an 'OLE DB Source' and 'OLE DB Destination' you could push the full result set straight into a CSV.
Hopefully this isn't too convoluted of a solution, this approach has the advantage of being able be run from either a remote machine or from the server itself, plus you can automate its execution with a SQL Agent Job.
Create a VB.NET console application.
Generate a list of files that end in .SQL from the folder in question.
Load the contents of each file into individual SQL Commands
Execute the SQL Command for each, storing the results in DataSets.
For each table in each dataset, create a new .csv file
For each .csv file, you will need to iterate over each cell in the datatable, and utilize proper escaping for .csv files.
Use 'for' in combination with either sqlcmd or bcp command for each file in the script folder.

Resources