Experts,
Have a questions on External Table refresh. We are having a external table pointing to the S3 folders and refresh it manually using Alter command. We have process in Nifi which will place a file in S3 folder and another process in talend which execute the alter command to refresh.
Question Here is - in a time frame Nifi has written 3 files and it is in the middle of writing the fourth file, when i list the S3 it would show 4 file names however the 4th file still being written by Nifi Process. During this time when i execute alter refresh command will 3 files will be refreshed to external table or 4 ? 4th file is still not complete.
Please share your thoughts, Meanwhile we are planning to do a PoC.
Thanks,
Gopi
Related
I have created CSV files into UNIX server using Informatica resides in. I want to load those CSV files directly from UNIX box to snowflake using snowsql, can someone help me how to do that?
Log into SnowSQL:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-log-in.html
Create a Database, Table and Virtual Warehouse, if not done so already:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-create-objects.html
Stage the CSV files, using PUT:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-stage-data-files.html
Copy the files into the target table using COPY INTO:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-copy-into.html
For a join project with a 3rd party I need to build a new database in SQL Server (2014SP2), which will include only main objects from our current one and without any data, apart from some basic settings I will INSERT INTO later.
I maintain our DDL scripts on GitHub, so on a daily basis every new script will be saved there as well as on our local SQL Server with .sql extension and then will be picked up and processed by my stored procedure mySP across our test DB's. Once everyone is happy and things have been tested, I will send the script to our DBA's to apply in live.
In order to address this new request I was thinking to implement same approach:
To have all the relevant scripts saved on the new SQL Server.
Run stored procedure mySP, which will pick them up
and process one by one: will execute the scripts, record the
errors (I already know there will be some, e.g. foreign keys on
tables, which may not exist yet), send me an email with the list of
errors and then I will have to manually fix them (e.g. re-build
those foreign keys).
My question if there is any better approach, please?
Example of my current process :
Step 1 - save a script with .sql extension on SQL Server. example for such a script is below
/*********************************************************************/
-- Date |Developer's Name | Version | Description
--27/04/2018| Mr Smith | 12 | Added new column myColumn
-----------------------------------------------------------------------
ALTER TABLE myTable
ADD myColumn BIT DEFAULT 0 NOT NULL;
Step 2 - my stored procedure is scheduled to run every hour and will pick up all the .sql files from the predefined location and process them, sending me the email. The script is here: http://www.sqlfiddle.com/#!18/e3309/1
Any advice will be much appreciated.
I have created my first SSIS package. I have a for each container the loops thru a folder. The data flow has a derived column task and 4 lookups. When I run the package in Visual Studio (2013) it starts with the first file and arrives at the destination, but it does insert the data it only hangs with the text "Validating OLE DB Destination 1" in the status bar.
The files are located on my hard drive and the destination database is on the locale network. I'm using a sysadmin account the be sure that the user have sufficient access rights.
I'm unable to query the destination database table from SSMS as well.
Anyone have some idea what could be the problem and how I could solve it?
Sorry for the unspecific question. In my control flow in ssis I have a for each loop container who contains a data flow task to import all the data in every file that the container loops. Connected to the task, is two move file tasks dependent on success or failure of the import task. The strange part is that one file i moved, no data is inserted in the database and the for each loop hangs after the first loop (the folder contains 150 files). While this ssis process hangs, i'm unable to to query the database with select *, no error it just says "executing query".
The latter. It finishes the first round (moves the file to my success folder) and then halts with the "still working" icon on the data import task. But the data is not inserted even if the file is move. Will the transaction commit first when it has finished processing all the files?
Edit: Image of the control flow and the data flow
The answer was found in the "lock table" option. Since both destinations are the same table, i guest the first destination locks it, when the second destination hits the same table it is locked and it waits until it is unlocked. And that does not happen since its not ready to commit yet.
Synopsis
I am needing to bridge a gap between a CSV file and an Access 2007 database linked to a SharePoint list. The end goal is to fully automate this process: Remedy > CSV > Access > SharePoint
Problem Details
Remedy ---> CSV
I am using a macro in BMC Remedy to run a report and export data to a CSV file. Each time the macro runs, it is set up to overwrite the previous version of the CSV file.
CSV --x Access
While I can import the CSV table (as a text file) to Access, the program won't let me update the database. Creating macros, relationships or queries is impossible since the CSV file is overwritten each time the Remedy macro runs. When I attempt to copy the CSV table to the linked database (using export feature in Access) I get the following error:
You can't delete the table 'tablename'; it is participating in one or more relationships.
Access --> SharePoint
The Access database I am wanting to update is linked to a SharePoint list so that any edits made (and saved) in Access update SharePoint.
Work-Around
I can copy & paste the data from the CSV to the Access database, but am wanting a better solution that doesn't require any maintenance.
I have tried creating a macro, but when I use RunCommand > SelectAll I get the following error:
The command or action 'SelectAll' isn't available now.
Is it possible to do this with a macro or do I need a VB script?
I have some csv files. I want to write SQL Server script to read the file at certain period and insert in SQL Server db, if record is not found and ignore it if file has already been read previously by scheduler. Each csv will contain one record only.
Like:
1.csv => John,2000,2012/12/12
2.csv => Tom,3000,2012/12/11
It will be great if someone can provide examples of script.
Thanks!
If I was you I would create an SSIS package that uses the multi file input. This input let's you pull data from every file in a directory.
Here are the basic steps for your SSIS package.
Check if there are any files in the "working" directory. If not end the package.
Move every file from your "working" directory to a "staging" directory.
You will do this so that if additional files appear in your "working" directory while you are in the midst of the package you won't lose them.
Read all of the files in the "staging" directory. Use a data flow with the multi file input.
Once the reading has been completed then move all of the files to a
"backup" directory.
This of course assumes you want to keep them for some reason. You could just as easily delete them from the "staging" directory.
Once you have your package completed then schedule it using SQL Server agent to run the package at whatever interval you are interested in.