How to load data from UNIX to snowflake - snowflake-cloud-data-platform

I have created CSV files into UNIX server using Informatica resides in. I want to load those CSV files directly from UNIX box to snowflake using snowsql, can someone help me how to do that?

Log into SnowSQL:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-log-in.html
Create a Database, Table and Virtual Warehouse, if not done so already:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-create-objects.html
Stage the CSV files, using PUT:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-stage-data-files.html
Copy the files into the target table using COPY INTO:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-copy-into.html

Related

Importing data into singlestore db using csv

I am new to MEMSQL and trying to restore data into MEMSQL db using .csv file but its below error
ERROR 1017 ER_FILE_NOT_FOUND: Can’t find file: ‘\home\vagrant\filename.csv’ (errno: 2)
CSV data is imported from another server.
I have memsql on virtual machine.
I have copied table dump csv to \home\vagrant\ location.
I am trying below command to restore data.
LOAD DATA INFILE ‘\home\vagrant\filename.csv’ INTO TABLE “tableName” FIELDS TERMINATED BY ‘\t’ LINES TERMINATED BY ‘\n’;
Thanks in advance
Just to clarify, the filename.csv file is located inside the virtual machine running memsql or is located on same machine your running the LOAD DATA from?
If the file is on the same machine your running LOAD DATA from you need to add the LOCAL keyword (LOAD DATA LOCAL INFILE ...)

How to export/generate backup or script using SQL Server using Ubuntu?

I have SQL Server installed in Ubuntu 18.04 LTS but I am trying to generate a backup or script for a specific table data for example products table. I do not know how to generate a script or backup for product table because I have no GUI because it is a virtual machine server. This is how I access to server database:
I cannot find information in the internet how to generate a script for a specific table since most of the documentation shows how to generate a script using a GUI, but I am not. How do I solve this?
You can use the bcp command to import/extract data from your db in a shell script. Ref: https://learn.microsoft.com/en-us/sql/linux/sql-server-linux-migrate-bcp?view=sql-server-ver15
Below link talk about this solution and export data to csv file
Export table from database to csv file

sql file stream PathName dosent show in window fileExplorer

I have a file Stream Sample database , I have added records into table.
When I use file.PathName() my sample project in c# SqlFileStream class recognize this address and retrieve my file but did not show in windows file Explorer?
What is this address? Is it fake address?This class may look at FileGroup path for finding real address?if not how this class find path?
\ComputerName\SQL2016\v02-A60EC2F8-2B24-11DF-9CC3-AF2E56D89593\FileStreamTestDB\dbo\BLOB_Table\FileData\00953530-2F65-4AC9-81E9-0281EFB89592\VolumeHint-HarddiskVolume3
Data in a FILESTREAM column are stored inside of the database. You can see the internal files stored in the database by browsing the local file system FILESTREAM filegroup directory but that path is not exposed for remote access and shouldn't be used at all. You'll need to use SqlFileStream to get a handle for access to FILESTREAM data via the Win32 API.
If you want to access files stored in the database via Windows Explorer or any other application, consider using Filetable instead. A FileTable leverages FILESTEAM internally but exposes the files stored in the table via a UNC path for non-transactional access. That allows files to be added/change/deleted via the share just regular files or with T-SQL INSERT/UPDATE/DELETE statements. In both cases, the changes are stored in the database FileTable and reflected in the FileTable directory share too.

how to download CSV file from FTP location and update table using stored proceedure

I want to download CSV file from FTP location and update that data into tables using stored procedure. I am not sure how to do that or whether stored procedure is right approach. I have gone through many posts but most of the post talk about pushing data to FTP location.
Any help much appreciated.
Thank you.
For this requirement you may have to write some Shell Script to connect required FTP Server and download to local system and use to sql loader function supported by the respective database.
If you using some programming language you can write program to download file from ftp location and read CSV file and insert as batch.

Update Access Database Linked to SharePoint Using CSV File

Synopsis
I am needing to bridge a gap between a CSV file and an Access 2007 database linked to a SharePoint list. The end goal is to fully automate this process: Remedy > CSV > Access > SharePoint
Problem Details
Remedy ---> CSV
I am using a macro in BMC Remedy to run a report and export data to a CSV file. Each time the macro runs, it is set up to overwrite the previous version of the CSV file.
CSV --x Access
While I can import the CSV table (as a text file) to Access, the program won't let me update the database. Creating macros, relationships or queries is impossible since the CSV file is overwritten each time the Remedy macro runs. When I attempt to copy the CSV table to the linked database (using export feature in Access) I get the following error:
You can't delete the table 'tablename'; it is participating in one or more relationships.
Access --> SharePoint
The Access database I am wanting to update is linked to a SharePoint list so that any edits made (and saved) in Access update SharePoint.
Work-Around
I can copy & paste the data from the CSV to the Access database, but am wanting a better solution that doesn't require any maintenance.
I have tried creating a macro, but when I use RunCommand > SelectAll I get the following error:
The command or action 'SelectAll' isn't available now.
Is it possible to do this with a macro or do I need a VB script?

Resources