Is there a way to import data from a csv file into a temporary table on Snowflake? Preferably using the web interface.
It is not possible to load a csv file into a temp table from UI. It would be interesting to know the use case for the questions?
Given that Temporary tables only exist within the session in which they were created and persist only for the remainder of the session.
As such, they are not visible to other users or sessions. Once the session ends, data stored in the table is purged completely from the system and, therefore, is not recoverable, either by the user who created the table or Snowflake.
Related
Im trying to create an external table in Snowflake by using an external stage in Azure blob storage. Query runs witout errors but since there are several files in the external stage I can not successfully create a table with data from a correct file which I want to load in. Even though I have been trying to specify the file name by writing different path I still dont get the table with the right data.
Does anyone know how to specify a file among many files in an external stage?
I would also like to update the table in Snowflake evey time the file has been updated in Azure blob storage. How would you do?
Thank you in advance!
I frequently need to validate CSVs submitted from clients to make sure that the headers and values in the file meet our specifications. Typically I do this by using the Import/Export Wizard and have the wizard create the table based on the CSV (file name becomes table name, and the headers become the column names). Then we run a set of stored procedures that checks the information_schema for said table(s) and matches that up with our specs, etc.
Most of the time, this involves loading multiple files at a time for a client, which becomes very time consuming and laborious very quickly when using the import/export wizard. I tried using an xp_cmshell sql script to load everything from a path at once to have the same result, but xp_cmshell is not supported by AzureSQL DB.
https://learn.microsoft.com/en-us/azure/azure-sql/load-from-csv-with-bcp
The above says that one can load using bcp, but it also requires the table to exist before the import... I need the table structure to mimic the CSV. Any ideas here?
Thanks
If you want to load the data into your target SQL db, then you can use Azure Data Factory[ADF] to upload your CSV files to Azure Blob Storage, and then use Copy Data Activity to load that data in CSV files into Azure SQL db tables - without creating those tables upfront.
ADF supports 'auto create' of sink tables. See this, and this
My application Fusion BICC is dumping data into oracle cloud object storage in the form of csv. I need to upload this data in my target database. so I am uploading data in external table and then comparing data of external table and target table using Minus and if data is new I am inserting it and if it exist I am updating them. I need few suggestion.
1) what is the best way to compare record if there is huge data.
2) instead of writing to external table is there any other better way? sqlloader, utl_file etc
3) If any record got deleted in BICC it does not come into csv file. but I have to delete those record if they are not in file. how to tackle that.
other than DBMS_CLOUD is there any package to upload data. I am very new to this. Request you to please suggest me on the same.
Consider BICC is an application which is dumping data in the form of cs file to Oracle cloud. I am interested basically in reading data from cloud storage to DBaaS.
I have worked with external tables in oracle, It can be created on a file containing data (with many other conditions). Then, How global temporary tables are different from External tables ?
An external table gets its content from e.g. a CSV file. The database itself does not store any data. Their content is visible to all sessions (=connections) to the server (provided necessary access privileges exists). The data exists independently of the database and is only deleted (or changed) if the file is changed externally (as far as I know Oracle can not write to an external table, only read from it - but I haven't used them for ages, so maybe this changed in Oracle 18 or later)
The data for a temporary table is stored and managed inside the database, but each session keeps its own copy of the data in the table. The data is automatically removed by Oracle when the session is disconnected or if the transaction is ended (depending on the definition of the temporary table). Data in a temporary table never survives a restart of the database server.
Broadly an external table is a place holder definition which points to a file somewhere on the OS. These are generally used (not limited to) when you have an external interface sending you data in files. You could either load the data in a normal table using sqlldr OR you could use External tables to point to the file itself, you can simply query the table to read from the file. There are some limitations though like you can not update an external table.
GTT - global temporary tables are used when you want to keep some on the fly information in a table such that it is only visible in the current session. There are good articles on both these tables if you want to go more in detail.
One more thing a GTT table access would be faster as compared to an external table access.
I am moving data within folder from Azure Data Lake to a SQL Server using Azure Data Factory (ADF).
The folder contains hundreds of .csv files. However, one inconsistent problem with these csv's is that some (not all) have a final row that contains a special character, which when trying to load to a sql table with datatypes other than NVARCHAR(MAX) will fail. To get around this, I have to first use ADF to load the data into staging tables where all columns are set to NVARCHAR(MAX), then I insert those rows that do not contain a special character into tables that have the appropriate data type.
This is a weekly process, and is over a terabyte of data and it takes forever to move the data so I am looking into ways to import into my final tables rather than having a staging component.
I notice that there is a 'pre-copy script' field that can execute before the load to sql server. I want to add code that will allow me to parse out special characters OR null rows before loading to sql server.
I am unsure of how to approach this since the csv's would not be stored in a table, so SQL code wouldn't work. Any guidance on how I can utilize the pre-copy script to clean my data before loading it into sql server?
The pre-copy script is a script that you run against the database before copying new data in, not to modify the data you are ingesting.
I already answered this on another question, providing a possible solution using an intermediate table: Pre-copy script in data factory or on the fly data processing
Hope this helped!
You could consider stored procedure. https://learn.microsoft.com/en-us/azure/data-factory/connector-azure-sql-database#invoking-stored-procedure-for-sql-sink