Azure Data Factory - File System to Oracle Cloud Storage - filesystems

Is it possible to copy files from an on-prem File System to Oracle Cloud Storage. Note that we are not concerned with the data inside the files.
In simple terms it's as if copying files from one folder to another.
Here is what I have tried:
Created Self-Hosted Runtime for the file system (testing on my local machine)
Created Linked Service for File System
Linked Service for Oracle Cloud Storage (OCS)
Data Set of File System
Data set of Oracle (OCS)
However, I get error saying that my C:\ can not be resolved in step 2. when connection is tested.
and
In 5. it says not able to sink because it is not supported under OCS. At this point it seems like it is not possible to copy files into OCS?
I tried different configurations to see if OCS can be used as a drop container for files.

Related

Create files and folder to network drive using SSIS

I have created SSIS packages which creates new folder using file system task at output location and then creates data files in that folder using data flow task.
I have created one network drive which is mapped to azure blob. So I pass the network drive path like Z:\ to package and the folder and files are created as expected which are also reflected on the azure blob.
Now when I schedule this package through SQL agent job I get error that the cannot find part of the path Z:\folderName from file system task. So I though it is because the sql server agent service was not running through my user id . So I started sql server agent with my credentials but it still gives me same error.
Note: My Id doesnt have direct access to azure blob and the network drive is only accessible by my id.
We are currently using azure blob for dev but we may use separate server to store files due to which I cannot use flexible file system task available in SSIS azure service pack
We had reasons we could not use UNC paths (\server\share\file.txt) and struggled with mapped drives as well. The approach we ended up using was to explicitly map and unmap drives as part of our ETL Process.
Execute Process Task - Mount drive
We used a batch file to mount the drive as it made things easier on us but the command would take the form of something like
net use Z: \\server\share mypassword user#azure
Execute process task - Unmount Drive
Again, execute process task
net use /delete Z:
We ended up with a precursor step that I don't remember how we accomplished it, but we would test whether the drive had been unmounted before we attempted to mount it. I think it was File System Task to see if the drive existed and if so, then we had the Unmount task duplicated.
I was able to resolve this using credential manager. I logged in using a domain user and then added the azure path like \\server, userid and password to windows credentials . Then started the sql server agent service using the same domain user id . Set up the file path as \\server\share similar to what I have provided in windows credentials into the ssis package configurations.After doing this I was able to successfully execute the package through sql agent job.
I was able to solve this issue using script task (C# Code) to move data. I've generated a project parameter for Target ShareFolder and used it as ReadOnlyVariable for script task. The target share folder has been already mapped as a network drive on my local PC as well as on Application Server.

One parameter file for different repositories Informatica PowerCenter

I have a parameter file which is assigning DB Connections for one repository which stands for test. It is referring to the folder where the workflow and session is like the following.
[ORANGE_REMIGRATION.WF:wf_m_remigration_payments_test.ST:s_m_remigration_payments_test]
I would like to know whether I can use one parameter file to assign DB Connections for different environments e.g. when the repository is PROD then the workflow should write in that environment. I need to know whether we can use repository names in the parameter file e.g.
[MDM_TEST.ORANGE_REMIGRATION.WF:wf_m_remigration_payments_test.ST:s_m_remigration_payments_test]
whereas MDM_TEST would be the repository and then list the DB Connections and then another list in the same parameter file for MDM_PROD. Is this possible or is there another way to do this?
Here's the description of Parameter File sections. Repository is not allowed.
However if you migrate workflows between environments and use different Integration Service, there are different values for $$ParamFileDir (or $$PMRootDir in general). So if you refer to your file using the variable, migrated workflow will use the parameterfile for the given environment. Hence DEV WF would use DEV connections, PROD one would use PROD connections. No actions needed. More can be found here.
I assume, your have distinct physical servers for DEV & PROD. If you define folder structure mirror each other like
/share/param/Paramfile1 ===> Dev server
/share/param/Paramfile1 =====> Prod Server
you can use the same param file. These param files can be configured via respective workflow properties.

How to load file locally with path vs path in Azure Devops build

I am trying to load a file in my PostDeployment script in my DB project:
FROM OPENROWSET(BULK'C:\Development\MyProject.Db\Scripts\My_Script.sql',SINGLE_CLOB)
Of course my build pipeline in azure devops is complaining that cannot find this script. But when I change it to the following my local build complains:
FROM OPENROWSET(BULK'$(Build.ArtifactStagingDirectory)\MyProject.Db\Scripts\My_Script.sql',SINGLE_CLOB)
What is the right way here? I would like to keep my local path locally and switch to the build one (hoping that one is the correct one, didn't even get the chance to try it cause the build complains) when the Azure Pipeline build runs. How can I achieve this?
Generally, the Post Deployment scripts will run on the servers once the databases have been deployed successfully to the servers. So, you can't directly use the absolute (or relative) paths in the scripts to access the local machines.
In Azure Pipelines, if we want to access the remote services from the local machines, we have many existing methods and tasks to create and use the service connections to access the services.
However, for the reverse visit that access the local machines from the remote services, we have no existing, easy ways. As far as I know, you may need to map the IP address and port of the local machine on the remote services, and you also may need to configure proxy settings, firewall settings on the local machines.
If the files you want to access in the Post Deployment scripts have been deployed to the servers, you can directly use these file paths on the servers instead of the local machines.
[UPDATE]
Each pipeline will be assigned with a working directory ($(Pipeline.Workspace)) in the working directory for the agent ($(Agent.WorkFolder)).
If the files you want to access are the source files in the source repository, and the repository has been checked out to the working directory for the pipeline, you can access the source files in the directory $(Build.SourcesDirectory).
$(Build.SourcesDirectory) is the default working directory of the job, so you can access the source files using either relative paths or absolute paths.
For example:
Using relative path.
MyProject.Db/Scripts/My_Script.sql
OR
./MyProject.Db/Scripts/My_Script.sql
Using absolute path.
$(Build.SourcesDirectory)/MyProject.Db/Scripts/My_Script.sql

Moving local Geoserver's data directory to production server unable to create new store

I have a local GeoServer running on tomcat which using a PostGIS store to get layers from the PostgreSQL database. There is a production server that runs the same version of my local GeoServer and PostgreSQL database. In order to apply changes in layers and layer groups of my local GeoServer, I copied and replaced the data directory to production GeoServer. After restarting the tomcat on the production server, Geoserver unable to load Layers and Layer Preview pages on the web interface. Trying to change the host address of the store or creating a new one, it gives me this error:
Error creating data store with the provided parameters: Failed to upgrade lock from read to write state, please re-try the configuration operation
You don't say which OS you are using or how you made the copy, but the most likely error is a permissions or ownership one.
Make sure that the user which is running GeoServer has permission to read, write and execute on the data dir. On linux machines I've seen issues with uid and gid differences between machines depending on how the copy id carried out. On Windows I've seen issues just because windows and the virus scanner feel like it.
When using the community function jdbcconfig, the same issues.
It seems that an error occurs because file locking of catalog is performed.
Since the data directory is not used by using JDBCconfig, the file lock has been improved by setting it to Lock Disable.
https://docs.geoserver.org/stable/en/user/configuration/globalsettings.html#file-locking

Informatica Cloud - Picking up files from SFTP and inserting records in Salesforce

Our objective is as follows
a) Pick up a file "Test.csv" from a Secure FTP location.
b) After picking up the file we need to insert the contents of the file into an object in Salesforce.
I created the following connection for the Remote SFTP (the location which will contain "Test.csv")
Step 1
This is as shown below
Step 2
Then I started to build a Data Synchronization Task as below
What we want is for the Informatica Cloud to connect to the secure FTP location and extract the contents from a .csv from that location into our object in Salesforce.
But as you can see in Step 2, it does not allow me to choose .csv from that remote location.
Instead the wizard prompts me to choose a file from a local directory (which is my machine ...where the secure agent is running) and this is not what I want
What should I do in this scenario ?
Can someone help ?
You can write a UNIX script to transfer the file to your secure agent and then use informatica to read the file. Although, I have never tried using sftp in cloud, I have used cloud and I do know that all files are tied up to the location of the secure agent( either server or local computer) .
The local directory is used for template files. The idea is that you set up the task using a local template and then IC will connect to the FTP site when you actually run the task.
The Informatica video below shows how this works at around 1:10:
This video explains how it works at around 1:10:
http://videos.informaticacloud.com/2FQjj/secure-ftp-and-salesforececom-using-informatica-cloud/
Can you elaborate the Secure agent OS as in Windows or Linux.
For Windows environment you will have to call the script using WINSCP or CYGWIN utility I recommend the former.
For Linux the basic commands in script should work.

Resources