is it possible to start up the adaptive server in sybase ASE 15.0 without .cfg file being present? On each server start up, the message "reading .cfg file for configuration information" is being displayed. so, if i delete the .cfg file for the server i'm starting up, will i be still able to start up my server? will the start up always fail?
The server should start fine without a .cfg file. If the file does not exist when the server starts up, it will create it.
All this information is available in the documentation:
ASE System Admin Guide - V1: Setting Configuration Parameters
Related
I'm in the process of migrating the database from one server to another. When I try to select the backup file (.bak) within Docker, I'm getting an 'Access Denied' error. How to provide access permission to the Docker container?
enter image description here
I had the same error, and what I concluded is that the problem is the file is not in the format it should be, to be precise since I have remote access to my Linux server with GUI, I just copied the .bak file from the windows machine to Linux, which I repeat is not advised way to transfer files form one OS to another. I solved the problem by posting .bak on Google Drive and then generating a download link that I typed in the terminal later.
To generate a proper link from Google Drive I recommend the following guide:
https://bytesbin.com/skip-google-drive-virus-scan-warning-large-files/
When you get the link type in the terminal:
$ curl -L -o The_Name.bak "The_link"
For restoring the database follow this tutorial:
https://learn.microsoft.com/en-us/sql/linux/tutorial-restore-backup-in-sql-server-container?view=sql-server-ver16
When one runs a job from the server (by selecting server as below), does PDI pick up the "kettle.properties" file from the server or from the local computer they are running the job from? What about the Pentaho User Console portal - where is the file being picked up from when one runs the jobs from there? Is there anyway to tell PDI which "kettle.properties" file to use?
AFAIK, there is no way to pick a kettle.properties file location from within the Spoon interface right before executing a job/transformation.
The kettle.properties file used is always linked to the instance of Kettle that executes the job/transformation.
When running a job locally with the PDI Client (Spoon), the kettle.properties file used is the one contained in the directory pointed to by the -DKETTLE_HOME JVM option (defined when running the spoon.sh or Spoon.bat launch scripts).
When running a job/transformation on the Pentaho Server (by either scheduling it explicitly on the Server from Spoon, or by running it from the PUC), the kettle.properties file used is the one located in the directory pointed to by the -DKETTLE_HOME JVM option defined when running the start-pentaho.sh or the start-pentaho.bat launch scripts.
Both the PDI Client and the Pentaho Server set the default location of KETTLE_HOME to ~/.kettle.
If you want to use a kettle.properties file located somewhere else, you will have to define the location of the Kettle Home directory yourself before starting the PDI Client or the Pentaho Server:
By setting an environment variable called KETTLE_HOME. It has to be set before running the Spoon launching scripts or the Pentaho Server launching scripts
For the Pentaho Server, you can also add the option -DKETTLE_HOME to CATALINA_OPTS (if the Pentaho Server uses Tomcat) by editing the launch script.
You can find this information on the Customize the Pentaho Server page.
I have SSIS package and on VISUAL STUDIO working good, for another server pre production also, but when i use production server and try loading xls files nothing do, didn't see any files. I hv the same permission both server sql so i Somebody can help me? I will be maximum enjoye if somebody will react!
Thank you very much for the suggestions!
My package imports xls files into a database on sql server
In the package I have a foreich loop cointainer and in the directory I use the variable " input folder path" .In the expression on my menager connection i have variable "Full Path" (with file name)
On preproduction server, the package is great and loads xls files. On production server no. And i have comunicat :"For each file enumerator, no files were found that matched the file pattern or the specified directory was empty."
I set delay validation (connection manager, task) is true, When i deploy on server my project again, the message "another user uses files" Nobody use this folder. only me.
I Use the mapped path // fileserv / clientspath.filename.
I do not know what else i can write about this will be help self ..
I have an requirement. I have a text file.
data will be dump into that file always.
Whenever there is an update in that text file . The same should update in SQL Server as well.
Is that possible??
SQL agent jobs with integration service packages is not recommended .. any other way.
You can write your own f.ex. .NET application / windows service with file watchdog and monitor file content changes.
EDIT:
In SSIS you can also try to use File Watcher Task:
Handling file access locks while file is being built
Out of curiosity, why using SSIS and SQL Server Agent is not recommended?
I want to create a Lotus Notes agent that will run on the server to generate a text file. Once the file is created, I need to send it to a remote server.
What is the best/easiest way to send the file to a remote server?
Thanks
If your "remote" server is on a local windows network, you can simply copy the file from the server file system to a UNC path (\myserver\folder\file.txt) using the FileCopy statement. If not, you may want to look at using a Java agent, which would make more file transfer protocols easily accessible.
In either case, be sure to understand the security restrictions on Notes agents - for your agent to run on the server and create a file on the server's file system, the agent will need to be flagged with a runtime security level of 2 or 3, and signed by an appropriately authorized ID.
Sending or copying files using O/S like commands to a remote server require that destination servers be also mapped as drives on your source server. As Ed rightly said, security needs to allow you to save files down onto the server and then try and copy them.
You can generate the file locally on the server and then use FTP commands in a script to send the file. Or if you're a java guru, you can try using Java.FTP to send the file as well. I had some trouble with it, but it should be possible providing an FTP account is setup on the destination server. FTP related stuff by a well known notes guy can be found here and here
I have done it using a script, and it's clumsy but effective in simply pushing files around. Ideally, if the server at the other end is a Domino server as well, you could actually attach the file in an email and send it to a mailin account on the destination server. I have done that before, and it's great as you can just pass the whole problem of getting files off to the SMTP process.