I have got a service running in a specific directory in 5-second-intervals which is picking up an XML file created in that directory sending it for some necessary authorization checks to another client and then requesting a response file.
My issue is that my Z_PROGRAM creating the XML file might take longer than 5 seconds as a result of the file's size. Therefore creating the file in that specific directory is not preferable. I thought about creating a new folder in that directory called "temporary" and creating the file inside that folder, then once I'm done with it, moving it back outside for the service to pick it up.
Is there any way to move files from one directory to another via ABAP code only?
Copying the file manually is not an option since the problem that I have during file creation still persists. I need 2 alternatives, one used for local directories and one for application server directories. Any ideas?
Generally, we create another empty file for completed files after the file creation process ends. Third parties must be firstly checked empty file is there. Example:
data file.csv
data file.ok
If you already completed your integration and it is not easy to make any change with third parties, I prefer using OS level file moving commands. Sample document here. You can use mv for Linux server and move for Windows. If your file is big, you will get same problem with OPEN DATASET concept. We have ARCHIVFILE_SERVER_TO_SERVER FM for moving files but it is also using OPEN DATASET.
there is no explicit move command in ABAP code that move or copy files between directories in application server.
there is two tips can be helpfull in your case. if you are writing big file you may seperate the logic behind collecting data and writing file. I would say don't execute transfer data inside your loop. instead collect you data into an internal table once you're done, loop over this internal table and write direclty strings without any delay you should be able to write a big files upp to several hundred of MB under 1 sec.
next tips is to not modify your program, or if you are using function modules to construct xml is, write to a temp directory after finishing, then have another program open you file on source directory by read dataset and directly write data to the new directory again just strings without interruptions.
you should be ok if you just write strings.
You can simply use System Call Commands to perform actions in Application Directory.
CALL 'SYSTEM'
ID 'COMMAND'
FIELD 'mv /usr/sap/temporary/File.xml
/usr/sap/final/file.xml'
Related
I have a couple lines of code in a batch file in Windows 10 that open a session of Octave, and load a script that uses design parameters contained in a .txt file. The batch file is named (for example) "Design123.bat", and when Octave runs, it automatically finds the design parameters in the file "Design123.txt" by simple string manipulation of the file name, i.e. strrep(filename,".bat",".txt"), where filename = '%~dpn0' is passed to Octave from the batch file. This allows for the contents of the batch file to stay simple and constant, and the file name of the batch file is the only thing tying it to the .txt file.
I do all of this to allow running the Octave script by double-clicking the batch file for convenience, instead of being forced to use the more tedious process of uigetfile in Octave. This works very well, but the catch is that I have to place a copy of the batch file in the same directory with the design (.txt) files (of which there are thousands, but each within their own directory) and give it the same file name to get it to work. Is there a way to quickly create the batch files somehow? The most ideal situation I can think of is to be able to right-click (or somehow select) a .txt design file, and create a batch file (replacing .txt with .bat) and place my lines of code into it.
Any ideas? I have coding experience, but only in software packages like VBA and Octave, not within operating systems themselves, though certainly willing to learn if I could get pointed in the right direction. The design file names follow a distinctive pattern, so they could be filtered easily within an operation on the active "File Explorer" window in Windows 10, if something like that is possible. Thanks in advance.
You might want to compose the answer to your question from calling the script on the right click and running the .m script with command line arguments.
If that fails, uigetfile is certainly not the only method to get file. At the very least you could always copypaste a path string to a folder from explorer to octave function call.
Finally, I guess I'll mention the existence of octave-cli which runs in terminal instead of gui. It might be better suited for running non interactive scripts.
I am mounting a folder as a virtual drive and i want to run a .exe file everytime user opens any file present in that folder. To be precise the folder would contain dummy files present on some other machine. By dummy files i mean the file would be listed but it would be a empty file. Whenever user opens a file i want the .exe program to download that file from another machine and display it to user.
That functionality (remote access on demand) can be implemented using reparse points and file system filters.
You could
use hooks to rewrite the jump address of OpenFile and in the
detour function check for the handle type, retrieve it's info by
using GetFileInformationByHandleEx, parse the data, download
what you need, open the downloaded file and then return
STATUS_SUCCESS or any appropriate error status in case one occurs.
Note
this is a bit more complicated as you also need a auto-inject
mechanism to inject function/library into each process according to
it's architecture.
this is not a safe procedure as most AV's will most likely consider your code malware.
I tried to list the files in my local directory as well as to read a file. But was not able to do it. Can any one give me sample code to get the list of files in a directory using gwt-filesystem.
It depends from where you are reading. You can't obviously read from the client due to security issues (any HTML wouldn't let you do that). However, you could obviously read the list of files in a folder from the server side.
I am using the SSIS Foreach Loop Container to iterate through files with a certain pattern on a network share.
I am encountering an kind of unreproducible malfunction of the Loop Container:
Sometimes the loop is executed twice. After all files were processed it starts over with the first file.
Have anyone encountered a similar bug?
Maybe not directly using SSIS but accessing files on a Windows share with some kind of technology?
Could this error relate to some network issues?
Thanks.
I found this to be the case whilst working with Excel files and using the *.xlsx wildcard to drive the foreach.
Once I put logging in place I noticed that when the Excel was opened it produced an excel file prefixed with ~$. This was picked up by the foreach loop.
So I used a trick similar to http://geekswithblogs.net/Compudicted/archive/2012/01/11/the-ssis-expression-wayndashskipping-an-unwanted-file.aspx to exclude files with a ~$ in the filename.
What error message (SSIS log / Eventvwr messages) do you get?
Similar to #Siva, I've not come across this, but some ideas you could use to try and diagnose. You may be doing some of these already, I've just written them down for completeness from my thought processes...
log all files processed. write a line to a log file/table pre-processing (each file), then post-process (each file). Keep the full path of each file. This is actually something we do as standard with our ETL implementations, as users are often coming back to us with questions about when/what has been loaded. This will allow you to see if files are actually being processed twice.
perhaps try moving each file after it is processed to a different directory. That will make it more difficult to have a file processed a second time and the problem may disappear. (If you are processing them from an area that is a "master" area (and so cant move them), consider copying the files to a "waiting" folder, then processing them and moving them to a "processed" folder)
#Siva's comment is interesting - look at the "traverse subfolders" check box.
check your eventvwr for odd network events, or application events (SQL Server restarting?)
use perf mon to see if there is anything odd happening in terms of network load on your server (a bit of a random idea!)
try running your whole process with files on a local disk instead of a network disk, if your mean time between failures is after running 10 times, then you could do this load locally 20-30 times and if you dont get an error it may be a network error
nothing helped - I implemented following workaround: script task in the foreach iterator which tracks all files. if a file was alread loaded a warning is fired and the file is not processed again. anyway, seems to be some network related problem...
Is there any built in way to read a file with SSIS and after reading it clearing the file of all content?
Use a File System Task in the Control Flow to either delete or move the file. If you want an empty file, then you can recreate the file with another File System Task after you have deleted it.
My team generally relies on moving files to archive folders after we process a file. The archive folder is compressed whereas the working folder is uncompressed. We setup a process with our Data Center IT to archive the files in the folders to tape on a regular schedule. This gives us full freedom to retrieve any raw files we have processed while getting them off the SAN without requiring department resources.
What we do is create a template file (that just has headers) and then copy it to a file of the name we want to use for processing.