Can I create a script and can actually download a file to my local disk using gatling?
I tried to create a script to download a file. When I run the script it was successful but looks like the file wasn't actually download. So I was wondering if using gatling a file can actually be downloaded?
Gatling doesn't optimise for this use case.
You can capture the whole byte array in memory using a bodyBytes check that you can then save on the filesystem in an exec(function).
But it means you'll retain the whole content in memory at some point, instead of streaming the chunks on the filesystem and discarding them from the memory.
Related
I have got a service running in a specific directory in 5-second-intervals which is picking up an XML file created in that directory sending it for some necessary authorization checks to another client and then requesting a response file.
My issue is that my Z_PROGRAM creating the XML file might take longer than 5 seconds as a result of the file's size. Therefore creating the file in that specific directory is not preferable. I thought about creating a new folder in that directory called "temporary" and creating the file inside that folder, then once I'm done with it, moving it back outside for the service to pick it up.
Is there any way to move files from one directory to another via ABAP code only?
Copying the file manually is not an option since the problem that I have during file creation still persists. I need 2 alternatives, one used for local directories and one for application server directories. Any ideas?
Generally, we create another empty file for completed files after the file creation process ends. Third parties must be firstly checked empty file is there. Example:
data file.csv
data file.ok
If you already completed your integration and it is not easy to make any change with third parties, I prefer using OS level file moving commands. Sample document here. You can use mv for Linux server and move for Windows. If your file is big, you will get same problem with OPEN DATASET concept. We have ARCHIVFILE_SERVER_TO_SERVER FM for moving files but it is also using OPEN DATASET.
there is no explicit move command in ABAP code that move or copy files between directories in application server.
there is two tips can be helpfull in your case. if you are writing big file you may seperate the logic behind collecting data and writing file. I would say don't execute transfer data inside your loop. instead collect you data into an internal table once you're done, loop over this internal table and write direclty strings without any delay you should be able to write a big files upp to several hundred of MB under 1 sec.
next tips is to not modify your program, or if you are using function modules to construct xml is, write to a temp directory after finishing, then have another program open you file on source directory by read dataset and directly write data to the new directory again just strings without interruptions.
you should be ok if you just write strings.
You can simply use System Call Commands to perform actions in Application Directory.
CALL 'SYSTEM'
ID 'COMMAND'
FIELD 'mv /usr/sap/temporary/File.xml
/usr/sap/final/file.xml'
When I loaded file/s into Snowflake Stage i see difference in number of bytes loaded compared to the files in my local system, does anyone know the reason for this issue ? How it can be resolved.
File size in my local is 16622146 bytes, after loaded into stage it shows as 16622160 bytes, i have checked with .csv and .txt file types. (I know .txt file is not supported in snowflake).
I compressed the file and loaded into snowflake stage using snowsql using put command.
When loading small files from the local file system, snowflake automatically compresses the file. Please try that option.
Refer to this section of the documentation
https://docs.snowflake.com/en/user-guide/data-load-prepare.html#data-file-compression
This will help you to avoid any data corruption related issues during the compression.
Thanks
Balaji
I am mounting a folder as a virtual drive and i want to run a .exe file everytime user opens any file present in that folder. To be precise the folder would contain dummy files present on some other machine. By dummy files i mean the file would be listed but it would be a empty file. Whenever user opens a file i want the .exe program to download that file from another machine and display it to user.
That functionality (remote access on demand) can be implemented using reparse points and file system filters.
You could
use hooks to rewrite the jump address of OpenFile and in the
detour function check for the handle type, retrieve it's info by
using GetFileInformationByHandleEx, parse the data, download
what you need, open the downloaded file and then return
STATUS_SUCCESS or any appropriate error status in case one occurs.
Note
this is a bit more complicated as you also need a auto-inject
mechanism to inject function/library into each process according to
it's architecture.
this is not a safe procedure as most AV's will most likely consider your code malware.
I am recording a script on HTTP based protocol and I am saving a file, a pop up opens and saves a file (.doc) on my local computer,while replaying the script I want to check the downloaded file path...is there any method?
LoadRunner 9.52
You will not have a file on replay. Your download is coming in the context of the HTTP data flow. If you log the information for the request you can check the log, but this still will not be a file that you can open. Your best bet is to use web_reg_find() or web_reg_save_param() to check for the existence of both the file header and footer in your http download stream. You also may want to check the size of the previous download. Then at some logic like this (P-CODED)
If (
file_header_exists
&& file_footer_exists
&& file_size>some_minimum_number_of_bytes
)
then ( I_have_a_valid_downloaded_file )
Just inmagine what you would be doing to your local file system if you required that all of the files be written to the local file system during the performance test. Your local hard drive would become a bottleneck for your entire load generator.
You might also consider running a single GUI Virtual User (Based upon QuickTest Professional Technology stack) to check for the one file download for a single user as a functional check.
If you are still engaging in functional checks then you are likely testing too soon for performance, for if it does not work for one then it will never work for many.
I want the user to be able to copy a file that's stored on disk from my GTK application to a normal file manager like Nautilus. How can I do that? I would prefer to just write a path into the clipboard and let the file manager take care of actually copying, is that possible?
I just found an example in which it seems as if the actual file data is transferred through the clipboard – but is that the only possible way?
You need CF_HDROP and possibly other shell clipboard formats.
See: http://msdn.microsoft.com/en-us/library/windows/desktop/bb776902(v=vs.85).aspx;