I want to migrate Files (Attachments) from a FTP server to another server (Salesforce), to do that i am going to use talend. i have no clue which components to use and in which order in order to download the files (multiple formats but downloadable via a http link), and to insert them into salesforce database, i will be grateful if someone explains to me how to proceed (what are the components to use and how to relate them) ?
Based on the info provided, first you will obtain the files from the remote server, then load them as a BLOB into a database.
See the diagram for a typical FTP flow. The first component is a connection to the server which allows connection reuse. The second component is optional, it allows you to get a count of file prior to your operations (you can use it later to make sure you retrieved all the files). The third component (tFTPGet) is technically all you need. This component actually grabs the files based on the file mask you set. The final component tFTPDelete cleans up the remote directory.
Once you have the files locally see this help link for information on how to insert files as BLOBs into a database. You will have to tweak it for your SalesForce db.
Related
I'm developing a single page app for image annotation. Each .jpg file is stored on S3/minIO services, coupled with a .xml file (Pascal VOC notation), which describes the coordinates and positions for each annotation associated to the image.
I'd like to fetch all the xml data, to be able filtering my image results within the webapp project (based upon ReactJS). But thousand of request to an S3 server directly from a web app seems a bit odd to me; nevertheless, I would prefer avoid using any "middleware" servers (like python/flask or nodejs), relying on the ReactJS app.
I've not been able to find any workaround to download all the xml files content with a single ajax call; do you have some idea to address this kind of issue?
The S3 API doesn't provide an API to fetch multiple files in a single operation. As you have suggested in your question, your application will need to handle this logic by first getting a list of the objects then iterating through that list.
Alternatively, if you can consider storing the xml files as a single archive.
What design is used to store file objects that gets loaded from a website. For instance if I have a website that accepts documents or images. So
Use Case 1.
Users logs in and selects a MS word file on his machine and uploads to the website.
Use Case 2.
User logs in and selects a image on his machine and uploads to the website.
How do I store these file objects in the database
The first step is just getting the file from the AngularJS application to the server. This page talks about sending request to the server from the client and should get you started.
Once you have done that, (assuming you are using PHP) you will need to save the resulting file to the database. This post will get you started with saving files to PostgreSQL, but the details will end up being very specific to your situation:
If you have more questions after reading through those resources please add specific details about your setup to your question.
Is it possible to use a local database file with html5 without using a server. I would like to create a small application that depends on information from a small database. I do not want to host a server just to pull information. Is it possible to create a database file and pull information from the local files ?
Depends on the following:
The type of application you want to build:
Normal website with some data being pulled from a local storage;
Special purpose hosted website / application with data generated by the user;
Special purpose local application with a dedicated platform (a particular browser) and with access to the browser's non-web API -- in order to access the browser's own persistent storage methods (file storage, SQLite etc.);
Special purpose local application with a dedicated environment -- in order to deploy the application with a local web server and database;
Available options:
Indexed DB
Web Storage
XML files used for storing data and XSLT stylesheets for translating the data into HTML;
Indexed DB and Web Storage ar available in some browsers but you need to make sure the targeted browsers have it. Their features aren't quite as complete and flexible as SQL RDBMSs but they may fit the bill if your application doesn't need all that flexibility.
XML files can contain the data you want to be shown to the user and they can be updated manually (not by the user) or dynamically (by a server script).
For dynamic updating the content of the XML is kept in JavaScript and manipulated / altered (using the XML DOM) and when the session is over the XML content is sent to the server to entirely replace the previous XML file. This works OK if the individual users have a file each and they never write to each other's files.
Reading local files:
Normal file access is prohibited (for security reasons) to all local (JavaScript) code, which means that "having" a file locally implies either downloading it from a known source (a server) or asking the user to offer access to a local file.
Asking the user to offer access to a local file which implies offering the user a "file input" -- like for uploads but without actually uploading the file.
After a file has been selected using FileAPI to read that file should be fairly simple.
This workflow would involve the user "giving" you the database on every page refresh -- but since it's a one page thing it would mean giving you the data on every session as long as your script does not refresh the page.
You can use localstorage but you can run a server from your own computer. You can use Wamp or Xampp. Which use Apache and mysql.
What i'm looking for is a little more robust than a cookie. I am making a web application for a friend that will be 1 page, and have a list of names on the page. The person wants to be able to add names to the list, however they do not want to use a web server. Just want the files locally on a computer so a folder called test-app , with index.html, and possibly a database file that can be stored in the web browser or a way to save information to the web browser for repeated use.
I am using Java, GWT and Eclipse. I have a static XML file I want to parse to get certain data that will fill in list boxes and other info. How can I read the static XML file in both the server and client side of the code? Where do I put the XML file? Also, where can I put it if I only want the server to have access to it (since it contains sensitive data)?
If you need it on a server side only, put it in the /war/WEB-INF directory, and you can read it directly in your server code.
You can use a DataResource if you need a file on the client side:
https://developers.google.com/web-toolkit/doc/latest/DevGuideClientBundle#DataResource
If you want this file accessible on the client side, put it in the /war directory.
I would suggest that you parse the file server-side using any good XML parser (for an example see this tutorial) and put the resulting data in POJOs. For the data that you need client-side, you can make an RPC call to the server to retrieve the POJOs previously populated. A good place to put the XML file to prevent it from being directly accessible is under the WEB-INF directory of your webapp.
I'm working on a side project right now for an email client. I'm using a library to handle the retrieval of the messages from the server. However, I have a question on caching.
I don't want to fetch the entire list of headers everytime I load the client. Ideally, what I'd like to do is cache them and then update the list with what is on the server.
What's the best way to go about this? Should I store all the header information (including the server's message ID #) in a database, load the headers from that DB. Then as a background task sync up with the server...
Or is there a better way?
Look at the webmail sample of this open source project that use local caching:
http://mailsystem.codeplex.com/
If I remember well, he used a combination of local RFC822 plain text email storing with the message id as the filename and an index file with high level data.
Maybe the message itself where zipped to save disc space.
That's just a sample for the library, so don't expect code art there, but that's a start.