I am developing iPad application,
A set of forms that user has to fill (roughly 500 text fields).
The form fields are displayed in a UITable view.
Once user fill the forms he/she can uploaded the data to the server.
Until its uploaded, the data should be persist in a temporary location in the local disk.
In my application design, I am using NSMutableDictionery, which is always being updated when the user updating the forms.
What is the best way to keep save NSMutableDictionery in the local disk.
SQL Lite or PList ?
/chatcja
If you are only ever saving one form at a time, a plist would be fine. If you would ever have the need to store multiple instances of the form and need to find them based on user-generated contents, using a more sophisticated storage mechanism like SQLite or CoreData would probably be more appropriate.
Related
We are working on survey's using ODK, by creating xls files and transform it into forms, and then we collect data offline.
When employees comeback from the field, they upload data.
What we need now is that they work online from the field, so they can search a specific ID or name, and see the data existing before adding new data.
What I mean is that we need to let them search the database by specific field, and that is not available in odk.
We upload data to ONA then the data are cleaned on the laptop and the searches are done on the laptop too.
Is there a tool that do that process ?
As far as I know, the closest you can get with existing tools is this: https://help.ona.io/faq/filtered-datasets
If you use Enketo (webforms), when the source dataset is updated (via new submissions) the webform will also be automatically updated (may require page refresh and there will be a delay). You could use both offline-capable or online-only webforms with this reference to external data and query it with select_one_from_file, select_multiple_from_file (in XLSForm terminology), or with pulldata, or with regular XPath.
I'm writing contracts by hand along with itineraries that contain much of the same data. Is there a way to do the following...
web form that will fill the fields of my contract and outputs in .pdf format
button that gives the option to also output the data in the form of my itinerary layout in .pdf
save various details to a database.
what tutorials might be useful to me? what videos should i be watching to steer me down the right path?
Many thanks!
This looks very much like an application for a server-side form filling tool. The tool of choice would be FDFMerge by Appligent.
A typical workflow would be as follows:
• you have a web form where the user fills in the required information
• user submits the web form to your server where the data gets filled into a database
• selects are made in the database, bringing together the data for a specific document
• assemble the result of the selects into an FDF file
• the base form is filled with the data, using the above mentioned tool
• the filled document(s) is/are made available for the user
And that should do it.
I don't know what is the best way to force a browser refresh when using silverlight.
The scenario is like this:
We have a silverlight application hosted on IIS
Two users opens the same page and that page contains a grid with some records.
Only one of the users modify one record and save the data in the database
How can the other user see that the data has been modified unless he refreshes the page manually?
Should I implement some automatic refresh?
Thanks in advance
I would think long and hard about the requirements here before you open up a can of worms. Is the grid editable? With the automatic refresh idea what happens to the user who is in the middle of an edit? Think about alternatives. Could you possilbly check whether the data has changed at the point of saving data and then provide an appropriate message to the user? If you want the data to automatically refresh you are going to get into looking at server to client notifications e.g WCF duplex calls or constant polling and refreshing of the underlying bound observablecollection
Is it possible to store harvested data from a website(Nestoria) upon implementing their APIs using PHP?
I am able to extract the data using PHP and it displays the result on a web browser, but I need to dump or save them into my PostGIS database. (I am using XAMPP and PostGIS on windows 7)
Most companies wouldn't have a problem with you doing that, for instance Ebay's API. However, as Mapperz pointed out - Nestoria's terms require you not to compete with them for originality of the content. So, any data that comes from their API and stored in your database should not be able to be indexed by search engines.
This isn't as difficult to comply with as you might think. You could have the content loaded through an iframe that uses the "NoIndex, NoFollow" meta tag attribute in the HTML of the page being loaded, or pull the content from the database to your page's DOM using AJAX/JavaScript after the page has loaded.
I personally would go with the second option (AJAX).
I have an application that shows a screen of image thumbnails, each image is around 80k and they are stored in a database. To keep response time reasonable, the appilcation displays a placeholder image when it first starts and later downloads the images from the server. I'm expecting to show around 40 images on the screen at once so that's my batch size. What's the best way to serve these images up to the client? I've got two options in mind.
Create an ADO.NET Data Service that exposes the Images database table to the client. The client can asynchronously request the images, one at a time, and display them as they come back from the server. I've implemented this solution and it seems to work Ok; the speed isn't great and I feel like I could utilize the Http pipe better by requesting maybe 3 images at a time.
Create an HttpModule on the server that looks for requests that look something like /Images/1.jpg and then reads the database and returns the requested data. On the client side I can have many Image objects whose source points to the virtual Urls on the server. My theory is that by just giving Silverlight many Urls to deal with it may be able to transfer the images more efficiently than my code in option 1.
Would either of these methods be more efficient or is there another technique for getting this done? Thanks!
I don't know if it's more efficient, but I've accomplished a very similar task using an HTTP Handler (ashx). The handler pulls the image in from the database based on the Parameters in the uri (image ID), and then Silverlight fetches them asynchronously by setting the Source property of an Image control to the URI of the handler with the specific ID that I want in the query string. The Image control, in turn, is inside of an ItemsControl which allows me to display multiple images.
We are doing something very similar, and we are just using an ASPX page to server them up with a query parameter of the image identifier. We are also caching the images, and the ASPX page will used the cached value if it exists. If not, we pull it from the data store, cache it, and send it down. It is working really well for us.
Have you looked at using Deep Zoom? It's very efficient about progressive image loading, and gives you a nicer user experience when the images are fully loaded.
Examples:
Hard Rock Memorabilia site
Deep Zoom Pix