Show images from inputsstream in xhtml page - database

I need to show some images that queried form database and placed into inputstream.My framework is JSF and I know that by using servlet, I can show them. But the problem is that there is many images in my page that placed into database, now if I want to select each image from database and show in my xhtml page, a lot of queries needed. In one managedbean, all of images placed into List of inputstreams, and I want to show each element as an image in page. In fact my requirement is to read image from inputstream and show in xhtml page.Can any body guide me?

If you're using Richfaces, you can use <ui:repeat> to iterate your list of images and use <a4j:mediaoutput> to show your images on your xhtml, example, also see How to use a4j:mediaOutput correctly
and another example

now if I want to select each image from database and show in my xhtml page, a lot of queries needed
How exactly does that form a problem? Have you measured the performance? Is the bottleneck really in "a lot of queries"? I really don't understand why that would form a bottleneck. It should be blazing fast with a properly designed datamodel, a self-respected SQL database is designed for exactly this purpose.
Isn't your bottleneck actually the step of making a DB connection and that you're doing that on every single query because you aren't using a connection pool? If so, then yes, it would be understandable that it would perform very slow. Making a DB connection can be as slow as 100~500ms. That's exactly why connection pools were invented a long time ago. Connections would then only be initialized and cleaned during "idle time" and shared/reused on a threadsafe manner and thus getting a connection from it should be no more than 10ms or something.
If you fix your data layer to utilize a decent connection pool, then you can keep using your servlet which is already the right tool for the particular job.

Related

What is the best method to store images in db for email sending?

Hi Im a newbie in stackoverflow!
As mentioned on the question title, I've been storing the email's image path into the db via localhost. Once the email is sent and received, my outlook automatically block the image download and I will need to manually download it (Not a big issue here).
Then I started to wonder what if my website/server is down? If it is down, the email will not be able to locate and download the image at all. So I'm wondering if there is any alternative ways to display the image without worrying bout the availability of my server.
Thanks in advance for any incoming advises/replies!
Since your primary concern seems to be about failure mitigation and not actual coding, I'll direct you to this question.
Your current method isn't actually embedding the images and is making a link. What you want to do is add the images as linked resources. This WILL make your emails larger in size and slower to send, but as long as you aren't a spammer, you should be OK.
Alternatively, you could have an enterprise level failure plan where your server would go offline and a mirrored server in a different location would begin serving up the data/images.

Storing Serialized Video files to SQL Server

I currently am faced with a need to host 20 small video files for my website. I know I could just host them with my project in a folder but I came a crossed this article.
http://www.kindblad.com/2008/04/how-to-store-files-in-ms-sql-server.html
The thought of storing the file in the db had not occurred to me. My question is would there be a performance increase or decrease by storing the files as bit data in the db versus just streaming the data. I like the idea of having the data in the db for portability and having control and who gets access to the videos. Thanks in advance.
Unless you have a pressing need to store them in a database, I wouldn't, personally. You can still control who gets access to which files by using a handler to validate access to the file. One big problem that the method in that article has is that it doesn't support reading a byte range - so if someone wants to seek to the middle of a video, for example, they would have to wait for the whole thing to download. You'd want it to be able to support the range header, as described in this question.

How do I elegantly import an Excel file into Sql Server via a Coldfusion HTML form?

Does anyone have an elegant suggestion for how to get the contents of an Excel spreadsheet into SQL Server via a web form? I need to allow our clients to upload modest amounts of structured data, and I need that data to ultimately reside in a sql table. I really can't expect the clientele to produce anything but an Excel file, but I could require that it be an xlsx.
The web app is written in Coldfusion; it doesn't need to be able to handle huge numbers of simultaneous requests, but I don't want to consider some sort of server-side batch job processing or shunt the user to an asp.net page (which is what we are doing now).
Any recommendations (or examples of how others are successfully doing this) would be appreciated. Due to the sensitivity of the data, we really can't do anything to compromise the security of the web or sql servers.
If you are using CF9, then you could easily use the cfspreadsheet tag too. I mention this one specifically because Shawn's link did not (presumably due to its being relatively new on the CF scene). Here's the livedoc link: http://help.adobe.com/en_US/ColdFusion/9.0/CFMLRef/WSc3ff6d0ea77859461172e0811cbec17cba-7f87.html
For full use, I would create a web form with a standard file upload field. On the backend handling the form submission, get a copy of the file with
<cffile action="upload" destination="uploaded.xls".....>
Then use:
<cfspreadsheet action="read" query="myExcelData" src="uploaded.xls" ...>
At which point, your spreadsheet content will be available as a query object. You can then loop over this query, running insert queries into your sql server each time you loop. That should do it.
Here are the most notable options to help point you in the right direction; choose what you are most comfortable with (Source: Charlie Arehart).
CFXL
JXLS
CFX_Excel
My personal recommendation is to go the CFX_Excel route. Although a commercial product, it will grant you the most functionality/flexibility of the options listed.

Need ideas on retrieving data from a website

I'm stumped and need some ideas on how to do this or even whether it can be done at all.
I have a client who would like to build a website tailored to English-speaking travelers in a specific country (Thailand, in this case). The different modes of transportation (bus & train) have good web sites for providing their respective information. And both are very static in terms of the data they present (the schedules rarely change). Here's one of the sites I would need to get info from: train schedules The client wants to provide users the ability to search for a beginning and end location and determine, using the external website's information, how they can best get there, being provided a route with schedule times for the different modes of chosen transport.
Now, in my limited experience, I would think the way to do that would be to retrieve the original schedule info from the external site's server (via API or some other means) and retain the info in a database, which can be queried as needed. Our first thought was to contact the respective authorities to determine how/if this can be done, but this has proven to be problematic due to the language barrier, mainly.
My client suggested what is basically "screen scraping", but that sounds like it would be complicated at best, downloading the web page(s) and filtering through the HTML for relevant/necessary data to put into the database. My worry is that the info on these mainly static sites is so static, that the data isn't even kept in a database to build the page and the web page itself is updated (hard-coded) when something changes.
I could really use some help and suggestions here. Thanks!
Screen scraping is always problematic IMO as you are at the mercy of the person who wrote the page. If the content is static, then I think it would be easier to copy the data manually to your database. If you wanted to keep up to date with changes, you could then snapshot the page when you transcribe the info and run a job to periodically check whether the page has changed from the snapshot. When it does, it sends an email for you to update it.
The above method could also be used in conjunction with some sort of screen scaper which could fall back to a manual process if the page changes too drastically.
Ultimately, it is a case of how much effort (cost) is your client willing to bear for accuracy
I have done this for the following site: http://www.buscatchers.com/ so it's definitely more than doable! A key feature of a web scraping solution for travel sites is that it must send you emails if anything went wrong during the scraping process. On the site, I use a two day window so that I have two days to fix the code if the design changes. Only once or twice have I had to change my code, and it's very easy to do.
As for some examples. There is some simplified source code here: http://www.buscatchers.com/about/guide. The full source code for the project is here: https://github.com/nicodjimenez/bus_catchers. This should give you some ideas on how to get started.
I can tell that the data is dynamic, it's to well structured. It's not hard for someone who is familiar with xpath to scrape this site.

Silverlight - Unable to load a lot of data

I have a Silverlight application that has a DataGrid and a DataPager. The data source for these controls comes from a database. I am accessing this database through RIA Services.
When I try to load all of the records, I receive an error that says:
"Load operation failed for query 'GetData'. The server did not provide a meaningful reply; this might be caused by a contract mismatch, a premature session shutdown or an internal server error."
By gradually restricting the size of the result set on the server side, I have come to the conclusion that I am getting this error because my data set is too large. My question is, how do I elegantly load large data sets into a DataGrid? I am open to approaches outside of RIA Services.
Thank you!
First off, if you have the means and aren't required to write this code yourself, consider buying a UI component that solves you problem (of find an open source solution). For these types of tasks, there's a good chance that someone else has put a lot of effort into solving problems like this one. For reference, there's a teleric grid control for Silverlight with some demos.
If you can't buy a component, here's some approaches I've seen:
Set up a paging system where
the data for the current page is
loaded, and new data isn't loaded
until the pages are switched. You
could probably cache previous results
to make this work more smoothly.
Load data when needed, so when the user scrolls down/sideways, data is loaded once cells are reached that haven't had data loaded.
One last idea that comes to mind is to gzip the data on the server before sending. If your bottleneck is transmission time, compression speed things up for the type of data you're working with.
You should take into consideration that you are possibly exceeding the command timeout on your data source. The default for LINQ to SQL for example is 30 seconds. If you wanted to increase this, one option is to go to the constructor and change it as follows:
public SomeDataClassesDataContext() :
base(global::System.Configuration.ConfigurationManager.ConnectionStrings["SomeConnectionString"].ConnectionString, mappingSource)
{
this.CommandTimeout = 1200;
OnCreated();
}

Resources