I have a Yahoo! Group that has its own Database tables for various reasons. I'd like to create a mobile application (or web app) that performs CRUD operations on such tables. Does anyone have any clue as to how I can get started on learning how to do this (if at all?)
Unfortunately, Yahoo! Groups does not have an external API. But, you might be able to use Export Table functionality which delivers the database records in CSV format.
Related
We are now developing an application that uses GAE Datastore and trying to implement Multitenancy.
Our customers are companies, so we are going to create namespaces on a per-company basis.
My question is how should we treat company mergers and separations.
For example, when two of our customers are merged, data under two namespaces should be migrated into a single namespace. When our customer is separated into two company, some of data should be migrated into another namespace. This takes a lot of effort and we would like to avoid these operations.
How can we treat these cases smoothly? Or is namespace suitable for per-company basis? If not, how should we implement per-company based multitenancy?
The general way this is handled is by creating a job that handles mergers as a batch process by read-write-delete the old to new keys as part of a transaction. Generally you'll have a bunch of business rules you throw in as part of the processing as well as the basic rekeying. For example, how will you handle 2 users having the same username?
Using Cloud Dataflow (Java & Python connectors available) is a good tool to do this.
Mergers are messy when it comes to data in most cases, so it isn't really namespaces that prevent a simpler solution.
Are there any resources available that can guide someone on how to 'think' about the various components of a hosted / cloud solution before going ahead and starting to make a hosted application? If that made no sense, what I mean to ask is are there any guidance books/websites on what things need to be considered when making a cloud application?
I am attempting to make a hosted CRM-style software application that will serve many hundreds of customers. The application is powered by a SQL server database with many tables and a ColdFusion, HTML5, CSS, Javascript front-end. If I was installing this application and its components at each client site, then each installation is unique to that customer. But somehow I have to replicate this uniqueness in the cloud which is baffling me.
Only two things have come to mind so far:
The need for a unique database per customer in SQL server
The need to change DB connection strings per customer in the web application
My thought process has come to a block when I am trying to envisage how to design the application to serve so many different customers. Even though the application that all customers use will is the same (same DB tables, same front-end), the data that they store and retrieve will be specific to them. So I was thinking that surely each customer needs a separate database creating for them? Is it feasible to create a replica database for each customer? If I need to update some tables or add a new table, how would I do this for hundreds of different databases?
From the front-end I guess each unique customer log-in would change DB connection strings so that they can only access their database. Other than this I can't think of anything else that needs to change per customer basis.
When a new customer wants to sign up, it needs to be clear to me what I need to create for them to have access to the application. I guess this is ultimately what I need to think of but I'm stuck.
If anyone can suggest some things to think of or if there is a book or website on this kind of thing that someone could point me to I'd really be very thankful.
EDIT:
I was looking at an article about Salesforce.com and it says
"In order to ensure privacy of data for each user and give an effect of each having their own database, the data from different users are securely isolated from one another."
Anyone know how this is achieved or how it may be done?
Found some great information here. It is called multi-tenant database design and seems to be a common topic. Once I get the database designed then the application can sit nicely on top.
https://dba.stackexchange.com/questions/1043/what-problems-will-i-get-creating-a-database-per-customer
Doing an application that will use Access as a back end and will rely on importing Excel sheets. Lots of reporting as well.
The app will only be used by one or two people.
Would you build this in Access forms? I also know Winforms and C#.
What criteria would you use for your decision making? Why would you choose one approach over another? What more information would you need to make a decision?
When considering an Access solution, the number of people using the application is an issue about data storage and retrieval. The front-end pieces should be segregated into a separate db file, and each user should have their own copy of that front-end db file; the back-end db file contains only data. You should view that type of split as an absolute requirement if there is any possibility that more than one user will ever use the application at the same time.
If a split Access application is unacceptable, use something other than Access for the front-end part.
As far as the suitability of Access for the front-end, you haven't described any features which could not be quickly and easily implemented with Access features. Notice you didn't provide any details about the purpose of the forms. For all we know, they may be used only to drive imports from Excel and select among reports, perhaps with form fields to select criteria for those reports. If that's all you need, it is an almost trivial task in Access.
Apologies for the newbie web service question -
I am trying to create a webservice that has a list of methods to perform read/writes to a database. An example function will be of form -
CreateNewEmployee(string username, string employeeid, string deptname)
I created a webservice in .net (asmx) that has the above mentioned webmethod. In that, I open the connection to the data base and do an insert in to the database and then close the connections. Is this the right way to design the web service call?
Should I instead be passing an object instead of multiple parameters?
Any pointers toward best practices when trying to create a webservice that writes data into a database?
To add some more information
We would like to have web services since it might be reused by many different applications within the organization (both web and desktop).
We are also planning to create an environment where users can use these web services to create data mashups.
Thanks,
Nate
Yes - pass objects vs large parameter sets. Also, have you considered WCF if you're in a .Net environment? If you look at how ADO.Net Data Services (formerly Astoria) works, it will put you in the right direction.
Quoting from the winning answer to this SO question:
Web Services are an absolutely horrible choice for data access.
It's a ton of overhead and complexity for almost zero benefit.
You can read the rest of the discussion there.
Edit: One excellent approach to having a common data access functionality that can be shared by multiple applications - web, desktop, service - is to create a Visual Studio project that compiles to a DLL. Each solution that wants to use the data access functionality references the DLL, which can deployed to the GAC or some other central location, or just added to the project's bin folder. Alternately, in order to be able to step through the data access code, the data access project can be added to a solution.
This is a very common practice in large enterprises, where many back office applications share common functionality. It is used not just for data access, but also for other services such as logging and authentication/authorization. Some divisions create a set of these DLLs, which they refer to as their "framework". It ensures that every application will have the same functionality and the same business logic, and that there is a single place for revisions to be made that will affect all of the applications. This is a similar benefit to using web services, but it avoids the overhead and performance hit of web services.
There is a website with a server database. I'm building a desktop application which uses data from one of the tables. Hacker can just take password from assembly.
How can I protect the database?
I wouldn't store the database information in the application at all. Instead, I would create an API to the database on the website, perhaps implementing a RESTful interface or having queries that return data in an appropriate format, such as JSON, XML, or even plain text. The application could then call these web services and process the results. All of your database information stays on the server, where it is (hopefully) secure.
The API adds a sometimes unnecessary application layer. Not all applications i've been involved with easily convert from using database calls to webservice calls. If the application has not been written i guess it would not matter that much.
My alternative implementation is:
Connect to the server using a secure tunnel of some sort.
Save the password encrypted on disk.
This would save me the effort of creating an API, which in most of my projects would be a waste of time.
This alternative is not viable if let's say you want to distribute the application to customers.
Your can
A) create a three tier system. Your client could interface with a server that in turn interfaces with the database. The server stores the access credentials.
B) create personal accounts on the database for your users. This two tier model is applicable if fine grained access control to data is needed. E.g. in an inhouse application with different user roles.
Don't let the database user the application logs in as perform any write operations or read operations on anything but the application data.
Or, choose a sane architecture, as Thomas mentions above. Databases are for storing and retrieving data, they are not a generic application server.