I develop mobile apps with React Native and have many components that are shared across projects, i.e., <StandardTextInput />, <LargeHallowButton />, and etc.
My clients and I have an agreement and they own any and all code worked on during this project. I would like to extract these shared components into a single npm module, owned by me but open-source, and then use it like any other component library. This would cut costs for both parties but would then make the client dependent on the maintenance of this module, i.e., me.
My question is whether or not doing the above would violate the agreement between me and my clients? If not, it is ethical to make their applications dependent on my library?
I had planned on speaking with them about this but wanted to consult the community first.
I personally have done exactly the same thing. I manage a project called "react-native-pinch-new" which has been used in numerous projects for different clients including some major software companies.
However making those open source was beneficial for them in a way that they can easily use the same package for their other projects that develop internally. Because of that, my clients were happy with that and I had no legal issues. But this actually depends on your NDA terms and other contract terms you have already signed. So I think it is better to talk with your client first before publishing the code.
Related
My question is not specific to any particular piece of code but it is about holistic usage of different python libraries.
My background is in 'product usage and behavioural analytics' in SaaS products.
I was wondering whether there are available any centralised data (e.g. from python.org or from anywhere else) about the usage of all(most?) python libraries and classes specific to those libraries.
The reason I am asking is to get an understanding about:
which libraries are used often
which libraries are used together frequently
new libraries which are getting popular
interesting combination of unexpected libraries which may mean new Python use cases
understand changes in trends of usage of the above
etc
I hear that many developers want to be up-to-date with the trends and with new libraries so the real usage would be a great indicator. I'm thinking to build a web around this topic to answer these questions. And I'm looking for the usage data.
Anyone knows about such data?
There are many articles on the web about most downloaded packages (e.g. https://betterprogramming.pub/the-22-most-used-python-packages-in-the-world-7020a904b2e) but I have not found the usage after the download itself.
Is it permitted to sell multiple copies of a rebranded app in the play store or apple store? Specifically, we would be creating a tailored version of the same app for different companies that incorporates their custom business logos and style (and perhaps other feature customizations as well).
Is this permitted?
This will only be an issue when Apple decides that you're flooding the app store, if at all. There are many factors involved in Apple's review process, but they are not public knowledge. As far as Android goes, I don't believe they have a review process as of yet. So theoretically, you can put as many versions of the same app on Play as you wish. This may someday change however, as many apps on Android are dangerous or malicious in nature.
Is it possible to share a model through other apps? if possible, how to do this.
Yeah, maybe we can implement API for those apps, But if apps can share their models for each other without any external libs, it would be cool :-)
You could possibly broaden your definition of "other applications" in the original question to include other versions of the same application (where these versions are really your "other applications"). If so, then this could be possible by deploying each of your "other applications" as a different version of the same application. That way they should be able to share the same data store. I've not yet attempted this myself but from what I read this should be possible. Might get some more info if anyone posts here, or if I end up trying this myself I'll let you know.
Update: I tried this out and it works but with one minor and one possibly significant issue. The minor issue is that you have to work out a way to duplicate the same data model across your two apps (or at least as much as you need). The bigger issue is that datastore commits made in one application may not be visible to the other application for quite some time, and that amount of time varies depending on where/how you're deploying.
There is no way (yet?) for an App to open its datastore to other Apps, if that is what you mean.
You'd have to go through an HTTP interface (which could probably be derived from the model classes directly, and thus shared). The remote_api standardizes this somewhat.
My team is maintaining a huge Client Server win32 Delphi application. It is a client/server application (Thick client) that uses DevArt (SDAC) components to connect to SQL Server.
The business logic is often "trapped" in Component's event handlers, anyway with some degree of refactoring it is doable to move the business logic in common units (a big part of this work has already been done during refactoring... Maintaing legacy applications someone else wrote is very frustrating, but this is a very common job).
Now there is the request of a web interface, I have several options of course, in this question i want to focus on the VCL for the web (intraweb) option.
The idea is to use the common code (the same pas files) for both the client/server application and the web application. I heard of many people that moved legacy apps from delphi to intraweb, but here I am trying to keep the Thick client too.
The idea is to use common code, may be with some compiler directives to write specific code:
{$IFDEF CLIENTSERVER}
{here goes the thick client specific code}
{$ELSE}
{here goes the Intraweb specific code}
{$ENDIF}
Then another problem is the "migration plan", let's say I have 300 features and on the first release I will have only 50 of them available in the web application. How to keep track of it? I was thinking of (ab)using Delphi interfaces to handle this. For example for the User Authentication I could move all the related code in a procedure and declare an interface like:
type
IUserAuthentication= interface['{0D57624C-CDDE-458B-A36C-436AE465B477}']
procedure UserAuthentication;
end;
In this way as I implement the IUserAuthentication interface in both the applications (Thick Client and Intraweb) I know that That feature has been "ported" to the web. Anyway I don't know if this approach makes sense. I made a prototype to simulate the whole process. It works for a "Hello world" application, but I wonder if it makes sense on a large application or this Interface idea is only counter-productive and can backfire.
My question is: does this approach make sense? (the Interface idea is just an extra idea, it is not so important as the common code part described above) Is it a viable option?
I understand it depends a lot of the kind of application, anyway to be generic my one is in the CRM/Accounting domain, and the number of concurrent users on a single installation is typically less than 20 with peaks of 50.
EXTRA COMMENT (UPDATE): I ask this question because since I don't have a n-tier application I see Intraweb as the unique option for having a web application that has common code with the thick client. Developing webservices from the Delphi code makes no sense in my specific case, so the alternative I have is to write the web interface using ASP.NET (duplicating the business logic), but in this case I cannot take advantage of the common code in an easy way. Yes I could use dlls maybe, but my code is not suitable for that.
The most important thing that you have to remember is this:
Your thick client .EXE process is used by one person at a time (multiple persons will have multiple instances of that .EXE).
Your intraweb .EXE process will be used by many persons at a time. They all share the same instance of the process.
That means your business logic must not only be refactored out into common units, the instances of the business logic must be able to reside into memory multiple times, and not interfere.
This starts with the business logic that talks to the database: you must be able to have multiple database connections at the same time (in practice a pool of database connections work best).
In my experience, when you can refactor your business logic into datamodules, you have a good starting point to support both an Intraweb and a thick client version of your app.
You should not forget the user interface:
Thick clients support modal forms, and have a much richer UI
Web browsers support only message dialogs (and then: those are very limited), all fancy UI stuff costs a lot of development time (though for instance, TMS has some nice components for Intraweb)
Then, to top it off, you have to cope with the stateless nature of the HTTP protocol. To overcome this, you need sessions. Intraweb will take care of most of the session part.
But you need to ask yourself questions like these:
what should happen if a user is idle for XX minutes?
how much session state can I store in memory? and what if it doesn't fit?
what do I do with the session state that does not fit into memory?
This is just a start, so let use know when you need more info.
If it gets very specific to your app, you can always contact me directly: just google me.
--jeroen
I think if you move your application to n-tiers will be a better solution, it will be easier after that to be used by the desktop and web applications.
you already made the first part by decoupling the business logic from the presentation, you can use RemObject SDK or DataSnap that bundled with Delphi.
after that you will have working desktop application, and you can use Intrawebm Asp.net or what ever for web part, and in this way you will not have to duplicate the business logic again for the web part.
usually converting desktop application to web not easy as you thought, because they work in different environment, and you need to build each one as it's nature.
I see this time and time again. The UAT test manager wants the new build to be ready to test by Friday. The one of the first questions asked, in the pre-testing meeting is, "what version will I be testing, against?" (which is a fair question to ask). The room goes silent, then someone will come back with, "All the assemblies have their own version, just right-click and look at the properties...".
From the testing managers point-of-view, this is no use. They want a version/label/tag across everything that tells them what they are working on. They want this information easily avaialble.
I have seen solutions where the version of diffierent areas of a system being stored in a datastore, then shown on the main application's about box. Problem is, this needs to be maintained.
What solutions have you seen that gets around this on going problem?
EDIT. The distributed system covers VB6, Classic ASP, VB.Net, C#, Web Services (accross departments, so which version are we using ?), SQL Server 2005.
I think the problem is that you and your testing manager are speaking of two different things. Assembly versions are great for assemblies, but your test manager is speaking of a higher-level version, a "system version", if you will. At least that's my read of your post.
What you have to do in such situations is map all of your different component assemblies into a system version. You say something along the lines of "Version 1.5 of the system is composed of Foo.Bar.dll v1.4.6 and Baz.Qux.dll v2.6.7 and (etc.)". Hell, in a distributed system, you may want different versions for each of your services, which may in and of themselves, be composed of different versions of .dlls. You might say, for example: "Version 1.5 of the system is composed of the Foo service v1.3, which is composed of Foo.dll v1.9.3 and Bar.dll v1.6.9, and the Bar service v1.9, which is composed of Baz.dll v1.8.2 and Qux.dll v1.5.2 and (etc.)".
Doing stuff like this is typically the job of the software architect and/or build manager in your organization.
There are a number of tools that you can use to handle this issue that have nothing to do with your language of choice. My personal favorite is currently Jira, which, in addition to bug tracking, has great product versioning and roadmapping support.
Might want to have a look at this page that explains some ways to integrate consistent versioning into your build process.
There are a number of different things that contribute to the problem. Off of the top of my head, here's one:
One of the benefits of a distributed architecture is that we gain huge potential for re-use by creating services and publishing their interfaces in some form or another. What that then means is that releases of a client application are not necessarily closely synchronized with releases of the underlying services. So, a new version of a business application may be released that uses the same old reliable service it's been using for a year. How shall we then apply a single release tag in this case?
Nevertheless, it's a fair question, but one that requires a non-trivial answer to be meaningful.
Not using build based version numbering for anything but internal references. When the UAT manager asks the question you say "Friday's*".
The only trick then is to make sure labelling happens reliably in your source control.
* insert appropriate datestamp/label here
We use .NET and Subversion. All of our application assemblies share a version number, which is derived from a manually updated major and minor revision numbers and the Subversion revision number (<major>.<minor>.<revision>). We have a prebuild task that updates this version number in a shared AssemblyVersionInfo.vb file. Then when testers ask for the version number, we can either give them the full 3-part number or just the subversion revision. The libraries we consume aren't changing or the change is not relevant to the tester.