How do you handle internal systems development? - intranet

We regularly convince our clients of the values of having a good quality intranet and systems, but within my organisation it doesn't seem that we're "eating our own dogfood".
We have really lacklustre intranet systems - a hastily thrown together sharepoint install that no one really oversees. An additional cobbled-together legacy intranet system that's been dragged "into" sharepoint by loading it in frames.
The general approach is that the senior developers are "too valuable" to waste on internal development, so responsibility for these systems falls to the least qualified, the work experience guys, etc. At least until they're found more lucrative work, and shipped off to a client. Hence the cobbled-together nature of our systems.
It's very much a case of the mechanic driving a shoddy car.
So how do you handle it? How can I convince management to spend effort on our internal systems?
Or is it actually that within a tech company that it's not worth it? Are high-quality systems only required for non-technical users?

Do it the same way that you convince the clients. Show the business value of the investment. If that can't be done, then it would be a waste of resources.

The best choice is adopting common Internet structures, such as blogs and wikis. Wikis, in particular, are very effective.
Having some kind of karma system can also be useful. Many people will do for status what they wouldn't do for money.

Related

Netsuite Salesforce Integration ESB vs Prebuilt Connectors

Is there a benefit for using prebuilt connectors to and from SaaS billing platforms like Aria/Zuora when they live between Salesforce and Netsuite used as pure CRM and ERP/Acounting/Finance respectively. That is, versus using an ESB/Integration platform like Mulesoft or Boomi.
We are currently looking at changing billing and ERP systems and having them integrate together and with Salesforce CRM. So the chain would look like:
CRM -- Billing Solution -- ERP
Many of the billing systems have prebuilt connectors that work with ERP systems like Netsuite or Fusion, as well as connectors for Salesforce. Not to mention web service end points/APIs.
But there are integration vendors like Mulesoft and Boomi (basically Enterprise Service Bus PaaS providers) that also allow integration between the services.
I come from a SOA background and tend to favour a standalone ESB to connect the systems but due to my lack of familiarity with Saas ERP systems don't understand the benefits and pitfalls in the prebuilt connector vs ESB debate. I understand the concepts behind avoiding point to point integration, which would turn around to being a benefit for using ESBs. But is there a benefit for using prebuilt connectors within the SaaS platforms ... and are there serious downsides (my main concern).
Can anyone provide some insight here? I am not asking for "which one is best", just some real world experience good or bad that could help someone make these kinds of decisions.
I cannot provide a comprehensive comparison between the services you plan on using, but your question is quite interesting so I thought I'd share my thoughts and experience and hope you'll benefit from it.
Prebuilt connectors are not something new - they existed long before SaaS and iPaaS became a thing. So their pros and cons are still the same, the main issues you will be looking at are still very much related to the lack of flexibility you'll be facing and of course, the shortcomings of point-to-point integration. Things are somewhat refracted via the prism of SaaS/iPaaS but I believe that most aspects are still relevant.
Prebuilt Connector capabilities and support
You need to assess to what extent a prebuilt connector really covers the integration between the two systems. Services like salesforce take pride in their customizability and extensibility by using 3rd party extensions. In most cases the connector will be following a one-size-fits-all approach that only satisfies the most common and simple of all integration needs. It's all fun and games until something has to change. It is not possible to know in advance what you could need the future but think about it - would you be able to count on having your customisations and extensions covered by the prebuilt connector in case you decide to integrate them as well?
Another point you must consider is support - what happens if one of those companies decide to suddenly announce that they will stop supporting future integration via prebuilt connectors you are already using? You should check to see if there are any guarantees for you.
Tight Coupling and Service provider lock-in
Using point to point connectors will couple systems to each other so you’ll be severely limiting your options to switch between platforms if you need to at some point. It might seem a fairly simple integration scenario now, but adding more systems to the mix over time generally makes things even worse, since you are going to have dependencies here and there, and not every new system will have a connector out of the box to integrate easily with all the others you're already using. Having a middleware gives you the precious ability to map and transform data if needed, and maybe even apply some business logic that makes your life much easier (and cheaper). Also you'd be able to replace a system without having to replace others depending on it.
Consider your scenario: if you decide to change the billing system, you will have to find one that’s being properly supported by both the CRM and ERP providers. Thus, you could potentially remain locked into using exactly these three, even though for example they don’t fit your needs anymore or there is something else on the market that would have given you great competitive advantage if only you could integrate with it.
Orchestration and future investments
An important note about the p-2-p scenario is that you will not be able to implement process services that span across all the systems if needed. The added flexibility and benefits of using even simple forms of orchestration (I’m not even talking about achieving what can be achieved with a full featured business process management) will be off reach for your business. When the market changes and Time to Market is the deciding factor you may be not be prepared.
Thoughts on choosing iPaaS
Using iPaaS platform looks like a much better decision in the long run. Yet, you still have to make sure that the platform does not just give you some set of predefined connectors and drag&drop beauties (they all do), but also the ability to easily implement your very own integrations from scratch while supporting industry standards. I think that it is absolutely crucial to have this kind of flexibility when talking about an ESB solution, be it in the cloud or on premises.
The potential cons of the iPaaS approach would be:
you come to depend on yet another service provider and you will have more costs because the service is not free;
your data travels to another service provider, so there is additional risk in terms of security, no matter what the service providers may try to tell you;
more upfront effort spent on design and implementation;
additional burden, related to having to maintain integration and accommodate potential changes (however rare they might be) if a new version comes out.
Conclusion
It’s all really a tradeoff between desired flexibility and the investment that you’re willing to make. Your decision will heavily depend on the current state of your business and your growth expectations going forward, rather than the purely technical side of things.
I hope my thoughts gave your some perspective. Please update the question with your decision and reasoning when the time comes. Good luck!

Cross Platform, Open Source, Data Synchronization Strategy

The goal is to have occasionally connected clients belonging to one user running on an array of different platforms be able to share a common dataset. Server will be GNU/Linux. Platforms and devices initially targeted include: GNU/Linux, Android and Windows on x86_64, i686, ARMv7, ARMv8. Possible consideration given to OSX and iOS devices. The UI toolkit will likely be Qt5. The non-interface logic will be in c++. My code will be licensed under the GPL or AGPL.
Many applications like Google's Maps, for instance, require a reliable connection to work effectively. My design decision is that each client should be fully functional without a network connection, assuming that the user has not used a different device/platform to change the data. In such cases, only a synchronization with network access should be necessary.
Some applications are written with hand-rolled data persistence and synchronization layers customized for each platform or subset of platforms. While this is quite effective, it is beyond the available skill and manpower available (just me :) ).
Also, though I'm somewhat conditioned by past experience to think of things from a relational database perspective, the data I'm expecting is probably better suited to a JSON/NoSQL database. Though there is UNQLite, it has no more synchronization capabilities than SQLite. And things like MongoDB, CouchDB, etc. are, as far as I have found, solely targeting the data center or cloud at the moment. Really, the same seems to be true of SQL solutions. Are there truly cross platform databases that have native synchronization/replication capabilities?
I've spent the past weeks looking at a variety of options, but have found little truly useful. The closest I've seen to something useful is CouchBase; however, it seems to be rather unstable (from an API standpoint, not relating to crashes), has a huge set of dependencies, seems to be nothing but a SQLite wrapper on mobile platforms and has limited developer docs at the moment.
If there is an applicable library or toolkit, please point it out. If not, or if this is excessively "discussion oriented", please kindly point me to an appropriate resource to find help before closing this question.
Notes:
How would you design & implement a Cross Platform synchronization mechanism? Talks about hand rolling solutions. If this is the only way to go, I would appreciate references specifically applicable to SQLite3 since I know that it is stable and available on just about everything but the kitchen sink.
Cross platform data synchronization Does not really address anything significant. One of the main concerns I have is effectively synchronizing data while minimizing network bandwidth requirements. Even in the US, there are vast areas where high speed data is simply not available. To me, usability in the widest range of circumstances is important.
How to Synchronize Android Database with an online SQL Server? This post is somewhat helpful but I am a total, absolute novice at developing for anything but *NIX on servers and workstations. Again, if it is necessary to do SQL based persistence for this set of requirements, more pointers to guidance would be helpful.

Salesforce: Developers view

We are in the process of deciding a route to take for a new CRM system. We've had Salesforce come in and give us their pitch and the developers have had a little play with it, made it do a few things we need etc...
It's hard for us to get a good idea of the pros and cons until we start to develop with it and if you start, you are tied in to a year contract for X number of users and it's pretty expensive as it is..
So, my question. Who has developed for sales force platform? how did you find the experience? would you recommend it as a good solution? Should we just continue with our ruby/rails/mongo systems?
Thanks!
The good news is the amount of customization you can do via configuration is amazing. The out-of-box functionality is very strong and you get a pretty nice security model and reporting system included.
Having said that, when you do need to do custom development beyond what the configuration can support, the pain can start;
-APEX is the most frustrating (modern?) language I have ever worked with.
-Deployment/Migration can be slow and painful (some things cannot be migrated, e.g. Approval processes)
-APEX is a rather immature language missing much of the concepts of .net or java
-Debugging is messy (log actually gets truncated at a certain length, no stepping)
Having said all that, SalesForce.com is a very strong CRM - 90% of the custom work you'll want to do will be really smooth and fast, the remainder will be extremely painful.

What slowed down development on your project and how did you overcome it?

Is there anything that slowed development down on a project you worked on and if so how did you improve it?
We recently introduced continuous integration to solve the problem of a constantly broken build.
To increase code quality we introduced code reviews.
The client was constantly changing the static data (lookups) so we introduced a change control process around it.
Communication with our offshore colleagues was difficult so we introduced office communicator
I would be interested in hearing about things that slowed your team down and how you got round them.
Our biggest productivity loss is when developers don't reach the point where they are "programming in the zone".
Developers can be exponentially more productive if they don't have distractions and just zone into what they are doing.
Reading Stackoverflow.com and trying to figure out answers to users' questions consume quite an amount of time. Oh... wait...
Number one far beyond anything else: Failure to adequately or accurately determine requirements.
Cascades into failures to estimate timescales correctly (obviously), an inability to handle change (because you weren't in possession of a full picture at the start), and increased change requirements (actually original requirements manifested as change because you didn't pick them up initially).
A lot of this can be mitigated to an extent by a mutual understanding that you are in an adaptive, formative cycle (i.e. agile), the really destructive thing is when you think you have good information.
Personally, I found that over zealous project managers has caused very slow development in the past. PM's who need very accurate specs written and meetings to cover the project etc causes lots of problems and sometimes you just need to tell them how little you are getting done. Also, I have found that client mind changes has lost me a lot of time in the past and I am working on a sign-off process where clients will get charged for wasted time when they change their mind.
Some things slowing down the project:
Multi-site (offshore) communication (sometimes even distributed team). I tried to set up time boxed conferences (status, requirements clarification) and with strict control of things to be discussed. Of course, with meeting minutes in the end, so no further discussions on what has been decided.
Continuous changes from the customer. They tend to be verbal, asked directly to the development team, fragmenting the development/testing team. I use to force a single point of communication when it comes to changes - the change control board. The handling of the changes (analysis, technical solution, planning, etc.) is done and in this control board. The conclusions are documented. The small changes are planned and handled as a bulk, for the sake of efficiency.
Updating technical documentation - this looks like a slowdown from development perspective, but it usually pays back on other activities (handling changes, discussions with the customer, onboarding, etc). So it must be done :). What shouldn't be done is to put too many details adds little value. However, the right degree of details... there's no rule to find it out :).
I almost forgot: "Analysis - Paralysis": thinking too much (on a technical solution). Having second thoughts, etc. This will definetly slow down the development as a whole. Adopting a pragmatic attitude might help.
If a large chunk of development happens offshore then
Make sure your offshore colleagues have the best hardware/software resources available. I have seen this to be a serious cause of productivity loss. The offshore contractor will provide its developers outdated versions of development tools. Their development machines will have poor configurations (RAM size etc.) causing serious productivity losses. Based on the requirements, ensure that both offshore and onsite developers have the same software and hardware available for development.
Moreover, network issues across continents will really slow down development. In many projects I worked for, offshore teams were connecting to databases located in US slowing down development/testing dramatically. It's a big demotivator when for ex. firing a single select query takes several seconds to complete.

Is it feasible to support multiple applications of the same type that are all written in different languages?

As much as we would all like to say it is a benefit to programmers to be language agnostic, is it really feasible to support multiple enterprise Web applications of the same type all written in different languages? Think about how complicated a CMS or e-commerce system can be -- now imagine supporting three different CMS platforms all written in different languages. I would hate to be known as a .NET or Java or PHP shop, but I also don't want to be the vendor who says they can support a solution they have never worked with, upsetting a client who wonders why we can't get something done right on time. Can anyone speak from experience on this? Does your company usually just suck it up, and try and to learn a new platform on the fly? Do you bill up-to-speed, or eat those costs?
I think it all depends on who your clients are and what they expect. I think knowing about different technologies is good, but really when you're hired by someone, they expect you to know what you are doing. Personally, I would much rather be known that I do a really good job with a certain type of technology and when hired, I get the job done well.
If you try and go after every contract without regard to what your core competencies are, you aren't going to succeed. You'll anger the people who do hire you and make mistakes, and you'll potentially miss opportunities where you can really shine. Sometimes you have to make compromises to pay the bills, but if you aren't careful, it can bite you in the end.
The large consulting firms I've worked with throw resources at it and hope they don't anger too many people. They mainly do this because they know that the people who work with the consultants and get angry when they don't get the job done aren't the ones making the decisions to keep them hired. To them (not all of them I know, but some definately), don't care if they screw up because they ultimately know they can convince the VPs and SVPs to keep them around.
To be honest, I think you tend to see this kind of thing happen over time, no matter how disciplined the organization is. It's natural for new methodologies to come bundled in the form of new libraries, frameworks, or even languages. Keep in mind that a .NET shop may well have been a ASP/VB shop at one time. They'll probably still maintain older systems for clients, because there's little benefit to rewriting everything from scratch.
I'm not sure anyone has the luxury to keep everything "the same," because language issues are minor compared to library or framework issues -- especially the ones you build yourself.

Resources