Labels

Monday, 22 February 2016

The Mobile App Development Platform Decision

The Mobile App Development Platform Decision:
Cross-compiled vs Hybrid vs Native



The objective of this report is to help developers and companies decide on the best platform for building mobile apps. This report is based on a compilation of research, market trends and ratings from different sources as well as our technical evaluations on various cross-platform tools for building mobile applications available in the market.


There were two drivers within our company leading to the work in this report.

  1. Keeping Current with the Latest Trends: Need to keep abreast of the latest trends and technologies mobile app development, and hence the best platform to develop our mobile apps.
  2. Rationalize the Development Cost: Given that we develop web-based apps, server-side apps, and mobile apps, it was becoming difficult to manage the diverse set of technology skills required within our development team.  The number of skills each developer needs to master as well as the number of developers required to manage different apps was fast becoming unsustainable.


Switching app development platform is expensive and often entails re-developing some of the existing apps.  The decision on which platform to use for developing mobile apps could be affected by the other types of applications and technologies used within the company.  This report focuses on the mobile app development platform options. 


After performing individual comparisons between different Hybrid development platforms, and selecting the top few, an overlying comparison table was made comparing 3 different mobile development approaches - Hybrid, Cross-Compiled and Native. This table compared cross-compiled platform tools such as Apache Flex/ActionScript, Ionic, React Native, Appcelerator Titanium, Xamarin, and Native Android/iOS development tools on various metrics such as Ease of Development, Learning Curve, Time-to-market, Device Access, Performance, User Experience, Development Cost, Access to features and Future Scope.

Each platform option was scored on each metric, with a lower number signifying a better score. These scores were based off of different comparison research reports done which highlighted specific aspects of a certain cross-platform tool on similar metrics.  The report contains scores that are subjective assessments made by us. Other experts in this area could arrive at different rankings and conclusions.  At the very least, we hope readers would get a good idea on different factors to consider in picking their mobile app development platform.

Read more about InterACT2Go - Our Mobility Platform 

InterACT2Go from CRMantra


To download the full report, including our final recommendations, please fill in your details below.






Friday, 12 February 2016

Using Siebel’s new Usage Pattern Tracking


Using Siebel’s new Usage Pattern Tracking (UPT).


With Siebel 15.5 came the new Usage Pattern Tracking; an improved way to get usage statistics out of Siebel.  Siebel has always had the ability (though little known) to provide usage statistics based on providing a time stamp for every view that a user visits.

We have been using this method for many years to provide in-depth analysis of user behaviour within Siebel.


With the new Usage Pattern Tracking we are now seeing additional events that can be tracked.  Before UPT we could only monitor view visits and had to extrapolate the Application, Business Component or Applet from that.  With UPT this has now been made a lot easier for us to get that information.

UPT will now also provide information on events associated with client scripts, for example Browser Scripts, JavaScripts and Server Scripts. This adds a lot of value to information that can be gathered.

So all in all UPT brings a lot of additional information that can be analysed, but what it doesn't do is perform the analysis.

I can see that the Usage Pattern Tracking view within Siebel may be of use if a specific view or script is being closely monitored - maybe there is a performance problem within a view or a script is causing issues - it's like a mini SARM.  But I can’t see that it would be of any use for one of our typical engagements that require us to analyse millions of records.


And that for me is the missing piece. After a quick Google it’s obvious that I wasn’t the only one hoping that UPT would provide the capability to analyse the data that has been captured – but sadly no – that task is still left to the Administrator.



Siebel Usage Analytics from CRMantra


As I mentioned earlier, we have been performing Siebel Usage Analytics for a long time now – and we always had, and still do, have the same ‘knee jerk’ reaction to our solution.  That it was very easy for the Siebel Administrator to turn on the usage service and do the analysis by themselves.  I agree with the first part, yes it’s very easy to turn on – but the second part? Absolutely not.  I’ve lost count of the number of Siebel Administrators who came back after a few weeks and asked for help to analyse the enormous amount of data that had been captured.

So whilst turning on the Usage logging and capturing the data is easy – actually analysing the data and turning it into actionable insights is no trivial task.

This is how we typically work on one of these engagements.

1 – Work with the business and the Siebel team to discover what is the key aim for this analysis.
For example; Is it to find our who is or is not using Siebel and overcome user adoption issues?  Is it to find out which parts of Siebel are heavily used – or never used?  Is it to find out what are the common paths users use to traverse the application and improve training?

2 – Turn on the Usage logging (that’s about a 10 minute job!)

3 – Work with the business to gather additional user demographic information.
What’s a user’s role, their skill level, their job title, their location, their business unit – everything we need to further enrich the data.

4 – We leave the logging to run for around 2-3 weeks and then get to work on the analysis.

5 – We merge in the user demographic profiles and start running through our standard set of analyses.

6 – We run any additional analyses that the client has asked for and pull together a full set of reports.

7 – We then look for actionable insights within the results.
Are there 8 heavily used Views that could do with some UI improvements? Are there a set of users in a remote locale that just aren’t using Siebel?  Is their a Siebel ‘SuperStar’ that is using Siebel more than anyone else – what can we learn from them?

8 – Finally we package up the analysis, generate the necessary graphics and charts and present our findings back to the client, with clear action statements that we gathered from the analysis.

Typically, projects would then be kicked off based on these findings – which aren’t always Siebel based, it could be new training material or a re-training programme.

But, most importantly is the need to re-analyse Siebel once any improvement measures have been put into place – without that there is no way of knowing if the actions that were taken had any effect!

So, hopefully it’s now clear that whilst turning on the Usage logging has to be done, it is only a very small part of the process.

Get in touch if you’d like to run one for your organisation!
read about some of our findings in this blog post > Siebel-usage-analytics

Or read more about all our Services - Improving Siebel Usability

David Moorman
David.moorman@crmantra.com