Wednesday, October 23, 2013

Questions from Presentation: Using Analytics to Improve Customer Communication

      
 
The best part of our presentation to the ABA on 10/22 was not the presentation itself but the really good questions that followed:
 
 
Can you elaborate on what impact privacy laws might have on using publicly available social media and other content as your source for customer messaging?  My only answer to that was that we are not the compliance or bank law experts.  In talking with a compliance vendor later, he saw no issue with this because as individuals we opt in to disclosing all manner of personal information to Google, Facebook, etc.  Bottom line: if the banks decide not to take advantage of this publicly available information, somebody else may do so instead.
 
Any ideas on how to know identify of clicks to specific ads you would embed?  That’s a darn good question, and I wonder if there is any way the e-Ad could somehow capture session information.  The question was posed by an SVP of Marketing of a 20 branch bank who is doing very well with embedding ads in their real estate channel that should drive sales to their lending channel and they are getting a lot of good clicks but how could they ever tell who it is that they need to follow up with.   More to follow as we could not come up with a complete answer to this very good question

Is it possible to do Communication Management completely independent of Analytics?  We answered that yes, one could definitely find low hanging fruit in the customer communication area especially in correspondence, without having to tie in analytics.  We explained that many banks in particular are getting a better handle on the way they manage the correspondence with their clients to ensure that they leverage their brand, manage message consistency and even compliance.  We mentioned the example of the bank we met at the Customer Experience Exchange who found countless examples of letters to customers stating that they "did not meet their standards."  (cue the fingernails on chalkboard).  We do believe that analytics helps you embed much more personalization and power into your messaging however and that most banks will at some point couple analytics to their communication hubs. 

Link to presentation: http://www.linkedin.com/profile/view?id=23220005&trk=nav_responsive_tab_profile
 
 
 

Tuesday, October 22, 2013

ABA 2013: Banks Bounce Back


The ABA 2013 conference in New Orleans has just drawn to a close.  Banks are bouncing back!

 
Most of the attendees were C-Level execs from small, community banks with assets of <$1B and they were mostly from the Southeast given that the venue was New Orleans this year.

 
Dodd-Frank is in the midst of being fully implemented with tighter lending guidelines for QM or Qualified Mortgages hitting in Jan 2014, however, the head of the CFPB stated that there would be a gradual easing in of the guidelines with little to no litigation in the first few months.  The bottom line for the small banks was that if they have a traditionally good track record making loans in their markets with acceptable loss rates, this new legislation should have little or no impact.  The reaction of the CEO's ranged however from from activist "we should show up with thousands to march on Washington" to more resigned "let's not dwell on the regulation but focus on our business." 

 
The attitudes of the CEO's matched almost exactly the geographies from which they hail.  The West Coast CEO focusing on a vibrant, young work force, the East Coast bank implementing new technology but only to the extent that it would pay off with its specific customer profile.  The CEO from a deep south bank explained how his bank's branches had been "slabbed" during Katrina, a verb meaning completely demolished save the cement slab upon which the building used to sit.  But the employees were at work the next day at fold-out tables doing business, ultimately handing out up to $100M total in cash to bank customers needing money to evacuate immediately.  I'm not sure what was more moving - the image of employees at those tables in front of their erstwhile branches, or the news that the bank experienced less than 1% loss from this cash handout despite the complete lack of ID validation.  "How can you force someone to produce ID, when you can plainly see that they 'swam' out of their house this morning."  This, said the CEO, is proof positive that we are still a nation of trustworthy individuals.  This particular bank has expanded rapidly since Katrina, capturing a double digit increase in market share right after the storm.  How's that for a positive customer experience?

 

 

Wednesday, September 11, 2013

Migrating-in-Place for ECM Consolidation


Migrating-in-Place for ECM Consolidation

We recently covered why organizations are moving swiftly to Next Gen Repositories.  No longer are modern organizations satisfied with the old “system of record” with all of its upkeep, overhead, and user charges.  They want something from which relevant, actionable data can be derived and pulled out by Gen X & Y users as they demand it.  A perfect example of this new, “system of engagement” is being delivered as part of the actual migration from the old repository to the new.  Let me explain.

Instead of simply mass-migrating the existing data to the new target system, many are choosing to “Migrate in Place.”  In other words, they leave the existing archive in place and allow users to continue to pull and view content.  What results is a process by which users dictate the content that is actually required in the new system.  Behind the scenes, the data being actively selected gets migrated dynamically to the new environment.  If you are from the document scanning world, this is similar to “scan on demand” which left paper archives in place until retrieved and scanned.  Once retrieved, they live again as a digital asset, and are ingested into workflow and repository systems with specific retention periods.   

One big assumption here is that the organization has decided that keeping the legacy system in place for some period is desirable.  This may be due to factors such as:

·        Legacy system can be accessed, with or without API’s

·        Legacy system maintenance is no longer required to maintain system access

·        Knowledgeable staff is no longer around to properly administer the legacy system (“How can we shut down what we don’t even understand?”)

What happens to data that never gets selected?  It simply ages off, or is migrated based upon specific configurable requirements.  For example, a life insurance company may have policies in force for life of the insured (100 years?) + seven years, so those policy documents would obviously be chosen to move to the new system even if not chosen by user retrievals.  This type of “Migrate Based on Retention” process can be configured and automated.

One other important note is that it’s not necessary to own API’s to the legacy system because of the advent of new access methods such as the CMIS Interface, (LINK TO INFO ON CMIS) an open access protocol to many ECM systems, and due to repository adapters which can be easily built by savvy integrators, with or without a CMIS interface.
One of the dangers of any mass data migration is what happens when you find entire libraries of data that don’t appear to have any disposition requirements.  “What is this data and to whom does it belong?”  Worse than that: “Who do we even ask to find out who owns the data?”  Vital to any successful migration is maintaining a regular communication channel with key user communities which access this data.  Think about setting up a spreadsheet or table with content type, user group and “go-to” individuals’ contact details so that you can make these important disposition decisions expeditiously.  The cost of this process is not in running the process but in deciding what to do with “orphan” data and how to bring the business and IT together long enough to decide.    At the end of the day, you have many fewer orphan decisions to make when you migrate-in-place. 
               “Isn’t setting all this up really expensive?” is one obvious question.  The answer is: “not really.”  The key is deploying a relatively light weight repository with excellent process flow and decision capabilities.  If done right, you can easily access both the legacy and new systems with zero coding.
               Here are the Do’s and Don’ts for Migrating-in-Place:
·        Implement a light-weight repository with excellent process flow capabilities so that decision making can be automated
·        Don’t worry about using or buying API’s to the legacy system as there are other ways to access the content
·        Avoid orphan document situations by establishing regular touch points with business users
·        Set up simultaneous access to both systems making it transparent to users