Archive

Archive for the ‘cloud’ Category

The poisoned chalice of #EFSS and cloud shared drives

September 23, 2017 Leave a comment

(Original article on LinkedIn June 29, 2017)

Box can do what network shares do” claims a recent email campaign for Box Drive.

Noooooo!!!!!!” echoes the collective scream of #ECM and #InformationGovernance practitioners, who have been trying to wean users away from the nightmare of network shares, for the last 25 years.

Just to be clear, my warning is not specifically about Box. Take Box Drive, Google Drive, OneDrive, DropBox, or any of the recent offerings that define the EFSS market.

The idea of replicating the functionality of shared network drives in a cloud environment is a really, really bad idea.

It propagates silos, lack of organisation, lack of governance standards, lack of consistency, lack of security and, ultimately, loss of control and accountability.

It’s not a technology issue. I know that Box Drive, for example, can offer much richer security and better document management capabilities than standard network shares.

But the users don’t.

And advertising these capabilities as a “better network share” which “allows them to use the same workflows they use today”, reinforces all the bad behaviours that we have been trying to eradicate all these years.

I get it: It makes sense to move your unstructured content from your expensive on-premises storage disks to a managed, scalable, and significantly cheaper cloud alternative, where you don’t have to think about backups and disaster recovery, and rack space, and air-conditioning, data-centre managers with night shifts, system upgrades, etc. I understand all that.

But taking your existing content mess and moving it wholesale to the cloud, is not the right answer. It may be quick and easy, but that doesn’t make it right. You are just delaying the inevitable. If you do want to move your content to the cloud, think VERY carefully about what you are doing and why you are doing it:

  • How do you assess what content you actually have and what risk it carries?
  • What needs to be preserved and what needs to be thrown away?
  • Who needs access and how will you protect and monitor security and privacy?
  • What do you need to encrypt?
  • How are you going to organise and classify what you are keeping?
  • How will you avoid unnecessary duplication and understand whose version is the right one?
  • How will you teach your users to stop emailing 85MB PowerPoint files to each other for review?
  • How will you teach them to stop downloading GDPR sensitive information into spreadsheets and sharing them out with partners and third parties over email?
  • How will you ensure that when an employee leaves, his cloud drive does not become a black hole for critical business information?
  • How will you apply AI and Analytics across your whole corporate knowledge base, if it’s scattered across thousands of personal silos?
  • Etc., etc., etc.

The list is endless…

You can argue whether “ECM is dead” or if it should be called “Content Services” or “Intelligent Information”, or whatever. It will not make the problem go away. The reason ECM became a multi-billion software market, is because of companies realised the risks that network file-shares had created, and the need to add a layer of governance and control, classification, metadata, and automation, above the standard uncontrolled “file sharing” that the operating system offered.

Caveat Emptor – Buyer Beware!

Please don’t take us back 25 years by re-creating the same nightmare, taking one of the least disciplined Information Management practices, and replicating it to the cloud.

“Appification” Part 2 – What is the impact to ECM?

In my previous post (Part 1), we looked at the appeal of Apps, and why we grew to love them. Now, let’s look specifically at the impact that Apps have to the ECM software industry.

Impact to the ECM Industry

With over 25 years under its belt, the ECM industry (with its Document Management pre-cursor) is a relative dinosaur in enterprise software terms. It was established as an industry, about the same time as ERP, and before CRM, BPM or eCommerce.  So, as is the case with any other respectable octogenarian, we are pretty set in our ways. Yes, we may introduce new functionality or attach another technology segment under the ECM moniker every now and then, and we will endlessly debate if ECM, EIM or Process Services is the right name for it, but fundamentally we are still delivering software the same way we always did.

But change is afoot: Whether we like it or not, the “Appification” culture described in [Part1] is challenging the fundamentals of how the software market works, including ECM, and how relevant it remains to the enterprise of the future. And in Darwinian terms, we’ll have to either evolve to survive, or we will face extinction.

There are two main areas where “Appification” has profound impact to the way we operate today: The way we design products, and the way we take products to market.

Impact to ECM Product Design

“Appification” brings fundamental cultural change, to the way software is conceived, designed and delivered. Every core design element is challenged, as well as every classical development and delivery methodology.

  • The “Apple” effect: Apple’s “Design Thinking” principle, threw the rulebook away when designing the first iPods and the first iPhones. They became icons of usability where less was more: who would have envisaged an electronic device with no buttons, just a glass slab? Where ALL functionality and behaviour is software controlled? Where accelerators, proximity sensors, hand gestures and voice command, would become the interaction controls, instead of a mouse, a keyboard, switches, levers and knobs. How many enterprise, and in particular ECM, solutions offer similar UI experiences?
  • The “Singularity” principle: For years, enterprise software vendors prided themselves on the functionality breadth of their offering. The more features, the more capabilities, the better. Apps challenge that: What is the most critical element of software design? The User Interface and usability? The richness of functionality? The quality of information? Apps’ “task-oriented design” challenges that principle: “Do one thing, very well”. They are designed to remove complexity and isolate distinct elemental functions, and then deliver these in the most intuitive manner possible. What ECM functionality do you include or exclude from an App? How many Apps would you need to provide a complete “set”?
  • The “EFSS” effect: “Enterprise File Sync & Share” has been one of the most disrupting apps in the ECM field. Even though there are clear overlaps, it did not set out to challenge the traditional ECM vendors, as such, it created a completely new market of its own by addressing just four fundamental requirements that traditional ECM couldn’t: Free and backed up storage space in the cloud; content accessibility from everywhere; ability to share larger files than email could, outside the firewall; and transparent synchronization of content across multiple devices (desktop and mobile). Box, Dropbox, Evernote, OneDrive, GoogleDrive, Picasa, iCloud, etc., all became a thorn on ECM’s side, because users liked the functionality they enjoyed in their personal space, and wanted to bring the same capability to their corporate environment. Now ECM has to step up and deliver that.
  • The “Nightly Update” effect: I have about forty apps on my smart phone, and it seems that at least half of them get updated on a weekly basis. Some, more often than that. Updates just happen, without any involvement from me, without any need for IT support calls, without scheduled downtime, without any need for training. App users are not only looking for the same experience from their Enterprise software, but they are looking for the tools to offer this experience to their own customers. The days of the 18-month deployment cycles are truly over. ECM needs to support similarly fast, agile development cycles and continuous improvement, as apps do.
  • The “Device” conundrum: Vendors can no longer dictate the device that their software will run on. Apps have completely transformed user expectations around being device agnostic: They expect the same App to behave appropriately, whether they are using it on a smart phone, a tablet, a desktop browser, or their hotel room TV. The name of the game is “Responsive Design”, where apps understand what device they are deployed on and adjust according to the operating system, the device format, the interaction capabilities, the connectivity bandwidth, security, etc. Enterprise software has a long way to go before it’s truly device agnostic, and introducing device independence in large suites of functionality, like ECM, does not come cheap.

Impact to ECM Go-to-Market Strategy

Whilst the ECM Engineering teams are grappling with the product design changes forced above, Sales and Marketing also need to completely re-think their approach to the market, in order to move into the Apps market space. Here are some of the key decisions they will need to make:

  • Who is our “App” customer? ECM vendors have to consider two distinct App target audiences. One audience consists of the internal Enterprise users, for whom the ECM vendor will have to provide specific apps and UIs, in order to access the repository and its services. The second audience relates to their client organisations’ own customers. The ECM vendor will either have to supply App development SDKs and tooling for the organization to create their own customer-facing apps, or work with channel partners and integrators to deliver specific vertical line-of-business apps on top of their platform.
  • Who is our buyer? IT is no longer the default target buyer for ECM platforms. The Apps culture has created a whole new set of buyers who are empowered to make purchasing decisions, outside the constraints of IT procurement. Human Resources, Marketing, Operations, Risk Officers, Compliance, etc., all have an expectation to choose their own tools, just as they choose to download a new app on their smart phone. Talking to these new buyers involves learning a whole new set of vocabularies, and a business outcome focused dialogue that does not rely on feature and function details. Few ECM vendors today have the capacity and the vertical domain expertise to carry these business conversations in a credible way. As a result, developing partner ecosystems with the relevant granular domain expertise, has to be a key component of the new go-to-market strategy.
  • How do we license apps? Most ECM vendors, have grown in the era of perpetual, inflexible, buy-once licensing. App users are expecting significantly more flexible licensing terms, which are mostly subscription based. And, while it’s relatively simple to come up with a subscription-based licensing structure, it will still require fundamental changes around invoicing, revenue recognition, renewals, compensation strategy, etc. On top of that, Apps are not designed for the high-value low-volume models that Enterprise software was established on. That model will need to turn on its head, to keep ECM components relatively inexpensive, and finance the product through volume sales.
  • Try-before-you-buy? App users are spoilt for choice: They are used to downloading an app, trying it out, testing it, and if it is perceived as adding value they may decide to license it. ECM vendors need to start offering wider choices, if they are going to compete: Free trial downloads (the Open Source market has a distinct advantage here); more Proof-of-concepts to get users to explore the value and complexity in business terms, Real live pilots that continue into production or can be safely scrapped; Agile development cycles that allow customers to fail often and fail fast.
  • How do we scale down? Traditional ECM markets are all about scalability: ever increasing content, ever increasing user bases, ever increasing processing capacity. Of course you can buy more capacity, more storage, more throughput. Music to our ears. But that is a one-way street, they were never designed to support flexible scaling. The new market models, primarily established by the Cloud providers but also evidenced in Apps, expect the ability to scale up or down on demand. In ECM terms, it’s very unlikely that the content volumes will scale up and down (except in the case of major ROT clean-ups, or periodic Records Management dispositions). But scaling the processing capacity to accommodate seasonal fluctuations, scaling user numbers to accommodate temporary workers and 3rd parties, scaling infrastructure to accommodate migrations, testing and consolidations, are all unpredictable usage-models. How will ECM vendors split their monolithic “All-in-One” pricing, to allow for “Pay-As-You-Use” revenue models? And how will they reconcile traditional investment and R&D budgeting, with unpredictable and varying revenue streams?

Adopting an “Appification” strategy

ECM vendors have some tough decisions to make, if they decide to play in the App economy. They need to decide, strategically, what their end-goal is and decide if this is a viable market model for them.

  • Just because you can, does not mean you should! That’s the first step: Will vendors really commit to moving into the App market, or will they chose to remain a more “traditional” ECM vendor? There is certainly enough market scope for both, at least for the next few years, but the gap will get bigger and the window of opportunity smaller. To borrow a phrase from a colleague of mine: “You cannot be a little bit pregnant”, when it comes to “Appification”. It’s either all-in or all-out.
  • If vendors choose to follow the “Appification” path, which App game do they want to be in? There are three fundamental variants, and they will have to decide which combination they will invest in: App Interfaces, requires them as a vendor to deliver their products to the end-users through apps. That typically means providing access to the ECM platform and functionality through a set of native “App” user interfaces. A lot of ECM vendors already have mobile interfaces to complement their standard desktop UIs. App Solutions, fits vendors that want to target specific with line-of-business apps. Few ECM vendors have been able to play that game successfully. It requires deep investment in vertical or horizontal domain expertise, it creates a very complex maintenance model since they have to keep a large portfolio of Apps in sync with changes to the core platforms, and they are in constant competition with System Integrators and customers’ own IT groups, who think they should have ultimate control of the business users. The final App space is App Tooling. Giving the market the services, development environments, integration capabilities and necessary tooling, to develop their own Apps on top of a core platform. This targets a more modern, microservices-based, component-based, architecture which supports agile development models, but it also creates a whole new ecosystem of buyers, mainly architects and mobile app developers, which does not represent the traditional sales market, or support infrastructure, of ECM vendors.
  • Whichever App game they choose to play, ECM vendors will need to build an ecosystem of App expertise: User Experience designers, solution domain experts, agile developers and project managers, distribution partners, DevOps support, Cloud-first architects, digital marketers, SaaS finance experts, etc. Every segment in the business will need a set of new or complementary skills, and they are not skills that can be learned easily. They will need to be acquired.
  • And the final strategy point is relatively unique to ECM vendors: The information we keep is hardly ever transient. It’s volume driven, regulatory controlled, security-sensitive and persistence-critical. None of these characteristics are native to the world of Apps. Which makes the separation of functional strategy and Information management strategy critical. – Any level of functional innovation provided at the App level, has to remain cognisant of the Information architectures it needs to access, reference and maintain.

Bottom line

There is no doubt that “Appification” is affecting not just ECM vendors but the whole of the Enterprise software market. ECM vendors have a choice: Either they will adopt an App strategy that fits their profile, or they will languish in the ever-decreasing pool of “legacy vendors”.

“Appification” is not a project, it’s a fundamentally different way of software life. The problem however for all of us in the ECM market, is that it’s a new, scary life, and it requires fundamental changes across the whole of the organization: Cultural, financial, and operational. Every department from product design to sales and marketing, to HR, to support, need to take a leap head-first into a massive transformational exercise. How many ECM vendors have the capacity, the investment capital, and are committed to undertake that transformation?

“Appification” Part 1 – Why did we fall in love with Apps?

January 25, 2017 2 comments

AppsI was privileged once again to participate in AIIM’s Executive Leadership Council in December, and this time the theme was “the ‘Appification’ of the ECM Industry”. Given that “appification” is not yet a word in the Oxford English Dictionary, that was always going to be an challenging discussion! I will leave it to AIIM’s paper (which I will link here once it’s available) to align the different interpretations of the theme from the multiple contributors at the ELC event, but I offer here my own contribution.

Definitions

Let’s start with the basics. The closest I found to a reasonable definition of “Appification” was provided by the IGI Global Dictionary:

“The replacement of Websites and Web pages with programs that run on mobile operating systems and mobile devises. With appification instead of the Web being a user’s primary user interface, it becomes an underlying service layer for apps, which become the new user interface.”

It’s an OK definition, but it does not go far enough for me. “Apps”, in the form of readily downloadable, simple task applications, mostly on mobile devices, have become a phenomenon that has dramatically impacted buying behaviours in the software market: Everyone who ever owned a smartphone, has downloaded an App at some point in time. This is not a phenomenon exclusive to the “millennials, or Generation Z, most of us use a smartphone. These same users are looking for similar experiences in their corporate environment.

So, for me “Appification” looks at the impact of the “App” cultural phenomenon on the software industry and, in the context of this forum, particularly to the ECM software market.

Why did we fall in love with Apps?

It’s difficult to understand the impact of Apps to the enterprise software market, without understanding first the reasons they became so universally successful at the personal market. What were the reasons that the mass population of smart phone and tablet users fell in love with mobile apps?

  • Availability: “There is an App for that” is the defining slogan of the App generation. With over 2.5 million apps to choose from on each of the main platforms (iOS/Android), users are spoilt for choice. A simple search and one button, gives you access to exactly the functionality you need.
  • Portability: We carry Apps with us all the time. From the handy units conversion app, to our banking services, to maps and GPS, to our digital darkroom. Everything is readily available wherever we happen to be.
  • Self-service: We don’t need to ask permission from anyone, especially from IT, to install a new app. We just do. We don’t need any special skills, we don’t need training programs, we don’t need elaborate configurations. 30 seconds later, it just works.
  • Price: We also don’t need permission from anyone, to spend $1.50 to buy an app, let alone install a free one. Not even our spouses would bat an eyelid at the typical App price. Contrast that with a typical IT budgeting and procurement cycle for enterprise software.
  • Usability: App designers thrive on usability. The fact that most apps have no need for training, and intuitively deliver value through an interface that is constantly improved, has dramatically challenged traditional software design by putting the user right in the center of the design.
  • “Ghost” contracts: When was the last time you read the terms & conditions of an App? We are so used to just clicking the “Accept” button, that the small print has completely disappeared from the App experience. Press the install button and use it. Acceptance of contractual terms & conditions is implicit!
  • Provider vetting: There is an underlying assumption that when we download an App, someone has vetted that app for security and malicious code. Rightly or wrongly, we very rarely agonise about installing a new app on our phone of tablet. We just assume that it will mostly play nicely with the other apps on our device, and it will not suddenly take over the device to cause World War III on our behalf.
  • Continuous improvement: The overnight, unsupervised, software update. Unlike enterprise software, it just happens and we mostly let it. No planned downtime, to regression planning, no trial runs. New features just appear on our little screens and we (usually) welcome them.
  • Device proliferation: “I want it on my desktop / web / tablet / iPhone / Android / Xbox / TV / Fridge”. Apps are ubiquitous. Chances are that the app which securely holds all your passwords and bank details, synchronises them between your iPad, Android phone and your desktop. And you have instant access to your banking app from all three. And when you run BBC iPlayer or Netflix, you expect to continue your movie where you left off, even if you are watching it on your brand new refrigerator.
Appifying your fridge...

Appifying your fridge…

I don’t think anybody would suggest that these are the defining characteristics of the average Enterprise software suite. Enterprise Software, including ECM, fails on each and every one of these aspects. It just can’t deliver this experience today. We tend to attribute a lot to the “millennial” generation and their upcoming expectations from the corporate environment, but it’s probably fair to assume that all of us would like to enjoy this “App” experience in our working environment.

Following soon: “Appification” Part 2 – How do Apps influence the ECM market?

IOS – A comet of Jurassic proportions

Every so often, an idea comes along that stops you in your tracks.

Innovation is happening at the speed of light all around us but most of of the time it consists only of incremental, evolutionary thinking, which takes us a little bit further in the same direction we were going all along. We have become fairly blazé about innovation.

And then you spot something that makes you sit up, pay attention, change direction, and re-think everything. I had one of these moments a few weeks back.

The name “EpyDoc” will probably mean nothing to most of you. Even looking at their existing website I would have dismissed it as a second or third-rate Document Management wannabe. Yet, EpyDoc is launching a new concept in April, that potentially re-defines the whole Data / Content / Information / Process Management industry, as we know it today. You know what happens when you mix comets and dinosaurs? It is that revolutionary.

I have lost track of the number of times over the years that I’ve moaned about the constraints that our current infrastructure is imposing on us:

  • The arbitrary segregation of structured and unstructured information [here]
  • The inherent synergy of Content and Process management [here]
  • The content granularity that stops at the file level [here]
  • The security models that protect the container rather than the information [here]
  • The lack of governance and lifecycle management of all information, not just records [here]
  • The impossibility of defining and predicting information value [here]

…etc. The list goes on. EpyDoc’s “Information Operating System” (a grand, but totally appropriate title), seeks to remove all of these barriers by re-thinking the way we manage information today. Not in small incremental steps, but in a giant leap.

Their approach is so fundamentally different, that I would not do it credit by trying to summarise it here. And if I’m honest, I am still discovering more details behind it. But if you are interested in having a taste on what the future of information management might look like in 5-10 years, I would urge you to read this 10-segment blog set which sets the scene, and let me know your thoughts.

And if, while you are reading through, you are, like me, sceptical about the applicability or commercial viability of this approach, I will leave you with a quote that I saw this morning on the tube:

“The horse is here to stay but the automobile is only a novelty – a fad”
(President of the Michigan Savings Bank, 1903)

 

P.S. Before my pedant friends start correcting me: I know that dinosaurs became extinct at the end of the Cretaceous period, not the Jurassic… 😉

No more “On-premise vs SaaS”, please!

(c) George ParapadakisCall me OCD, if you want, or a pedant: Am I the only one annoyed by the “On-premise vs SaaS” question? It makes as much sense as asking “Indoors vs Credit card”

On-Premise is an architectural deployment decision (On-premise vs. Cloud). It defines where your software would physically be deployed, and the access and connectivity options available to you. It’s a decision that has to be taken in context of the rest of your enterprise architecture landscape and your long-term design strategies.

Software-as-a-Service (SaaS) is a licensing model and, if you are comparing it with anything, it would be with Perpetual Licensing which has been the traditional IT licensing model for many years. This relates to how you are going to pay for using the solution: Pay a large sum (usually) up front as capital expense (Cap-Ex), and you own a perpetual license to access the solution forever. It is then your choice if you also pay and an annual support fee as an Operational Expense(Op-Ex). But the license to use the software is yours forever. Alternatively, in SaaS, you pay a much smaller amount per user/per month (all Op-Ex), which is flexible as your requirements change. In the SaaS model, you don’t actually own any licenses you are effectively “paying rent” only for as long as you are using the solution. This has nothing to do with where the solution is physically deployed.

And just to confuse the definitions even further, SaaS is also sometimes used to refer to the responsibility for administering the systems and supporting the solution. Typically, in a perpetually licensed environment, the license owner is responsible for the administration of the solution (or a third-party, if Application Management has been outsourced). In the SaaS model, the administration burden typically lies with the solution provider, not the organisation paying for the services.

The confusion has come about by the fact that, most commonly, perpetually licensed software tends to be deployed on-premise and managed by the license owner, whereas SaaS software tends to be deployed on cloud and administered by the service provider. But it does not have to be that way: Theoretically at least, there is nothing stopping you from deploying your perpetually licensed software on a private cloud, instead of on-premise. There is also nothing stopping you from negotiating a SaaS payment model with your software vendor, even if the software is deployed on-premise.

So the question of “On-Premise vs SaaS” usually implies: “On-Premise, perpetually licensed, self administered VS Cloud hosted, Pay-as-you-use, provider managed”.

And I’m not even going to start talking about what this implies for private vs public vs hybrid clouds and Single instance vs Multi-tenant architectures, which are also often lumped under the “SaaS” moniker, even though they have nothing to do with SaaS.

I know the differences are semantic but, as Information Management professionals, we have a duty to be clear about the terminology we use. Our clients have more than enough to be confused about, we don’t need to make it any worse.

P.S. As my good friend and fellow pedant, Chris Walker reminded me, the correct term is “On-Premises” not “On-Premise”. He is right of course. There is no excuse for bad English either! 🙂

When, not if, the EFSS market dies

Wrong WayUnless you have spent the last couple of years under a rock, you will have come across EFSS as the latest and greatest fad to hit the ECM and collaboration market. Discussions on EFSS abound, amongst the ECM and Social Collaboration blogs.

Analysts legitimised EFSS as a separate technology marketspace: Forrester published its The Forrester Wave™: File Sync And Share Platforms at the end of 2013, followed by Gartner’s Magic Quadrant for Enterprise File Synchronisation and Sharing (EFSS) in July 2014. They define EFSS as products that allow secure file Synchronisation, Access and Sharing across diverse devices, and positions with vendors like Box, Citrix, EMC, IBM and Accellion as leaders, adding Microsoft, Dropbox, Google, Apple and others as challengers.

The EFSS market is already a dying market

Alas! All is not well in the state of Denmark: The EFSS market is not going to be with us for long, as a separate market segment. Don’t get me wrong, EFSS functionality has been around for years and will continue to be around for many more years to come. But its product transition from niche, to mainstream, to commodity will be very fast.

Secure sharing of files, small and large, has been around for ages in the form of the mature MFT (Managed File Transfer) market, which is used extensively by large financial organisations, Engineering firms, etc. On the flip side, on-line/off-line synchronisation of files across devices has also been around for a long time, used in both ECM and Collaboration platforms. What has changed, which brought EFSS to the fore, is that (a) SaaS and cloud have added an additional layer of accessibility and (b) companies like Box and Dropbox stepped in to fill a gap in the market by providing easily consumable, standalone products that consumers can buy without involving IT. Adopting a Freemium licensing model helped too.

Move forward a couple of years to today and numerous major vendors, across multiple technology sectors, offer EFSS products: IBM ECM, OpenText, VMware, Oracle, Microsoft, Salesforce.com, etc.  IBM alone, markets at least four different EFSS products, that I’m aware of:

I wouldn’t be surprised if there are even more, disguised and embedded into other platforms such as Asset Management.

And therein lies the problem. If all of these vendors, from different disciplines, are offering either embedded or explicit EFSS capabilities within their core product licensing, it means that the EFSS market is already commoditised. Enterprises will not invest in dedicated EFSS products or licenses, when they can have comparable functionality for free within their existing investments.

Interestingly Gartner’s own Hype Cycle for Digital Workplace Software, which was published in the same month as their MQ paper, positions EFSS already in the “Trough of Disillusionment” which creates an interesting contradiction. IDC in their Worldwide File Synchronization and Sharing 2014–2018 Forecast and 2013 Vendor Shares report also agree that EFSS is a rapidly commoditising market, although they predict that the market will continue to grow in revenue.

There’s another, perhaps even more important, reason why EFSS is not a sustainable market: As BYOD and platform-agnostic applications develop, the core principle behind EFSS – the need to share and move content transparently and securely – becomes too core and too essential to many different business functions. Companies cannot afford to have multiple and conflicting EFSS tools. EFSS does not lend itself to multiplicity – sooner or later CIOs will need to converge on a single common EFSS platform shared by all employees, otherwise it serves very little purpose, the relative cost of ownership becomes extravagant, and the security risk unmanageable. And that means that unified standards and common protocols for EFSS will prevail. I don’t know yet whose standards – that battle is yet to be fought – but a fearsome battle it will be.

Where next for EFSS?

My prediction is that within 2-3 years, the EFSS market will be completely subsumed into one or more other technology segments. If I was a gambling man (I’m not), my money would be on the Collaboration (aka Digital Workplace) platform becoming the natural “home” for EFSS functionality. At the end of the day, EFSS is primarily a catalyst for exchanging information within the organisation and with third parties. In other words, collaborating.

In an ideal world however, I personally would like to see EFSS become (together with most other collaboration platform features) a native feature of the Operating System’s file system, unified across different O/S platforms. But maybe that’s just wishful thinking!

What does that mean for independent EFSS vendors? They have a very short window of opportunity in which they will have to either transform into a bigger platform (e.g. become ECM or Collaboration vendors), get acquired and assimilated (into a bigger platform vendor, perhaps CRM) or get out (i.e. change technology focus). EFSS vendors without a 3-year exit strategy will just disappear. Today, pure play EFSS vendors enjoy an undeniably large marketshare. That’s because the product marketing teams of established B2B Enterprise Software vendors have been asleep and missed the consumer calling. These vendors are now paying attention, and the time is ticking. Watch this space…

CMaaS – Content Management as a Service

I haven’t written much about cloud because, frankly, I don’t think its as revolutionary as people think and because the demand for it has been largely vendor induced. Whatever you think about cloud however, it is here, it is a driving force, and it will continue to be a conversation topic for a while.

I wrote on a previous article (Cloud and SaaS for dummies), that cloud is like a train: Someone else has to maintain it and make sure it it there on time, all you have to do is buy a single ticket and hop on it when you need it. At least that’s the oversimplified theory… For Content Management however, the reality is a bit different: When you get on the train, you don’t carry your bookcase, your briefcase and your children’s photo albums with you, and you certainly don’t leave them there expecting them to be available and in tact next time you hop on the train. You take the train to go from A to B, and you keep your personal belongings with you.

The train analogy works well for Software as a Service (SaaS) cloud models, but not for Content.

The financial argument of SaaS is compelling: Buying software capabilities on demand moves the financial needle from CapEx to OpEx; the total cost of ownership reduces, as support costs & administration skills burden the provider; technology refresh secures ubiquitous access; and economies of scale dramatically reduce infrastructure costs.

Microsoft, Google, Apple, Box, Dropbox and every other ECM and Collaboration vendor, are offering content storage in the cloud – often free – to entice you to move your content off your premises, or off your personal laptop, to a happier, more abundant and more resilient place, which is all good and worthwhile. What isn’t good, is the assumption that providing storage in the cloud (or as I’ve seen it incorrectly mentioned recently “CaaS – Content as a Service”), is the same as providing Content Management in the cloud. It is not!

We (the ECM industry) have fought for years to establish the idea that managing content goes a lot further than just storing documents in a file system. It requires control: Security, versions, asynchronous editing, metadata, taxonomies, retention, integration, immutable flags, workflow, etc. etc. Unfortunately the new fad of EFSS (Enterprise File Synching and Sharing) systems, is turning the clock back: Standalone EFSS environments, are just another way for users to bypass IT and Security controls (Chris Walker articulates this very well in his article You’re out of your mind).

Now, before you jump on my throat and tell me that EFSS came about exactly because of the straitjacket that compliance, governance and ECM have put organisations in, let me say, “I know!”. I’ve lived and breathed this industry since it was born, so I understand the issues. However, we (ECM and IG practitioners) risk throwing out the baby with the bathwater:  Ignoring EFSS and all other file external sharing mechanisms is dangerous, at best. Blocking them is impractical and unenforceable. Institutionalizing them (as Chris suggests) adds a layer of governance over them, but it does not solve the conflict with the need for secure internal repositories and regulatory control.

So, what if you could have your cake and eat it too? Instead of accepting EFSS as an externally imposed inevitability, why not embrace EFSS within the ECM environment? Here’s a revolutionary idea: Why not have an ECM environment that incorporates EFSS capabilities, instead of fighting against them? An ECM repository that provides the full ECM control environment we know and love, as well as keeping content synchronised across all your mobile and desktop devices, so that you can work

I try to stay impartial on my blog and refrain from plugging IBM products, but in this case I cannot avoid the inevitable: IBM Content Navigator offers this today (I don’t doubt that other ECM vendors are or will be offering it soon).

What we are starting to see,  is the evolution of proper “Content Management as a Service – CMaaS”:  Not only storing content in a cloud and retrieving it or sharing it, but offering the complete ECM capability, including sync & share, offered as a cloud-based, on-demand, scalable and secure service.

Why should organisations settle for either an on-premise heavy-weight ECM platform, or a light-weight low-compliance cloud-based sharing platform, when they can combine both?

George Parapadakis

Cloud and SaaS for dummies…

I had to explain Cloud and SaaS to a (non-IT) friend recently. It had to be quick and simple…

On-premise/Licensed: You buy a car and you drive it to work whenever you want. You pay for Insurance, Service, MOT, tyres and petrol. You can tweak it or add “go faster” stipes if you like. If it breaks down, you pay to have it fixed.

Cloud: The government buys a train and pays for its maintenance. You hop on it when you need it, and pay a ticket. If you are going to use it regularly, you buy an annual pass. If the train breaks down, the company sends another one to pick you up and they refund your ticket.

Hybrid: You drive your own car to the station and then take a train to work.

Simple enough?

2020 and beyond… The mother of all Information Management predictions

January 30, 2014 6 comments

crystal ballI’ve been wanting to write this article for a while, but I thought it would be best to wait for the deluge of 2014 New Year predictions to settle down, before I try and look a little bit further in the horizon.

The six predictions I discuss here are personal, do not have a specific timescale, and are certainly not based on any scientific method. What they are based on, is a strong gut feel and thirty years of observing change in the Information Management industry.

Some of these predictions are more fundamental than others. Some will have immediate impact (1-3 years), some will have longer term repercussions (10+ years). In the past, I have been very good at predicting what is going to happen, but really bad at estimating when it’s going to happen. I tend to overestimate the speed at which our market moves. So here goes…

Behaviour is the new currency

Forget what you’ve heard about “information being the new currency”, that is old hat. We have been trading in information, in its raw form, for years. Extracting meaningful value however from this information has always been hard, repetitive, expensive and most often a hit-or-miss operation. I predict that with the advance of analytics capabilities (see Watson Cognitive), raw information will have little trading value. Information will be traded already analysed, and nowhere more so than in the area of customer behaviour. Understanding of lifestyle-models, spending-patterns and decision-making behaviour, will become the new currency exchanged between suppliers. Not the basic high-level, over-simplified, demographic segmentation that we use today, but a deep behavioural understanding of individual consumers that will allow real-time, predictive and personal targeting. Most of the information is already being captured today, so it’s a question of refining the psychological, sociological and commercial models around it. Think of it this way: How come Google and Amazon know (instantly!) more about my on-line interactions with a particular retailer, than the retailer’s own customer service call centre? Does the frequency of logging into online banking indicate that I am very diligent in managing my finances, or that I am in financial trouble? Does my facebook status reflect my frustration with my job, or my euphoric pride in my daughter’s achievement? How will that determine if I decide to buy that other lens I have been looking at for my camera, or not? Scary as the prospect may be, from a personal privacy perspective, most of that information is in the public domain already. What is the digested form of that information, worth to a retailer?

Security models will turn inside out

Today most security systems, algorithms and analysis, are focused on the device and its environments. Be it the network, the laptop, the smartphone or the ECM system, security models are there to protect the container, not the content. This has not only become a cat-and-mouse game between fraudsters and security vendors, but it is also becoming virtually impossible to enforce at enterprise IT level. With BYOD, a proliferation of passwords and authentication systems, cloud file-sharing, and social media, users are opening up security holes faster than the IT department can close. Information leakage is an inevitable consequence. I can foresee the whole information security model turning on its head: If the appropriate security becomes deeply embedded inside the information (down to the file, paragraph or even individual word level), we will start seeing self-describing and self-protecting granular information that will only be accessible to an authenticated individual, regardless if that information is in a repository, on a file-system, on the cloud, at rest or in transit. Security protection will become device-agnostic and infrastructure-agnostic. It will become a negotiating handshake between the information itself and the individual accessing that information, at a particular point in time.

Oh, and while we are assigning security at this granular self-contained level, we might as well transfer retention and classification to the same level as well.

The File is dead

In a way, this prediction follows on from the previous one and it’s also a prerequisite for it. It is also a topic I have discussed before [Is it a record, who cares?]. Information Management, and in particular Content Management, has long been constrained by the notion of the digital file. The file has always been the singular granular entity, at which security, classification, version control, transportation, retention and all other governance stops. Even relational databases ultimately live in files, because that’s what Operating Systems have to manage. However, information granularity does not stop at the file level. There is structure within files, and a lot of information that lives outside the realm of files (particularly in social media and streams). If Information Management is a living organism (and I believe it is), then files are its organs. But each organ has cells, each cell has molecules, and there are atoms within those molecules. I believe that innovation in Information Management will grow exponentially the moment that we stop looking at managing files and start looking at elementary information entities or segments at a much more granular level. That will allow security to be embedded at a logical information level; value to grow exponentially through intelligent re-use; storage costs to be reduced dramatically through entity-level de-duplication; and analytics to explode through much faster and more intelligent classification. File is an arbitrary container that creates bottlenecks, unnecessary restrictions and a very coarse level of granularity. Death to the file!

BYOD is just a temporary aberration

BYOD is just a transitional phase we’re going through today. The notion of bringing ANY device to work is already becoming outdated. “Bring Work to Your Device” would have been a more appropriate phrase, but then BWYD is a really terrible acronym. Today, I can access most of the information I need for my work, through mobile apps and web browsers. That means I can potentially use smart phones, tablets, the browser on my smart television, or the Wii console at home, or my son’s PSP game device to access work information. As soon as I buy a new camera with Android on it, I will also be able to access work on my camera. Or my car’s GPS screen. Or my fridge. Are IT organisations going to provide BYOD policies for all these devices where I will have to commit, for example, that “if I am using that device for work I shall not allow any other person, including family members, to access that device”? I don’t think so. The notion of BYOD is already becoming irrelevant. It is time to accept that work is no longer tied to ANY device and that work could potentially be accessed on EVERY device. And that is another reason, why information security and governance should be applied to the information, not to the device. The form of the device is irrelevant, and there will never be a 1:1 relationship between work and devices again.

It’s not your cloud, it’s everyone’s cloud

Cloud storage is a reality, but sharing cloud-level resources is yet to come. All we have achieved is to move the information storage outside the data centre. Think of this very simple example: Let’s say I subscribe to Gartner, or AIIM and I have just downloaded a new report or white paper to read. I find it interesting and I share it with some colleagues, and (if I have the right to) with some customers through email. There is every probability that I have created a dozen instances of that report, most of which will end up being stored or backed up in a cloud service somewhere. Quite likely on the same infrastructure where I downloaded the original paper from. And so will do many others that have downloaded the same paper. This is madness! Yes, it’s true that I should have been sending out the link to that paper to everyone else, but frankly that would force everyone to have to create accounts, etc. etc. and it’s so much easier to attach it to an email, and I’m too busy. Now, turn this scenario on its head: What if the cloud infrastructure itself could recognise that the original of that white paper is already available on the cloud, and transparently maintain the referential integrity, security, and audit trail, of a link to the original? This is effectively cloud-level, internet-wide de-duplication. Resource sharing. Combine this with the information granularity mentioned above, and you have massive storage reduction, cloud capacity increase, simpler big-data analytics and an enormous amount of statistical audit-trail material available, to analyse user behaviour and information value.

The IT organisation becomes irrelevant

The IT organisation as we know it today, is arguably the most critical function and the single largest investment drain in most organisations. You don’t have to go far to see examples of the criticality of the IT function and the dependency of an organisation to IT service levels. Just look at the recent impact that simple IT malfunctions have had to banking operations in the UK [Lloyds Group apologies for IT glitch].  My prediction however, is that this mega-critical organisation called IT, will collapse in the next few years. A large IT group – as a function, whether it’s oursourced or not – is becoming an irrelevant anachronism, and here’s why: 1) IT no longer controls the end-user infrastructure, that battle is already lost to BYOD. The procurement, deployment and disposition of user assets is no longer an IT function, it has moved to the individual users who have become a lot more tech-savy and self-reliant than they were 10 or 20 years ago. 2) IT no longer controls the server infrastructure: With the move to cloud and SaaS (or its many variants: IaaS, PaaS, etc.), keeping the lights on, the servers cool, the backups running and the cables networked will soon cease to be a function of the IT organisation too. 3) IT no longer controls the application infrastructure: Business functions are buying capabilities directly at the solution level, often as apps, and these departments are maintaining their own relationships with IT vendors. CMOs, CHROs, CSOs, etc. are the new IT buyers. So, what’s left for the traditional IT organisation to do? Very little else. I can foresee that IT will become an ancillary coordinating function and a governance body. Its role will be to advise the business and define policy, and maybe manage some of the vendor relationships. Very much like the role that the Compliance department, or Procurement has today, and certainly not wielding the power and the budget that it currently holds. That, is actually good news for Information Management! Not because IT is an inhibitor today, but because the responsibility for Information Management will finally move to the business, where it always belonged. That move, in turn, will fuel new IT innovation that is driven directly by business need, without the interim “filter” that IT groups inevitably create today. It will also have a significant impact to the operational side of the business, since groups will have a more immediate and agile access to new IT capabilities that will enable them to service new business models much faster than they can today.

Personally, I would like all of these predictions to come true today. I don’t have a magic wand, and therefore they won’t. But I do believe that some, if not all, of these are inevitable and it’s only a question of time and priority before the landscape of Information Management, as we know today, is fundamentally transformed. And I believe that this inevitable transformation will help to accelerate both innovation and value.

I’m curious to know your views on this. Do you think these predictions are reasonable, or not? Or, perhaps they are a lot of wishful thinking. If you agree with me, how soon do you think they can become a reality? What would stop them? And, what other fundamental changes could be triggered, as a result of these?

I’m looking forward to the debate!

A clouded view of Records and Auto-Classification

When you see Lawrence Hart (@piewords), Christian Walker (@chris_p_walker) and Cheryl McKinnon (@CherylMcKinnon) involved in a debate on Records Management, you know it’s time to pay attention! 🙂

This morning, I was reading Lawrence’s blog titled “Does Records Management Give Content Management a Bad Name?”, which picks on one of the points in Cheryl’s article “It’s a Digital-First World: Five Trends Reshaping Records Management As You Know It”, with some very insightful comments added by Christian.  I started leaving a comment under Lawrence’s blog (which I will still do, pointing back to this) but there are too many points I wanted to add to the debate and it was becoming too long…

So, here is my take:

First of all, I want to move away from the myth that RM is a single requirement. Organisations look to RM tools as the digital equivalent to a Swiss Army Knife, to address multiple requirements:

  • Classification – Often, the RM repository is the only definitive Information Management taxonomy managed by the organisation. Ironically, it mostly reflects the taxonomy needed by retention management, not by the operational side of the business. Trying to design a taxonomy that serves both masters, leads to the huge granularity issues that Lawrence refers to.
  • Declaration – A conscious decision to determine what is a business record and what is not. This is where both the workflow integration and the auto-classification have a role to play, and where in an ideal world we should try to remove the onus of that decision from the hands of the end-user. More on that point later…
  • Retention management – This is the information governance side of the house. The need to preserve the records for the duration that they must legally be retained, move them to the most cost-effective storage medium based on their business value, and actively dispose of them when there is no regulatory or legal reason to retain them any longer.
  • Security & auditability – RM systems are expected to be a “safe pair of hands”. In the old world of paper records management, once you entrusted your important and valuable documents to the records department, you knew that they were safe. They would be preserved and looked after until you ask for them. Digital RM is no different: It needs to provide a safe-haven for important information, guaranteeing its integrity, security, authenticity and availability. Supported by a full audit trail that can withstand legal scrutiny.

Auto-categorisation or auto-classification, relates to both the first and the second of these requirements: Classification (using linguistic, lexical and semantical analysis to identify what type of document it is, and where it should fit into the taxonomy) and Declaration (deciding if this is a business document worthy of declaration as a record). Auto-classification is not new, it’s been available both as a standalone product  and integrated within email and records capture systems for several years. But its adoption has been slow, not for technological reasons, but because culturally both compliance and legal departments are reluctant to accept that a machine can be good enough to be allowed to make this type of decisions. And even thought numerous studies have proven that machine-based classification can be far more accurate and consistent than a room full of paralegals reading each document, it will take a while before the cultural barriers are lifted. Ironically, much of the recent resurgence and acceptance of auto-classification is coming from the legal field itself, where the “assisted review” or “predictive coding” (just a form of auto-classification to you and me) wars between eDiscovery vendors, have brought the technology to the fore, with judges finally endorsing its credibility [Magistrate Judge Peck in Moore v. Publicis Groupe & MSL Group, 287 F.R.D. 182 (S.D.N.Y.2012), approving use of predictive coding in a case involving over 3 million e-mails.].

The point that Christian Walker is making in his comments however is very important: Auto-classification can help but it is not the only, or even the primary, mechanism available for Auto-Declaration. They are not the same thing. To take the records declaration process away from the end-user, requires more than understanding the type of document and its place in a hierarchical taxonomy. It needs the business context around the document, and that comes from the process. A simple example to illustrate this would be a document with a pricing quotation. Auto-classification can identify what it is, but not if it has been sent to a client or formed part of a contract negotiation. It’s that latter contextual fact that makes it a business record. Auto-Declaration from within a line-of-business application, or a process management system is easy: You already know what the document is (whether it has been received externally, or created as part of the process), you know who it relates to (client id, case, process) and you know what stage in its lifecycle it is at (draft, approved, negotiated, signed, etc.). These give enough definitive context to be able to accurately identify and declare a record, without the need to involve the users or resort to auto-classification or any other heuristic decision. That’s assuming, of course, that there is an integration between the LoB/process and the RM system, to allow that declaration to take place automatically.

The next point I want to pick up is the issue of Cloud. I think cloud is a red herring to this conversation. Cloud should be an architecture/infrastructure and procurement/licensing decision, not a functional one. Most large ECM/RM vendors can offer similar functionality hosted on- and off-premises, and offer SaaS payment terms rather than perpetual licensing. The cloud conversation around RM however, comes to its own sticky mess where you start looking at guaranteeing location-specific storage (critical issue for a lot of European data protection and privacy regulation) and when you start looking at the integration between on-premise and off-premise systems (as in the examples of auto-declaration above). I don’t believe that auto-classification is a significant factor in the cloud decision making process.

Finally, I wanted to bring another element to this discussion. There is another RM disruptive trend that is not explicit in Cheryl’s article (but it fits under point #1) and it addresses the third RM requirement above: “In-place” Retention Management. If you extract the retention schedule management from the RM tool and architect it at a higher logical level, then retention and disposition can be orchestrated across multiple RM repositories, applications, collaboration environments and even file systems, without the need to relocate the content into a dedicated traditional RM environment. It’s early days (and probably a step too far, culturally, for most RM practitioners) but the huge volumes of currently unmanaged information are becoming a key driver for this approach. We had some interesting discussions at the IRMS conference this year (triggered partly because of IBM’s recent acquisition of StoredIQ, into their Information Lifecycle Governance portfolio) and James Lappin (@JamesLappin) covered the concept in his recent blog here: The Mechanics on Manage-In-Place Records Management Tools. Well worth a read…

So to summarise my points: RM is a composite requirement; Auto-Categorisation is useful and is starting to become legitimate. But even though it can participate, it should not be confused with Auto-Declaration of records;  “Cloud” is not a functional decision, it’s an architectural and commercial one.

%d bloggers like this: