Archive

Archive for the ‘history’ Category

“Your baby is ugly!” – The schizophrenic world of overlapping software portfolios

No mother will ever say “My baby is the ugliest”. And no product manager will allow their brainchild to commit harakiri, following a software company acquisition.

I had the dubious pleasure to live and breathe this paranoid madness for over a decade, and I can tell you it’s neither pretty nor dignified.

Just look at our little ECM corner of the world:

  • FileNet ECM vs. IBM Content Manager vs. Content Manager on Demand
  • FileNet BPM vs. Websphere vs. Lombardi
  • IBM Records Manager (Tarian) vs. IBM Enterprise Records (FileNet)
  • Metastorm vs. Global 360 vs. Cordys
  • Tibco vs. Staffware
  • LiveLink vs. Hummingbird vs. Documentum
  • Vignette vs. RedDot
  • Etc. vs. Etc. vs. Etc.

Look carefully at the most acquisitive companies in the sector: It’s always a bloodbath.

Today the latest victims entered the fray: OnBase vs. Perceptive vs. Saperion or OnBase VNA vs. Acuo VNA, etc.

Naturally, the acquiring CEO – usually shoulder to shoulder with the incoming comrade – will issue reassuring PR statements to appease the acquired user-base: “Welcome to our happy family, we love you too! It’s going to be great!” (except in the case of FileNet and Lombardi where IBM’s message was more targeted to the existing user base: “Yes, we bought a prettier child, but we will never stop loving you”). Today’s example by Hyland is no exception…

And with the pleasantries completed, the gruesome reality starts to creep in: Innovators and Thought leader executives are either leaving in drones, or patiently waiting out their gardening leave or golden-handcuff term to expire. Marketing will talk about “coexistence” and “interoperation” and “unifying functionality” and “rationalising capabilities” and “ecosystems” across the portfolio. In the back room, skeletal engineering resources will be tearing each other’s hair out, scrounging for scraps of headcount to keep up with just the most basic bug-fixes on totally incompatible architectures, creating the QA matrices from hell. While salesmen in the field will try to pinch each other’s deals and upsell incompatible “extensibility” features creating Frankenstein implementation monsters that will never see the light of production, or another version update. Ever!

You think I’m exaggerating? Just ask any pre-sales support engineer who has had to live through these acquisitions… Pure madness!

The legend of ancient Spartans throwing their disabled and diseased children off a cliff, in order to maintain a superior warrior race, may have been disputed by archaeologists, but the software industry could take some lessons and apply some of the same rationale: less emotional attachment to the lesser products, and a more honest – if harsh – reality check for the customers: “Sorry, we cannot afford to maintain your ugly investment forever. Let’s come to an arrangement on how you move to our single, best, invested, up-to-date product portfolio, before you start running off to our competitors in despair”.

I sincerely hope that I’m proven wrong in my cruel cynical assessments, and I wish my Hyland and Perceptive colleagues a long and happy marriage, once the honeymoon period is over…

(VERY IMPORTANT: These are my personal opinions and are not necessarily representing the opinions of my current, previous or future employers. Phew, that was close…)

3 steps to a Compliance Strategy – As valid now, as ever!

3Steps Compliance StrategySome of my old FileNet friends reading this article will smile… I realised today to my surprise, that it’s over 11 years ago that this simple concept was first articulated, and went on to form the basis of our compliance messaging, transitioned into IBM after the acquisition, and was presented in many conferences and briefings. The result of a quick brainstorm before a breakfast briefing for Bearingpoint, at an off-site annual kick-off session, the picture on the left is a scan from my original notebook where it first appeared, in January 2004. I have evidence of this still being included in presentations as late as 2011. In the world of PowerPoint slides, does that make it a classic?

Now, it may be an old message, but it is as valid today as it ever was. And since I’ve never written about it in this blog I thought it was worth re-introducing it to a whole new audience.

What does a company need to do, to be compliant?

There are three very fundamental and very explicit stages for an organisation to achieve a “compliant” status. These apply equally to every vertical industry, be it Banking, Insurance, Telco, Retail, Pharmaceutical, etc. And they also apply equally, if “compliance” refers to regulatory compliance in a Nuclear plant, financial compliance, or Health & Safety at a local school.

Step 1 – The Present: Become compliant

What do you need to do today, to comply with the rules and meet the regulations? What changes in procedure, what risk controls, what equipment checks, what training? This stage includes designing and implementing everything that a company needs to put in place, to be able to certify that today, it is compliant with each regulation the law currently subjects it to. Implementing this stage requires the company to (a) identify and understand which regulations are relevant and what they are expecting (b) identify possible areas and processes where the company is at a risk of not compliant with the regulations, and (c) implementing any changes necessary to remove those compliance risks.

Step 2 – The Future: Remain compliant

This is the part that is often forgotten, and ends up costing organisations millions in fines: Looking at the future. Becoming compliant is not enough, it’s just the first step. As an organisation, you need to ensure that compliance is sustained consistently in the future. That every system, every procedure and every employee remains within the controls and guidelines specified by the legal regulations or the company policies. At a manual level, this involves regular training for employees and regular testing of all the various controls and devices implemented in Step 2. The best way to implement Step 2 however, is automation. Putting in place systems and processes that not only monitor the company’s compliance, but that enforce it. The less a company relies on individual employees to maintain compliance the less likely it is to fall foul of compliance breaches through human error. Automation reduces training requirements, reduces management overheads, and it reduces wasting operational cycles for testing and reporting.

Step 3 – The Past: Demonstrate compliance

The final part of the process is looking at compliance retrospectively: Are you able to go back to a specific point in time, and demonstrate to a regulator, and auditor, or even a customer, that you operated compliantly. Are you able to shoe what decisions were made, what policies were in force, who made the decisions and what information they had available to them to support that decision? This is all about Records Management and audit trails. It’s about maintaining evidence of your compliance that is complete, accurate and irrefutable. Preparing for that retrospective compliance review in the future, should be a core part of the design of any compliance system implemented today.

So the meme Become – Remain – Demonstrate (or even “AchieveSustain – Prove”, as the alternative version that our U.S. marketing folk seemed to favour) summarises the three key steps that you need to remember about structuring a compliance programme. If you are faced with a new regulation, new management, or even a new mandate to create or replace IT systems for compliance, use these three steps to validate if your compliance strategy is complete or not.

It’s Knowledge Management, Jim, but not as you know it

March 19, 2015 1 comment

LibraryA recent conversation with a colleague sent me searching back to my archives for a conference presentation I did nearly 16 years ago. The subject of the conference was on the impact of Document Management as an enabler for Knowledge sharing in the enterprise.

Driven by three different technology sectors at the time, Document Management, Search and Portals, Knowledge Management was all the rage back then. No good deed goes unpunished, however, and after several massive project failures and even more non-starter projects, Knowledge Management lost its shine and became a dirty phrase that no self-respecting consultant wanted to be associated with.

Why did Knowledge Management fail in the ‘90s?

They say 20:20 hindsight is a wonderful thing… Reading again through my slides and my notes, made me realise how different this market has become since the late ‘90s. There were a number of factors at the time that made sure that Knowledge Management never took off as a viable approach but, in my view, two were the most dominant:

The first one was the much used phrase of “Knowledge is power”. Leaving aside the fact that knowledge in and by itself very rarely has intrinsic value – it’s the application of knowledge that creates the power – the phrase was quickly misconstrued by the users to mean: “I have knowledge, therefore I have power”. Guess what? Who wants to dilute their power by selflessly sharing out knowledge? Not many users felt altruistic enough to share their prized knowledge possessions, their crown jewels, for the greater good of the organisation. “As long as I hold onto the knowledge, I hold on to the power and therefore I am important, valuable and irreplaceable”. Nobody said so, of course, but everyone was thinking it.

The second one was the incessant focus on the information itself as the knowledge asset. Technology was focused almost exclusively on extracting tacit knowledge from individuals, encapsulating it in explicit documents, categorising it, classifying it, archiving it and making it available to anyone who could possibly need it. There were two problems with this approach: The moment tacit information became explicit, it lost its owner and curator; it also started aging and becoming obsolete. Quite often, it also lost its context too, making it not only irrelevant but often dangerous.

Why are we talking again about Knowledge Management in 2015?

The last decade has brought a silent cultural revolution on knowledge sharing. We have all learned to actively share! Not only did we become a lot less paranoid about sharing our “crown jewels”, but we are all actively enjoying doing so, inside and outside the work environment: Wikipedia, blogs, Twitter, self-publishing, Facebook, Pinterest, LinkedIn, SlideShare, Open-source, crowdsourcing, etc., all technologies that the millennium (and the millennials) have brought to the fore. All these technologies are platforms for sharing information and knowledge. The stigma and the paranoia of “Knowledge is Power” has actually transformed into “Sharing is Power”. The more we share the more are valued by our networks, and the bigger the network grows the more power we yield as individuals. And, surprise-surprise, it’s reciprocal! The bigger the network we create the bigger the pool of knowledge we can draw upon.

What couldn’t have been envisioned in the late ‘90s, or early ‘00s, is that by 2015 the knowledge power would be contained in the relationships and the connections, not in the information assets. Not just connections between knowledge gurus inside an enterprise, but amongst individuals in a social environment, between companies and consumers and amongst professional organisations.

Social Media and Collaboration environments have proven to us that the value of sharing knowledge is significantly higher than the value of holding on to it. We may or may not see the term “Knowledge Management” resurrected as an IT concept, but the reality is that knowledge sharing has now become an integral part of our daily life, professional and personal, and it’s not likely to change any time soon.

The mobile (R)evolution – A historical review

Unless you live in a cave, you will have not failed to notice that mobility has taken over our life. As I write this, I’m sitting in a train full of commuters who, almost to a man, are holding a smart phone, a tablet or a laptop. The odd ones out, are reading a book… on a Kindle.

There is no denying that mobility is an established phenomenon and it’s here to stay. The IT industry is actively embracing it as the new Amalthean horn (alongside that other nebulous revolution – The Cloud). With Mobile First (IBM), The Mobile Imperative (Gartner), Enterprise Mobility(Accenture), 3rd Platform (IDC), etc., etc. .. one by one every major vendor and analyst is releasing their “mobile” strategy that will drive growth in the next 3, 5 or 10 years. And undoubtedly, it will.

But is our current obsession with mobility, really that revolutionary? Is the change in our culture and behaviour really so sudden and dramatic? Prompted by a very stimulating conversation at AIIM’s Executive Leadership Council (see the recent paper: The Mobile Reality), I decided to look at the historical milestones of computer mobility. Its heritage, if you like. The picture it paints is very interesting.

Mobile Evolution

Let’s look at the impact of mobility on a decade by decade basis.

1960

The starting point. Computer access was restricted to a single physical location, determined by the location of the computer machines themselves. Access was granted to few, selected, highly trained computer boffins, who were responsible for allocating the computing resource on a time-share basis, and deliver the results to the outside world. There is zero mobility involved at this stage.

1970

The 70’s introduced the first layer of mobility to the organisation, and it had a transformational impact. “Dumb” terminals, could be distributed across the organisation, connected with RS-232 serial connections. Mobility was location-based, since connectivity was hard-wired and employees would have to physically go to wherever the terminal was, in order to access it. Systems became multi-user giving selected, trained, specialist users simultaneous access to computing power on-demand. Suddenly, computing power and business applications were no longer constrained by the physical location of the computer, but were distributed to core departments across the organisation.

1980

The ‘80s saw the introduction of PCs. A hub-and-spoke revolution, where autonomous business machines could execute tasks locally, wherever they were located, and could communicate transparently with each other and with centralised servers. More “intelligent” connectivity through network cables introduced the client-server and email era. Mobility moved outside the constraints of the physical building. With the advent of “a PC on every desk”, users could work anywhere within the organisation and could communicate with each other, from building to building, and from town to town. Or copy their work on a floppy-disk and continue their work on their PC at home.

1990

In the 90’s mobility went through another revolutionary phase. PCs gave way to laptops, work would be taken anywhere, and modems could allow dial-up connectivity back to the office. Location, for users that had been issued with a company laptop and modem access, was no longer constrained to the confines of the organisation. They could easily work connected from home, or from a customer site anywhere in the world. Mobile phones became a corporate tool, eventually obliterating phonecards and phoneboxes, and wireless handsets, brought telephone mobility within the home. All that mobility created its own cultural revolution, bringing faster on-site customer support, home-working and flexible hours. At the same time, the internet and world-wide-web broke out of the military and academic domains, and the first commercial internet applications started appearing.

2000

With the millennium Y2K scare out of the way, mobility re-invented itself again. Website access and intranets, meant that every employee could access the corporate environment regardless of the physical machine they were using: A corporate notebook, home PC, Internet café, or hotel lobby, would be equally useful for checking emails, writing the odd MS-Office document, or finishing the latest marketing presentation. Virtually every employee had remote access to the organisation, and was actively encouraged to use it to reduce travelling and office-space. Internet commerce became universally accepted transforming the retail market. Computer form factor started reducing, with lighter notebooks and PDAs with styluses, touch screens and hand-writing recognition (remember Palm and Psion?), became the first truly portable devices. Mobile phones penetrated the personal consumer market, while Email and text messaging (SMS) started replacing phone calls, as the preferred mediums for short conversations. ADSL networks brought affordable broadband connectivity to the home, and the first 3G networks and devices allowed internet connection “on the go”.

2010

Which brings us to today: Enter the iPhone and iPad generation, where the preferred device factor is smaller (smartphones), more portable (tablets, phablets) and more universal (Smart TVs, Wifi Cameras, etc). Mobile connectivity became a bit more reliable and a bit faster, using faster 3G and 4G networks on the street. WiFi Fibre optic broadband at home, in fast-food restaurants and at coffee chains, brought faster downloads and HD streaming. Consumers are moving to apps as the preferred interface (rather than websites) and internet access has become accessible to everyone and the preferred customer interaction medium for many businesses. The delineation between personal computing and work computing has more or less disappeared, and the internet (as well as the office) can be accessed almost anywhere and by everyone. SMS text messaging is still prevalent (but virtually instant and virtually free) but asynchronous email communications declined in favour of synchronous Social Network access, Instant messaging (Skype, Twitter, FB Messaging, WhatsApp) or video chats (Skype, Lync, FaceTime, Hangouts).

Ubiquity

But we’re not quite there yet! The much heralded “ubiquitous” access to information, or “24×7” connectivity, is still a myth for a lot of us: While I constantly have to worry if my phone should connect via 3G or WiFi (a cost-driven and availability decision), while I can have internet access on a transatlantic flight, but not in a commuter train, while my broadband signal at home drops the line every 20 minutes because it’s too far away from the telephone exchange, while my WiFi router signal at one end of the house does not reach the dining room at the opposite end, and while I need a 3G signal booster at home (in a 450,000 people town) because none of the mobile networks around me have strong enough signal, mobile connectivity is not “ubiquitous”, it’s laboured.

Having lived and worked through 30 years of mobility transformation, I would argue that today’s “mobile revolution” is more evolutionary than revolutionary. What we are experiencing today is just another step in the right direction. Mobility will continue to have a transformational effect on businesses, consumers and popular culture, just as computer terminals transformed the typical desktop environment in the ‘70s and ‘80s, and as modems enabled home-working and flexible hours in the 90’s and 00’s. I expect that in the next 5 years we will see true “permanently on” connectivity and even more internet enabled devices communicating with each other. I also expect that businesses will become a lot more clever and creative with leveraging mobility.

Nevertheless, I don’t expect a mobile revolution.

My first DMS kiss…

September 22, 2011 Leave a comment

A recent tweet exchange with @pmonks and @pelujan (legends amongst the ECM Twitterati…) prompted me to dig deep into my past to find my first flirting with Document Management, a relationship that has lasted over 35 years.

The year: 1984

The venue: London, offices of a Greek shipping company

The actor: An impoverished first year BSc student

The platform: Perkin-Elmer (later Concurrent) super-minis, 32-bit architecture

The language: CoBoL with proprietary RDBMS and transaction processing

The screen: Green on Black

The medium: X.25 network, over a private leased London-to-Athens line

The gig: Long-distance telephone calls between London and Athens offices were costing the company a fortune. Also, the timezone difference reduced the effective daily communication window by 4 hours. The company was looking for a way to leverage their existing technology platform, to exchange messages between offices synchronously or asynchronously, without incurring additional telephone costs.

The solution: A database system written in Cobol, which allowed terminal users at either end to pick a recipient from a list or registered users, leave a message from the user to the opposite party and receive a message back. Since it showed a history of the messages exchanged between the parties, if both parties were on-line, then you could have a dialogue in real-time (line-by-line). If not, the other party would pick the message when they logged in and respond back. This was using a temporary database table. If either party wanted to keep a permanent record of the conversation, they would “archive it” in a separate table, holding metadata like start time, end time, from, to, a subject description, location, etc. Also, since I wanted to be able to exchange messages about code with other programmers in the head office, it had a primitive system of referencing external files on shared disks.

In today’s terminology, this was email, Instant Messaging, micro-blogging and Document Management system rolled into one. An early form of social collaboration. I designed it and built it in about two weeks and it was used daily. It was simple, crude but effective.

[A side note for the pedants: I know email systems were already around by then in the Unix community, but they were not commonplace and they certainly were not available on a business platform like the Perkin-Elmer. Remember, 1984: no TCI/IP, no Internet, no Windows, no PCs, no files]

Since then, I’ve worked on many more weird DMS implementations, before the Document Management market was even identified as such: A hand-crafted invoice processing system written in VB with Kofax cards and massive Cornerstone monitors on OS/2 machines; A bespoke DMS for commercial property agents, with distributed desktop scanning (property images) attached to workflow (rental review) cases; A bespoke DMS based on Uniplex and Informix 4GL for lawyers, a fully fledged DMS with version control and content searching on NeXT machines, using C, Informix and BRS-Search (free-text database), later ported to a disasterous Ingres implementation on Windows 3.11

By then Documentum came on the scene and I remember writing VB for a very early implementation of version 1 (effectively just a set of APIs) for a Pharmaceutical company. FileNet was already on the scene with the first notion of Imaging+Workflow as a single intergrated platform, but our paths were not to cross until a decade later.

Now, there is a point to this inane drivel, beyond self-indulgence…

In today’s confused ECM market, none of these early bespoke implementations would classify as proper “Document Management”. Yet at the time, they were all innovative, trailblazing, and large companies would pay good money to implement them. It created the legitimate (if schizophrenic) ECM market space that we live in and love today.

When I launched “Document Management Avenue” in 1995 – the first independent online community forum for DMS, for those old enough to remember – we were tracking over 300 products in this space. I still have the list somewhere. Today, most of us can only point at a dozen or so major ECM / EDRMS vendors.

There you have it. My own short history of watching the birth of ECM – The bespoke became product, which became open-source, which became commodity. The rest, as they say, is history… And some of us are still arguing what to call the baby 🙂

%d bloggers like this: