Archive

Posts Tagged ‘IT culture’

Cloud and SaaS for dummies…

I had to explain Cloud and SaaS to a (non-IT) friend recently. It had to be quick and simple…

On-premise/Licensed: You buy a car and you drive it to work whenever you want. You pay for Insurance, Service, MOT, tyres and petrol. You can tweak it or add “go faster” stipes if you like. If it breaks down, you pay to have it fixed.

Cloud: The government buys a train and pays for its maintenance. You hop on it when you need it, and pay a ticket. If you are going to use it regularly, you buy an annual pass. If the train breaks down, the company sends another one to pick you up and they refund your ticket.

Hybrid: You drive your own car to the station and then take a train to work.

Simple enough?

Advertisements

The mobile (R)evolution – A historical review

Unless you live in a cave, you will have not failed to notice that mobility has taken over our life. As I write this, I’m sitting in a train full of commuters who, almost to a man, are holding a smart phone, a tablet or a laptop. The odd ones out, are reading a book… on a Kindle.

There is no denying that mobility is an established phenomenon and it’s here to stay. The IT industry is actively embracing it as the new Amalthean horn (alongside that other nebulous revolution – The Cloud). With Mobile First (IBM), The Mobile Imperative (Gartner), Enterprise Mobility(Accenture), 3rd Platform (IDC), etc., etc. .. one by one every major vendor and analyst is releasing their “mobile” strategy that will drive growth in the next 3, 5 or 10 years. And undoubtedly, it will.

But is our current obsession with mobility, really that revolutionary? Is the change in our culture and behaviour really so sudden and dramatic? Prompted by a very stimulating conversation at AIIM’s Executive Leadership Council (see the recent paper: The Mobile Reality), I decided to look at the historical milestones of computer mobility. Its heritage, if you like. The picture it paints is very interesting.

Mobile Evolution

Let’s look at the impact of mobility on a decade by decade basis.

1960

The starting point. Computer access was restricted to a single physical location, determined by the location of the computer machines themselves. Access was granted to few, selected, highly trained computer boffins, who were responsible for allocating the computing resource on a time-share basis, and deliver the results to the outside world. There is zero mobility involved at this stage.

1970

The 70’s introduced the first layer of mobility to the organisation, and it had a transformational impact. “Dumb” terminals, could be distributed across the organisation, connected with RS-232 serial connections. Mobility was location-based, since connectivity was hard-wired and employees would have to physically go to wherever the terminal was, in order to access it. Systems became multi-user giving selected, trained, specialist users simultaneous access to computing power on-demand. Suddenly, computing power and business applications were no longer constrained by the physical location of the computer, but were distributed to core departments across the organisation.

1980

The ‘80s saw the introduction of PCs. A hub-and-spoke revolution, where autonomous business machines could execute tasks locally, wherever they were located, and could communicate transparently with each other and with centralised servers. More “intelligent” connectivity through network cables introduced the client-server and email era. Mobility moved outside the constraints of the physical building. With the advent of “a PC on every desk”, users could work anywhere within the organisation and could communicate with each other, from building to building, and from town to town. Or copy their work on a floppy-disk and continue their work on their PC at home.

1990

In the 90’s mobility went through another revolutionary phase. PCs gave way to laptops, work would be taken anywhere, and modems could allow dial-up connectivity back to the office. Location, for users that had been issued with a company laptop and modem access, was no longer constrained to the confines of the organisation. They could easily work connected from home, or from a customer site anywhere in the world. Mobile phones became a corporate tool, eventually obliterating phonecards and phoneboxes, and wireless handsets, brought telephone mobility within the home. All that mobility created its own cultural revolution, bringing faster on-site customer support, home-working and flexible hours. At the same time, the internet and world-wide-web broke out of the military and academic domains, and the first commercial internet applications started appearing.

2000

With the millennium Y2K scare out of the way, mobility re-invented itself again. Website access and intranets, meant that every employee could access the corporate environment regardless of the physical machine they were using: A corporate notebook, home PC, Internet café, or hotel lobby, would be equally useful for checking emails, writing the odd MS-Office document, or finishing the latest marketing presentation. Virtually every employee had remote access to the organisation, and was actively encouraged to use it to reduce travelling and office-space. Internet commerce became universally accepted transforming the retail market. Computer form factor started reducing, with lighter notebooks and PDAs with styluses, touch screens and hand-writing recognition (remember Palm and Psion?), became the first truly portable devices. Mobile phones penetrated the personal consumer market, while Email and text messaging (SMS) started replacing phone calls, as the preferred mediums for short conversations. ADSL networks brought affordable broadband connectivity to the home, and the first 3G networks and devices allowed internet connection “on the go”.

2010

Which brings us to today: Enter the iPhone and iPad generation, where the preferred device factor is smaller (smartphones), more portable (tablets, phablets) and more universal (Smart TVs, Wifi Cameras, etc). Mobile connectivity became a bit more reliable and a bit faster, using faster 3G and 4G networks on the street. WiFi Fibre optic broadband at home, in fast-food restaurants and at coffee chains, brought faster downloads and HD streaming. Consumers are moving to apps as the preferred interface (rather than websites) and internet access has become accessible to everyone and the preferred customer interaction medium for many businesses. The delineation between personal computing and work computing has more or less disappeared, and the internet (as well as the office) can be accessed almost anywhere and by everyone. SMS text messaging is still prevalent (but virtually instant and virtually free) but asynchronous email communications declined in favour of synchronous Social Network access, Instant messaging (Skype, Twitter, FB Messaging, WhatsApp) or video chats (Skype, Lync, FaceTime, Hangouts).

Ubiquity

But we’re not quite there yet! The much heralded “ubiquitous” access to information, or “24×7” connectivity, is still a myth for a lot of us: While I constantly have to worry if my phone should connect via 3G or WiFi (a cost-driven and availability decision), while I can have internet access on a transatlantic flight, but not in a commuter train, while my broadband signal at home drops the line every 20 minutes because it’s too far away from the telephone exchange, while my WiFi router signal at one end of the house does not reach the dining room at the opposite end, and while I need a 3G signal booster at home (in a 450,000 people town) because none of the mobile networks around me have strong enough signal, mobile connectivity is not “ubiquitous”, it’s laboured.

Having lived and worked through 30 years of mobility transformation, I would argue that today’s “mobile revolution” is more evolutionary than revolutionary. What we are experiencing today is just another step in the right direction. Mobility will continue to have a transformational effect on businesses, consumers and popular culture, just as computer terminals transformed the typical desktop environment in the ‘70s and ‘80s, and as modems enabled home-working and flexible hours in the 90’s and 00’s. I expect that in the next 5 years we will see true “permanently on” connectivity and even more internet enabled devices communicating with each other. I also expect that businesses will become a lot more clever and creative with leveraging mobility.

Nevertheless, I don’t expect a mobile revolution.

2020 and beyond… The mother of all Information Management predictions

January 30, 2014 6 comments

crystal ballI’ve been wanting to write this article for a while, but I thought it would be best to wait for the deluge of 2014 New Year predictions to settle down, before I try and look a little bit further in the horizon.

The six predictions I discuss here are personal, do not have a specific timescale, and are certainly not based on any scientific method. What they are based on, is a strong gut feel and thirty years of observing change in the Information Management industry.

Some of these predictions are more fundamental than others. Some will have immediate impact (1-3 years), some will have longer term repercussions (10+ years). In the past, I have been very good at predicting what is going to happen, but really bad at estimating when it’s going to happen. I tend to overestimate the speed at which our market moves. So here goes…

Behaviour is the new currency

Forget what you’ve heard about “information being the new currency”, that is old hat. We have been trading in information, in its raw form, for years. Extracting meaningful value however from this information has always been hard, repetitive, expensive and most often a hit-or-miss operation. I predict that with the advance of analytics capabilities (see Watson Cognitive), raw information will have little trading value. Information will be traded already analysed, and nowhere more so than in the area of customer behaviour. Understanding of lifestyle-models, spending-patterns and decision-making behaviour, will become the new currency exchanged between suppliers. Not the basic high-level, over-simplified, demographic segmentation that we use today, but a deep behavioural understanding of individual consumers that will allow real-time, predictive and personal targeting. Most of the information is already being captured today, so it’s a question of refining the psychological, sociological and commercial models around it. Think of it this way: How come Google and Amazon know (instantly!) more about my on-line interactions with a particular retailer, than the retailer’s own customer service call centre? Does the frequency of logging into online banking indicate that I am very diligent in managing my finances, or that I am in financial trouble? Does my facebook status reflect my frustration with my job, or my euphoric pride in my daughter’s achievement? How will that determine if I decide to buy that other lens I have been looking at for my camera, or not? Scary as the prospect may be, from a personal privacy perspective, most of that information is in the public domain already. What is the digested form of that information, worth to a retailer?

Security models will turn inside out

Today most security systems, algorithms and analysis, are focused on the device and its environments. Be it the network, the laptop, the smartphone or the ECM system, security models are there to protect the container, not the content. This has not only become a cat-and-mouse game between fraudsters and security vendors, but it is also becoming virtually impossible to enforce at enterprise IT level. With BYOD, a proliferation of passwords and authentication systems, cloud file-sharing, and social media, users are opening up security holes faster than the IT department can close. Information leakage is an inevitable consequence. I can foresee the whole information security model turning on its head: If the appropriate security becomes deeply embedded inside the information (down to the file, paragraph or even individual word level), we will start seeing self-describing and self-protecting granular information that will only be accessible to an authenticated individual, regardless if that information is in a repository, on a file-system, on the cloud, at rest or in transit. Security protection will become device-agnostic and infrastructure-agnostic. It will become a negotiating handshake between the information itself and the individual accessing that information, at a particular point in time.

Oh, and while we are assigning security at this granular self-contained level, we might as well transfer retention and classification to the same level as well.

The File is dead

In a way, this prediction follows on from the previous one and it’s also a prerequisite for it. It is also a topic I have discussed before [Is it a record, who cares?]. Information Management, and in particular Content Management, has long been constrained by the notion of the digital file. The file has always been the singular granular entity, at which security, classification, version control, transportation, retention and all other governance stops. Even relational databases ultimately live in files, because that’s what Operating Systems have to manage. However, information granularity does not stop at the file level. There is structure within files, and a lot of information that lives outside the realm of files (particularly in social media and streams). If Information Management is a living organism (and I believe it is), then files are its organs. But each organ has cells, each cell has molecules, and there are atoms within those molecules. I believe that innovation in Information Management will grow exponentially the moment that we stop looking at managing files and start looking at elementary information entities or segments at a much more granular level. That will allow security to be embedded at a logical information level; value to grow exponentially through intelligent re-use; storage costs to be reduced dramatically through entity-level de-duplication; and analytics to explode through much faster and more intelligent classification. File is an arbitrary container that creates bottlenecks, unnecessary restrictions and a very coarse level of granularity. Death to the file!

BYOD is just a temporary aberration

BYOD is just a transitional phase we’re going through today. The notion of bringing ANY device to work is already becoming outdated. “Bring Work to Your Device” would have been a more appropriate phrase, but then BWYD is a really terrible acronym. Today, I can access most of the information I need for my work, through mobile apps and web browsers. That means I can potentially use smart phones, tablets, the browser on my smart television, or the Wii console at home, or my son’s PSP game device to access work information. As soon as I buy a new camera with Android on it, I will also be able to access work on my camera. Or my car’s GPS screen. Or my fridge. Are IT organisations going to provide BYOD policies for all these devices where I will have to commit, for example, that “if I am using that device for work I shall not allow any other person, including family members, to access that device”? I don’t think so. The notion of BYOD is already becoming irrelevant. It is time to accept that work is no longer tied to ANY device and that work could potentially be accessed on EVERY device. And that is another reason, why information security and governance should be applied to the information, not to the device. The form of the device is irrelevant, and there will never be a 1:1 relationship between work and devices again.

It’s not your cloud, it’s everyone’s cloud

Cloud storage is a reality, but sharing cloud-level resources is yet to come. All we have achieved is to move the information storage outside the data centre. Think of this very simple example: Let’s say I subscribe to Gartner, or AIIM and I have just downloaded a new report or white paper to read. I find it interesting and I share it with some colleagues, and (if I have the right to) with some customers through email. There is every probability that I have created a dozen instances of that report, most of which will end up being stored or backed up in a cloud service somewhere. Quite likely on the same infrastructure where I downloaded the original paper from. And so will do many others that have downloaded the same paper. This is madness! Yes, it’s true that I should have been sending out the link to that paper to everyone else, but frankly that would force everyone to have to create accounts, etc. etc. and it’s so much easier to attach it to an email, and I’m too busy. Now, turn this scenario on its head: What if the cloud infrastructure itself could recognise that the original of that white paper is already available on the cloud, and transparently maintain the referential integrity, security, and audit trail, of a link to the original? This is effectively cloud-level, internet-wide de-duplication. Resource sharing. Combine this with the information granularity mentioned above, and you have massive storage reduction, cloud capacity increase, simpler big-data analytics and an enormous amount of statistical audit-trail material available, to analyse user behaviour and information value.

The IT organisation becomes irrelevant

The IT organisation as we know it today, is arguably the most critical function and the single largest investment drain in most organisations. You don’t have to go far to see examples of the criticality of the IT function and the dependency of an organisation to IT service levels. Just look at the recent impact that simple IT malfunctions have had to banking operations in the UK [Lloyds Group apologies for IT glitch].  My prediction however, is that this mega-critical organisation called IT, will collapse in the next few years. A large IT group – as a function, whether it’s oursourced or not – is becoming an irrelevant anachronism, and here’s why: 1) IT no longer controls the end-user infrastructure, that battle is already lost to BYOD. The procurement, deployment and disposition of user assets is no longer an IT function, it has moved to the individual users who have become a lot more tech-savy and self-reliant than they were 10 or 20 years ago. 2) IT no longer controls the server infrastructure: With the move to cloud and SaaS (or its many variants: IaaS, PaaS, etc.), keeping the lights on, the servers cool, the backups running and the cables networked will soon cease to be a function of the IT organisation too. 3) IT no longer controls the application infrastructure: Business functions are buying capabilities directly at the solution level, often as apps, and these departments are maintaining their own relationships with IT vendors. CMOs, CHROs, CSOs, etc. are the new IT buyers. So, what’s left for the traditional IT organisation to do? Very little else. I can foresee that IT will become an ancillary coordinating function and a governance body. Its role will be to advise the business and define policy, and maybe manage some of the vendor relationships. Very much like the role that the Compliance department, or Procurement has today, and certainly not wielding the power and the budget that it currently holds. That, is actually good news for Information Management! Not because IT is an inhibitor today, but because the responsibility for Information Management will finally move to the business, where it always belonged. That move, in turn, will fuel new IT innovation that is driven directly by business need, without the interim “filter” that IT groups inevitably create today. It will also have a significant impact to the operational side of the business, since groups will have a more immediate and agile access to new IT capabilities that will enable them to service new business models much faster than they can today.

Personally, I would like all of these predictions to come true today. I don’t have a magic wand, and therefore they won’t. But I do believe that some, if not all, of these are inevitable and it’s only a question of time and priority before the landscape of Information Management, as we know today, is fundamentally transformed. And I believe that this inevitable transformation will help to accelerate both innovation and value.

I’m curious to know your views on this. Do you think these predictions are reasonable, or not? Or, perhaps they are a lot of wishful thinking. If you agree with me, how soon do you think they can become a reality? What would stop them? And, what other fundamental changes could be triggered, as a result of these?

I’m looking forward to the debate!

Seven even deadlier sins of Information Governance

October 7, 2012 3 comments

Devin Krugly published a very interesting blog/article, describing the “The 7 Deadly Sins of Information Governance“. I enjoyed the article, and I can’t find anything to disagree with, but I have to admit that it left me wanting… The 7 sins presented by Devin are well known and very common problems that plague most Enterprise scale projects, as he points out within the article itself. They could equally apply to HR, supply chain, claims processing or any other major IT implementation. Devin has done a great job of projecting these pitfalls to an Information Governance program.

For me, however, what is really missing from the article is a list of “sins” that are unique to Information Governance projects. So let me try and add some specific Information Governance colour to the picture… Here is my list of seven even deadlier sins:

Governance needs a government. Information governance touches the whole of the organisation. It touches every system, every employee and every process. Decisions therefore that govern information, must be taken by a well defined governance body, that accurately represents the business, compliance, legal, audit and IT, at the very least. You cannot solve the Information Governance problem by throwing technology at it. Sure, technology plays a key part as an enabler, a catalyst and as an automation framework. But technology cannot determine policy, priorities, responsibility and accountability. Nor can it decide the organisation’s appetite for risk, or changes in strategic direction. For that, you need a governing body that defines and drives the implementation of governance.

Information does not mean data. I have talked about this in an earlier blog (Data Governance is not about Data). We often see Information Governance projects that focus primarily (or even exclusively) on transactional data, or data warehousing, or records management, or archiving, etc. Information Governance should be unified and consistent. There isn’t a different regulator for data, for documents, for emails or for tweeter messages. ANY information that enters, leaves or stays in the organisation should be subject to a common set of Governance policies and guidelines. The technical implementation a may be different but the governance should be consistent.

It is a marathon not a sprint. You can never run an “Information Governance Project”. That would imply a defined set of deliverables and a completion point at some specific date. As long as your business changes (new products, new suppliers, new customers, new employees, new markets, new regulations, new infrastructure, etc.) your Information Governance needs will also change. Policies will need revising, responsibilities will need adjusting, information sources will need adding and processes re-evaluating. Constantly! If your Information Governance project is “finished”, frankly, so is your business.

Keep it lean and clean. Information governance is the only cure for Content Obesity. Organisations today are plagued by information ROT (information that is Redundant, Outdated or Trivial).  A core outcome of any Information Governance initiative should be the regular disposal of redundant information which has to be done consistently, defensibly and with the right level of controls around it. It is a key deliverable and it requires both the tools and the commitment of the governing body.

Remember: Not who or how, but why Information Governance projects often get tangled up in the details. Tools, formats, systems, volumes, stakeholders, stewards, regulators, litigators, etc., become the focus of the project and, more often the not, people forget the main driver: Businesses need good, clean and accessible information to operate. The primary role of Information Governance is to deliver accurate, timely and reliable information to the business, for making decisions, for creating products and for delivering services. Every other issue must come second in priority.

The ministry of foreign affairs. The same way that a country cannot be governed without due consideration to the relationship with its neighbours, Information Governance does not stop at the company’s firewall. Your organisation continuously trades information with suppliers, customers, partners, competitors and the wider community. Each of these exchanges has value and carries risks. Monitoring and managing the quality, the trustworthiness, the volume and the frequency of the information exchanged, is a core part of Information Governance and should be clearly articulated in the relevant policies and implemented in the relevant systems.

This is not a democracy, it’s a revolution. Implementing Information Governance is not an IT project, it is a business transformation project. Not only because of its scope and the potential benefit and risk that it represents, but also because of the level of commitment and engagement it requires from every part of the organisation. Ultimately, Information Governance has a role in enforcing information quality, regulatory and legal controls, and it is contributing to the organisation’s accountability. The purpose of on Information Governance implementation is not to ensure that everyone is happy and has an equal voice on the table. The purpose is to ensure that the organisation does the right thing and behaves responsibly. And that may require significant cultural change and a few ruffled feathers…

If you don’t already have an Information Governance initiative in your organisation, now is the time to raise the issue to the board. If you do, then you should carefully consider if the common pitfalls presented here are addressed by your program, or if you are in danger of committing one or more of these sins.

So what? Who cares? – The art of being relevant

October 2, 2012 4 comments

When I first joined FileNet, in 2003, all new recruits attended a two-week intensive training course which, for the largest part of it, was a sales skills course. For those of us that were hired in a marketing or technical roles, that part of the course had relatively little relevance other than to empathise with the sellers and as a general skill of communicating with customers.

Nevertheless, everyone took away something extremely useful from the course: A prop! It was a simple piece of card paper (imagine an A4 cut in half lengthwise) which said on one side “So What?” and on the other .”Who Cares?” in bold red letters. The purpose of the card was simple: As we were listening to various roll-playing presentations, we could hold up the cards when the presenter was making irrelevant points or describing product functionality without relating it to the client’s problems. It was a signal to re-think the message and reduce the unnecessary waffle.

So What?” i.e. What is the point you are trying to make? How is this relevant to a business problem? What would the outcome be?

Who Cares?” – i.e. Why is this relevant to the person you are talking to? How does it relate to their work or their own personal targets of ambitions? Who in the organisation feels the pain from the problem that you are trying to resolve? Why should they care?

Surprisingly, nearly ten years later, I still find myself using regularly this simple mental test. Both for my own presentation content as well as when reviewing others’. I find myself applying this principle to presentations, marketing material, website designs and even reviewing customer requirements.  And I often introduce it to conversations with colleagues and with clients. For most of my FileNet colleagues the principle is very clear and familiar, and just mentioning “So what? who cares?” raises a knowing smile and often a review of the task at hand. When introduced to other people, the first response is usually one of shock: “How can you be so rude?”. But a quick explanation makes them realise that I’m not being impertinent, the questions are quite literal and should be answered. And, usually, they take the principle on-board which allows for a much more productive dialog.

Try it for yourself! Next time you are reading a white paper or a marketing brochure or an RFI/RFP/Proposal or even a newspaper article (especially a newspaper article!!), check each of the points made: Do they pass the “So what? Who cares?” test? If not, they are irrelevant waffle and should not be there or they are valid points which should be articulated differently. I promise you, it will make for much clearer, concise and effective communication.

Looking for Mr. Right – Revisited

I was reading a recent article by Chris Dale, where he gave an overview of Debra Logan‘s “Why Information Governance fails and how to make it succeed” keynote speech. It’s difficult to disagree with most points made in the session, but one point in particular caught my attention. Chris transcribes Debra’s thoughts as:

“…we are at the birth of a new profession, with hybrid players who have multiple strands of skills and experience. You need people with domain expertise, not just about apps and servers but data and information. The usual approach is to take people who already have jobs and give them something else to do on top or instead. You need to find people who understand the subject and teach them to attach metadata to their material, to understand document retention, perhaps even send them to law school to turn them into a legal/IT/subject matter expert hybrid.”

In parallel, I have also had several conversations, recently, relating to AIIM‘s new “Certified Information Professional” accreditation (which I am proud to possess, having passed their stringent exam). It is a valiant attempt to recognise individuals who have enough breadth of skills in Information Management, to cover most of the requirements of Debra’s “new profession“.

These two – relatively unrelated – events, prompted me to go and re-discover an article that I wrote for AIIM’s eDoc online magazine, published sometime around June 2005. Unfortunately the article is no longer online, so apologies for  embedding it here, in its entirety:

Looking for Mr. Right

Why advances in ECM technology have generated a serious skills gap in the market.

ECM technologies have advanced significantly in the last ten years. The convergence of Document/Content Management, Workflow, Searching, web technologies, records management, email capture, imaging and intelligent forms processing, has created a new information management environment that is much more aware of the value of information assets.

Most analysts agree that we are entering a new phase in ECM, where medium and large size organizations are looking to invest in ECM as a strategic enterprise deployment in order to leverage their investment in multiple business areas – especially where improving operational efficiencies and compliance are the key drivers, as these tend to have a more horizontal appeal across the organization.

But as ECM technologies are starting to become pervasive, there is a lot of confusion on the operational management of these systems. Technically, the IT department is responsible for ensuring the systems are up and running as optimally as the technology permits. But whose responsibility is it, to make sure that these systems are configured appropriately and that the information held within them is managed correctly as a valuable asset?

Think about your own company: Who decides how information is managed across your organization? With ECM, you are generating a virtual library of information that should be used and leveraged consistently across departments, geographical boundaries, organizational structures and individual responsibility areas. And if you include Business Process Management in the picture, you are also looking for common, accountable and integrated business practices across the same boundaries. Does this responsibility sit within the business community, the IT department or as a separate internal service function? And what skills would be required to support this?

There is a new role requirement emerging, which is not very well defined or understood at the moment. There is a need for an individual or a group, depending on the size of the organization, who can combine the following capabilities:

  • identify what information should be managed and how, based on its intrinsic value and legal status
  • implement mechanisms for filtering and purging redundant information
  • design and maintain information structures
  • define metadata and classification schemes and policies
  • design folder structures and record management file plans
  • define indexing topologies, thesauri and search strategies
  • implement policies and timelines for content lifecycle management
  • devise and implement record retention and disposition strategies
  • define security models, access controls and auditing requirements
  • devise schemes for the most efficient location of information across distributed architectures
  • devise content and media refresh strategies for long-term archiving
  • consolidate information management practices across multiple communication channels: e.g. email, web, fax, instant messaging, SMS, VoIP
  • consolidate taxonomies, indexing schemes and policies across organizational structures
  • etc.

And all of this, for different business environments and different vertical needs with a good understanding of both business requirements and the capabilities offered by the technology –  someone who can comfortably bridge the gap between the business requirements and the IT department.

People who can effectively combine the skills of librarian, administrator, business analyst, strategist and enterprise architect are extremely rare to find. If you can find one, hire them today!

The closest title one can use for this role today is “Information Architect” although job descriptions with that title differ significantly. More importantly, people with this collective skill set are very difficult to find today and even more difficult to train since a lot of “best practices” in this area are not established or documented.

This is a wakeup call for universities, training agencies, consultants and people wanting to re-skill: While the ECM technology itself is being commoditised, more and more application areas are opening up which will require these specialist skills. Companies need more people with these capabilities and they need them today. Without them, successful ECM deployments will remain difficult and expensive to achieve.

The more pervasive ECM becomes as an infrastructure discipline, the bigger the skill gap will become, unless we start addressing this today.

Apart from feeling slightly proud that I highlighted in June 2005 something that Gartner is raising as an issue today, this doesn’t reassure me at all: 7 years have passed and Debra Logan is (and organisations are…) still looking for Mr. Right!

I am happy that Information Governance has finally come to the forefront as an issue, and that AIIM’s CIP certification is making some strides in helping the match-making process.

But I really hoped we would have come a bit further by now…

Is it Art, is it Science or is it the Art of Science?

I don’t often cross-post between my ECM and my Photography blogs, but this is definitely a post that is relevant to both…

I was having one of these left-brain vs. right-brain discussions with a friend of mine who works in IT and also happens to be a keen photographer, as I am. He asked me: “Do you consider yourself primarily a technologist or an artist?”

I could not answer the question. The obvious answer is “both”, but the more I think about it, the less sense the question makes. Is there really a distinction between these two? I don’t believe so. They are certainly not mutually exclusive.

Let’s look at an example of a software developer and a painter or a photographer or a writer: They all have to start with a vision, they all have to innovate and all have to be problem solvers. Just imagine yourself in an artist’s studio, a photographer’s studio or your IDE environment, and look at each process:

In painting, you chose your canvas, depending on the final purpose of the painting. In photography your format and your output medium, based on the audience. In software you chose the operating system and the market your solution is intended for.

Then you choose your primary crafting tool: Your paintbrushes or your pencils, your cameras and lenses or your coding language. And you start the creative process. Your lines of code are your brushstrokes, the same lights and shadows and colours make up your composition.

In art you use a palette of colours and you combine them to create new ones. In photography you have exposure techniques and filters and in coding you use code libraries.

You step back, you look at your masterpiece or test your code, and then you use turpentine, an eraser, debugging tools or Photoshop to correct minor mistakes.

I believe that not only software development, but most scientific undertakings are a form of art. If you are experimenting in a chemistry lab, or you are designing a marketing campaign, or designing a new electronic device, you will have to use tools and imagination to create something new. You will use subjective judgements to determine if it’s bad or if it’s good. And once you deliver it you will be critiqued by other people.

So, as a solutions architect, I use artistic processes to bring my visions to life. As a photographer, I use both technology and science to create new art. Can I ever de-couple art from science? No. If I did I would end up being bad at both.

%d bloggers like this: