Feeds:
Posts
Comments

Posts Tagged ‘implementation’

[Part one of this discussion looked at different definitions of BI, and a very salient example of how it can be done well.]

When I’ve presented to people on the opportunities inherent in business intelligence, they marvel when they see information that is directly relevant to their work, in a new and meaningful light: summarised, for example, or detailed, or with direct visual impact that promotes new insights.

That’s the easy part.  Delivery is harder.

When I need to take a step back and assess what I am doing, I ask:

What does business want out of business intelligence?

This is particularly cogent if a BI implementation is less than successful – and I’ve never seen an implementation that really, I mean really, delivers.  I’m not talking about simply analysing business requirements, but understanding what is needed to deliver effectively.

There are many different ways of answering this question.

1) The anecdotal

My experience is probably not too different from many others.  In general, the feedback I’ve had from business stakeholders is:

  • They don’t know what they want; and/or
  • They want you to do it all for them

That’s a bit glib, but later I’ll extract some value from it.  In fact, as long as you’re delivering tangible value, I’ve found the business information consumers are reasonably happy.  It’s easy enough to rest on that, but as a professional it pays to think ahead.  Unfortunately, there remains a need for a level of business commitment to information issues – and I’m not talking about getting training in the tools or the data qua data, more about adopting an information-as-strategic-resource mindset.

2) The statistical

In a recent survey run by BeyeNetwork, the top two desires of business for BI are:

  • Better access to data
  • More informed decision-making

Axiomatic, no?  These effectively say the same thing, but there is nuance in each.

On the one hand, can business get whatever information they can possibly envisage, and in a format (whether presentation or analytical input) they can use effectively?  Clearly not – that’s a moving target.  But it’s also a goal to constantly strive for.

On the other hand, for business decisions to be made, it needs to be asked: what would support them in that process?  That’s too high-level for an immediate answer from most people.  Drilling into the detail of the processes is business analysis.  Maintaining such an understanding of business processes should rightly belong with the business, who should be fully on top of what they do and how they do it.  In practice, it’s often only when prompted by necessity – such as analysing information needs – that that exercise is done with much rigour.

3) The ideal

In an ideal world we would provide the knowledge base for a worker to be truly effective – which includes not just the passive support information, but the active intelligence that can generate useful new insights.  There’s a lot that can go into this, but the wishlist includes fuller realisation of:

  • Data integration: of information from disparate sources (not just databases)
  • transformation: from data to business meaning
  • Presentation: insightful representation of information (current buzzword being visualisations)
  • Discovery: the opportunity to explore the information (discovery)
  • Timeliness: information when they need it, where they need it, no delays
  • Control: the ability to save (and share) meaning that they encounter
  • Tools: a good, intuitive user experience – no learning hurdle, no techy barrier
  • Technical integration: seamless integration with the software and hardware environment (applications, devices respectively)
  • autonomy: the ability to do it themselves

That last one is an interesting one: it’s the exact opposite of what I said I’d experienced.  But the gap there is in the toolset, the environment in which the information is presented.  If it’s something they can intuitively explore for themselves, extract meaning without a painful learning curve, they would want to do it themselves.

This can’t be achieved by the data professional in isolation.  To achieve the above needs collaborative efforts: with business stakeholders, other IT professionals, and software vendors.

I don’t think there’s any BI implementation out there that delivers to the ideal.  Better business engagement, better business commitment, more resources for BI, better software tools, better integration: these would help.

We will get a lot closer to the delivery ideal.  But by then, BI will look rather different from today’s experience.

The dangling question: are new paradigms needed for BI to be fully realised?  If it is so hard to properly achieve the potential of BI today, there must be ways of working better.

Advertisements

Read Full Post »

“we had the data, but we did not have any information”
– CIO to Boris Evelson (Forrester), on the global financial crisis.

Vendor marketing messages have been said to contend that only 20% of employees in BI-using organisations are actually consuming BI technologies (“and we’re going to help you break through that barrier”).

Why is the adoption of BI so low?

That was my original question, brought about by a statistic from this year’s BI Survey (8).  As discussed in a TDWI report, in any given organisation that uses business intelligence, only 8% of employees are using BI tools.

But does it matter?  Why should we pump up the numbers?  It should not be simply because we have a vested interest.

The questions are begged:

What is BI, and why is it important?
BI is more than the query, analysis and reporting from a database:

“Business intelligence (BI) refers to skills, technologies, applications and practices used to help a business acquire a better understanding of its commercial context” – Wikipedia

It’s a very broad definition.  A rather more technical one from Forrester:

“Business intelligence is a set of methodologies, processes, architectures, and technologies that transform raw data into meaningful and useful information used to enable more effective strategic, tactical, and operational insight and decision-making. . . .”

But it can be explained more simply as:

data -> information -> knowledge -> insight -> wisdom

Data can be assembled into information.  Information provides knowledge.  Knowledge can lead to insights (deeper knowledge), which can beget wisdom.  Is there any part of an organisation that would not benefit from that process?  If there are any roles sufficiently mundane that insights won’t help them improve the job, improve their service delivery, then I guess those roles would not benefit from BI.  Yet I would suggest they are few and far between, and they should be automated as soon as possible, because you can bet that employees filling those roles won’t feel fulfilled, won’t feel motivated.

Business intelligence has a part to play in that whole process above.  At the lowest level, it can provide data for others to analyse.  But at every step of the process of generating wisdom from data, BI has a part to play.  In that sense, it is both intrinsic to an organisation’s aims, and everyone has a part to play in it.

I started into this subject aiming to canvas the reasons behind poor BI takeup.  After some research and reflection on my own experiences, though, I found a whole book’s worth of material in that simple question.  So it’s not something I can lay out simply, in one take.

Demonstration
First, let’s see an example of good use of data – one, in fact, that demonstrates both the adding of value to the data, and the presentation and impartment of insight.

That wonderful organisation TED (“ideas worth spreading”) has a presentation by Hans Rosling, a Swedish professor of International Health.  Start with Rosling’s entry at TED, and look at any one of the presentations there.  The first has the most oomph, but they are all good.  Why?  Meaningful data, good presentation tools and a Subject Matter Expert.  (Thanks to Mike Urbonas for the reference).

Rosling’s presentations are a prime example of business intelligence done right.  The data was gathered from multiple sources, its quality assessed, it was assembled and presented in a fashion that gave its audience insights. In fact, the presentation tool he uses, Trendalyzer, although later bought by Google was originally developed by his own foundation Gapminder.org.  (There are similar tools such as Epic System‘s Trend Compass; MicroStrategy also has a similar tool)

Much as it might look like it, I wouldn’t say the job began and ended with Rosling.  Whatever other parts he played, here his role is SME.  Yet his presentations clearly demostrate the involvement of other roles, from data analyst to system integrator to vendor/software developer.

Barriers to BI takeup

So where to start?  Everyone has an opinion.

Rosling: “people put prices on [the data], stupid passwords, and boring statistics”.  In other words, he wanted data to be free, searchable, and presentable.  Integration and system issues aside, he found his barriers to be data availability and the expressiveness of his tools.

Pendse:  he gave a number of barriers, including “security limitations, user scalability, and slow query performance… internal politics and internal power struggles (sites with both administrative and political issues reported the narrowest overall deployments)… hardware cost is the most common problem in sites with wide deployments; data availability and software cost;… software [that] was too hard to use…”

In grouping together the issues, I found the opportunity to apportion the responsibility widely.  All roles are important to the successful dissemination of a business’ intelligence: CEO, CIO, CFO, IT Director, IT staff, BI manager, BI professional (of whatever ilk), implementation consultant, vendor, SME (too often under- or not rated!), all the way down to the information consumer.

Comments welcome.  See part two for some discussion about gaps that exist in the delivery of BI.

Read Full Post »

Interesting to read of an I.T. “glitch” in healthcare in Canada. Reported here, no doubt it made the rounds as one more example of computers gone wrong.

The story:

Recently, a doctor called Saskatoon Health Region’s medical imaging department to find out why he hadn’t received his patient’s test results. After some investigation, it was found that a fax machine had not sent it out. That machine was part of their automated Radiology Information System. Further investigation revealed that that particular machine [or the system’s link to it] had been inoperative for about ten years. Result: at least 1,380 test results had not been sent out.

On the one hand, it’s possible to say that the failure rate is low: one thousand out of about two million communications missed, a rate of only about 0.05%. And doubtless a good number of those missed communications would have resulted in followups, closing the loop the second time around.

But on the other hand, this is a medical situation, with health and lives at stake. There should be greater certainty – particularly where manual error is ostensibly eliminated.

Sure, the doctors involved are likely all highly-paid professionals. But I’ve already heard that line, and my experience working with a community of highly-paid professionals demonstrates that that does not ipso facto lead to quality outcomes.

What does this have to do with business intelligence?

BI has by now evolved to cover a wide variety of information solutions, with data and its distribution managed centrally (in theory at least), and available via a plethora of channels. What was once reports now encompasses alerts and dashboards in particular, as well as varied options for visualisation, analysis, and tools of information empowerment for business stakeholders.

But how do we know what should be there a) is there, and b) gets communicated to the right people?

When reading the above medical anecdote, I am immediately reminded of alerts in BI systems. Alerts only sound when there is something wrong. Yet the above is a good illustration of an automated system that is validated so infrequently that a full decade can pass before a breakdown is discovered. To the business manager: ow do you know you will be alerted as you expect – as you originally specified when the BI system was implemented?

My experience with databases is that information that is frequently touched (made use of) at the BI consumer end is more likely to be exposed to quality management than data that is, say, languishing unexplored until the odd occasion that it is called up – and too frequently found to be lacking.

Likewise at a meta level. Alerts are a great way of automating – outsourcing – the checking of problem situations. But for a business manager to base their functionality on the lack of alerts received really begs the question: how confident can you be that the automation is working as expected?

This level of confidence is generally outsourced – to those who assure the manager they have the answer to their needs: the BI manager (and/or implementer).

It is encumbent on the BI implementer to sufficiently test their build. Yet business changes, data changes, and it remains encumbent on the BI manager to ensure their systems maintain that degree of integrity.

Especially for alerts, which only show when there is an issue, the BI professional needs to be vigilant.

Read Full Post »

Lord knows everyone wants it at the moment.  Enterprises want to cut costs, and small businesses – although they don’t usually appreciate it – want affordable BI (which is a whole other story).

There’s plenty of open source products to cover the range of tools used to effect business intelligence, from MySQL to Mondrian, Pentaho, Jaspersoft, and the likes of Talend for ETL.  Open source in itself does not necessarily constitute free BI, as labour, infrastructure and support costs remain.  Depending on requirements and existing resources, they can even be more costly than paid-for tools.

There’s also MicroSoft.  If you happen to have an enterprise copy of SQL Server lying around (as many companies do), then you have out-of-the-box availability of the BI products, SQL Server Integration Services (ETL), Reporting Services, and Analysis Services (cubes).

That’s the theory.  It’s tempting… too tempting, sometimes.

Recently, I talked to a government department who wanted to implement a data warehouse/business intelligence system from ETL to cubes and reports.  They had a business analyst, a consultancy lined up for the ETL, licensing for SQL Server, and… a few spare people they had lying around who could be brought up to speed on the toolset.  All they needed was someone to realise the DW/cube/reporting environments… in a few months… and train up those people to take over.  Oh, and did I mention the spare staff was non-technical?

I have nothing against non-technical people.  Particularly those with an analytical temperament.  And everyone else just represents a challenge to achieve for.  But of all the BI tools I’ve worked with, Microsoft would be the last toolset I would foist on the unsuspecting.  They’re just not that geared to the business user – nor the neophyte.  Even apart from the need to come to grips with the development environment BIDS (a version of Visual Studio, MS’ IDE), there are conceptual and practice-based experience hurdles to overcome.  Oh, and did I mention the capacity of BIDS to overwhelm a non-technical user?

All this because they were looking for a quick and inexpensive route to implementation of their BI/DW project – and they happened to have SQL Server lying around.

The two biggests risks I saw were bringing the putative BI staff up to speed – and the ETL project.

ETL can account for maybe 80% of a BI project, and there may be virtue in sequestering the complexities to a consultancy.  On the other hand, they will merrily acheive their task, and come up with a theoretically pristine ETL solution… that may provide a theoretical solution, but leave the client with a front end that only performs its task well in theory.  I’ve seen this happen at least twice before (and I’ve picked up the pieces) – where a consultancy built a structure, leaving behind a theoretically accomplished mission.  In each case they left no documentation, and a system that a) could not easily be adapted to changing business conditions, b) may not have sufficiently engaged the source business stakeholders, and c) may handle only the majority of the data – if that – while leaving in limbo non-conforming data, ie a large part of the ETL project.  Consultants paid, unquantified work still remaining.

Other challenges abounded, but they were just challenges.  Possible ways through may involve at least some of the following:
a) Have a tech and business savvy person (ideally) work with alongside the ETL consultants, then take over that aspect of the work on an ongoing basis;
b) Choose a more business friendly toolset, or hire some people who were already not too far off being able to do the BI/DW work on an ongoing basis;
c) Hire a relatively experienced BI person to run with the project and then stay on to manage the system and the changing demands on it, mentoring existing staff to the best of their capabilities
d) Allow for a longer implementation schedule;
e) Narrow the scope of the project;
f) Accept the need for a bigger budget for the requisite outcomes.

It pains me to have the size of a project blow out – I would like to deliver streamlined – and effective – BI; versions thereof should be available to all levels of need. Yet on the other hand, it’s too easy for business managers to grit their teeth and say “this is what we are going to achieve” without either building slack into the project or at least inject some relevant expertise at the project planning phase.

As proposed, they would end up with a half-working ETL process and a bunch of people who would struggle to maintain the initial environment, far less meet evolving business needs.

Last heard, that government department was still looking for someone to run the project.

What do you think?

Read Full Post »