Archive for the ‘business intelligence’ Category

Yesterday I was involved in a few discussions about meeting business needs.

Well, that covers a multitude of sins.

Someone said that in his experience, getting business requirements for BI results in either “give me exactly what I have already”, or blue sky, ie “everything”.  That’s been pretty much my experience too, and can signal that the stakeholder isn’t successfully engaged, perhaps because they don’t know what they can get, or they don’t prioritise the exercise highly enough to put in the requisite effort.

Managing scope is another issue.  BI projects are especially susceptible to scope creep, for a number of reasons.  In particular, business stakeholders often only engage belatedly on the fuller range of opportunities presented them.  This can be for rational reasons, as early deliveries often trigger further ideas and needs – not to mention their realisation you can deliver them something meaningful, cool even.

Still scope needs management one way or another.  Formalised signoffs are common, but what do you do for enhancement requests or incremental changes?  A trickle can become a steady stream.  In some situations I’ve seen a very strict policy taken: any further requirements can only be admitted via a subsequent project.  The most extreme was when a project was underquoted by an external supplier, and cost was fixed.  Black-letter adherence to a document can lead to poisonous – or at least cold – relationships, so usually there’s been some tolerance allowed or built in.  Ideally, you’d quote in a bit of slack, over-deliver, make everyone happy, and generate further collaboration.

Then there’s business-as-usual BI.

Identifying opportunities for further BI development:  not usually high on the agenda.  This because of a familiar experience that was voiced yesterday: the six-month queue for new development.  Delivering business intelligence is more a matter of managing what’s being requested than drumming up work (how to get a six-month queue: drum up work).

Prioritising is necessary, but not the ultimate answer: it doesn’t shorten the queue, and you can guarantee that as a result some worthy requests can end up languishing in a permanent limbo; somebody will be put offside.

Another common approach, which I favour wherever possible, is to foster skills loci in individual business units.  It’s often possible to identify someone in a given business area who has an analytical bent – who, by temperament, interest or both, is not only open to the idea but keen for the opportunity to extract and analyse themselves.

That’s a two-edged sword for several reasons.  Primarily: unfettered access can result in people building non-conforming versions of commonly-used metrics; some sort of auditing or filtering process needs to take place.

Mentioned yesterday was a forum of such power users, meeting monthly under the auspices of a BI professional.  Sharing experience and best practice is one aim, but it also helps to be aware of the directions people are headed, training needs, and to keep on top of resourcing levels.  I don’t think control should be an issue per se, but with workload decentralisation it’s easy to lose sight of the use of both toolsets and resources, which understanding is necessary when planning updates or changes to environment or data.  Again, it remains important to keep an eye on the use of metrics, where possible via published – and updated – standards, with acknowledged business owners.  This model can become unwieldy when there is not at least centralised insight into the use of the data resources provided.

I don’t think any of this is particularly new, but for various reasons it’s not always effected with sufficient enthusiasm – on either side.  While it’s important to ensure people are reading off the same script, I don’t think that either business or IT interests are served by maintaining BI skills within IT – with or without business analysts interfacing.  Even if there’s pushback from the business units, they will have to acknowledge they are their own subject matter experts, and shouldn’t abrogate that knowledge by delegating to those without a direct interest.

Read Full Post »

It’s hard to get through 2010 without stumbling across the term ‘agile’, which is being spilt everywhere.  Like most bandwagonesque ideas, the exact meaning is by turns carelessly mislaid, blithely trampled on, or deliberately stolen.

The origins of “agile software development” goes right back to 2001, published in The Agile Manifesto.  In theory, anything not referencing it is either wilful, ignorant, or indifferent.  But language is organic; these things will happen.

The Wikipedia definition of agile software development accords with the Manifesto.  And an example of the breakout process comes from Maureen Clarry of CONNECT: “Confusing Agile with a capital A and agility is a common mistake. Agile as a methodology is a small piece compared to organizational agility. Closely related to that, we sometimes see BI organizations that use Agile methodology as an excuse so that they don’t have to define standards or document anything. This is another example of trading speed and adaptability for standardization and reuse. It does not need to be an either/or proposition.”

Ouch.  The battle lines are clearly drawn; it can’t be surprising to see it in the business intelligence arena.

This current discussion will look at the capital A, which has definition.  As such, the Agile Manifesto is not for everyone.  Up front, they say:

“we have come to value:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan”.

That’s not motherhood – and it’s obviously not universally applicable.  Enterprise-level organisations will necessarily favour processes and tools, simply because they need good communication, integration between parts of the body to make it work –and grow – well.  In that context, the Manifesto could be seen as permission for cancer to grow: it may be successful, but out of step with the rest of the body.  On the other hand, it may be good for pilots where they don’t need tight integration with the body corporate.

The Agile Principles should be viewed in full here, but a short version could be summarised as:

  1. Highest priority: to satisfy the customer through early and continuous delivery
    of valuable software.
  2. Embrace changing requirements, even late in development.
  3. Deliver working software frequently.
  4. Business people and developers must work together daily.
  5. Build projects around motivated individuals, and resource them.
  6. Face-to-face meetings!
  7. Working software is the primary measure.
  8. Sustainable development: the ability to maintain a constant pace indefinitely.
  9. Continuous attention to good design.
  10. Simplicity: maximise the amount of work not done.
  11. Self-organising teams.
  12. Reflect as a team at regular intervals, on how to be more effective.

MIP, an Australian data management consultancy, are the ones who first brought MicroStrategy, Brio and Informatica to Australia.  Recently they gave a presentation on Agility in its formal sense, in the context of presenting RED, a data warehouse development tool from a New Zealand company called WhereScape.

WhereScape RED has:

–          automatic creation of surrogate keys, load timestamps, etc;

–          code generation, execution, re-execution;

–          a source repository;

–          change management/ version control, including comparison tools;

–          generated user and technical documentation, with auto commenting, diagrams, glossaries;

–          prototyping facilities;

–          notification of data issues (although it is not a data quality tool per se, it uses an error mart).

MIP presented WhereScape RED as inextricably linked to Agile development; a simpler IDE than Microsoft’s Visual Studio, and an intuitive ETL tool.  It has been customer-quoted as a “perfect complement” to SQL Server technology (albeit I can’t say how well it fits in with other database technology).

What I saw did look good.  It makes sense that it would suit an Agile development project.  I noted one caveat at the time: that with such tools and methodology, it would be easy to lose the development thread in the process of rapid reiteration.  A challenge, but not an insurmountable one, for the data professional.

Update 05-Aug-10: The Data Warehouse Institute’s World Conference has Agile as its theme.  Some of the adjunct discussions can be seen to muddy the waters somewhat (is it a methodology? a process? a style? – depends on who’s talking, and how loose their language is).  An earlier discussion – “Being” Agile vs. “Doing” Agile – is salient, especially the comments.  One of the author’s own comments is worth repeating, that promoting Agile on the basis of speed specifically is “wrong-headed”:

“When speed is the primary objective, quality and value often suffer. But when the focus is on incremental value delivery (of high quality); increased productivity occurs naturally”.

Read Full Post »

A quick listing of HP’s latest analysis of trends within Business Intelligence:

1.  Data and BI program governance

– ie managing BI [and especially data] more strategically.

2. Enterprise-wide data integration

– recognising the value of such investment.

3. (the promise of) semantic technologies

– especially taking taxonomical (categorising) and ontological (relating) approaches to data.

4. Use of advanced analytics

– going beyond reporting/OLAP, to data mining, statistical analysis, visualisation, etc.

5. Narrowing the gap between operational systems and data warehouses

6. New generation, new priorities in BI and DW – ie updating BI/DW systems

– HP identifies renewals of systems, greater investment in new technology – perhaps in an emerging economic recovery context.

7. Complex event processing

– correlating many, varied base events to infer meaning (especially in the financial services sector)

8. Integrating/analysing content

– including unstructured data and external sources.

9. Social Computing [for BI]

– yet at the moment it takes great manual effort to incorporate such technology into BI

10. Cloud Computing [for BI]

You can find the full 60-minute presentation here.  HP noted that these points are very much inter-related.  I would also add a general tenor that I got from the discussion: that these are clearly more aspirational trends than widespread current initiatives.  HP’s research additionally highlighted the four most important current BI initiatives separately:

– data quality

– advanced analytics [again]

– data governance

– Master Data Management

Other current buzzwords, such as open source, Software as a Service, and outsourcing, didn’t emerge at the forefront of concerns.  For the first two, the comment was made that these were more background enabling technologies.  As for outsourcing, it looked like those who were going to do it had largely done it, and there was current stability around that situation.

Business Intelligence has obviously moved away from simple reporting from a single repository.   Concerns are now around data quality, integration/management – and making greater sense of it, particularly for decision-making.  Those trends are clear and current.  But I’d also like to note one small point almost buried in the above discussion: the use of external data sources.  Business value of data must inevitably move away from simple navel-gazing towards facing the whole of the world, and making business sense of it.  That’s a high mountain, and we’re only just becoming capable of moving towards that possibility in a meaningful way.

Read Full Post »

Coincidence: a scant twelve days after I discussed the contribution BI can make to process improvement, I found myself listening to Olivera Marjanovic similarly drawing a confluence between the two – from a more structured, organisational perspective.

Dr Marjanovic, an academic with the University of Sydney, has a focus on the integration of Business Process Management with Business Intelligence (a paper of hers on that subject can be found here).  At a recent TDWI meeting (abstract here), she aimed to present a roadmap for this integration.

Countering the business/data technology community’s traditional cynicism of academia, her talk was wide-ranging and stimulating.  I can only summarise some of what she said, because she raised many more discussion points than can be covered in a brief post – or captured in hurried notes.

BPM suffers a variety of both definitions and practice – and has changed over time – so it’s important to put a context on the term.  Dr Marjanovic says in her abstract that PBM “has evolved beyond technologies for process automation and methodologies for process efficiency improvement”.  Her definitional synthesis (based on one I have seen usually attributed to the Aberdeen Group) is

Business Process Management is the “identification, comprehension, management and improvement of business processes that involves people and systems, both within and across organisations“.

It’s a process-driven management philosophy, one where effectiveness is more crucial than efficiency (which is pointless if a process is not effective).  Technology is, of itself, insufficient: it comes from the interaction of strategy, people, processes and systems.  From the HR perspective, this should include training in:
– design thinking
– process innovation
– lateral thinking; but in particular
– learning how to learn.

Within knowledge management, Dr Marjanovic emphasised an organisation’s tacit knowledge – that is, experiential knowledge, that which conveys competitive advantage – over explicit knowledge, something that can easily be replicated [in other organisations].  This is the difference between

  • procedural business processes: highly structured decisions, tight information/decision coupling, and decision-centred BP improvement


  • practice-oriented business processes: unstructured decision-making, loose information/decision coupling, and collaborative knowledge-based improvement.

In this sense “knowledge management repositories don’t work” for process improvement – in that they date too rapidly (they are better suited for business continuity and risk management functions).

The greatest value of BI comes from being increasingly embedded within the business processes themselves” – TWDI’s Best Of BI, 2008.

Dr Marjanovic offered some lessons for Business Intelligence, which included:
– BI practitioners are not [sufficiently] trained in process improvement or process thinking in general
– BI training methods are still very traditional, and skill-based BI practitioners need advanced training methods to help them learn how to learn.

When I outlined my experience with process improvement through BI or data quality initiatives (mentioned a couple of weeks ago), Dr Marjanovic suggested this was not common practice.  She clarified: “What is not common is a systematic (rather than ad-hoc) application of BP improvement methodologies in the context of BPM/BI integration.”  That does not surprise me: it accords with my (anecdotal) experience that the two disciplines don’t often meet.  But as I’ve said before, if BI practitioners, by intention or direction, retain a narrow focus on BI-specific projects, both they and their organisation risk abrogating the value they could express in both process and data improvement.

Read Full Post »

This week I have pointers to three discussions I’ve been reading.

BI Workspace: another ‘future of BI’ is discussed here, traceable back to a report from industry analysts Forrester (executive summary here).  What it is: the concept of a fully navigable data environment, geared specifically to the power user who has sufficient understanding of the data and its business context to make rational use of the full extent of data exploration.

Data Quality as an issue of contextA discussion at the always-useful OCDQ on data quality being a wider issue than simply accuracy.  Data accuracy was fully acknowledged, but other dimensions raised.  My contribution to the discussion focused (as usual) on the quality – fitness – of the data as a business resource: including timeliness, format, usability, relevance – and delivery mechanisms. (To give the discussion its due, it was prompted by Rick Sherman’s report on a TDWI Boston meeting.)

Quality Attributes as a point of architecture: An ambitious point was raised as a discussion point on LinkedIn’s TDWI group.  The essence was a suggestion that data quality dimensions defined as standards or Architectural Criteria when designing repositories.  Should standards such as ‘availability’, ‘portability’, ‘recovery’ be built into a data repository’s initial design?  Sounds laudible, but how practical is it to define it to measurable detail?  How intrinsic should such measures be to such a project’s SLAs?

Finally, a comment by Atif Abdul-Rahman (blog Knowledge Works) on my previous post linking business intelligence to business process improvement.  Atif effectively said BI+EPM=BPM.  My first reaction was to treat it as spam 🙂   – what do you think?

Read Full Post »

What does Business Process Improvement have to do with BI?

Not much, if you read Wikipedia.  But I’m beginning to suspect that a large number of the Wikipedia articles that stand at the confluence between business and technology are written by management ‘experts’ who are practicing for their next book, based on their part-time management studies.  Certainly most of those articles are arcane enough for those of us steeped in practical application of technology.

Yet there are other ways to bring about improvement in business processes than by following a rigorous methodology imposed from on high.

On the one hand, it’s often possible to just walk into a business and identify candidates for improvement, if not processes that are thoroughly broken.  That’s not just because a fresh set of eyes can help, but because experience, and enough experiences in different workplaces, can help to quickly identify both the work practices that are worth repeating, and those that are well broken.

But that’s not what I’m talking about either.

It is this: a business’ data is a model of the business and its practices.  In  return, business intelligence is a process of accurately reflecting that business and its processes.  And in endeavouring to do so, a good amount of business analysis is called for, to understand the business as you are reflecting it.  And that wholistic engagement process has a habit of uncovering what is not working as expected in business processes, both in practice (when analysing what people are doing) and in virtualisation – because when the data is shown to be incorrect and/or not as expected, that mismatch tends to reflect business processes that are awry.

That is not necessarily a part of the brief of a business intelligence professional.  Yet with forward-thinking management, it can be.

But at the very least, business intelligence professionals are ideally placed to gain insight into both a business and the model of the business and, in identifying mismatches, to foster improvements in business processes.  It would be negligent to waste such opportunities.

Read Full Post »

Older Posts »