Feeds:
Posts
Comments

Archive for October, 2009

Again, this week I am gathering together a few reads that I have found to stick in my mind, for one reason or another.

The future of Analytics
The Data Warehouse Institute has a series of “Best Practice Reports”; a recent one is called Delivering Insights with Next-Generation Analytics.  It provides an analysis on the future of analysis, backed up with some survey results.  It characterises BI as central to analytics in a business context (and it’s hard to say what part of business analytics BI would not be involved in).  Reporting and monitoring remain crucial components of such activity, but TDWI places an emphasis on differentiating users of information and analytics, from production report consumers (wide in scope but terse in analytical focus) to the power user analysts and managers concerned with forecasting and modelling.  The essence of its recommendations are to provide appropriate tools to the differentiated users, and keep an eye on technology.  Although at a top level this isn’t exactly news, this report is packed with useful detail for those making an effort to keep on top of the intersection between business and technology.

The future of Data Warehouses
Although I had a look at some new technology in data warehousing recently, this second TWDI report (Next generation Data Warehouse Platforms) is necessarily more systematic.  It models the DW technology stack, outlines new technology and business drivers, intersperses user stories, and outlines emerging trends (eg appliances, in-memory, cloud/SaaS, columnar, open source, etc) not too different from my list.  Recommendations include: focusing on the business drivers; moving away from expensive in-house development; preparing for high-volume data; anticipating multiple path solutions, including open source.

In-memory databases
TDWI’s above report treated in-memory DWs seriously, without going into much detail on feasibility.  This is odd, given one of their recommendations involves preparing for an explosion in data to be stored.  I read a discussion on this technology (TDWI again: Q&A: In-memory Databases Promise Faster Results), which still doesn’t convince me that this isn’t a cat chasing its own tail.  The only realistic way forward I can see is by developing a dichotomy between core and peripheral data and functionality.  Haven’t seen that discussed.  Yet.

Forrester on trends and spotting them
Forrester has a new report aimed at Enterprise Architects: The Top 15 Technology Trends EA Should Watch.  These are grouped into five themes: “social computing for enterprises, process-centric information, restructured IT service platforms, Agile applications, and mobile as the new desktop”.  Some of it is discussed here, by Bill Ives.  Further, Forrester gives an outline of the criteria it uses for paying attention to a technology.  This includes how meaningful it is in the near term, its business impact, its game-changing potential, and its integrational complexity.

Vendor news: Oracle and bulk financials
Finally, news that Oracle has bought up again, this time taking over HyperRoll, whose software is geared for analysing “large amounts of financial data”.  Sounds a sensible move.

Read Full Post »

Coincidence: a scant twelve days after I discussed the contribution BI can make to process improvement, I found myself listening to Olivera Marjanovic similarly drawing a confluence between the two – from a more structured, organisational perspective.

Dr Marjanovic, an academic with the University of Sydney, has a focus on the integration of Business Process Management with Business Intelligence (a paper of hers on that subject can be found here).  At a recent TDWI meeting (abstract here), she aimed to present a roadmap for this integration.

Countering the business/data technology community’s traditional cynicism of academia, her talk was wide-ranging and stimulating.  I can only summarise some of what she said, because she raised many more discussion points than can be covered in a brief post – or captured in hurried notes.

BPM suffers a variety of both definitions and practice – and has changed over time – so it’s important to put a context on the term.  Dr Marjanovic says in her abstract that PBM “has evolved beyond technologies for process automation and methodologies for process efficiency improvement”.  Her definitional synthesis (based on one I have seen usually attributed to the Aberdeen Group) is

Business Process Management is the “identification, comprehension, management and improvement of business processes that involves people and systems, both within and across organisations“.

It’s a process-driven management philosophy, one where effectiveness is more crucial than efficiency (which is pointless if a process is not effective).  Technology is, of itself, insufficient: it comes from the interaction of strategy, people, processes and systems.  From the HR perspective, this should include training in:
– design thinking
– process innovation
– lateral thinking; but in particular
– learning how to learn.

Within knowledge management, Dr Marjanovic emphasised an organisation’s tacit knowledge – that is, experiential knowledge, that which conveys competitive advantage – over explicit knowledge, something that can easily be replicated [in other organisations].  This is the difference between

  • procedural business processes: highly structured decisions, tight information/decision coupling, and decision-centred BP improvement

and

  • practice-oriented business processes: unstructured decision-making, loose information/decision coupling, and collaborative knowledge-based improvement.

In this sense “knowledge management repositories don’t work” for process improvement – in that they date too rapidly (they are better suited for business continuity and risk management functions).

The greatest value of BI comes from being increasingly embedded within the business processes themselves” – TWDI’s Best Of BI, 2008.

Dr Marjanovic offered some lessons for Business Intelligence, which included:
– BI practitioners are not [sufficiently] trained in process improvement or process thinking in general
– BI training methods are still very traditional, and skill-based BI practitioners need advanced training methods to help them learn how to learn.

When I outlined my experience with process improvement through BI or data quality initiatives (mentioned a couple of weeks ago), Dr Marjanovic suggested this was not common practice.  She clarified: “What is not common is a systematic (rather than ad-hoc) application of BP improvement methodologies in the context of BPM/BI integration.”  That does not surprise me: it accords with my (anecdotal) experience that the two disciplines don’t often meet.  But as I’ve said before, if BI practitioners, by intention or direction, retain a narrow focus on BI-specific projects, both they and their organisation risk abrogating the value they could express in both process and data improvement.

Read Full Post »

This week I have pointers to three discussions I’ve been reading.

BI Workspace: another ‘future of BI’ is discussed here, traceable back to a report from industry analysts Forrester (executive summary here).  What it is: the concept of a fully navigable data environment, geared specifically to the power user who has sufficient understanding of the data and its business context to make rational use of the full extent of data exploration.

Data Quality as an issue of contextA discussion at the always-useful OCDQ on data quality being a wider issue than simply accuracy.  Data accuracy was fully acknowledged, but other dimensions raised.  My contribution to the discussion focused (as usual) on the quality – fitness – of the data as a business resource: including timeliness, format, usability, relevance – and delivery mechanisms. (To give the discussion its due, it was prompted by Rick Sherman’s report on a TDWI Boston meeting.)

Quality Attributes as a point of architecture: An ambitious point was raised as a discussion point on LinkedIn’s TDWI group.  The essence was a suggestion that data quality dimensions defined as standards or Architectural Criteria when designing repositories.  Should standards such as ‘availability’, ‘portability’, ‘recovery’ be built into a data repository’s initial design?  Sounds laudible, but how practical is it to define it to measurable detail?  How intrinsic should such measures be to such a project’s SLAs?

Finally, a comment by Atif Abdul-Rahman (blog Knowledge Works) on my previous post linking business intelligence to business process improvement.  Atif effectively said BI+EPM=BPM.  My first reaction was to treat it as spam 🙂   – what do you think?

Read Full Post »