Feeds:
Posts
Comments

Archive for May, 2009

Interesting to read of an I.T. “glitch” in healthcare in Canada. Reported here, no doubt it made the rounds as one more example of computers gone wrong.

The story:

Recently, a doctor called Saskatoon Health Region’s medical imaging department to find out why he hadn’t received his patient’s test results. After some investigation, it was found that a fax machine had not sent it out. That machine was part of their automated Radiology Information System. Further investigation revealed that that particular machine [or the system’s link to it] had been inoperative for about ten years. Result: at least 1,380 test results had not been sent out.

On the one hand, it’s possible to say that the failure rate is low: one thousand out of about two million communications missed, a rate of only about 0.05%. And doubtless a good number of those missed communications would have resulted in followups, closing the loop the second time around.

But on the other hand, this is a medical situation, with health and lives at stake. There should be greater certainty – particularly where manual error is ostensibly eliminated.

Sure, the doctors involved are likely all highly-paid professionals. But I’ve already heard that line, and my experience working with a community of highly-paid professionals demonstrates that that does not ipso facto lead to quality outcomes.

What does this have to do with business intelligence?

BI has by now evolved to cover a wide variety of information solutions, with data and its distribution managed centrally (in theory at least), and available via a plethora of channels. What was once reports now encompasses alerts and dashboards in particular, as well as varied options for visualisation, analysis, and tools of information empowerment for business stakeholders.

But how do we know what should be there a) is there, and b) gets communicated to the right people?

When reading the above medical anecdote, I am immediately reminded of alerts in BI systems. Alerts only sound when there is something wrong. Yet the above is a good illustration of an automated system that is validated so infrequently that a full decade can pass before a breakdown is discovered. To the business manager: ow do you know you will be alerted as you expect – as you originally specified when the BI system was implemented?

My experience with databases is that information that is frequently touched (made use of) at the BI consumer end is more likely to be exposed to quality management than data that is, say, languishing unexplored until the odd occasion that it is called up – and too frequently found to be lacking.

Likewise at a meta level. Alerts are a great way of automating – outsourcing – the checking of problem situations. But for a business manager to base their functionality on the lack of alerts received really begs the question: how confident can you be that the automation is working as expected?

This level of confidence is generally outsourced – to those who assure the manager they have the answer to their needs: the BI manager (and/or implementer).

It is encumbent on the BI implementer to sufficiently test their build. Yet business changes, data changes, and it remains encumbent on the BI manager to ensure their systems maintain that degree of integrity.

Especially for alerts, which only show when there is an issue, the BI professional needs to be vigilant.

Advertisements

Read Full Post »

There’s been some fierce publicity (try here or here, eg) around the release of Wolfram Alpha, touted as a new kind of search engine.

But is it a search engine?

Perhaps it’s being marketed as such to appeal to the average internet user’s comfort zone – perhaps also in an attempt to redirect traffic from one of the web’s most popular sites (Google, of course).

But in concept, it would seem to be more of an intersection between Google and Wikipedia – an attempt to provide answers from its own compendium of knowledge (certified in a way that Wikipedia’s store is not).

The answers it provides are often through simple database query. Yet it intends to integrate (potentially disparate) information from different data sources, giving presentation-level answers to the user.

Sound like business intelligence?

My common understanding of business intelligence is a toolset that allows for users to query, analyse, and report from a data source. Is this not what Wolfram Alpha does?

Usually BI is understood as the navigation of an organisation’s own, internal data – but that is really only an issue of access and availability. I for one would like the opportunity to integrate internal information with any available external information (with the caveat of confidence in the quality of the source data). Wolfram clearly does not provide hermetic company information – but in a conceptual sense, could it provide what would be asked of business intelligence tools?…

Performance? Always an issue – unless the toolset is tuned well enough that there are no complaints from business stakeholders. In this case, the matter is fully externalised… rather like cloud computing, except that the end-user has no input into response times. This could be an issue for complex requirements.

The interface. The end user would seem to have little to no control over the presentation layer – a definite minus. As for query structure, it’s intended to be natural language – which can be a boon or a barrier, depending on the user. I would not be surprised to find Wolfram permitted a more syntactically pedantic query language…

And therein maybe lies a departure: we are exposed to neither a rigorous syntactical query language and – more importantly, in some ways – we do not have exposure to the format of the source data – its extent and its limitations. Thereby it becomes more an information discovery process than a deterministic information provider: if we don’t know with precision the extent or dynamism of the source data, we can not necessarily know we will get like information from like queries.

A devil’s advocate could accuse me of nit-picking. As a BI professional, I have a strong preference for understanding the source data in the development of any information system. But I find it hard to argue that that preference is a rigid necessity for the provision of business information/intelligence.

At first glance, Wolfram doesn’t seem to be geared for the strict world of BI. More broadly, I can see a space for a Wolfram appliance, sold for internal use, but with access to the external Wolfram data stores.

Why not? Such an appliance wouldn’t fill every business need. But it would address a few that are not currently met. A bit more access to data structures, query process, and presentation level, and it could compete admirably in the BI market.

So yes, in a sense I do see it as a BI tool – of sorts, and in potential. But rather, it’s in danger of expanding – exploding – our understanding of business intelligence. Perhaps taking it more to where it should be.

Postscript: I declined to mention throughout this discussion Wolfram’s limitations. It is rather appalling as a generalised search engine – unless your queries are very standard high school fare (and you’re American). Yet I believe it has a sweet spot, which again would come from having a reasonable understanding (even so far as just a listing) of its data sources.

Read Full Post »

This year’s CeBIT exhibition was a bit more staid than it had been in the past.

One year, blackberries were the buzz, and there were lots of giveaways. Another year, gadgets abounded. Last year, every second stall seemed to have a prize giveaway in exchange for a business card. But this year, it was just business; companies may have been seeking genuine leads rather than a scatter-gun collection of cards.

But some things will never change. Like the specious claims on business intelligence.

I’d say each exhibitor was asked to tick a set of boxes to indicate which category/categories they wanted to appear under. This year, 34 exhibitors ticked the ‘Business Intelligence’ box, which wouldn’t have been much different from previous years.

This year, I sought to test their claims somewhat more systematically. And largely, I found the BI connection tenuous at best – as expected. On the one hand, it could be said to reflect the level of understanding of business intelligence among business at a broad level – alternatively, it could have been more a marketing effort than accurate categorising.

There was a number of general (or specific) I.T./web business service companies making the claim, along with the CRM/ERP vendors – i.e. the usual suspects.

Which might seem to beg the question ‘what is business intelligence?’ But at the very least, one should expect the ability to query/report/analyse the business data that their software collected. In most cases, for non-out-of-the-box reporting it was a matter of “come back to us post-sales, and we’ll customise to your requirements”. Sorry, that ain’t it.

Those in the BI category could be broken down into companies with no
connection with BI, those with a tenuous relationship, and a small handful for whom the term was genuinely meaningful.

Frequently, those that did understand BI simply pointed to APIs or an ability to configure/integrate with some of the more common BI toolsets. And there were few that could boast a genuine connection to BI.

Pronto was one of them. An Australian vendor of a nicely configurable ERP suite, they said they slotted into the marketplace between MS Dynamics and SAP – ie the gap between small business and enterprise-level organisations.

That segment would seem to be a good earner – which Pronto’s competitors at each end are doubtless also gunning for.

Pronto’s analytics section was actually an OEM of another, more squarely BI product, Panorama (which seems to have next to no direct presence in Australia). But fair cop, the person I was talking to did understand BI, and how their product related to it, even at the back end.

Then there was Integeo, who links GIS software with BI. One Source is a content provider: their service aggregated data from a wide variety of sources, but as they described the interface – a manual logon/search and extract to Excel-format – it sounded much too clunky to be integrated into BI solutions. Sensis, of course, provides directory/information services, but I’m not aware of any capability of theirs to tightly integrate with BI solutions (admittedly I didn’t get to talk to them). Wave seems to be a consultancy that provides end-to-end IT solutions (in partnership with vendors) including, yes, BI – via Calumno.

Maybe I should have spent more time on this – but I don’t know how much more wading through a morass of claims it would have taken to gain a few more grains of insight. One can investigate company after company to find which ones do not encompass BI – but that’s the wrong end of the stick. It would
be helpful to have an industry-specific event – but we don’t really get further than the vendor-specific.

Sadly, it’s too much of a niche for there to be a BI equivalent of CeBIT, in Australia at least.

Read Full Post »

I mentioned before that my first exposure to business intelligence was via the Brio toolset. I certainly had my frustrations with it – crashing, not being able to achieve what I wanted with the tools, and the sometimes slow response.

Crashing became far less of a problem from version 6.4. Slow query response is often enough an issue for the data source – optimising the database for OLAP is a good start. Good computer memory is also an enormous help in this area.

But then, I longed to try some of the other tools that were out there and buzzy – Cognos, for example. Little did I appreciate that it had its own frustrations. And that there were several virtues to the Brio tool. although geared only to ROLAP (relational) querying, it had all the various conceptual layers (user interface, datamodel, query, data, analysis) in a clean interface, and a pivot section that was very useful for data analysis. It is particularly useful for exploring relational tables and the data therein, although – unsurprisingly – it helps to have a good understanding of the schema under navigation.

Brio had a healthy market presence in Australia, but not as strong in its home base, the US. In the frenzy of BI consolidations over the past 10 years, it got swallowed up by Hyperion, which in turn was bought by Oracle.

Wherein lies the rub. Oracle is the software equivalent of an industrial conglomerate, and had made quite a few purchases over the years. These included, notably, Peoplesoft (a particularly hostile takeover), Siebel, Hyperion, BEA, and the newest takeover, Sun Microsystems.

Once the mergers down the line are taken into account, it should be apparent that Oracle would find itself with a lot of duplication of functionality – and it did, several times over.

Along the way, Brio was successively renamed and wrapped into other products. It found its way into Hyperion Performance Suite (as Hyperion Intelligence), then to become buried within OBIEE (Oracle Business Intelligence Enterprise Edition), as Hyperion Interactive Reporting.

Although, as HIR it faces both obscurity and competition from its OBIEE stablemates, the erstwhile Siebel Analytics, the old Brio is still getting decent writeups. The OLAP Report, referring to Brio as a “pioneer of interactional relational analysis”, praise its “ease-of-use on the one hand, and basic-level security features on the other”, positioning it at a departmental level, as an “ideal smaller scale solution”. HIR/Brio also gets mentions in IT-Toolbox from time to time, particularly in some of the discussion groups (it also has a Wiki section on Brio, mainly a set of How-Tos). There is also a BI blog with a useful post or two on it.

Indeed, the market has moved on, and BI has expanded in a number of directions. But the core of Brio, as Oracle’s Hyperion Interactive Reporting, is a zippy little tool that will certainly maintain its fans.

Read Full Post »