Archive for the ‘trends’ Category

Yesterday I took the opportunity to attend CA Expo 10 in Sydney – CA’s annual talkfest.  Having been there in the past, I knew there would be some pearls to be gleaned on current and future directions in IT.

CA, once known as Computer Associates, now wants to be known as CA Technologies – returning the core emphasis to their brand name.  They’re one of the largest software and services organisations in the world, traditionally aimed at system management in the mainframe and enterprise market.

Their theme this year was cloud technology, which they took pains to portray as a step up from virtualisation.  The primary target of their presentations seemed to be Chief Information Officers – and anyone else influencing the IT spend.  Although they emphasised their point that The Cloud was an evolutionary move (not revolutionary!), their aim was clear: they wanted to scare the bejesus out of the CIO.

CA wants to ensure the CIO is aware of the large-scale changes in the wind, but not to worry: CA are there with the solutions.

In fairness, they drove a number of meaningful points…

The keynote speech was given by Peter Hinssen, presented as a European technology writer/lecture/strategist.  As a keynote, the intention was to both entertain and to flag CA’s themes. One of them was the cloud as a change of strategic emphasis: from efficiency to agility.  As context, he depicted a cultural trend from digital as a novelty to digital as the norm.  We’re “halfway there” he said, somewhat arbitrarily, but while we have been beavering away in the background over the past 15 years, digital has infiltrated the mainstream consumer end of society.  He rightly reminded us (who were mostly old enough to remember Pong and Space Invaders) that current entrants into the workforce had grown up surrounded by digital – and [some] could claim to have better technology at home than at work.

Ah, but I digress.  Hinssen depicted the cloud as not only Software as a Service, but Platform [and infrastructure] as a service.  Security issues require a change in thinking: from firewalling people out to a “conditional yes” – we will need partners and customers to integrate with our systems.  And he, too, wanted to scare the CIO: “You will now be known as the Cloud Interface Officer” (a “joke” repeated elsewhere).

But beneath the words, the structural changes are both daunting and complex.  CA were sometimes roundabout, sometimes direct: EVP Ajei Gopal: there will be a transition from the “IT [department] as a monolithic supplier of services, to manager of a supply chain”. ( Therein lies the challenge.  It won’t be immediate, and it will never be comprehensive, but technology management will inevitably be forced to grapple with that changing role.  Is this the same as outsourcing?  In some ways yes, in some ways no – the change is likely to be far more gradual, as new functionality is quitely placed in a cloud domain, for example.)  CA’s Chris Dickson said competitive advantage comes from managing external resources. ( But that rather begs the question: how good is your capacity to manage external resources?)

But back to scaring the CIO. Gopal: “You will get a call from the CEO: What are we doing in the cloud?”  (The absolutely natural response to hearing that is to ensure there’s a proof-of-concept pilot in place, just to show IT has the concept on the map.)  Gopal wanted to demonstrate that CA had the answers, by listing a number of CA acquisitions in recent times, including Oblicore, Nimsoft, NetQoS, and 3tera, which he characterised as strategic to CA’s cloud focus.

The proof of the pudding, however… CA want to prove their capability, with their knowledge base, their software, their expertise.  They have security product, which they presented at length.  But a question from the floor stymied their chief security architect: on managing social networks spilling out from the workplace.  It’s a known challenge, and they’re working on it.

On networking: CA have set up a “Cloud Commons” community, where experiences can be exchanged, best practice shared.  They developed the infrastructure, but communities only work when a critical mass is hit.  Various product (and other) scores can be aggregated in this Commons, for example – but as we see all the time, only useful where enough people participate.

CA went into more detail on managing infrastructure, on security, on transition, all of which were meaningful, while at the same time saying “here is your problem, and we are your solution”.

In conception, their vision is what the cloud is.  The more your capacity is successfully abstracted into the cloud, the fewer points of failure to affect business with your partners and your customers.  But CA’s value propositions are large-scale, high cost projects.  They’re reaching for the sky.  Yet in the short term, they may have to settle for hand-holding exercises in proof-of-concept.

All presentations from the day can be viewed here.

Addendum:  Gartner has just put out a release on SaaS.  Inter alia, it forecasts the SaaS market to grow to US$8.5 billion in 2010 from $7.5b.  Interestingly, they estimate that “75 percent of the current SaaS delivery revenue could be considered as a cloud service”, but that will increase “as the SaaS model matures and converges with cloud services models”.  Further, they expect SaaS to comprise 26% of the CRM market in 2010 (due in no small measure to Salesforce.com, I’d say).  That’s likely to be the easiest route to a pilot cloud project at the moment.


Read Full Post »

A quick listing of HP’s latest analysis of trends within Business Intelligence:

1.  Data and BI program governance

– ie managing BI [and especially data] more strategically.

2. Enterprise-wide data integration

– recognising the value of such investment.

3. (the promise of) semantic technologies

– especially taking taxonomical (categorising) and ontological (relating) approaches to data.

4. Use of advanced analytics

– going beyond reporting/OLAP, to data mining, statistical analysis, visualisation, etc.

5. Narrowing the gap between operational systems and data warehouses

6. New generation, new priorities in BI and DW – ie updating BI/DW systems

– HP identifies renewals of systems, greater investment in new technology – perhaps in an emerging economic recovery context.

7. Complex event processing

– correlating many, varied base events to infer meaning (especially in the financial services sector)

8. Integrating/analysing content

– including unstructured data and external sources.

9. Social Computing [for BI]

– yet at the moment it takes great manual effort to incorporate such technology into BI

10. Cloud Computing [for BI]

You can find the full 60-minute presentation here.  HP noted that these points are very much inter-related.  I would also add a general tenor that I got from the discussion: that these are clearly more aspirational trends than widespread current initiatives.  HP’s research additionally highlighted the four most important current BI initiatives separately:

– data quality

– advanced analytics [again]

– data governance

– Master Data Management

Other current buzzwords, such as open source, Software as a Service, and outsourcing, didn’t emerge at the forefront of concerns.  For the first two, the comment was made that these were more background enabling technologies.  As for outsourcing, it looked like those who were going to do it had largely done it, and there was current stability around that situation.

Business Intelligence has obviously moved away from simple reporting from a single repository.   Concerns are now around data quality, integration/management – and making greater sense of it, particularly for decision-making.  Those trends are clear and current.  But I’d also like to note one small point almost buried in the above discussion: the use of external data sources.  Business value of data must inevitably move away from simple navel-gazing towards facing the whole of the world, and making business sense of it.  That’s a high mountain, and we’re only just becoming capable of moving towards that possibility in a meaningful way.

Read Full Post »

Again, this week I am gathering together a few reads that I have found to stick in my mind, for one reason or another.

The future of Analytics
The Data Warehouse Institute has a series of “Best Practice Reports”; a recent one is called Delivering Insights with Next-Generation Analytics.  It provides an analysis on the future of analysis, backed up with some survey results.  It characterises BI as central to analytics in a business context (and it’s hard to say what part of business analytics BI would not be involved in).  Reporting and monitoring remain crucial components of such activity, but TDWI places an emphasis on differentiating users of information and analytics, from production report consumers (wide in scope but terse in analytical focus) to the power user analysts and managers concerned with forecasting and modelling.  The essence of its recommendations are to provide appropriate tools to the differentiated users, and keep an eye on technology.  Although at a top level this isn’t exactly news, this report is packed with useful detail for those making an effort to keep on top of the intersection between business and technology.

The future of Data Warehouses
Although I had a look at some new technology in data warehousing recently, this second TWDI report (Next generation Data Warehouse Platforms) is necessarily more systematic.  It models the DW technology stack, outlines new technology and business drivers, intersperses user stories, and outlines emerging trends (eg appliances, in-memory, cloud/SaaS, columnar, open source, etc) not too different from my list.  Recommendations include: focusing on the business drivers; moving away from expensive in-house development; preparing for high-volume data; anticipating multiple path solutions, including open source.

In-memory databases
TDWI’s above report treated in-memory DWs seriously, without going into much detail on feasibility.  This is odd, given one of their recommendations involves preparing for an explosion in data to be stored.  I read a discussion on this technology (TDWI again: Q&A: In-memory Databases Promise Faster Results), which still doesn’t convince me that this isn’t a cat chasing its own tail.  The only realistic way forward I can see is by developing a dichotomy between core and peripheral data and functionality.  Haven’t seen that discussed.  Yet.

Forrester on trends and spotting them
Forrester has a new report aimed at Enterprise Architects: The Top 15 Technology Trends EA Should Watch.  These are grouped into five themes: “social computing for enterprises, process-centric information, restructured IT service platforms, Agile applications, and mobile as the new desktop”.  Some of it is discussed here, by Bill Ives.  Further, Forrester gives an outline of the criteria it uses for paying attention to a technology.  This includes how meaningful it is in the near term, its business impact, its game-changing potential, and its integrational complexity.

Vendor news: Oracle and bulk financials
Finally, news that Oracle has bought up again, this time taking over HyperRoll, whose software is geared for analysing “large amounts of financial data”.  Sounds a sensible move.

Read Full Post »

Why the buzz over columnar databases recently?  They’ve been around since the 1970s at least.  At the moment it remains the realm of niche players, although Sybase has had such a product for years, in Sybase IQ.

As far back as 2007, Gartner has been giving it a big tick.

Yet for some reason, I’ve been assailed by the concept from several disparate sources over the past month or so, some of which are heavy on the blah blah blah advantages but light on specifics such as reasons for those advantages.

I don’t pretend to have done the research, so I’ll just present a quick overview and links.  (At the very least, you can revert to Wikipedia’s article at top.)

In a nutshell, it is as it says, in that data is stored by column rather than by row (however, retrieval implementation seems to vary, with both row- and column-based querying variously supported).  Typified as meaningful in an OLAP more than OLTP context, it is said to be particularly beneficial when frequently working with a small subset of columns (in a table which has a large number of columns).  And, you guessed it, aggregation particularly thrives.


  • There’s a simple overview in a blog by Paul Nielsen, who puts his hand up for an implementation in SQL Server;
  • There’s a small simulation in Oracle, in a blog by Hemand Chitale  (with caveat in a comment);

Read Full Post »

For my money, Gartner‘s and Forrester‘s depiction of tools has broad equivalence. Their x-axes are Completeness of Vision and Strategy respectively; their y-axes are Ability to Execute and [strength of] Current Offering. Additionally, Forrester’s Wave helpfully spells out equivalence (of a sort), and sizes out market presence.

To compare, I looked at Gartner BI Q1 2008 and Forrester Q2 08 (which periods should not exhibit marked change, to my knowledge). One should expect their analyses to have congruencies, but they do differ, sometimes significantly. The both accorded leadership to IBM [Cognos],SAP [Business Objects], Oracle [Hyperion/Siebel products], and SAS, but Gartner included Microsoft, which Forrester downgraded to second tier, along with MicroStrategy, Information Builders, and SAP Netweaver. Forrester had them rather clustered, whereas Gartner differentiated more strongly between current execution and vision, interestingly ranging them from current to future order as Microsoft, Cognos, BO, Oracle, SAS.

Gartner's BI magic quadrant, Q1 2009

Gartner's BI magic quadrant, Q1 2009

Gartner’s Q1 2009 (summary and better quality image here) has them more clustered, yet differentiated. Cognos, Oracle, and SAS are ahead, with Microsoft and SAP back. On this take, Cognos has the best current offering, while SAS has better vision. (I note here that MicroStrategy, placed in second tier tends to perform particularly well with The BI Survey [OLAP Report] on customer satisfaction, which must count for something.)

Gartner’s  observes, inter alia, a flattening of the market in terms of ROI and offering: bigger spends don’t yield greater satisfaction, and BI is becoming more accessible through open source, SaaS, and Microsoft.  But there’s a split in the market, between those going for a middleware solution (it will fit here) versus those seeking a vendor capable of providing a fully integrated product set – which puts a context on those market consolidations.

The overall impression I get is that one cannot guarantee a clear leader as the manufacturers attempt to leapfrog each other with each new release. It’s also worth mentioning the change in the BI market over the past five years or so, from straightforward query/analysis/report to a plethora of tools (dashboards, scorecards, etc) for conveying that intelligence to the right people.

Of other interest for BI is: database, data integration, data quality, and collaboration tools.

In Data Warehousing, current analysis has few surprises. Forrester puts Teradata, IBM, and Oracle at the forefront, with Teradata slightly ahead due to strength of current offering. Standing back is Microsoft, still depicted as a leader due to their strategy more than their current offering. Which takes us to collaborative tools, which goes some of the way to explaining Microsoft’s strength, particularly due to their Sharepoint products. This week, someone from a Microsoft-focused shop told me he was not selling on the basis of SQL Server products so much as Sharepoint – because of its presentation presence, albeit it being back-ended by SQL Server (an analysis can be read with the graph here at Intelligent Enterprise).

Data quality tools? Gartner’s has rather changed in the past six months, now putting Dataflux clearly ahead, with Informatica and IBM [DataStage] bunched behind (viewed here).

Data Integration? As of Sept-08, IBM/DataStage were at front, with SAP/BO trailing, followed by SAS, with Microsoft and Oracle surrpisingly far back in the field, due both to current offering and vision. Simple picture here.

Gartner’s BI-specific page has a lot of information to absorb; Forrester’s page is mostly just links to reports.

Accompanying analyses are intrinsic, but as Get Elastic points out, there’s no “one size fits all”.  You have to assess on their ability to meet business needs; on the basis of choosing leadership alone can burn fingers.

My experience is that every tool has its pain points, in terms of both capability and usability. The quality of implementation is probably a bigger determinant of success than toolset (amongst the general leaders, at least; niche players such as Qlikview do not have the broad capabilities called for in a comprehensive tool). Like data mining, successful implementation is more likely with experienced implementers – that is, consultants. Yet there are traps there, too. I’ve seen consultant installs that have shown insufficient business insight, and/or have left behind insufficient documentation or transferrence of skills – either of which can deflate an initiative.

Moreover, it has to be accepted that BI is an ongoing project. If a consultant sets and an enterprise forgets, a couple of years down the track there will be significant atrophy of relevance. Business needs, expectations, and technological possibilities are constantly evolving. That latter is where product leadership has the most significance.

Read Full Post »

Computer Associates is one of the world’s largest providers of software and services (number 7 by software revenue, in the Software Top 100).

At this month’s day-long CA forum in Sydney (CA Expo 09), their CEO John Swainson gave us his vision of the future.  Following on from the last post, I thought it might be salient to drop this in, although it’s not aimed specifically at BI.

If you tried to guess the  future of IT, you might think services-everything and/or convergent/pervasive systems/devices – and you’d not be far off.

Swainson’s list was:
1.  Virtualisation
2.  Convergent networks
3.  SOA
4.  Social networking in the enterprise
5.  Cloud computing
6.  Networked devices  [“Scott McNealy: The network is the computer”*.]

This is pretty much a vision of pervasive, service-oriented computing.  Infrastructure becomes sufficiently commodified that it’s more cost-effective to outsouce most of it, and both enterprise and individual come to value their digital information more than anything else.  A dichotomisation, if you will, of Information and Technology.

But social networking in the enterprise?  Let’s not go overboard – viz the recent Harvard study of Twitter, for example, that found this buzz-of-the-month is far less used than its hype would suggest (the median Twitter user would tweet once – ever).

Yet the human networking paradigm is certainly spreading, and we can all envisage a place for it in the enterprise, for example:
– socialising knowledge (ie knowledge management);
– facilitating existing channels (ie Business As Usual improved);
– facilitating transparency (spreading information and understanding, which can greatly reduce the meetings treadmill).

So, it could be said that there’s nothing startling here.  Still, there is some fascination in experiencing trends converging in this information age.

*However, that  phrase is credited to John Gage originally; then popularised by McNealy.

Read Full Post »