From: JTP PR on
If you have been tracking the BI industry for a while you will have
noticed that that in-memory analysis is all the rage. The reason for
this is that it is more than a techno fad. In-memory analysis really
does matter.

And here are 5 reasons why.

[1] Fast BI deployment at low cost

Lets face it most organizations are not blessed with plenty of
resources (time and money) for BI projects. Traditional data
warehousing can be resource intensive – with a dismal ROI. What in-
memory analysis allows for is rapid development and deployment of
analytical datasets that can be used for reporting and analysis
purposes.

The speed of development means that you can get reporting quickly into
the hands of those that need it to take action – whilst the issues
uncovered are still fresh and relevant for the organization. This
agility means that as the organization’s competitive environment
changes the reporting can quickly adapt to new needs, without the
burden of high development costs and delivery cycles.

[2] Disposable BI – the art of analysis

One of the dichotomies that we are seeing in the BI space is the
difference between reporting and analysis. Most vendors interchange
the two – but they are very different beasts.

Reporting is an ongoing and regimented process. Performance measures
are defined, targets set, then reports written to track these over
time. And lets face it there can be hundreds/thousands of these in an
organization. Reporting is used to drive actions within an
organization. They assist people to fulfill the obligations of their
role.

Analysis on the other hand is a creative exercise. Whether intended
or not people who analyze data are looking to uncover new insights,
discover correlations and use data to assist in strategic decision
making – the what to do next, where to focus attention.

The interesting part of analysis is that in reality it does not have
to be repeated ad infinitum. Once insight has been gained, and
proved, that analysis can be relegated to history. A new strategy
created and the organization moves on.

What this means is that most true analysis is disposable. You want to
build a data set, test your theories, make decisions and then move
onto the next problem at hand. In-memory analysis for this is awesome
– since you do not have to build a data warehouse or invest in massive
amounts of infrastructure to get your data into shape for this
analysis.

[3] Proof of Concepts and Iterative / Agile Development Cycles

One of the great issues in BI is the risk of building a massive BI
data warehouse before you actually write a single report. This means
that business users have few examples into what they are will get from
a BI project until it is delivered.

It’s really hard to change a data warehouse. It is not hard to change
an in-memory database. Why? You do not have to aggregate your data
(see below)

In-memory analysis means that IT can work side-by-side with the
business to develop a BI solution. It is
so fast that a business user can see reports and then pose more
questions, get more data test reports, change logic etc etc.

This iterative and agile development means that the need for long
analysis is no longer needed since the down stream risk is so low. So
get out there, and start building, showing and improving
continuously.

[4] Great for large data volumes

OK so time for some tech stuff. In-memory solves the problem of a
slow data base response time because of large data volumes. Basically
not many people have all day for a query to run – so traditionally you
had to build a data warehouse to extract the data fro your source
system, aggregate the data (that is roll it up into manageable
volumes) then report on the aggregate. Wow – that was a lot of work
just to describe. With in-memory you can be lazy. Load the lot into
memory and report off that. That’s what in-memory databases are for.
They love lots and lots of data.

[5] Great for complex data queries

Part 2 of the tech stuff. Apart from lots of data – traditional
database designs used for your core business activities hate complex
queries. Why are BI queries complex – well because they want to join
multiple tables together to make sense of all the data you have
collected in your CRM or ERP and view these over time.

This are not single transaction queries for which your application
(and its database) have been designed.

So now you have a large volume of data and a complex query – another
reason you need a data warehouse – right? Wrong another key reason
that in-memory matters. OK there is a trick – your data is still
coming out of your transactional system but only once. Then it is
loaded into your in-memory db and ready for analysis.

In-Memory Brochure
http://yellowfin.com.au/Document.i4?DocumentId=104877

In-Memory Whitepaper
http://yellowfin.com.au/Document.i4?DocumentId=104879
From: John G Harris on
On Wed, 16 Jun 2010 at 05:11:31, in comp.lang.javascript, JTP PR wrote:
>If you have been tracking the BI industry
<snip>

Apparently BI stands for 'Business Intelligence', but if you replace BI
by BS everywhere in this article then it makes just as much sense.

John
--
John Harris