Thursday, May 27, 2010

In-Memory Database ile BI Implementasyonlarının Gücü

Bundan bir kaç yıl önce yüzlerce kullanıcının aktif olarak kullanabileceği bir iş zekası platformunun oluşturularak, sorgulama, analiz ve raporlama ortamının bu kadar yüksek sayıdaki bir kitleye haftalar seviyesinde açılabileceğini söyleseniz sanırım inanamazdım. Teknolojik gelişmeler sektörü öyle bir noktaya getirdi ki yaklaşık 15 yıldır iş zekası dünyasının içinde olmama rağmen her geçen gün beni biraz daha şaşırtıyor. Evet, sadece 4 hafta içinde yüzlerce kullanıcıya QlikView ile açılan bir projede aktif olarak rol almak istememin en önemli nedeni, iş zekası dünyasındaki bu teknolojik değişimin/sıçramanın hangi noktaya geldiği konusunda saha deneyimiyle birlikte bilgi sahibi olmak istememdi.


Gerçekten geleneksel iş zekası yazılım araçlarıyla ETL, sorgulama, raporlama, OLAP analizi, yönetim kokpitleri gibi tüm bileşenleri içinde bulunduran bir projeye soyunacak olsak büyük ihtimalle hala toplantılar yapıyor olacakken, QlikView ile sadece 4 hafta içinde bir kaç yıllık verilerin saniyeler bazında analiz edilebileceği bir ortamın oluşturulması, iş zekası dünyasındaki teknolojinin geldiği seviye hakkında şaşırtıcı bir sonuç...Yıllarca bitmeyen senfoniler haline gelen iş zekası projelerinin istenilen tatmin seviyesine gelememesi gerçeğini de gözönüne aldığım zaman, bundan sonraki dönemlerde geleneksel araçları kullanarak yapılan implementasyonlarda iş zekası hizmet sağlayıcı firma olarak yıpratıcı süreçleri yaşamamak adına QlikView dışında bir iş zekası aracı kullanır mıyım bilemiyorum.

Bundan sonraki dönemlerde favori iş zekası platformum : QlikView! 

En azından bir sonraki sıçramayı yapabilecek yeni bir teknoloji görene kadar...

Saturday, November 21, 2009

Revolution in BI - A Dream or a Reality

Revolution in BI – A Dream or a Reality

For many years, I have been discussing with business people about how a BI front-end tool should be used to bring intelligence into their business. The problem mainly comes because of the IT infrastructure and BI architecture in the organization and the usage behaviour of the analysts who would like to use the BI platform to help them in their decision making cycle.

Typically, IT has to be very careful about compliance, integration, security kind of components in a BI deployment to provide a healthy and well performing query and analysis environment. To achieve this, IT always has to bring constraints on the end users such as setting maximum limits for the query time, number of rows in a result set or preventing users taking some actions such as refreshing a report or sending the report by email to some other people in the organization.

However, typical analysts do not like having such constraints. In any BI project, usual approach of the project team to first understand what kind of information in the reports the business people would like to have. Then there’s an analysis period to find out where the data to create those reports are coming from and what are the calculations. Then the next steps come. On the other hand, the real benefit of BI solutions is that you extract and analyse information to create some conclusions that you would never knew if such a BI solution was not there. What you need is to extract and work on information that you cannot get when you don’t have such a BI implementation. If you would like to get hiddent information that you do not know, how can you explain to the project team, what kind of reports you would like to get, at the end of the BI implementation?

This is one of the main problems in BI world. But you should not think that I’m telling you there’s no real benefit of BI solutions. Of course when you have a BI platform you can access data that you could not reach otherwise, analyze it and share it with other people. What I’m trying to tell you is that, most of the time, business people do NOT know what they are looking for. What they usually tell IT is that, “give me the all data that I have the right to see, and I would like to do all my analysis on this data”.

However, traditional BI tools mainly use the “report” or “document” concept when data is provided to the end users. When using such reports, what I have seen is that, sometimes analysts try to create reports with millions of data. Of course since such tools do NOT have any DNA to handle such volume of data in one report, IT constraints come into the picture. In addition to these constraints, some IT people would also try to “train or teach” the analyst so that it’s NOT meaningful to get such data and see it on the screen? Why can there be such a requirement? How can an analyst see such big data on the screen? Usually what they need should be a summarized set of data! Your source data can be very big but the result set should not be so big.

But after many years of experience in BI and after listening to what business analysts are trying to achieve and how they work, I have understood that this is NOT always correct.

This situation usually creates a political fight in the organization between business analysts who would like to get all the data and IT who would like to bring constraints about the usage.

Do you know what happens in such scenarios? Usually Analysts will open the reports created by a BI tool, and then immediately Save this report as Excel and create a bunch of Excel files. If such an analyst is experienced enough, he/she will transfer this Excel data to an Access database, and accumulate historical data on this database. In short, they would create a kind of “Personal/Desktop Datamart”.

But why? Why have we, IT, always said that it’s NOT the correct way to get all the data to make analysis although business analysts have always wanted to see all the data? To me, the answer is quite simple:

It was the core architecture and how the traditional BI tools handle data in its core. Traditional BI tools with their existing approach, would not let you work on such big data sets. In many organization what is happening is that instead of spending time and effort on the intelligent side of the business, most of the people have been trying to understand how they can more effectively use the BI tools. Instead of understanding the organization by analysis in a BI platform, they are trying to learn the functions, features of a BI tool or trying to find out what kind of approach such tools have so that they can do what they really want to do. If such an analyst is lucky enough to spend enouth time and understand the dynamics of such a BI tool, then he/she suddenly becomes the star in the organization because he/she’s able to create any report that some executives would require.

Is this Business Intelligence?

It depends; but there has to be certainly some other approaches to improve this cycle and now what I see is that there are much better solutions because some BI tools solve this kind of problems from scratch. One major category that I have seen solving this problem is the BI Tools with in-Memory Databases with advanced calculation and visualization engines.

Since in-memory databases is able to store huge volume of data by high compression techniques, the end users are able to work with data volume that they would never be able to see in reports prepared within traditional BI tools. Such tools provide;

  1. More data in a BI workspace,

  2. Faster query results,

  3. Much more visualization in the user interface

  4. Much easier analysis environment.

...And of course provide the end users spend time on analysis rather than spending time to learn the BI tool itself.

Between 2000 and 2005, many BI vendors have claimed that by offering their “latest version” of the BI platform, there was a “revolution” in BI landscape and when I saw those features and functionalities I really thought so at that time. However, what I see is that the real revolution should just be coming now with such next generation in-Memory BI Tools...