One of the highlights of Microsoft’s Business Intelligence 2008 Conference in October was the announcement of “Project Gemini”. As Forrester reported, “With its just-announced Project Gemini, Microsoft aims to bring an Excel-based user analytics mashup tool into the core of Microsoft’s BI and data warehousing product portfolio. What is now only in the hands of OLAP data modelers and other highly trained staff will — as Community Technology Previews roll out to public beta testers late next year — become available to any company employee as an in-memory, drag-and-drop, pivot-table-enabled dashboard.”
According to Forrester, Microsoft’s ubiquitous spreadsheet, Excel, is already the most popular front-end program used by business analysts and others who want to analyze and display the results of their BI queries. This announcement of Project Gemini shows that Microsoft wants to accelerate the use of Excel as the ubiquitous front end for business intelligence dashboarding.
So what is Project Gemini? It’s an Excel add-on planned to ship with Kilimanjaro (a business intelligence focused release of SQL Server) that incorporates column-based in-memory business intelligence.
If you’d like to hear how Microsoft explained it, let’s go back in time to the Microsoft Business Intelligence 2008 Conference keynote speeches.
To view Donald Farmer introducing Project Gemini, fast forward to about 1 hour and 26 minutes. Also worth watching is the “BI Fairy Tale” at 1 hour and 17 minutes.
But, what is the real impact of Project Gemini for us Dashboard Spies? Well, thanks to the good folks at SiSense, makers of a business intelligence tool called Prism, we have this contribution of an article that makes sense of Project Gemini.
Thanks goes out to Roni Floman and Eldad Farkash for this article:
What does Microsoft’s project Gemini announcement mean for the business intelligence world?
By Eldad Farkash
Microsoft announced its Project Gemini on October 6th 2008. In essence, Project Gemini adds to Excel the ability to perform column based in-memory business intelligence without much of the terminology today’s BI consultants need to master.
Why does in-memory matter when it applies to business intelligence?
Traditional business intelligence solutions are OLAP centric. The OLAP was developed to rapidly answer multidimensional analytical queries (a paraphrase on Nigel Pendse). Since disks aren’t that quick, the OLAP pre-computes and aggregates the data.
The OLAP’s pre-calculation and pre-aggregation of business intelligence metrics was successful in that it enabled the early success of business intelligence and powered its growth. It is what made business intelligence so important for the large corporations that could afford to maintain it. This also points at the weak point: OLAPs create a lot of new data, require a data warehouse and involve long projects. This is the same as saying a high total cost of ownership.
This is where in-memory business intelligence comes in. It takes advantage of column based data structures (as opposed to row based tables or pre-aggregated cubes), and uses the already available super fast RAM to aggregate and calculate millions of cells on your regular Desktop (or on any cheap data server for that matter), without the long lasting table scans and indexing techniques that are required by traditional OLAP & OLTP systems. Coupled with modern, intuitive user interfaces, it lets users slice, dice and filter data in a way that is easy to learn.
Click on the “read more” link below to view the rest of this post: