Saturday, January 5, 2013

COGNOS Report with Dynamic Columns

This is the first blog post I am writing in year 2013. I want to be little more active with respect to blog writing this year and my goal is to write at least 20 posts in 2013. Primarily my target is to write articles related to COGNOS, Informatica, Microsoft Business Intelligence, SharePoint 2013 and Data Analytics.

The first one I am going to start with COGNOS where I am going to showcase the use of macros and conditional statements in COGNOS Report Studio. The requirement is to develop a report which should display the columns dynamically based on the options selected. For example a report has three options to run the report.

     Option A – Show Quantity Detail
     Option B – Show Revenue Detail
     Option C – Show Sales Detail

The report should show the below mentioned columns based on selected option

Option A – Year, Order Method (dynamic column) and Quantity (dynamic column)
Option B – Year, Product Line (dynamic column) and Revenue (dynamic column)
Option C – Year, Product Description (dynamic column) and Total Sales Amount (dynamic column)

The simplest but the dirty way to achieve this would be having three list objects with in the same report and hide two of them based on prompt selection.

This could be done dynamically without having the need of three separate list objects…we just need to write a macro and Query calculation columns where we can write the conditional statements to make the columns dynamic.

Here are the steps to make this working

a) Drag a List object in the report page

b) Place the “Year” data item in the first column on the List object. I am using sample “Sales Query” package.

c) Now create a Query Calculation column as a next column in the report. This column will be used to display the second column (Order Method, Product Line or Product Description) based on the selected prompt. The macro expression should look something like below

#prompt('Report Type','token','[Sales (query)].[Order method].[Order method type]')#

Please note that we are creating a new parameter called “Report Type” which we will use in the Value Prompt as a parameter

d) Create another Query Calculation column for the measures (third column). We need to write the Case expression and assign the column accordingly. The expression should be something like below 
when(?Report Type?='[Sales (query)].[Order method].[Order method type]') then ([Sales (query)].[Sales fact].[Quantity])
when(?Report Type?='[Sales (query)].[Products].[Product line]') then ([Sales (query)].[Sales fact].[Planned revenue])
([Sales (query)].[Sales fact].[Product cost])

e) Now we need to create a prompt page…go to prompt page and drag and drop the “Value Prompt “in the prompt page. Change the “Select UI” property of value prompt to the “Radio Button Group”

f) Assign the values Option A, Option B and Option C in the “Static Choices” option in the Value Prompt.

Option A - [Sales (query)].[Order method].[Order method type]
Option B -   [Sales (query)].[Products].[Product line]
Option C -   [Sales (query)].[Products].[Product Description]

That’s all, run the report and you should be good to go. If we select “Option B”, we should see the Year, Product Line and Revenue columns in the report.


Friday, October 5, 2012

What is new in COGNOS 10.2?

The COGNOS 10.2 was released on 10th Sep 2012. I had opportunity in last few weeks to explore the COGNOS 10 and this has been very interesting journey so far. It’s amazing to see on how IBM COGNOS 10 has taken this tool forward and very well aligned with future needs of Mobile BI and Self Service Business Intelligence. Not enough, now COGNOS 10.2 has connectivity for the Big Data as well which I feel is going to play a very important role in near future.

In this blog post I am going to be focused on the new enhancements on the IBM COGNOS 10.2 version. Please note that I am not talking about the COGNOS 10 features and it’s going to be specific to the COGNOS 10.2 which was released last month in Sep-12.
Some of the features are listed below

Dynamic Cubes

I think this is the most significant feature in the COGNOS 10.2 release. The most important thing is there is no additional licensing involved for this feature and customers need not buy anything separately.

The dynamic cube comes with the concept of the In -Memory analysis very similar to what other major BI vendors like SAP, Microsoft, SAS and Oracle have.

Ø  The dynamic cube model is build using the Dynamic Query Mode (DQM) functionality. Also these cubes use the extensive caching capabilities to efficiently store data within the memory.

Ø  The dynamic cube supports the relational Star or Snow flake schema and provides an OLAP experience without sacrificing on high performance. Some of the key highlights are mentioned below

Ø  100 % In Memory Engine with high performance and low data latency.

Ø  Aggregate recommendation to allow query routing to in memory or physical aggregate table in data warehouse.

Ø  New OLAP modeling tool called Cube Designer provides better OLAP experience. The Cube Designer tool is integrated with the COGNOS Framework Manager.

Ø  Multiple cubes can share the conformed dimensions.

Ø  Dynamic cubes can connect to the relational databases including aggregated tables to provide the OLAP style analytics.

Ø  Once the dynamic cubes are created and published as package, they can be accessed similar to any other OLAP packages.

Now the question comes on what is going to happen to existing Transformer and PowerCubes? I believe they still are going to be there and Dynamic Cubes will be a kind of complement in the existing environment. The decision to go with the Dynamic Cubes or PowerCube will be purely dependent on the Business need on each and every scenario.

COGNOS Workspace (AKA Business Insight)

IBM COGNOS 10.2 has done the renaming of the Business Insight and Business Insight Advance to Cognos Workspace and Cognos Workspace Advance. I think this new naming convention is clearer and reduces the confusion around and should be a welcoming change. There are some enhancements done in the Cognos 10.2 Workspace as listed below
Ø  Muti Tab Support

I believe this is the coolest enhancement in the Workspace.

Now we do not need to worry about shuffling around horizontal and vertical scrolling. We can easily expand our visual footprints in various tabs without the need to scroll.
Ø  Now we can freeze and unfreeze the crosstab columns and rows.

Ø  Support for the Google Chrome and Safari browsers.


Cognos Mobile was one of the major enhancements in the 10.1 release. In the10.2 release there have been some enhancements with respect to mobile

Ø  Cognos Mobile now supports the push notification. The users will be notified when a new of version of the report is available. 

Ø  The page streaming feature has been added which helps in reducing the load time before viewing the reports. 

Ø  Administrators now can secure the mobile access which can help organizations to implement the BYOD (Bring your own device) policy within the organization.

Report Studio

Report Studio has fewer but higher impact enhancements.

Ø  Prompt API

A fully supported API is available in the 10.2 release. This will help to extend the prompting capabilities. This java script based prompt API can be used for reading, deleting and validating the prompt value. This means now we can add the validations in the various prompt before users run the reports. The Prompt API is supported only in

ü  Report Studio for authoring

ü  Cognos Viewer for consumption

ü  Cognos Workspace/ Advance


Ø  Excel Output Changes

Excel 2010 is still not an output format but the Excel 2007 output capabilities have been expanded. Now the Cognos reports can export the data with increased limit of maximum of 16,384 columns and about 1 million rows. 


Architecture Enhancements

Ø  Support for new Data Sources 

ü  SQL Server and Analysis Services 2012
ü  Salesforce.Com
ü  Siebel
ü  Big Data Support : Hadoop and Hive

As you might see above, the support for Big Data is very significant. I believe this is going to be one of the key features in coming next few years as the Big data is coming in reality.

Also apart from support for some other ERP vendors like SAP and Oracle you might notice that now 10.2 can host the content store on SQL Server 2012 and also have ability to connect to the SQL Server 2012 cubes.

Ø  64 Bit – Now the entire remaining 32 bit component gets an update to 64 bit. This means now Cognos 10.2 is fully 64 bit.


Ø  Multi Tenancy – Cognos 10.2 now supports native multi- tenancy.

Saturday, September 1, 2012

Column Store Database Technology

Column Store database technology is something creating a buzz these days in the world of Business Intelligence as the time of Self Service Business Intelligence is coming soon and performance is going to play major role in deciding which Self Service BI tool to adopt. The Column Store Technology is something not new and it’s been there in the market since long time. I believe Sybase IQ, Teradata and now SQL Server 2012 are the major players in the field of the column store technology.

As the adoption of the self service Business Intelligence tool will grow day by day which I am sure, it is highly important for these tools to store the large amount of data effectively and have ability to play around with the data at run time efficiently. As we know unlike the OLAP where we do have the pre aggregated stored in the within the cells, in the self-service BI tools it is done on real time data based on the request.
SAP HANA a self-service Business Intelligence tool leverages the Sybase column store technology for data storage in memory, similarly Microsoft based self service BI tool Power Pivot uses the xVelocity in memory analytic engine and same technology is used for the column store index in the SQL Server 2012.
What is Column Store and how column store drastically improves the speed of the data warehouse queries?
In a traditional table the data is stored in the row- based fashion. This storage is basically the row based storage. Column Store is something like turning the row based storage to 90 degree. In the column store all the values of a single column are stored in compressed form. The column store stores each column in separate set of disk pages rather than storing multiple rows per page.
Let us take an example of the Employee Table to evaluate the different ways of storing the data.
Table Containing the Employee Data –

Employee Data Stored in the Row Format

Row Store 

1           14417807      adventure-works\guy1            Production Technician - WC60       M      M
2           253022876     adventure-works\kevin0         Marketing Assistant                        S       M
3          509647174      adventure-works\roberto0      Engineering Manager                     M      M
4          112457891      adventure-works\rob0 Senior Tool Designer                                 S       M
5         480168528     adventure-works\thierry0       Tool Designer                             M       M

Column Store
1 2 3 4 5
14417807 253022876  509647174 112457891 480168528
adventure-works\guy1 adventure-works\kevin0  adventure-works\roberto0  adventure-works\rob0  adventure-works\thierry0
Production Technician - WC60  Marketing Assistant  Engineering Manager Senior Tool Designer  Tool Designer

As you might see the that column store groups and stores the data for each column together while row store groups the data for each row.
Now since we understand on how the data is stored in row and column based storage techniques, let us try to understand on how the column store significantly improves the speed of the queries.
The first and the biggest reason for significant improvement in the query speed is because the column store shares many more similar characteristics that the data stored at row level. Because of this data can be highly compressed in the column store database.
When the data is compressed, queries requires less IO because the amount of data that needs to be transferred from the disk is significantly reduced.
The compression permits the columnar operations like MIN, MAX, AVG, COUNT perform quickly.
In the case of SQL Server 2012 column store uses the xVelocity compression algorithm which compresses the data far superior than the row and page based compression techniques.
The column store technology is used to improve the Data Warehousing queries and we might see a lot of enhancements and improvements coming in this area in near future.

Sunday, August 12, 2012

Get your own SAP HANA DB Server using Amazon Web Services

Someone who is interested to explore SAP HANA and do not have access to HANA DB server can use the Amazon Web Services by creating their own instance on the cloud. Please keep in mind that this is not a free service and after completing the free trial limit you will be charged by Amazon based on the usage and volume of data.

The steps are very simple in order to create our own HANA instance using Amazon cloud services. We need to follow the below mentioned steps in order get the access to the SAP Hana instance running on the cloud.

1.   Account creation in Amazon Web Services. We need to have an account on the Amazon Web Service as a first step, please note while creating the account, you will be asked to enter the credit card information which is mandatory in order to complete the registration.

2.     Now once we have the account setp up, we need to generate the “Key Pair”. In order to create the Key pair we need to go to the “Amazon Management Console” àEC2 à Left hand side we should see the option called “Key Pairs”, give a Key pair name. You will be asked to save the Key Pair file which we should keep in a safe place as this will be used further.

3.      Once we have the Key Pair ready, we are ready to go to the SAP site to register for the Hana. You need to Login and need to accept the License agreement. In case if you go not have the account in the SAP SCN then we need to create first in order to move further.

4.   After accepting the licensing agreement, you will be asked to enter the Amazon account number which you can find from the Amazon Web Services.

5.     Enter the Amazon account number and in the next page you will be asked to enter the Key Pair name which you already created in the Amazon Management console. Enter the Key pair name and click Continue. Regarding the HanaInstaceSize, use the default option “m2.xlarge” until you have need of some extra memory.



6.  In the final step, you will be asked to enter the Stack name and new stack will be  created in few minutes.

7.     Now we should again login in the Amazon Web Services and go to the Management Console, click on the EC2 and you should see the option called “Launch Instance”. After clicking on the “Launch Instance”, the SAP Hana instance will be initiated and ready to use. We will have to give the IP Address of this instance from the HANA client in order to connect my HANA DB hosted on Amazon Web Services.

8.      In order to install the SAP Hana Client, please click on the link and download the Windows (32 and 64 bit) or Linux version of client.  

9.   As mentioned in starting this is not a free service and you should be careful about the usage. Also in order to reduce the usage cost you should start the instance only when we are going to use. The rest of the time we can easily stop the instance, it takes less than a minute to restart the instance. In order to Stop the instance, go to the Amazon Management Console,   go to the EC2 and in the home page in the left hand side under instances we should the option called “Stop” used to stop the instance. 

For more details on Amazon Web Services and SAP HANA DB Server, you can visit the SAP Community Network  

Sunday, August 5, 2012

SQL Server Analysis Services Server Memory Issue

Sometimes we might have noticed that SSAS is consuming too much memory in the server and creating a deadlock kind of situation where the only solution is to restart the SSAS service or restarting the server.

This situation arises when bulky MDX queries are running and trying the fetch the large volume of data. Since in the SSAS the MDX queries are constructed with in the memory, the MDX query requesting for the large amount of data can be very much memory intensive. We can avoid these kinds of situations by changing some default memory settings in the server. Before we talk about the solution I think it is very important to understand some of the memory setting properties within the SSAS. I have captured their details below as per described in MSDN.


Specifies the amount of memory allocated by Analysis Services at start up. When this limit is reached, the instance will start to slowly clear memory out of caches by closing expired sessions and unloading unused calculations. The server will not release memory below this limit. The default value is 65; which indicates the low memory limit is 65% of physical memory or the virtual address space, whichever is less.


Defines a threshold that when reached, causes the server to deallocate memory more aggressively. The default value 80% of physical memory or the virtual address space, whichever is less.

Note that TotalMemoryLimit must always be less than HardMemoryLimit


Specifies a memory threshold after which the instance aggressively terminates active user sessions to reduce memory usage. All terminated sessions will receive an error about being cancelled by memory pressure. The default value, zero (0), means the HardMemoryLimit will be set to a midway value between TotalMemoryLimit and the total physical memory of the system; if the physical memory of the system is larger than the virtual address space of the process, then virtual address space will be used instead to calculate HardMemoryLimit.

In order to fix the memory freezing problem, we should change the default Memory\HardMemoryLimit setting.  We should keep this value between the Memory\TotalMemoryLimit and 100. Since the TotalMemoryLimit is 80 by default, then in this case we can set the HardMemoryLimit to 90. What will happen is anytime when the memory exceeds 90%, the SSAS will start cancelling the queries.  The users might see the error message saying “The operation has been cancelled due to memory pressure”.

In order to set the Memory\HardMemoryLimit property to 90, please follow the steps below

1.       Login in the SSAS using the SQL Management Studio.

2.       Right Click on the Server Name and Go to Properties.

3.       Look for the “Memory\HardMemoryLimit” property and set the value to 90. By default it is 0.