Wednesday, 31 January 2018

Using Custom Visuals in Power BI

Microsoft Power BI comes up with the option of using various Custom Visuals available in Microsoft Appsource. This feature gives endless options for Data Visualizations.

To start with let me built a simple Clustered Bar Chart, with Total Sales by Ship Mode and Product Category.


My objective is to build the same report using a Custom Visual, with a better way of representing the data.

First I have gone to the Microsoft Appsource and have selected the Infographic Designer, which will appropriate in this case. However, you can explore other Custom Visuals also.

Once I click 'Get Now', it takes me to the next page, here I can download the Custom Visual plugin or the sample report containing the plugin.

The plugin gets downloaded in .pbivz format.


Now I import the custom visual in the Power BI Desktop.


It shows a message that the import was successful.

And Infographic Designer is now available in the Visualizations pane.


Let me select that and use Total SalesShip Mode and Product Category. By default it will create a normal Bar chart.


I go the Format and change the Chart Type to Bar, which will convert it to a horizontal bar.


Next, to edit the custom visual, I click the pencil shaped 'Edit mark'.

This opens a Mark Designer window in the right. From here I can select different type of custom shapes for the chart.

I select file shape and enable the multiple units, this add multiple shapes in a single bar.


However, I have 3 Product Categories, viz Furniture, Office Supplies, Technology. It would be great if I can have different shapes for each of the categories. To achive that I have to enable the Data-Binding.


In the Data-Binding, I select Product Category.


And assign different shapes for each of the categories.


The final chart looks like,


Enabling Log in Oracle BI Publisher

Unlike OBIEE, there is no straightforward way in Oracle BI Publisher to view the log and the SQL generated for a report. However, there is a workaround to achieve the same, and in this post, I am going demonstrate that.

1. Open a text file and paste the code as below


LogLevel=STATEMENT
LogDir=D:\Tilak_Work\OBI11G\BIP_Debug

I have set the LogDir path according to my system, this needs to customized based on the environment.

For information, there are 7 levels of log information in BIP.
  • UNEXPECTED
  • ERROR
  • EXCEPTION
  • EVENT
  • PROCEDURE
  • STATEMENT
  • OFF


2. Save this file as xdodebug.cfg and place at <OBIEE Home>\Oracle_BI1\jdk\jre\lib



3. Restart the services. and now when I run some BIP reports and go to the log directory ('D:\Tilak_Work\OBI11G\BIP_Debug' in my case), I see multiple log files are generated. 




4. Among these files, if I check xdo.log file I can find the backend SQL for the BIP reports.


Configuring Oracle Database Cloud Service and Accessing using SQL Developer

I have used Oracle DBCS (Database Cloud Service) as part of BICS (Business Intelligence Cloud Service) before, which comes as preconfigured. But to use DBCS as part of Oracle Analytics Cloud, we need to configure it first. Now I have got my trial OAC account, so I have 3 objectives, 
  1. Create and configuring Oracle Database Cloud Service instance
  2. Access the DBCS using SQL Developer in my local
  3. Accessing the DBCS EM online

First I have to create a DBCS instance, and to do that I go to Oracle Cloud My Services - Dashboard, and click on 'Create Instance'.


And click on the Database option in the popup.


Next I have to select the preferances / configurations I want for my DBCS. I am taking Oracle DB 12c Release 1 - Enterprise Endition, which is compatible with Analytics service also.


In the Details page I set the DB name and the Administrator / SYS password. I have not chosen any Backup and Recovery option, as it would mean additional cost, and I don't need it for this demo purpose. 

There is an option for SSH Public Key. You can edit it. I would prefer to create fresh Public - Private Key pair using Putty Key Generator. This would be beneficial for connecting to Oracle DBCS from Putty.  


Once done I confirm and create my DBCS instance.


It will take some time for Oracle to create and configure the DBCS instance, and the status will show 'Creating service ..', and if I hover my mouse on that it shows me the details.

Once the status is completed, I have achived my 1st objective. But now what about 2nd and 3rd. The problem is now if I try connect to the DBCS from my local SQL Developer using the informations such as Public IP, Port etc, it is not connecting and I am unable to open the DBCS EM in the browser.

Actually I have spent quite some amount of time on this bottleneck, and what I have learnt is,
there are 2 major ways to connect to the DBCS using SQL Developer, viz

  1. Using SSH when the Listener Port is Blocked
  2. Without using SSH when the Listener Port is Unblocked

For now I am focusing only on #2. I will try to cover #1 alongwith connecting from Putty in a later post.

Now I need to use the Oracle Database Cloud Service console to enable one of the automatically created access rules.

I click the Navigation menu icon navigation menu in the top corner of the My Services Dashboard and then click Database. The Oracle Database Cloud Service console opens.

From the hamburger menu icon menu for the database deployment, select Access Rules and the Access Rules page is displayed. It contains the following access rules set to a disabled status.
  • ora_p2_dbconsole, which controls access to port 1158, the port used by Enterprise Manager 11g Database Control.
  • ora_p2_dbexpress, which controls access to port 5500, the port used by Enterprise Manager Database Express 12c.
  • ora_p2_dblistener, which controls access to the port used by SQL*Net.
  • ora_p2_http, which controls access to port 80, the port used for HTTP connections.
  • ora_p2_httpssl, which controls access to port 443, the port used for HTTPS connections, including Oracle REST Data Services, Oracle Application Express, and Oracle DBaaS Monitor.
I enable ora_p2_dbconsole and ora_p2_dblistener, which are required for the EM and the SQL Developer respectively.


And now I can use the same details as below to connect to DBCS using SQL Developer in my local.


And similarly I am able to open the EM also using internet browser.


Which confirms the closure of my 2nd and 3rd objectives.

Monday, 8 January 2018

Load data using Data Flow in Oracle DV

This time I explore the inbuilt ETL feature called Data Flow in Oracle DV ( DVD in my case).

To start with first go to Data Sources and Create Data Flow.


It will prompt for Data Source. We can select a Data Source or alternatively we add a source by either clicking on the '+' icon on top to add data or 'Add Data' option on the lefthand side menu.


Now let me, explore the left-hand side menu. These are the basic ETL activities those we can perform.


I start with selecting Sample Order Lines excel as a source.


My objective is to load the Customer data into Customer dimension.


Therefore first I need to select the Customer columns, and I use 'Select Columns'.


And I select 'Customer ID', 'Customer Name', 'Customer Segment'.


Next, I use 'Rename Columns' to match the target column names.


And now I define the target table.


I save and execute the Data Flow.


Now I open the table in DB and check the loaded data. It has loaded 9000 rows.


However, there is a problem. There are duplicate records. And as I have stated earlier my objective was to load data for Customer dimension. I need to get distinct rows only.


To do that, I have added one more column 'CUST_COUNT' which is nothing but 'Customer ID' only.


And I add aggregation 'Count' on CUST_COUNT.


I do not do any further change, and as I do not have any target column as CUST_COUNT, it is not going get loaded in the target. But by having an aggregation I am using group by other columns, and that in turn is eliminating the duplicate rows.


Now I truncate the target table and rerun the Data Flow, and this time the result is as desired.


There are multiple other options and we can use expressions also, which we might cover in a later post. 

Implementing & Testing Row Level Security in Power BI

I have suffered a great deal of pain while implementing and more so while validating Row Level Security in Power BI. Let me try to capture a...