Thursday, 30 November 2017

Oracle BICS / OAC - using Data Sync to upload data in DBCS

I have subscribed for a free trial of Oracle BICS  (Oracle's Cloud BI) and exploring way to put my data into the Oracle DBCS (Oracle's Cloud DB).


Now there are multiple ways like Data Sync, RDC, RESTful API etc, however for this blog I am going to keep my focus to Data Sync only. Before going to Data Sync, let me check the information for businessintelldb (Oracle DBCS ).
I have downloaded and unzipped Oracle Data Sync in a local folder. There is no installation required. However there are certain prerequisites prescribed by Oracle.


To start Data Sync run datasync.bat (on Windows) or datasync.sh (on Linux/UNIX) from the directory where you installed Data Sync.

Once it starts, it will ask if you want to create a new project or use an existing one. In this case I am going to create a new one, and naming it 'BISample' as I am going to load the BISample schema from my local DB to the Cloud.


In Data Sync there are 3 tabs :

1. Connection - Create connection for Source and Targets
2. Project - Manage mappings, definition, parameters etc
3. Jobs - Create and schedule jobs

And our action items are also in the same order. So first we define and test Target, which is DBCS.



And Source, which is BISAMPLE schema in my local.



Next I go to Project tab. As our source is Relational Data, I click on that tab and choose for Discover objects.


I select BI Sample Source and Import all table with definition.


Once the table definition is in place it is time to create the mapping. In my case as I want to replicate everything in cloud, it is pretty straight forward.


Next it is time to define the Load Strategy.


Next I move to Jobs tab and create a job with the Source and Target. If required we can schedule multiple jobs, send email notification etc.

However in this case I just create, save and run the job.


Now I go to the Current Jobs to find the job in running status. We can also check the past jobs in the History tab.


Simultaneously, I log into the DBCS to find the recently created tables, which further confirms that the jobs is running and creating the tables in cloud.


Once the job is finished with 100% success, it is my time to validate the same in cloud.


I go the DBCS Object Browser to find the tables and data.


Which is matching with my local DB.


Pretty simple, obviously there is a huge scope to explore further complex scenarios.


No comments:

Post a Comment

Implementing & Testing Row Level Security in Power BI

I have suffered a great deal of pain while implementing and more so while validating Row Level Security in Power BI. Let me try to capture a...