C. Nagarjuna Email:nagarjunac08@gmail.com
Mobile no: +91-9676331468
Career Objective
To further my career with a growth-or...
Experience Details :
Project Description:
 TITLE: Sales Data mart for BI.
 CLIENT: Argos
 ROLE: ODI Consultan...
 Developed Interfaces for dimensional Incremental uploading.
 Imported KM as per the Requirement.
 Debug the sessions b...
of 3


Published on: Mar 3, 2016

Transcripts - nagarjuna_4+etl

  • 1. C. Nagarjuna Email:nagarjunac08@gmail.com Mobile no: +91-9676331468 Career Objective To further my career with a growth-oriented firm that will allow me to utilize my experience and knowledge as an ODI , Cognos , Hadoop, Pig ,Hive and Informatica Implementation, support and maintenance. Professional Summary:  Having 4+ years of experience in Experience in ETL methodology including Data Extraction.  Knowledge to configure Load Balancing through Agents and Scheduling through ODI scheduler and Hadoop.  Extensive Knowledge in Informatica 7X.  Knowledge in Data warehousing concepts.  Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.  ODI Implementation, support and maintenance.  Strong team player and core Contributor and Quick self Learner  Good knowledge in cognos reporting studio  Good understanding in ELT Design development Oracle Data Integrator 11G.  Ability to setting up the connection parameters in the Topology Manager.  Sound Knowledge on developed ODI Interfaces,packages,Models and Data stores in different Technologies.  Good Understanding on ODI Components like Designer, Topology, Operator Security Navigator.  Ability with loading metadata and data from Different Source Systems and into target Systems (Oracle, File, XML, SQLserver, DB2).  Ability with Exporting a Flat file source system to different target systems. Professional Experience:  Working at Infostairs technologies from 2010 Educational Qualification:  B.Tech in Computer Science from JNTU Technical Skills:  OS: Windows, Unix/Linux(Redhat Linux,ubuntu)  ETL Tools: ODI 11.1.X,10.1.3.X,Informatica 7.1.X  Reporting Tools: Cognos,Siebel Analytics  Databases: Oracle,SQL Server,Mysql  Languages: C,Java  Web Technologies: HTML,XML,DHTML  Database Tools: PL/SQL
  • 2. Experience Details : Project Description: Project#1  TITLE: Sales Data mart for BI.  CLIENT: Argos  ROLE: ODI Consultant. Description: The company sells consumer products through stored, catalogs and E commerce site; Product category includes electronics furniture bed and baths, game and arcade,kichen and bar. The Company's portfolio includes office furniture, furniture systems, interior architectural products, technology products and related products and services. Extracting data from Oracle and reverting back to Oracle needs a lot of time to load the data. So, Hadoop helps the data to be transformed on HDFS with less time.This data mart was developed to facilitate better decision making and to optimize inventory levels. ODI was used to design mappings and to lad into the target. 1. Create the logical design for the data mart star schema. 2. Map the logical design to a physical design. 3. Generate code to create the objects for the data mart. 4. Create a process flow for populating the data mart. 5. Execute the process flow to populate the data mart. In addition to a powerful graphical interface,OWB provides a metadata repository that holds detailed design information about your databases. The repository is implemented as a set of tables in an Oracle Database. The data you enter in the repository is available to any user who has at least read access to the repository application system. Using the OWB Design Center, you import data source definitions, define target structures, validate the structures, generate and deploy the code to implement the structures,and execute the process flows to run the ETL mappings that load data into the target structures. Once deployed, Warehouse Builder assists in the daily maintenance as well as monitoring of the deployed system. Responsibilities:  Creating severalreverse engineering Models as per the requirements.  Setting up Connection parameters in the topology Manager of ODI.  Creating Physical schemas and Logical Schemas in Different Contexts.  Mapped Logical resources and Physical resources in different contexts.  Developed severalInterfaces as per the requirements.  Developed Interfaces to Union Multiple tables of same structure using Datasets.  Created and scheduled in ODI scenarios as per the requirements.  Created ELT workflows in the package by using ODI Tools.  Monitor interfaces and execution logs as per the requirements.  Developed packages as per the requirements.  Developed Package to atomize work repository and master repository by using ODI Tools.  Exported scenarios in the DEV and imported in Production environment.  Written pl/sql functions as per the requirement.
  • 3.  Developed Interfaces for dimensional Incremental uploading.  Imported KM as per the Requirement.  Debug the sessions by utilizing the log of the sessions.  Performed unit testing at various levels of ELT flows.  Developed business reports using cognos Environment: WindowsXP, ODI11g, Oracle11g, Editplus, Sqlloader,Hadoop,pig,cognos. Project#2  TITLE: Health care Development and Dept.  CLIENT: Medibank health care(USA).  ROLE: Software Engineer. Description: Medibank – an integrated healthcare company providing private health insurance and health solutions to 3.9 million people in Australia and New Zealand. The regional wise data of Insurance claims data retrieval and loading to the informatica was the main module. There are various database of historical data.Dataware housing plays a major role in enabling various stores to view data at lowest level and help them to make decisions to bring sales improvement to the company with new polices. Responsibilities:  Worked with multiple sources such as Relational Databases, Flat files for extraction using source qualifier and Joiner.  Developed Various mappings to load data from various sources using different Transformations like joiner, Router, Aggregator, look up, update strategy, source qualifier, sequence.  Involved in developing mapping document indicating the source tables, columns, data types, transformation required ,business rules, target tables, columns and data types.  Generator to store the target tables.  Creating sessions and ETL work flows for carrying out data loads.  Do the error validation of the data moving from source system to the staging area.  Test the Mappings and quality of the deliverables.  Used Informatica workflow manager and Monitor to create ,schedule and Monitor workflows. Environment: Informatica power center 7.1.X,Oracle 9i, Windows 2000 Professional. Personal Details  Present Location : Hyderabad  Willing to Relocate: Yes  Languages Known: English,Telugu,Hindi  Passport : Yes  Notice Period :30 Days (Nagarjuna.C)

Related Documents