Nagesh Hadoop Profile
Published on: Mar 3, 2016
Transcripts - Nagesh Hadoop Profile
NAGESH M E.Mail : email@example.com
Ph.No : +91 7416453763
CAREER OBJECTIVE :
To excel as software professional and hold up a position in corporate world through diligence
and dedicationandtoensure myhighestcontributiontowardsthe organizationIworkwith.
CAREER SUMMERY :
Experienced Hadoop Developer has a strong background with file distribution systems in a big-
data arena. Understands the complex processing needs of big data and has experience developing codes
and modules to address those needs. Familiar with Hadoop information architecture and ELT
• 2+ years of experience inITindustry that includes HadoopDeveloper(withknowledge of
• Experience indeveloping PIG LatinScripts and using Hive Query Language for data
• Good workingexperience usingSqoopto import data into HDFS from RDBMS and vice-
• Providinghardware architectural guidance,planningandestimatingclustercapacity,and
creatingroadmapsfor Hadoop clusterdeployment.
Maintaining and monitoring clusters. Loaded data into the cluster from dynamically
generated files using Flume and from relational database management systems using
Supporting Hadoop developers and assisting in optimization of map reduce jobs, Pig Latin
scripts,Hive Scripts, andHBase ingestrequired.
• Experience inNOSQLcolumnorienteddatabases like HBase and its integrationwith Hive
• UsedSqoop extensivelytoingestdata fromvarioussource systems intoHDFS.
• Assistedin loadinglarge sets ofdata (Structure, Semi Structured,Unstructured).
• Managed Hadoop clusters include addingandremovingclusternodesformaintenance and
• Experience in InformationTechnologyincludingPL/SQL,System Analysis,Development,
AREAS OF EXPERTISE :
Big Data Ecosystems : Map Reduce,HDFS, HBase, Hive, Pig,Sqoop, Oozie and Flume.
Programming Languages : Pig,hive,sql and Java.
Databases : Oracle, MySQL and NoSql.
Tools : Eclipse,Netbeans.
Platforms : Windows(2000/XP), Ubuntuand CentOS.
Title : Data processingusingMap Reduce in NoSql databases
Environment :Hadoop,Apache Pig,Hive,OOZIE,SQOOP,UNIX, HBase andMySQL.
Role :Hadoop Developer
The purpose of the project is to perform the analysis on the Effectiveness and validity of
controls and to store terabytes of log information generated by the source providers as part of the
analysis and extract meaningful information out of it. The solution is based on the open source Big Data
software Hadoop. The data will be stored in Hadoop file system and processed using Map Reduce jobs,
which intern includes getting the raw data, process the data to obtain controls and redesign/change
history information, extract various reports out of the controls history and Export the information for
Worked on setting up pig, Hive and HBase on multiple nodes and developed using Pig, Hive and
DevelopedMapReduce applicationusingHadoop,MapReduce programmingandHBase.
Developedthe Sqoop scriptsinordertomake the interactionbetweenPigandMySQL Database.
Title : SentimentAnalysison customer evolutioninbanking domain
Environment :Hadoop,Apache Pig,SQOOP,Java,LINUX, and MySQL.
Role :Hadoop Developer.
Project Description :
The purpose of the project is to store information generated by the bank’s historical data,
extract meaning information out of it and based on the information predict the customer’s category.
The solution is based on the open source BigData s/w Hadoop .The data will be stored in Hadoop file
Developedthe Sqoopscriptsinordertomake the interactionbetweenPigandMySQLDatabase.
Analytics Model with R used for group expressions, Box plots and used to read files
usingread.table( ) functionsandscan( ) functions.RwithHDFS and R withHBASE.
DidEDA on the data set andcarefullyremovedthe irrelevantdataitems.
ACADEMIC QUALIFICATION :
GraduationinComputerScience Engineeringfrom JNTUHyderabad with75 %
IntermediateinMPCfrom SREE NIDHI JuniorCollegewith 87%
Highschool passedfrom VAGDEVIHIGH SCHOOL with86%
I hereby state that all the above furnished information is true and correct to the best of my
PLACE : Chennai ( NAGESH )