Primary technical skills in HDFS, YARN, Pig, Hive, Sqoop, HBase, Flume, Oozie, Zookeeper.
Skilled in Hadoop Architecture and various components such as HDFS, Job
Tracker, Task Tracker, Name Node, Data Node.
Hands on experience in working with Ecosystems like Hive, Pig, Sqoop, Flume, Oozie. With Pig and Hive's analytical functions, extending Hive and Pig core functionality with UDFs.
Hands-on experience developing Teradata PL/SQL Procedures and Functions.
Working with databases like Teradata and proficiency in writing complex SQL, PL/SQL
for creating tables, views, indexes, stored procedures, and functions.
Import/export terabytes of data between HDFS and RDBMS using Sqoop.
Job workflow scheduling and monitoring tools like Oozie and Ganglia.
NoSQL databases such as HBase, Cassandra.
Hadoop installation; configuration of nodes/clusters, administration, de/commissioning.
Hadoop ecosystem components such as Flume, Oozie, Hive, and Pig.
Experience in XML technologies like Informatica XML parser & XML writer.
Performed the performance and tuning at the source, target, and data stage job levels using indexes, hints, and partitioning.
Performance tuning in the live systems for ETL/ELT jobs in Hive, Hadoop ecosystem.
Good knowledge in PL/SQL, hands-on experience in writing medium level SQL queries.
Knowledge in Impala, Spark/Scala, Shark, Storm, Ganglia.
Preparation of test cases, documenting and performing unit testing and Integration.
In-depth understanding of Data Structures and Algorithms and Optimization.
Self-motivated, excellent team player, with a positive attitude and adhere to strict deadlines.