Banner Image

Skills

  • Hadoop
  • AIX
  • Apache
  • Apache Hive
  • Asterisk
  • Bash
  • Big Data
  • C
  • Dhcp
  • DNS
  • Firewalls
  • FTP
  • LAMP Stack
  • Linux
  • MATLAB

Sign up or Log in to see more.

Services

  • Server Administration

    $10/hr Starting at $150 Ongoing

    Dedicated Resource

    10 years of experience managing large infrastructure up to 2000 node cluster.Exposure to Linux (Redhat, Suse, Ubuntu), Solaris, AIX,HP-UXWorked with IT leaders Like IBM and TATA.Nominated as Technical...

    AIXApacheAsteriskBashC
  • System Administration

    $10/hr Starting at $150 Ongoing

    Dedicated Resource

    10 years of experience managing large infrastructure up to 2000 node cluster.Exposure to Linux (Redhat, Suse, Ubuntu), Solaris, AIX,HP-UXWorked with IT leaders Like IBM and TATA.Nominated as Technical...

    AIXApacheAsteriskBashC
  • Hadoop and Big Data Platform Setup

    $25/hr Starting at $500 Ongoing

    Dedicated Resource

    Install and Setup Hortonworks, Cloudera, HD Insights Hadoop. Designing Security Platform in Hadoop using Knox, Ranger, AD. High Availability of Name Node, Resource Manger, Hbase Master, Hive Server 2...

    Amazon AWSApache HBaseApache HiveAzureBig Data
  • Development in Spark/Scala

    $35/hr Starting at $500 Ongoing

    Dedicated Resource

    I am working on Hadoop and Big Data Since last 7years. Experienced programmer and architecture in Hadoop, Spark. I have 2 years experience working on Scala, Spark SQL, ML and R language.

    Apache HiveData ScienceHadoopMachine LearningMl
  • Google Cloud Platform

    $35/hr Starting at $500 Ongoing

    Dedicated Resource

    Data Engineering on GCP using Big Query, Data proc, Dataflow. Data platform and enterprise data hub on GCS.

    Big DataData AnalysisGoogle CloudHadoop

About

I do Hadoop better on cloud

For details resume mail me.

Overview
I have a sound experience of design & development of Big Data Application and Solution on AWS/Azure/GCP Public cloud as well as in-house data center by using open source Hadoop, Map-Reduce, Pig, Hive, HBase, R, Linux System, Core Java, Spark, Scala and Python. As an Architect, my expertise lies in building loosely coupled components that scale out horizontally to address the exponential data growth in terms of volume as well as velocity. I have developed a metadata-driven unique transformation framework, which can successfully replace traditional ETL tool in EDW world. I am working on Big Data Analytics (Market Modeling, SADM, NLP Classifiers for Spam filtering, Sentimental Analysis) and cloud computing since last 7 years. Last two years I am working on Hortonworks/Cloudera Hadoop GCP BigQuery and open source analytics tools like R, Spark/Scala.
Education
MBA (Finance), Master of Computer Application

Experience
Total 19 years of experience working with organizations like
1. IBM Software Lab
2. Hortonworks Data Platform
3. Wipro Technologies

Certification
1. HDPCA (Hortonworks)
2. ITIL v2 Foundation
3. MCTS (Microsoft Windows HPC Server Development

Project Related to Big data (7 years’ experience)
1. Big data Platform Engineering for large production systems of 1000 nodes cluster.
2. Multi-tenancy and role-based access control (ranger) Security (Knox, Kerberos) on Data Lake (Hortonworks).
3. Spark/Scala programming for IoT operational data for Digital Datahub on AWS with Cloudera Hadoop
4. Retail Data Analytics on Hadoop (HDP) using Spark, Zeppelin, R.
5. Enterprise Data Warehouse (EDW) in Big Data Eco System using Hive.
6. ETL on Spark/Scala, Spark SQL using AWS EMR.
7. Integration of Tableau, SAS, R with Hadoop for Visualization
8. Migration of traditional EDW to AWS cloud using EMR, Redshift.
9. Hive query optimization on Hive2/Tez to gain 40% performance.

Work Terms

Open

Attachments (Click to Preview)