Banner Image

Skills

  • Hadoop
  • AIX
  • Apache
  • Apache Hive
  • Asterisk
  • Bash
  • Big Data
  • C Programming Language
  • Dhcp
  • DNS
  • Firewalls
  • FTP
  • LAMP Stack
  • Linux
  • MATLAB

Sign up or Log in to see more.

Services

  • Server Administration

    $10/hr Starting at $150 Ongoing

    Dedicated Resource

    10 years of experience managing large infrastructure up to 2000 node cluster.Exposure to Linux (Redhat, Suse, Ubuntu), Solaris, AIX,HP-UXWorked with IT leaders Like IBM and TATA.Nominated as Technical...

    AIXApacheAsteriskBashC Programming Language
  • System Administration

    $10/hr Starting at $150 Ongoing

    Dedicated Resource

    10 years of experience managing large infrastructure up to 2000 node cluster.Exposure to Linux (Redhat, Suse, Ubuntu), Solaris, AIX,HP-UXWorked with IT leaders Like IBM and TATA.Nominated as Technical...

    AIXApacheAsteriskBashC Programming Language
  • Hadoop and Big Data Platform Setup

    $25/hr Starting at $500 Ongoing

    Dedicated Resource

    Install and Setup Hortonworks, Cloudera, HD Insights Hadoop. Designing Security Platform in Hadoop using Knox, Ranger, AD. High Availability of Name Node, Resource Manger, Hbase Master, Hive Server 2...

    Amazon Web ServicesApache HBaseApache HiveAzureBig Data
  • Development in Spark/Scala

    $35/hr Starting at $500 Ongoing

    Dedicated Resource

    I am working on Hadoop and Big Data Since last 7years. Experienced programmer and architecture in Hadoop, Spark. I have 2 years experience working on Scala, Spark SQL, ML and R language.

    Apache HiveData ScienceHadoopMachine LearningMl
  • Google Cloud Platform

    $35/hr Starting at $500 Ongoing

    Dedicated Resource

    Data Engineering on GCP using Big Query, Data proc, Dataflow. Data platform and enterprise data hub on GCS.

    Big DataData AnalysisGoogle CloudHadoop

About

Looking for Data Engineering, AI/ML and DevOps work

For details resume mail me.

Overview
I have a sound experience of design & development of Big Data Application and Solution on AWS/Azure/GCP Public cloud as well as in-house data center by using open source Hadoop, Map-Reduce, Pig, Hive, HBase, R, Linux System, Core Java, Spark, Scala and Python. As an Architect, my expertise lies in building loosely coupled components that scale out horizontally to address the exponential data growth in terms of volume as well as velocity. I have developed a metadata-driven unique transformation framework, which can successfully replace traditional ETL tool in EDW world. I am working on Big Data Analytics (Market Modeling, SADM, NLP Classifiers for Spam filtering, Sentimental Analysis) and cloud computing since last 7 years. Last two years I am working on Hortonworks/Cloudera Hadoop GCP BigQuery and open source analytics tools like R, Spark/Scala.
Education
MBA (Finance), Master of Computer Application

Experience
Total 22 years of experience working with organizations like
1. IBM Software Lab
2. Hortonworks Data Platform
3. Wipro Technologies

Certification
1. HDPCA (Hortonworks)
2. ITIL v2 Foundation
3. Python for Data Science (IBM)
4. Google Cloud Architect
5. Google Cloud Data Engineer
6. Advance Data Analytics with R (NPTEL)

Project Related to Big data (13 years’ experience)
1. Big data Platform Engineering for large production systems of 1000 nodes cluster.
2. Multi-tenancy and role-based access control (ranger) Security (Knox, Kerberos) on Data Lake (Hortonworks).
3. Spark/Scala programming for IoT operational data for Digital Datahub on AWS with Cloudera Hadoop
4. Retail Data Analytics on Hadoop (HDP) using Spark, Zeppelin, R.
5. Enterprise Data Warehouse (EDW) in Big Data Eco System using Hive.
6. ETL on Spark/Scala, Spark SQL using AWS EMR.
7. Integration of Tableau, SAS, R with Hadoop for Visualization
8. Migration of traditional EDW to AWS cloud using EMR, Redshift.
9. Hive query optimization on Hive2/Tez to gain 40% performance.

Project Related to AI/ML (10 years’ experience)
1. HP Billiton, Australia. (Live4D, Truck360) NiFi, PySpark, H2O Machine learning HBase. D3, Data Lake using HDP.
2. Product Price Analysis to detect fraud in product catalog. (random forest model)
3. Enhanced Service-Matters with Google Gemini LLM
4. Smart Contract Management Framework (Gen AI )
5. Google IVN Solution using Big Query, VertexAI, CRM/CDP and Denodo (data virtualization), Looker

Work Terms

Open

Attachments (Click to Preview)