Banner Image

Skills

  • Amazon Web Services
  • AWS
  • GCP
  • Google Cloud Platform
  • Python
  • Spark
  • API Development
  • Application Migration
  • Athena
  • Cloud Migration
  • Data Architect
  • Data Engineer
  • Data Lake
  • Data Migration
  • Data Pipeline

Sign up or Log in to see more.

Services

  • Data Engineer(Python/Spark) - (AWS/GCP)

    $30/hr Starting at $100 Ongoing

    Dedicated Resource

    =You will get support for writing highly scalable ETL & Data Processing Jobs, Data Migration (Homogenous & Heterogeneous), Real-time data streaming and analytic dashboards. ===Data Engineering=== =========...

    Amazon Web ServicesApplication MigrationAthenaAWSCloud Migration
  • Machine Learning | Deep Learning Expert

    $30/hr Starting at $100 Ongoing

    Dedicated Resource

    =You will get support in developing and implementing Machine Learning models starting from data ingestion and engineering until model deployment into production. ===Machine Learning=== ========= Machine...

    Amazon Web ServicesAPI DevelopmentAWSData ScienceDeep Learning

About

Data Engineer(Python/Spark) - ETL Pipeline & Machine Learning(AWS/GCP)

?Do you want to get insights from the Huge Size of Data? Want to Migrate database because of cost and performance issues? Apply Data Engineering techniques of raw data? or Do you want to apply Machine Learning on big data?

====I have solutions to all of your Data Related problems here====
My name is Chandu Parmar and I am a Technology Enthusiastic. My core expertise are writing highly scalable ETL Jobs, Creating ETL Pipeline, Training production-ready large scale machine learning models, Data Migration (Homogenous and Heterogeneous) and working with Cloud Technologies(Amazon Web Services - AWS & Google Cloud Platform - GCP)

Below are my detailed skillset - You click on "Invite to Job" or "Hire Freelancer" button, so that we can quickly get started and solve your problem.


===Data Engineering===
• Creating a Data Pipeline on Cloud Platforms like Amazon Web Services (AWS) and Google Cloud Platform (GCP)
• Writing Extract-Transform-Load (ETL) jobs for data processing using technologies like (AWS Glue, Pyspark, GCP DataProc, AWS EMR, AWS Lambda)
• Building real-time data pipelines for Streaming data using Apache Kafka, AWS Kinesis, GCP Data Flow

========= Data Engineering Skills =========
Expertise:
Amazon Web Services:
RDS, EC2, Glue, Lambda, Data Migration Service (DMS), S3, Sagemaker, Batch, ECS, ECR, Athena, Redshift, QuickSight, Kinesis.

Google Cloud Platform:
DataFlow, Pub/Sub, BigQuery, DataProc, Google Storage, Google Data Studio, Cloud Functions, Google AutoML, Google DataLab.

Tools & Libraries:
PySpark, Spark, Scala, Python, Hadoop, Hive, SparkML, AirFlow.

Database:
Postgres, MySQL, Oracle, DynamoDB, MongoDB, AWS Aurora, MSSQL

========= Machine Learning Skills =========
Expertise:
Machine Learning,Deep Learning, Natural Language Processing, Data Science, Predictive Modeling, Unsupervised Learning , Text Extraction , Data Mining , Sentiment Analysis

Tools & Libraries:
Keras, Jupyter Notebook, Python, TensorFlow, Pandas, Numpy, SQL, JS.