Banner Image

All Services

Programming & Development

Big Data & Datawarehousing Architect

$15/hr Starting at $25

1-Expertise in HQL,SQL, VQL, ETL (SSIS 2008/2012, Informatica Power Center 10.0, Penataho, Talend for Big data), Data Virtualization (Denodo 5/6/7), data-as-a-service-platform (Dremio) and data visualization technology (Tableau10.5). 2- Have developed the concept where we can not only retrieve real times data from APIs but also can expose Denodo views as rest APIs in order to be consumed by Desktop Apps. Also, developed base views, cached derived views, flatten views, Projection, Selection, Union, Minus, Interface, Associations, and Web Services using Denodo. 3-Utilizing snowflake in order to copy data from cloud storage systems such as Amazon s3 and loading to snowflake tables. Also, gathered a few best practices in order to utilize snowflake with BI tools as well as middleware tools to perform cloning, combining data from multiple databases, etc. 4-Analyzed requirements, functional specifications, and building Mappings, sessions, and workflows in Informatica Power Center by utilizing data provided by Banking clients. 5-Developed customer service information system for various products supplied to customers using SQL server 2008 & SSIS Package. 6- Building up tableau dashboards on top of Data lake, tuning workbooks, published it to server and applying best practices. 7- Helping customers on building reflections, performing maintenance and optimizing data preparatory queries. 8- Developed complex jobs and transformations using Pentaho Integration 6.1/7.0/7.1 to extract data from various sources like JSON, REST APIs such as Kibana, Elastic search and loaded into different RDBMS like oracle, SQL server. 9. Created Sqoop scripts to import data from various databases into hadoop environment. Developed Performance tuned Parquet and Orc format hive tables with snappy compression. Created Pig scripts to transform data into user needs. 10. Developed various jobs to extract data from various delimited sources and loaded into hdfs and hive dynamically partitioned and bucketed tables using Talend for Big data.

About

$15/hr Ongoing

Download Resume

1-Expertise in HQL,SQL, VQL, ETL (SSIS 2008/2012, Informatica Power Center 10.0, Penataho, Talend for Big data), Data Virtualization (Denodo 5/6/7), data-as-a-service-platform (Dremio) and data visualization technology (Tableau10.5). 2- Have developed the concept where we can not only retrieve real times data from APIs but also can expose Denodo views as rest APIs in order to be consumed by Desktop Apps. Also, developed base views, cached derived views, flatten views, Projection, Selection, Union, Minus, Interface, Associations, and Web Services using Denodo. 3-Utilizing snowflake in order to copy data from cloud storage systems such as Amazon s3 and loading to snowflake tables. Also, gathered a few best practices in order to utilize snowflake with BI tools as well as middleware tools to perform cloning, combining data from multiple databases, etc. 4-Analyzed requirements, functional specifications, and building Mappings, sessions, and workflows in Informatica Power Center by utilizing data provided by Banking clients. 5-Developed customer service information system for various products supplied to customers using SQL server 2008 & SSIS Package. 6- Building up tableau dashboards on top of Data lake, tuning workbooks, published it to server and applying best practices. 7- Helping customers on building reflections, performing maintenance and optimizing data preparatory queries. 8- Developed complex jobs and transformations using Pentaho Integration 6.1/7.0/7.1 to extract data from various sources like JSON, REST APIs such as Kibana, Elastic search and loaded into different RDBMS like oracle, SQL server. 9. Created Sqoop scripts to import data from various databases into hadoop environment. Developed Performance tuned Parquet and Orc format hive tables with snappy compression. Created Pig scripts to transform data into user needs. 10. Developed various jobs to extract data from various delimited sources and loaded into hdfs and hive dynamically partitioned and bucketed tables using Talend for Big data.

Skills & Expertise

API DevelopmentBig DataConcept DevelopmentCustomer ServiceData ManagementDatabase DevelopmentDenodoDremioEtlImport/Export OperationsInformaticaLayout DesignPentahoPerformance EngineeringRdbmsServer AdministrationSnowflakeSQLSsisStorage EngineeringSystems EngineeringTableauTalend Open StudioVirtualization

0 Reviews

This Freelancer has not received any feedback.