- Designing, developing, implementing, and maintaining data flow processes using Apache NiFi and Kafka. 
- Collaborating with stakeholders to understand data and system requirements, analyzing data sources and schema, and designing data processing pipelines, workflows, and templates. 
- Building and configuring NiFi and Kafka processors, connectors, and components such as filters, splitters, routers, and others, in order to support the data collection, processing, and transformation process. 
- Developing and implementing real-time data ingestion pipelines, automating data transfers and workflows, and designing strategies to handle various data formats and APIs. 
- Leveraging NiFi and Kafka's ability to conduct real-time analysis and monitoring of data sources, processing, and performance to optimize workflows and ensure optimal data quality. 
- Integrating NiFi and Kafka with other technologies in order to create a comprehensive data pipeline system, such as integrating with Hadoop, Spark, and other commonly used big data tools. 
- Developing custom NiFi and Kafka processors, templates, and workflow elements to meet specific business needs and data requirements. 
- Troubleshooting and debugging issues related to NiFi and Kafka data flows, processors, and workflows to ensure high uptime and reliability. 
- Managing collaboration across cross-functional teams including data scientists, data analysts, and other stakeholders to ensure smooth delivery of data-related solutions. 
- Staying up-t0-date with advancements in related data technologies and identifying new ways to apply NiFi and Kafka to evolving data requirements and data sources in the organization. 
- Developing API and UI components to interface with and monitor Apache NiFi and Kafka workflows and data flows, providing self-service access to NiFi and Kafka workflows and data for other teams and stakeholders. 
- Creating scripts and code for automation of deployment and scaling of NiFi and Kafka in production environments. 
- Ensuring compliance with data governance, data protection, and security rules and regulations, including auditing data flows, tracking lineage, and making sure that encryption and access controls are in place. 
- Collaborating with other developers and technical staff to design and implement production-level data processing workflows that can handle high data volumes and ensuring scalability to meet business needs. 
- Testing and deploying production-level solutions developed with NiFi and Kafka. 
- Mentoring and coaching other team members in NiFi and Kafka development and usage. 
- Documenting the code and development processes for future reference.