Epicareer Might not Working Properly
Learn More

Cloud Engineer

$ 6,000 - $ 12,000 / month

Checking job availability...

Original
Simplified

Role Summary

This position is to work closely with Technical Architects in evaluating, designing, and implementing cloud solution, perform studies, POCs and deliver guidelines, sample code for development teams & production teams ensuring systems are secure & reliable.

Main Responsibilities

  • Administer, configure and manage Kafka clusters.
  • Ensure high availability and reliability of Kafka services.
  • Guide iOPS engineers in – Creating & maintaining Kafka topics, configurations including partitions, replication factors and setting up Access Control Lists.
  • Configure & utilize monitoring tools to track cluster performance, throughput, and latency.
  • Optimize cluster performance through tuning of configurations and resource allocation.
  • Troubleshoot and resolve issues related to Kafka clusters, including connectivity issues, message delivery failures, and performance bottlenecks.
  • Collaborate with development teams to address application-level issues related to Kafka.
  • Maintain accurate documentation of cluster configurations, operational procedures, and incident resolutions.
  • Generate reports on cluster performance and usage for stakeholders.
  • Proactively identify opportunities for process improvement, optimization, and innovation.
  • Design, document and execute proof of concepts, realize exploratory studies on cloud services, technologies, and architectures.
  • Draft and update procedures and instructions regarding operations.
  • Lead the analysis of the current technology environment to detect critical deficiencies and recommend solutions for improvement.

Qualifications & Experience

  • Bachelor’s Degree in Information Technology or relevant fields.
  • More than 4 years of overall IT experience.
  • At least 2-3 years demonstrated proficiency and experience in administration, implementation, monitoring, and troubleshooting Kafka messaging infrastructure(s).
  • In-depth knowledge of Apache Kafka architecture, including brokers, topics, producers, and consumers.
  • Experience with Kafka's core concepts such as partitioning, replication, and retention policies.
  • Familiarity with IBM Event Streams or other managed Kafka services.
  • Experience with monitoring and observability tools (e.g., Grafana, Prometheus) to track Kafka cluster performance.
  • Understanding of security best practices related to data streaming, including encryption, access management, and compliance frameworks.
  • Experience with data ingestion and ETL processes, along with familiarity with tools that integrate with Kafka (e.g., Kafka Connect, data pipelines).
  • Proficiency in at least one programming or scripting language (e.g., Java, Python) for automation and integration tasks.
  • Strong critical thinking attitude. Analytical and Problem-solving skills.
  • Strong communication skills to work effectively with cross-functional teams, including developers, data engineers, and DevOps personnel.
  • Experience/ Strong Knowledge of Kafka Producer and Consumer APIs, Zookeeper, KSQL, Kafka Connect, Kafka Stream Processing.
  • Working experience with AVRO Schemas.
  • Experience with Terraform & Ansible.
  • Experience with modern application development and DevOps practices, including CI/CD, containerization (Docker, Kubernetes).
  • Strong understanding and good knowledge of IT system infrastructure solutions.

Good To Have:

  • Knowledge/experience in Redhat AMQ Streams.
  • Strong knowledge on API Management and Integration using APIGEE API Management platform.
  • Private Banking or Asset Management knowledge/experience.
  • Cloud architect certifications.
  • Kubernetes Certification.
  • Working knowledge of Scrum framework.

Role Summary

This position is to work closely with Technical Architects in evaluating, designing, and implementing cloud solution, perform studies, POCs and deliver guidelines, sample code for development teams & production teams ensuring systems are secure & reliable.

Main Responsibilities

  • Administer, configure and manage Kafka clusters.
  • Ensure high availability and reliability of Kafka services.
  • Guide iOPS engineers in – Creating & maintaining Kafka topics, configurations including partitions, replication factors and setting up Access Control Lists.
  • Configure & utilize monitoring tools to track cluster performance, throughput, and latency.
  • Optimize cluster performance through tuning of configurations and resource allocation.
  • Troubleshoot and resolve issues related to Kafka clusters, including connectivity issues, message delivery failures, and performance bottlenecks.
  • Collaborate with development teams to address application-level issues related to Kafka.
  • Maintain accurate documentation of cluster configurations, operational procedures, and incident resolutions.
  • Generate reports on cluster performance and usage for stakeholders.
  • Proactively identify opportunities for process improvement, optimization, and innovation.
  • Design, document and execute proof of concepts, realize exploratory studies on cloud services, technologies, and architectures.
  • Draft and update procedures and instructions regarding operations.
  • Lead the analysis of the current technology environment to detect critical deficiencies and recommend solutions for improvement.

Qualifications & Experience

  • Bachelor’s Degree in Information Technology or relevant fields.
  • More than 4 years of overall IT experience.
  • At least 2-3 years demonstrated proficiency and experience in administration, implementation, monitoring, and troubleshooting Kafka messaging infrastructure(s).
  • In-depth knowledge of Apache Kafka architecture, including brokers, topics, producers, and consumers.
  • Experience with Kafka's core concepts such as partitioning, replication, and retention policies.
  • Familiarity with IBM Event Streams or other managed Kafka services.
  • Experience with monitoring and observability tools (e.g., Grafana, Prometheus) to track Kafka cluster performance.
  • Understanding of security best practices related to data streaming, including encryption, access management, and compliance frameworks.
  • Experience with data ingestion and ETL processes, along with familiarity with tools that integrate with Kafka (e.g., Kafka Connect, data pipelines).
  • Proficiency in at least one programming or scripting language (e.g., Java, Python) for automation and integration tasks.
  • Strong critical thinking attitude. Analytical and Problem-solving skills.
  • Strong communication skills to work effectively with cross-functional teams, including developers, data engineers, and DevOps personnel.
  • Experience/ Strong Knowledge of Kafka Producer and Consumer APIs, Zookeeper, KSQL, Kafka Connect, Kafka Stream Processing.
  • Working experience with AVRO Schemas.
  • Experience with Terraform & Ansible.
  • Experience with modern application development and DevOps practices, including CI/CD, containerization (Docker, Kubernetes).
  • Strong understanding and good knowledge of IT system infrastructure solutions.

Good To Have:

  • Knowledge/experience in Redhat AMQ Streams.
  • Strong knowledge on API Management and Integration using APIGEE API Management platform.
  • Private Banking or Asset Management knowledge/experience.
  • Cloud architect certifications.
  • Kubernetes Certification.
  • Working knowledge of Scrum framework.