Join our Talent Network
Skip to main content

Kafka Architect

Location: , United States
Date Posted: Oct 26, 2021

Job Description

Description

Duties:

Responsible for Managing the Hardware and Software for Kafka and its ecosystem components
Work with Application teams to gather future requirements to plan the growth of the infrastructure and expand as needed.
Implement Disaster Recovery (DR) for Kafka.
Build Automation Capabilities using tools like Terraform (AWS), Ansible, Git runner, Jenkins, etc.
Kafka security (Kerberos, ACL, SSL, SASL, SCRAM, etc.).
Research and implement the new capabilities for Enterprise Messaging Services

Requirements

Kafka Admin experience in managing critical 24/7 applications and optimizing Kafka for variable workloads
Lead messaging modeling, persistent data modeling, normalization, transform, migrate, cleanse, and all other aspects relating to data
Understand implications of data upstream and downstream
Make key decisions on technology and tools to implement from a data platform technology standpoint
Thorough understanding of Kafka, producer/consumer/topic technologies and drive implementation of the technology
Design, build, assemble, and configure application or technical architecture components using business requirements.
Hands-on experience with Kafka clusters hosted in AWS and on-prem K8 Open Shift platform / Spark
Experience in building Kafka pipelines using Terraform, Ansible, Cloud formation templates, shells etc.
Experience in implementing security & authorization (permission based) on Kafka cluster.
Experience in System Administrators with setting up Kafka platform in provisioning, access lists Kerberos and SSL configurations.
Experience in setting standards to automate deployments using Spark, Kubernetes, Docker, Chef or Jenkins
Experience in open source and confluence Kafka, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center.
Experience in Kafka Mirror Maker or Confluent Replicator
Experience in High availability cluster setup, maintenance and ongoing 24/7 support
Establish best practice standards for configuring Source and Sink connectors.
Hands on experience in standing up and administrating Kafka platform from scratch which includes creating a backup & mirroring of Kafka Cluster brokers, broker sizing, topic sizing, h/w sizing, performance monitoring, broker security, topic security, consumer/producer access management (ACL)
Knowledge of Kafka API (development experience is a plus)
Knowledge of best practices related to security, performance, and disaster recovery.
Ability to concentrate on a wide range of loosely defined complex situations, which require creativity and originality, where guidance and counsel may be unavailable.
Demonstrate a product mindset with an ability to set forward thinking and direction.
Ability to synthesize large amounts of complex data into meaningful conclusions and present recommendations.
Ability to maintain a positive attitude while working with high demands and short deadlines that leads to working after hours.
Must have excellent communications and interpersonal skills Not Required, but Preferred:
Experience with Oracle Streams and/or another real-time streaming solution is a plus.
Experience in setting up Promethes Grafana or ELK monitoring tools is a plus.
Experience as Linux (RHEL) /Unix administrator is huge plus.
Experience in PostgreSQL, SQL Server, No-SQL (Hbase), Oracle, and AWS Cloud is a plus.
Understanding of or experience supporting .NET / Java application.
Understanding of or experience with Programming languages like Python, etc is plus.
Save Job Saved
Share: mail

Similar Jobs