Search Jobvertise Jobs
Jobvertise

Kafka Architect
Location:
US-IL-Deerfield
Email this job to a friend

Report this Job

Report this job





Incorrect company
Incorrect location
Job is expired
Job may be a scam
Other







Apply Online
or email this job to apply later


  • 3+ years of hands on Kafka/Confluent/Data Streaming development/operational platform experience

  • Experience in working in with Kafka connectors, Producer and consumer APIs

  • Strong understanding of Kafka architecture including offset management, partition strategy and DR requirements

  • Good understanding of Streaming message delivery semantics.

  • Good understanding of Spark framework

  • Strong understanding of streaming message formats like Avro and streaming semantics.

  • Strong understanding of data architecture, data integration and data quality management. Should be able to create a secure and performance centric architecture based on client requirements.

  • Ability to articulate design standards and patterns



 




  • Application integration experience, event driven processing with Kafka, Kafka with Flink, Sink connectors

  • Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center.

  • Ensure optimum performance, high availability and stability of solutions

  • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.

  • Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations.

  • Strong written and oral communication and presentation

  • Strong analytical and problem solving skills.

  • Roles & Responsibilities:

  • Build Kafka/Confluent/Data Streaming development/operational related applications

  • Develop Kafka connectors, Producer and consumer APIs

  • Design Kafka architecture including offset management, partition strategy and DR requirements

  • Build Spark Streaming applications using Scala/Python

  • Strong understanding of streaming message formats like Avro and streaming semantics.

  • Strong understanding of data architecture, data integration and data quality management

  • Ability to articulate design standards and patterns

  • Application integration experience, event driven processing with Kafka, Kafka with Flink, Sink connectors

  • Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center.

  • Ensure optimum performance, high availability and stability of solutions

  • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.

  • Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations.

  • Communicate and Explain Design and Architecture to the client.

  •  


Varite

Apply Online
or email this job to apply later


 
Search millions of jobs

Jobseekers
Employers
Company

Jobs by Title | Resumes by Title | Top Job Searches
Privacy | Terms of Use


* Free services are subject to limitations