LatentView Analytics is a leading global analytics and decision sciences provider, delivering solutions that help companies drive digital transformation and use data to gain a competitive advantage. With analytics solutions that provide a 360-degree view of the digital consumer, fuel machine learning capabilities, and support artificial intelligence initiatives., LatentView Analytics enables leading global brands to predict new revenue streams, anticipate product trends and popularity, improve customer retention rates, optimize investment decisions, and turn unstructured data into valuable business assets.
Designation: Manager
Level: L4
Experience: 10 to 15 years
Location: Dallas, Texas , United States.
Job Description:
We are seeking a highly skilled Kafka Integration Engineer to design, implement, and maintain data streaming solutions that enable seamless integration between enterprise systems. The role requires expertise in Apache Kafka (self-managed or Confluent Cloud) and strong knowledge of data integration, messaging, and distributed systems. You will collaborate with data engineers, architects, and business stakeholders to build reliable, scalable, and real-time data pipelines.
Key Responsibilities:
- Design, develop, and support Kafka-based data pipelines for real-time and batch data integration.
- Implement and manage Kafka Connect connectors (source/sink), Kafka Streams, and/or ksqlDB applications.
- Integrate Kafka with a variety of systems (databases, data lakes, cloud services, analytics platforms, APIs).
- Optimize Kafka topics, partitions, retention policies, and consumer group configurations for performance and reliability.
- Monitor and troubleshoot Kafka clusters, connectors, producers, and consumers to ensure high availability and low-latency processing.
- Implement schema management using Confluent Schema Registry (Avro/Protobuf/JSON).
- Automate deployment and configuration management of Kafka components using CI/CD pipelines and Infrastructure-as-Code tools (Terraform, Ansible, Helm, etc.).
- Work closely with Data Engineers and Application Developers to ensure smooth onboarding of new Kafka use cases.
- Establish best practices for error handling, data quality, scalability, and observability in streaming applications.
- Collaborate with DevOps and SRE teams to ensure security, scalability, and disaster recovery of Kafka environments.
Required Skills & Qualifications:
- Strong hands-on experience with Apache Kafka or Confluent Platform/Confluent Cloud.
- Proficiency with Kafka Connect, Kafka Streams, ksqlDB, and related ecosystem components.
- Experience with real-time data integration from RDBMS (SQL Server, Oracle, Postgres, MySQL), NoSQL databases, cloud storage (AWS S3, GCS, Azure Blob), and SaaS APIs.
- Solid understanding of distributed systems, event-driven architecture, and messaging patterns.
- Proficiency in one or more programming languages (Java, Python, Scala, or Go) for building Kafka producers/consumers.
- Familiarity with Docker, Kubernetes, and CI/CD practices for deploying and scaling Kafka applications.
- Knowledge of monitoring tools (Prometheus, Grafana, Confluent Control Center, Datadog, Splunk, etc.) to manage Kafka environments.
- Experience with schema management (Avro/Protobuf/JSON) and data serialization formats.
- Understanding of security concepts (SSL, SASL, RBAC, ACLs, Kerberos, OAuth).
- Excellent problem-solving, troubleshooting, and communication skills.
Preferred Qualifications:
- Experience with Debezium or other CDC tools for change data capture.
- Hands-on knowledge of Snowflake, BigQuery, Redshift, Databricks, or other cloud data warehouses.
- Exposure to microservices and event-driven application architecture.
- Experience working in multi-cloud or hybrid environments.
- Certification in Confluent Kafka Developer/Admin is a plus.
- Education & Experience
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field.
- 3–7 years of experience in data integration, streaming, or distributed systems.
- 3+ years of hands-on Kafka experience in a production environment.
At LatentView Analytics, we value a diverse, inclusive workforce and provide equal employment opportunities for all applicants and employees. All qualified applicants for employment will be considered without regard to an individual's race, colour, sex, gender identity, gender expression, religion, age, national origin or ancestry, citizenship, physical or mental disability, medical condition, family care status, marital status, domestic partner status, sexual orientation, genetic information, military or veteran status, or any other basis protected by federal, state or local laws.