Skip to main content
Agivant Technologies logo

Senior Backend Engineer – Distributed Systems (Kafka, Data Ingestion)

Agivant Technologies
Full Timesenior
Hyderabad, Telangana, INPosted April 22, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

JavaAzureDockerKubernetesPostgreSQLSnowflakeBigQuerygRPCKafkaSparkAgileCI/CD

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

About The Role

We are seeking a Software Engineer to design and build high‑performance distributed data ingestion systems. You will work on parallel processing pipelines that integrate with cloud storage, relational databases, and streaming platforms. This is a core product development role where you’ll apply best practices in distributed system design, ensuring scalability, reliability, and performance.

Key Responsibilities

  • Design and implement data ingestion pipelines with parallel processing in Java, C++, or Golang.
  • Ingest data from Amazon S3, Azure Cloud Storage, Google Cloud Storage, Snowflake, BigQuery, PostgreSQL, files, Kafka, and Iceberg data lakehouse.
  • Build high‑availability (HA) loading systems with cross‑region replication.
  • Develop monitoring and error reporting for ingestion pipelines.
  • Integrate with Spark connectors and manage third‑party systems (Kafka, Kafka Connect).
  • Collaborate with cross‑functional teams to ensure scalable, reliable, and performant distributed systems.
  • Follow Agile development practices and contribute to CI/CD pipelines.

Requirements

Requirements

  • Strong programming skills in Java, C++, or Golang.
  • Hands‑on experience with Kafka, Zookeeper, Spark, or stream processing frameworks.
  • Expertise in Kafka Connect, Kafka Streams, Kafka security, and customization.
  • Experience with Spark connectors and event‑driven architectures.
  • Familiarity with Agile development and CI/CD workflows.

Nice to Have

  • Experience with gRPC protocol and multi‑threading.
  • Exposure to Zookeeper, ETCD, or Consul.
  • Understanding of distributed consensus algorithms (Paxos/Raft).
  • Knowledge of Docker and Kubernetes for containerized deployments.

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free