Ninja Van Logo

Ninja Van

Staff Software Engineer (Data Analytics Platform)

Posted 11 Days Ago
Be an Early Applicant
Hyderabad, Telangana
Expert/Leader
Hyderabad, Telangana
Expert/Leader
As a Staff Software Engineer, you will design and develop data infrastructure using modern Lakehouse architecture, lead architectural decisions, and collaborate with stakeholders to deliver scalable data solutions. You'll also foster a data-driven culture and ensure robust data governance, while leading a small team of data engineers.
The summary above was generated by AI

Ninja Van is a late-stage logtech startup that is disrupting a massive industry with innovation and cutting edge technology. Launched 2014 in Singapore, we have grown rapidly to become one of Southeast Asia's largest and fastest-growing express logistics companies. Since our inception, we’ve delivered to 100 million different customers across the region with added predictability, flexibility and convenience. Join us in our mission to connect shippers and shoppers across Southeast Asia to a world of new possibilities. 


More about us: 

- We process 250 million API requests and 3TB of data every day.

- We deliver more than 2 million parcels every day.

- 100% network coverage with 2600+ hubs and stations in 6 SEA markets (Singapore, Malaysia, Indonesia, Thailand, Vietnam and Philippines), reaching 500 million consumers.

- 2 Million active shippers in all e-commerce segments, from the largest marketplaces to the individual social commerce sellers.

- Raised more than US$500 million over five rounds.


We are looking for world-class talent to join our crack team of engineers, product managers and designers. We want people who are passionate about creating software that makes a difference to the world. We like people who are brimming with ideas and who take initiative rather than wait to be told what to do. We prize team-first mentality, personal responsibility and tenacity to solve hard problems and meet deadlines. As part of a small and lean team, you will have a very direct impact on the success of the company.

Roles & Responsibilities

  • Design and develop a sophisticated data infrastructure for streaming, processing, and storage, aligned with a modern Lakehouse architecture.
  • Build and maintain tools for effective data monitoring and management using best practices in data engineering.
  • Lead key architectural decisions and implement significant data pipeline initiatives using the latest technologies.
  • Collaborate with various stakeholders to deliver scalable and performant data solutions, supporting all aspects of data handling from extraction to loading.
  • Promote knowledge sharing within the team, encouraging best practices for a data-driven culture.
  • Ensure robust data governance and compliance with developed data retention, backup strategies, and secure storage solutions.
  • Leading a team of 4-5 data engineers

Required Qualifications

  • Bachelor’s or Master’s degree in Computer Science or related field from a top university.
  • Over 8 years of experience in data infrastructure, with a proven track record in building scalable and high-performance data systems.
  • Deep expertise in SQL and extensive experience with relational and NoSQL databases.
  • Advanced proficiency in Apache Kafka, Hadoop ecosystem, and stream-processing systems like Spark Streaming.
  • Familiarity with big data tools (Pig, Hive, Spark), data serialization frameworks (Protobuf, Thrift, Avro), and workflow management using Apache Airflow.
  • Knowledge of data catalog tools such as Open Metadata and Unity Catalog is highly desirable.
  • Experience with infrastructure as code (e.g., Terraform, Ansible) and CDC technologies (e.g., Maxwell, Debezium) is advantageous.

Tech Stack

Backend: Play (Java 8+), Golang, Node.js, Python, FastAPI

Frontend: AngularJS, ReactJS

Mobile: Android, Flutter, React Native

Cache: Hazelcast, Redis

Data storage: MySQL, TiDB, Elasticsearch, Delta Lake

Infrastructure monitoring: Prometheus, Grafana

Orchestrator: Kubernetes

Containerization: Docker, Containerd

Cloud Provider: GCP, AWS

Data pipelines: Apache Kafka, Spark Streaming, Maxwell/Debezium, PySpark, TiCDC

Workflow manager: Apache Airflow

Query engines: Apache Spark, Trino


Submit a job application

By applying to the job, you acknowledge that you have read, understood and agreed to our Privacy Policy Notice (the “Notice”) and consent to the collection, use and/or disclosure of your personal data by Ninja Logistics Pte Ltd (the “Company”) for the purposes set out in the Notice. In the event that your job application or personal data was received from any third party pursuant to the purposes set out in the Notice, you warrant that such third party has been duly authorised by you to disclose your personal data to us for the purposes set out in the the Notice. 

Top Skills

Android
Angularjs
Ansible
Apache Airflow
Apache Kafka
Spark
Avro
AWS
Containerd
Debezium
Delta Lake
Docker
Elasticsearch
Fastapi
Flutter
GCP
Go
Grafana
Hadoop
Hazelcast
Hive
Java 8+
Kubernetes
Maxwell
MySQL
Node.js
Pig
Play
Prometheus
Protobuf
Python
React Native
React
Redis
Spark Streaming
SQL
Terraform
Thrift
Tidb
Trino

Similar Jobs

22 Days Ago
Hyderabad, Telangana, IND
Junior
Junior
Marketing Tech • Sales • Software
You will design and develop scalable data processing platforms and work on scalable data architecture systems within a fast-paced agile environment. The role involves following engineering best practices to tackle data-related challenges and collaborating with cross-functional teams.
Top Skills: Big DataJavaJavaScriptPythonSQL
3 Hours Ago
Hyderabad, Telangana, IND
Mid level
Mid level
Big Data • Fintech • Information Technology • Insurance • Financial Services
The Configuration Management Analyst at MassMutual India will manage the Configuration Management Database (CMDB), enforce processes, conduct audits, and oversee data integrity initiatives. Responsibilities include managing BAU tasks, facilitating communication with stakeholders, and training staff on best practices. The role emphasizes problem-solving, collaboration, and innovation to improve efficiency.
Top Skills: Servicenow
Yesterday
Hyderabad, Telangana, IND
Senior level
Senior level
Big Data • Fintech • Information Technology • Insurance • Financial Services
The role involves leading Data Warehouse projects using IICS/IDMC and Informatica. Responsibilities include advanced SQL query optimization, database management, writing Unix shell scripts, and understanding schema design and transformation logic. Experience in unit and integration testing is essential, along with proficiency in Git.

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account