Payoneer Logo

Payoneer

Senior Data Engineer

Posted 12 Days Ago
Be an Early Applicant
Hybrid
Gurugram, Haryana
Senior level
Hybrid
Gurugram, Haryana
Senior level
Lead the design and delivery of scalable data platforms in a fintech environment, ensuring data quality, architecture, and operational excellence while mentoring engineers and driving AI adoption in data systems.
The summary above was generated by AI

About Payoneer

Founded in 2005, Payoneer is the global financial platform that removes friction from doing business across borders, with a mission to connect the world’s underserved businesses to a rising global economy. We’re a community with over 2,500 colleagues all over the world, working to serve customers, and partners in over 190 countries and territories.

By taking the complexity out of the financial workflows–including everything from global payments and compliance to multi-currency and workforce management, to providing working capital and business intelligence–we give businesses the tools they need to work efficiently worldwide and grow with confidence.

Role summary 

We’re looking for a Senior Data Engineer with a drive for excellence and an ownership mindset who can lead the design and delivery of scalable, secure, and highly reliable data platforms in a complex payments and fintech environment. You set the technical bar for your team: you architect systems, make sound trade-off decisions, unblock cross-team delivery, and mentor engineers. 

You’re deliberate about how AI-assisted development is adopted on your team, setting guardrails that prevent shortcuts from becoming long-term cognitive debt, while actively using AI to solve real engineering and business problems. 

AI-first mindset: We value engineers who can incorporate AI and agentic development practices into how we build data systems, setting patterns for responsible AI-assisted engineering across design reviews, code quality, testing, and documentation, while delivering data engineering-led AI use cases such as intelligent data quality and observability, anomaly detection, automated alert triage, and governance. 

What You’ll Do 

  • Own the technical architecture for large-scale batch and streaming data pipelines that power product, risk, and reporting use cases, using frameworks such as Apache Beam, Spark, or Flink with managed runners like Google Cloud Dataflow. 
  • Lead data warehouse and lakehouse design for analytical and operational usecases, setting modelling standards and driving performance and cost optimisation. 
  • Design event-driven and streaming architectures with strong correctness guarantees schema evolution, replay and backfill strategies, late-data handling, idempotency, and operational safety. 
  • Build and operate storage patterns for operational and analytical workloads using wide-column stores (Bigtable, Cassandra, HBase, or equivalents), including capacity planning and SLO definition. 
  • Establish orchestration and operational excellence using tools like Airflow, Composer, Dagster, or Prefect - including CI/CD strategy, automated testing, pipeline observability, and incident response practices. 
  • Drive data quality, governance, and auditability through automated controls, lineage/metadata practices, and secure-by-default access patterns. 
  • Lead through technical influence: mentor engineers, run design reviews, maintain decision records, unblock cross-team delivery, and shape roadmaps through clear technical reasoning. 
  • Set the standard for how your team uses AI-assisted development, ensuring AI-generated code meets the same review, testing, and documentation bar as any other code. Identify and deliver AI-driven solutions for data engineering problems such as intelligent data profiling, anomaly detection, and automated root-cause analysis, with a focus on reproducibility and governance. 

Who You Are 

  • You are a seasoned data engineer with a strong sense of ownership and accountability, comfortable operating in complex domains and driving end-to-end delivery from design through production operations. 
  • You balance long-term platform thinking with pragmatic execution, and you know how to raise reliability and quality without slowing teams down. 
  • You thrive in cross-team environments and influence through technical leadership, clarity, and strong execution. 
  • You are impact-driven and measure success by how effectively you enable teams, improve data trust, and accelerate product outcomes. 
  • You think critically about AI adoption in engineering workflows, you see the leverage it provides, but you also understand the risks of unchecked reliance: reduced understanding, hidden errors, and maintenance burden. You set patterns that capture the upside while protecting quality. 

Key skills and competencies 

  • Strong track record delivering production-grade data pipelines and datasets end-to-end: design, implementation, deployment, and operations. 
  • Deep experience with distributed data processing 
  • Expertise with cloud data warehouses (BigQuery, Snowflake, Redshift, or Databricks) including strong dimensional modelling, query optimisation, and cost management skills. 
  • Hands-on experience designing systems on event streaming platforms including schema management, delivery semantics trade-offs, and operational patterns like replay and backfill. 
  • Experience with operational and wide-column stores (Bigtable, Cassandra, HBase, or equivalents) with a strong understanding of access-pattern-driven design and capacity planning. 
  • Strong orchestration and platform engineering experience with Airflow, Dagster, or Prefect, including CI/CD, automated testing, observability, and incident response. 
  • Demonstrated technical leadership: mentoring, design reviews, architectural decision records, and the ability to influence stakeholders and align teams around technical direction. 
  • A considered approach to AI-assisted engineering: you use AI tools to improve throughput and quality, but you also set guardrails, review standards, testing expectations, documentation requirements. 

Preferred 

  • Prior experience in fintech, payments, lending, or broader financial services (e.g., reconciliation, settlement, risk and fraud data, regulatory reporting). 
  • Experience operating data systems with defined SLOs/SLAs and governance in cloud environments. 
  • Familiarity with data governance and compliance standards: PII handling, access controls, auditing, and policy-as-code patterns. 
  • Experience partnering with ML/DS teams to productionise features, training datasets, and monitoring infrastructure. 

The Payoneer Ways of Working 

Act as our customer’s partner on the inside
Learning what they need and creating what will help them go further. 

Do it. Own it.
Being fearlessly accountable in everything we do. 

Continuously improve
Always striving for a higher standard than our last. 

Build each other up 
Helping each other grow, as professionals and people. 

If this sounds like a business, a community, and a mission you want to be part of, apply today.

We are committed to providing a diverse and inclusive workplace. Payoneer is an equal opportunity employer, and all qualified applicants will receive consideration for employment no matter your race, color, ancestry, religion, sex, sexual orientation, gender identity, national origin, age, disability status, protected veteran status, or any other characteristic protected by law. If you require reasonable accommodation at any stage of the hiring process, please speak to the recruiter managing the role for any adjustments. Decisions about requests for reasonable accommodation are made on a case-by-case basis.

Top Skills

Airflow
Apache Beam
BigQuery
Bigtable
Cassandra
Dagster
Databricks
Flink
Google Cloud Dataflow
Hbase
Prefect
Redshift
Snowflake
Spark

Similar Jobs

Yesterday
In-Office
Senior level
Senior level
Financial Services
The Senior Data Engineer will build and maintain data lakes, data warehousing solutions, and oversee the data pipeline lifecycle in a B2B multi-tenant SaaS solution, including coding, monitoring performance, and collaborating with teams.
Top Skills: AthenaAWSAws Cloud FormationCassandraCi/CdDynamoDBEksEmrGlueKafkaKinesisKubernetesLambdaMksMongoDBPostgresPythonRdsS3SQLTerraform
7 Days Ago
In-Office or Remote
IN
Senior level
Senior level
Insurance
Design and develop scalable data pipelines using PySpark and Databricks, focusing on data ingestion, transformation, validation, and performance optimization.
Top Skills: DatabricksPysparkPythonSparkSQL
7 Days Ago
In-Office
Senior level
Senior level
Enterprise Web
As a Senior Data Platform Engineer, you will design and maintain robust data platforms, enabling self-service for data practitioners. Responsibilities include developing tools, managing data components, and ensuring data security and governance.
Top Skills: AirflowSparkAthenaAWSDatadogDockerEksGlueKubernetesPrefectPythonS3SnowflakeSplunkSQLTerraform

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account