Company Description
We are looking for a Data Platform Engineer to design, build, and optimize the data infrastructure that powers defense and anti-malvertising solutions specifically for AdTech.
You will work on designing and maintaining a scalable data platform that processes large volumes of AdTech traffic data, balancing efficiency signal collection and performance while adapting to evolving privacy frameworks.
Does this sound like an interesting opportunity? Keep reading, and let’s discuss your future role!
CUSTOMER
Our client provides comprehensive security solutions designed to protect individuals, organizations, and communities. They operate across various sectors to address vulnerabilities and ensure protection against a wide range of risks, especially in the digital space. Their products focus on bot defense and anti-malvertising solutions specifically for AdTech companies. This includes onboarding and integration processes for different partners such as Supply-Side Platforms (SSPs), Demand-Side Platforms (DSPs), agencies, and publishers. They offer protection solutions for a variety of use cases, along with a customizable reporting console and APIs that seamlessly integrate into clients' systems.
Job Description
- Support existing processes, improve them, and develop new solutions, with a focus on platform reliability and scalability
- Enhance and maintain Python services
- Contribute to new architecture components and implement new ones
- Migrate components and refactoring, and introduce new tools such as Flynk
- Optimize system architecture. Manage file transfers between local storage, AWS S3, and Snowflake. Also, review and optimize code in Python and Java
- Migrate and consolidate three Airflow instances into a single instance
- Set up the PoV with a third-party hosted version of Airflow to make a go/no-go decision
Qualifications
- 5+ years of overall experience in data and software engineering
- Strong experience with Java
- Strong SQL skills
- Hands-on experience with Apache Airflow
- Experience working with AWS and S3
- Proven ability to build and maintain data pipelines
WOULD BE A PLUS:
- Experience with Python
- Knowledge of Snowflake
- Familiarity with Flink or Spark