dentsu Logo

dentsu

Lead Enterprise Architect

Job Posted 21 Days Ago Posted 21 Days Ago
Be an Early Applicant
In-Office
5 Locations
Senior level
In-Office
5 Locations
Senior level
The role involves designing and implementing ETL/ELT pipelines, managing GCP infrastructure, and collaborating with teams to optimize data solutions.
The summary above was generated by AI
The purpose of this role is to understand, model and facilitate change in a significant area of the business and technology portfolio either by line of business, geography or specific architecture domain whilst building the overall Architecture capability and knowledge base of the company.

Job Description:

Role Overview:
We are seeking a highly skilled and motivated Cloud Data Engineering Manager to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns.

The GCP Data Engineering Manager will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs.

Key Responsibilities:

Data Engineering & Development:

  • Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data.
  • Implement enterprise-level data solutions using GCP services such as BigQuery, Dataform, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer.
  • Develop and optimize data architectures that support real-time and batch data processing.
  • Build, optimize, and maintain CI/CD pipelines using tools like Jenkins, GitLab, or Google Cloud Build.
  • Automate testing, integration, and deployment processes to ensure fast and reliable software delivery.

Cloud Infrastructure Management:

  • Manage and deploy GCP infrastructure components to enable seamless data workflows.
  • Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices.

Infrastructure Automation and Management:

  • Design, deploy, and maintain scalable and secure infrastructure on GCP.
  • Implement Infrastructure as Code (IaC) using tools like Terraform.
  • Manage Kubernetes clusters (GKE) for containerized workloads.

Collaboration and Stakeholder Engagement:

  • Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals.
  • Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation.

Quality Assurance & Optimization:

  • Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations.
  • Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines.
  • Monitor and optimize pipeline performance to meet SLAs and minimize operational costs.

Qualifications and Certifications:

  • Education:
    • Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field.
  • Experience:
    • Minimum of 7 to 9 years of experience in data engineering, with at least 4 years working on GCP cloud platforms.
    • Proven experience designing and implementing data workflows using GCP services like BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer.
  • Certifications:
    • Google Cloud Professional Data Engineer certification preferred.

Key Skills:

  • Mandatory Skills:
    • Advanced proficiency in Python for data pipelines and automation.
    • Strong SQL skills for querying, transforming, and analyzing large datasets.
    • Strong hands-on experience with GCP services, including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine and Kubernetes Engine (GKE).
    • Hands-on experience with CI/CD tools such as Jenkins, GitHub or Bitbucket.
    • Proficiency in Docker, Kubernetes, Terraform or Ansible for containerization, orchestration, and infrastructure as code (IaC)
    • Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer
    • Strong understanding of Agile/Scrum methodologies
  • Nice-to-Have Skills:
    • Experience with other cloud platforms like AWS or Azure.
    • Knowledge of data visualization tools (e.g., Power BI, Looker, Tableau).
    • Understanding of machine learning workflows and their integration with data pipelines.

Soft Skills:

  • Strong problem-solving and critical-thinking abilities.
  • Excellent communication skills to collaborate with technical and non-technical stakeholders.
  • Proactive attitude towards innovation and learning.
  • Ability to work independently and as part of a collaborative team.

Location:

Bengaluru

Brand:

Merkle

Time Type:

Full time

Contract Type:

Permanent

Top Skills

Ansible
Apache Airflow
BigQuery
Cloud Composer
Cloud Functions
Cloud Pub/Sub
Cloud Storage
Dataflow
Dataform
Docker
Gitlab
Google Cloud Platform
Jenkins
Kubernetes
Python
SQL
Terraform

Similar Jobs

Yesterday
New Delhi, Delhi, IND
Senior level
Senior level
Food • Retail • Agriculture • Manufacturing
The SRE Engineer will enhance system reliability and resilience, automate operational tasks, monitor performance, and improve software quality through collaboration.
Top Skills: AnsibleAppdynamicsArtifactoryAWSBashC#C++ChefCloudwatchDocker SwarmElkGitGoogle Cloud PlatformJavaJenkinKubernetesAzureNewrelicOpentelemetryPodmanPowershellPrometheusPuppetPythonSaltSplunkTeamcityTerraform
3 Days Ago
Remote
Hybrid
18 Locations
Senior level
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Join CrowdStrike's NGSIEM Data Onboarding Team to develop third-party ingest pipelines, ensuring fault tolerance and scalability in cloud systems while optimizing quality assurance and automated testing.
Top Skills: AWSAzureDockerGCPGoKubernetes
5 Days Ago
New Delhi, Delhi, IND
Senior level
Senior level
Food • Retail • Agriculture • Manufacturing
Design and implement AI solutions using generative models and machine learning, collaborating across teams, optimizing models, and driving DevOps practices.
Top Skills: AWSAzureBertDockerGansGitGptHugging FaceKubernetesMatplotlibNlpNltkPlotlyPythonPyTorchRSeabornSpacyTensorFlowVae

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account