Similar Jobs
Artificial Intelligence • Big Data • Cloud • Information Technology • Software • Cybersecurity • Data Privacy
The Account Executive will drive bookings growth and account development in the Public Sector by managing pipelines, executing strategies, and exceeding sales objectives.
Top Skills:
Command Of MessageSaaSSales Methodologies: Meddpicc
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
The Sr. Software Engineer will develop feature extraction engines, collaborate with data scientists, and test software systems while working with complex file formats and reverse engineering.
Top Skills:
AWSAzureBitbucketC++GCPGitJenkinsJIRAPythonRust
Food • Retail • Agriculture • Manufacturing
As a Cloud Infrastructure Solution Architect, you will design and implement cloud infrastructure solutions, collaborating with teams to adopt cloud services while ensuring high availability, reliability, and security.
Top Skills:
AnsibleArmAWSAws CdkAzureAzure DevopsAzure SdkChefCloudFormationConfluenceDockerGCPGitGitlabJenkinsJIRAKubernetesLinuxMssqlNoSQLPostgresPulumiPuppetRedisSaltTerraformVMwareWindows
We are seeking a highly skilled and motivated Senior GCP Data Engineer to join our team. The role is critical to the development of a cutting-edge data platform and product for fortune 50 company.
The GCP Data Engineer will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP).
The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs.
The GCP Data Engineer will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP).
The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs.
Job Description:
Key Responsibilities:
Data Engineering & Development:
- Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data.
- Implement enterprise-level data solutions using GCP services such as BigQuery, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer.
- Develop and optimize data architectures that support real-time and batch data processing.
- Develop RESTful APIs using Python and Flask to expose data services and integrate with upstream/downstream systems.
- Enable secure API-based interactions between data pipelines, applications, and external interfaces.
Cloud Infrastructure Management:
- Manage and deploy GCP infrastructure components to enable seamless data workflows.
- Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices.
Collaboration and Stakeholder Engagement:
- Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals.
- Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation.
Quality Assurance & Optimization:
- Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations.
- Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines.
- Monitor and optimize pipeline performance to meet SLAs and minimize operational costs.
Qualifications and Certifications:
- Education:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience:
- Minimum of 3 - 6 years of experience in data engineering, with at least 2 years working on GCP cloud platforms.
- Proven experience designing and implementing data workflows using GCP services like BigQuery, Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer.
- Experience developing and deploying RESTful APIs using Flask for data integration and system interoperability.
- Certifications:
- Google Cloud Professional Data Engineer certification preferred.
Key Skills:
- Mandatory Skills:
- Advanced proficiency in Python for data pipelines and automation.
- Strong SQL skills for querying, transforming, and analyzing large datasets.
- Expertise in GCP services such as BigQuery, Cloud Functions, Cloud Storage, Dataflow, and Kubernetes (GKE).
- Hands-on experience with CI/CD tools such as Jenkins, Git, or Bitbucket.
- Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer.
- Nice-to-Have Skills:
- Experience with other cloud platforms like AWS or Azure.
- Knowledge of data visualization tools (e.g., Looker, Tableau).
- Understanding of machine learning workflows and their integration with data pipelines.
Soft Skills:
- Strong problem-solving and critical-thinking abilities.
- Excellent communication skills to collaborate with technical and non-technical stakeholders.
- Proactive attitude towards innovation and learning.
- Ability to work independently and as part of a collaborative team.
Location:
MumbaiBrand:
MerkleTime Type:
Full timeContract Type:
PermanentWhat you need to know about the Chennai Tech Scene
To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.