Job Title: GCP & Big Data Engineer
Experience Required: 4–7 Years
Location: Indore, Raipur, Bangalore, Gurugram
Role Overview
We are looking for a skilled GCP & Big Data Engineer with strong expertise in cloud data platforms and large-scale data processing. The ideal candidate will design, build, and optimise scalable data pipelines using Google Cloud services and Big Data technologies, ensuring efficient data processing, storage, and analytics.
Key Responsibilities
->Design, develop, and maintain scalable data pipelines on Google Cloud Platform.
->Work extensively with GCS, Dataproc, BigQuery, and Composer (Airflow) for data ingestion, processing, orchestration, and analytics.
->Develop and optimise PySpark-based data processing jobs for large datasets.
->Ensure data quality, reliability, and performance through monitoring and optimization.
->Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver data solutions.
->Implement best practices in data engineering, including data governance, security, and cost optimisation.
->Troubleshoot performance bottlenecks and provide scalable solutions.
->Maintain documentation for pipelines, workflows, and system architecture.
Required Skills & Qualifications
->4–7 years of experience in Big Data engineering and cloud-based data platforms.
->Strong hands-on experience with Google Cloud Platform (GCS, Dataproc, BigQuery, Composer/Airflow).
->Proficiency in PySpark and distributed data processing frameworks.
->Solid understanding of ETL/ELT processes, data warehousing, and data modelling.
->Experience with workflow orchestration tools and pipeline automation.
->Good knowledge of SQL, scripting languages, and performance tuning.
->Strong analytical, problem-solving, and communication skills.
Preferred / Good to Have
->Experience with CI/CD pipelines and DevOps practices.
->Exposure to other cloud platforms or modern data stack tools.
->Knowledge of data security, governance, and compliance standards.


