Nexaminds Logo

Nexaminds

Senior Databricks Engineer (Exp: 8-12 yrs | Databricks ,PySpark , Apache Spark, Python, SQL, Azure, Devops)

Posted 4 Hours Ago
Be an Early Applicant
In-Office or Remote
4 Locations
Senior level
In-Office or Remote
4 Locations
Senior level
Seeking a Data Engineer to design and maintain scalable data pipelines using Databricks and Azure, focusing on ETL/ELT workflows, data quality, and collaboration with teams.
The summary above was generated by AI

Unlock Your Future with Nexaminds!

At Nexaminds, we're on a mission to redefine industries with AI. We're passionate about the limitless potential of artificial intelligence to transform businesses, streamline processes, and drive growth.

Join us on our visionary journey. We're leading the way in AI solutions, and we're committed to innovation, collaboration, and ethical practices. Become a part of our team and shape the future powered by intelligent machines. If you're driven by ambition, success, fun, and learning, Nexaminds is where you belong.

 

πŸš€ Are you a PRO at developing Databricks Pipelines / enhance or support existing pipelines?

Are you strong with Databricks, Pyspark,Apache Spark, SQL, Python debugging skills dealing with business-critical data with extensive exposure to Azure?

Have you worked hands-on in E-Commerce or Retail domains?

Then don’t wait any further β€” your next big opportunity is here! πŸŒŸ

Join us at Nexaminds and be part of an exciting journey where innovation meets impact. The benefits are unbelievable β€” and so is the experience you’ll gain!


Role Summary

We are seeking a Data Engineer with strong Databricks expertise to design, build, and maintain scalable, high-performance data pipelines on cloud platforms. The role focuses on developing production-grade ETL/ELT pipelines, enabling data modernization initiatives, and ensuring data quality, governance, and security across enterprise data platforms.

You will work closely with data engineers, analysts, and business stakeholders to deliver reliable, cost-efficient, and scalable data solutions, primarily on Azure.

Key Responsibilities

  • Design, build, and maintain scalable data pipelines using Databricks (Pyspark, Apache Spark, Delta Lake, SQL, Python).
  • Develop and optimize ETL/ELT workflows for performance, reliability, and cost efficiency.
  • Implement data quality, data profiling, governance, and security best practices.
  • Design and maintain data models to support analytics, reporting, and downstream consumption.
  • Collaborate with data engineers, analysts, and business stakeholders to define and implement data requirements.
  • Troubleshoot and resolve issues across data workflows, Spark jobs, and distributed systems.
  • Support cloud data platform modernization and migration initiatives.
  • Automate workflows using Databricks Workflows / Jobs and scheduling tools.
  • Participate in code reviews and contribute to engineering best practices.
  • Work within Agile/Scrum teams to deliver data solutions iteratively.

Must-Have Skills & Experience

  • 10+ years of experience in Data Engineering
  • Solid understanding with Azure Cloud platform.
  • Strong hands-on expertise with Databricks, Pyspark, Apache SparkDelta LakeDatabricks SQL
  • Excellent programming skills in Python and SQL.
  • Experience building production-grade ETL/ELT pipelines.
  • Experience in Data ModelingData ProfilingData Warehousing , distributed computing concepts
  • Working knowledge of Shell Scripting for automation.
  • Experience with Azure Event Hub / Github/Terraform
  • Experience using JFrog Artifactory or any other similar antifactory for artifact management.
  • Understanding of cloud security and access controls.

Nice-to-Have Skills

  • Exposure to CI/CD pipelines for data engineering workloads.
  • Knowledge of streaming data processing.
  • Familiarity with Azure DevOps or similar tools.
  • Experience supporting large-scale analytics or enterprise data platforms.

Soft Skills

  • Strong analytical and problem-solving skills.
  • Ability to work independently and in cross-functional teams.
  • Excellent communication skills to interact with technical and non-technical stakeholders.
  • Proactive mindset with attention to data accuracy and reliability.

 

What you can expect from us

Here at Nexaminds, we're not your typical workplace. We're all about creating a friendly and trusting environment where you can thrive. Why does this matter? Well, trust and openness lead to better quality, innovation, commitment to getting the job done, efficiency, and cost-effectiveness.

  • Stock options πŸ“ˆ
  • Remote work options 🏠
  • Flexible working hours πŸ•œ
  • Benefits above the law
  • But it's not just about the work; it's about the people too. You'll be collaborating with some seriously awesome IT pros.
  • You'll have access to mentorship and tons of opportunities to learn and level up.

Ready to embark on this journey with us? πŸš€πŸŽ‰ If you're feeling the excitement, go ahead and apply!

Top Skills

Spark
Azure
Azure Devops
Azure Event Hub
Databricks
Delta Lake
Git
Jfrog Artifactory
Pyspark
Python
Shell Scripting
SQL
Terraform

Similar Jobs

13 Minutes Ago
Remote or Hybrid
India
Senior level
Senior level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
The Strategic Territory Director will develop sales strategies in India, oversee the sales lifecycle, manage customer relationships, and conduct market analysis for growth.
Top Skills: CRMManetRf NetworkingTactical Radio TechnologiesWireless Communications
4 Hours Ago
Remote
India
Mid level
Mid level
Cloud • Information Technology • Productivity • Software • Automation
Design and manage AWS database solutions including MySQL and PostgreSQL, monitoring performance, ensuring high availability, and implementing backup strategies.
Top Skills: Aurora MysqlAWSCloudwatchLinuxMongoDBMySQLPostgresRdsRds Mariadb
4 Hours Ago
Remote or Hybrid
Bengaluru, Bengaluru Urban, Karnataka, IND
Expert/Leader
Expert/Leader
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
The role involves designing and maintaining CI/CD pipelines, managing cloud infrastructure, leading engineering teams, and ensuring project delivery.
Top Skills: AnsibleAWSAzureC#Ci/Cd ToolsDatadogDockerElk StackGCPGithub ActionsGitlab CiGrafanaJenkinsKubernetesNew RelicNoSQLPrometheusPythonSQLTerraform

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account