Fractal Logo

Fractal

Data Engineer

Reposted Yesterday
In-Office
5 Locations
Senior level
In-Office
5 Locations
Senior level
As a Data Engineer, you will innovate and maintain technology services, develop software systems for large projects, and work across all SDLC phases.
The summary above was generated by AI

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As a Data Engineer -Azure, you will work in the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services.
Experience reqd - 5-10 years
RESPONSIBILITIES:
  • Be an integral part of large-scale client business development and delivery engagements
  • Develop the software and systems needed for end-to-end execution on large projects
  • Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions
  • Build the knowledge base required to deliver increasingly complex technology projects
  • Team handling, problem solving, project management and communication skills & creative thinking
QUALIFICATIONS:
  • A bachelor’s degree in Computer Science or related field with 6-12 years of technology experience
  • Strong experience in System Integration, Application Development or Data-Warehouse projects, across technologies used in the enterprise space
  • Software development experience using: Object-oriented languages (e.g. Python, PySpark,) and frameworks
  • Database programming using any flavors of SQL
  • Expertise in relational and dimensional modelling, including big data technologies
  • Exposure across all the SDLC process, including testing and deployment
  • Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure Databricks, HD Insights, ML Service etc.
  • Good knowledge of Python and Spark are required
  • Good understanding of how to enable analytics using cloud technology and ML Ops
  • Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus
  • Proven track record in keeping existing technical skills and developing new ones, so that you can make strong contributions to deep architecture discussions around systems and applications in the cloud (Azure)
  • Characteristics of a forward thinker and self-starter
  • Ability to work with a global team of consulting professionals across multiple projects
  • Knack for helping an organization to understand application architectures and integration approaches, to architect advanced cloud-based solutions, and to help launch the build-out of those systems
  • Passion for educating, training, designing, and building end-to-end systems for a diverse and challenging set of customers to success
  • GenAI - added advantage

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Top Skills

Azure
Azure Data Factory
Azure Data Lake Storage
Azure Databricks
Azure Sql
Hd Insights
Ml Service
Pyspark
Python
SQL

Fractal Chennai, Tamil Nadu, IND Office

Chennai, Tamil Nadu, India, 600034

Similar Jobs

7 Days Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Design, build, and operate scalable ETL/ELT pipelines using PySpark and AWS data services. Orchestrate workflows with Apache Airflow, implement AWS Glue jobs and Data Catalog, manage Lake Formation permissions, publish datasets for BI, and deliver QuickSight visualizations while ensuring data quality and performance.
Top Skills: Pyspark,Apache Airflow,Aws Glue,Aws Lake Formation,Aws Glue Data Catalog,Amazon Quicksight
15 Days Ago
Remote or Hybrid
India
Mid level
Mid level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
No role-specific responsibilities provided in the posting. Typically, a Unit Manager - Data Governance leads the data governance function: defining and enforcing data policies, managing data quality and metadata, coordinating stakeholders, ensuring regulatory compliance, and supervising a team to implement governance processes.
18 Days Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Design, develop, implement and maintain Big Data solutions for Disability & Absence products. Build batch, speed, and serving layers using Spark, Hive, NoSQL, SOLR and Kafka; optimize Hadoop jobs; automate with Shell/Python; support deployments, CI/CD, runbooks, and cross-team collaboration.
Top Skills: AzureChange-Data-CaptureCi/CdGCPHadoopHbaseHiveKafkaNifiNoSQLPigPythonScalaShellSolrSpark

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account