Fluxon Logo

Fluxon

Data Engineer

Posted 15 Days Ago
Easy Apply
Remote
Hiring Remotely in India
Mid level
Easy Apply
Remote
Hiring Remotely in India
Mid level
As a Data Engineer, you will design and maintain data infrastructure, build data pipelines, and collaborate on data requirements for analytics and reporting.
The summary above was generated by AI
Who we are

We are Fluxon, a product development team founded by ex-Googlers and startup founders. We offer full-cycle software development: from ideation and design to build and go-to-market. We partner with visionary companies, ranging from fast-growing startups to tech leaders like Google and Stripe, to turn bold ideas into products with the power to transform the world.

This is a remote position, with a preference for candidates located in Hyderabad, Bangalore, or Gurgaon, India.

About the role

As the first Data Engineer at Fluxon, you’ll take the lead in designing, building, and maintaining the data infrastructure that powers our products and enables data-driven decision-making for our clients.

You'll be responsible for:

  • Design and implement data models and warehouse schemas to support analytics and reporting needs
  • Build and maintain reliable data pipelines to ingest, transform, and load data from various sources
  • Collaborate with product and engineering teams to understand data requirements and deliver scalable solutions
  • Ensure data quality, integrity, and accessibility across the organization
  • Optimizing query performance and improving the efficiency of existing data infrastructure
  • Maintain comprehensive documentation for data models, pipelines, and processes for team reference

You'll work with technologies including:

Data & Analytics

  • Data Warehouse: Google BigQuery, Snowflake, AWS Redshift, Databricks
  • ETL/Pipeline Tools: Apache Spark, Apache Airflow, dbt
  • Streaming & Queuing: Apache Kafka, Pub/Sub, RabbitMQ

Languages

  • SQL
  • Python (good to have)

Cloud & Infrastructure

  • Platforms: Google Cloud Platform (GCP) or Amazon Web Services (AWS)
  • Storage: Google Cloud Storage (GCS) or AWS S3
  • Orchestration & Processing: Cloud Composer (Airflow), Dataflow, Dataproc

Data Stores

  • Relational: PostgreSQL

Monitoring & Observability

  • GCP Cloud Monitoring Suite
Qualifications
  • 3–5 years of industry experience in data engineering roles
  • Strong proficiency in SQL and experience with data warehousing concepts (dimensional modeling, star/snowflake schemas)
  • Experience building and maintaining ETL/ELT pipelines
  • Familiarity with cloud data platforms, preferably GCP and BigQuery
  • Understanding of data modeling best practices and data quality principles
  • Solid understanding of software development practices including version control (Git) and CI/CD
Nice to have:
  • Experience with Python for data processing and automation
  • Experience with Apache Spark or similar distributed processing frameworks
  • Familiarity with workflow orchestration tools (Airflow, Prefect)
  • Exposure to dbt or similar transformation tools
What we offer
  • Exposure to high-profile SV startups and enterprise companies
  • Competitive salary
  • Fully remote work with flexible hours
  • Flexible paid time off
  • Profit-sharing program
  • Healthcare
  • Parental leave that supports all paths to parenthood, including fostering and adopting
  • Gym membership and tuition reimbursement
  • Hands-on career development

Top Skills

Amazon Web Services
Apache Airflow
Apache Kafka
Spark
Aws Redshift
Aws S3
Cloud Composer
Databricks
Dataflow
Dataproc
Dbt
Gcp Cloud Monitoring Suite
Google Bigquery
Google Cloud Platform
Google Cloud Storage
Postgres
Pub/Sub
Python
RabbitMQ
Snowflake
SQL

Similar Jobs

14 Hours Ago
Remote
India
Senior level
Senior level
Information Technology • Software • Business Intelligence
The Azure Data Engineer Lead Consultant will oversee a data engineering team, ensuring best practices, project delivery, and providing technical guidance in data solutions.
Top Skills: Azure Data FactoryAzure Data LakeAzure DatabricksEtl FrameworksPysparkPythonSQL
14 Hours Ago
Remote
India
Senior level
Senior level
Information Technology • Software • Business Intelligence
As a GCP Data Engineer Lead Consultant, you will design and implement data solutions on GCP, optimize performance, and mentor junior engineers while collaborating with cross-functional teams.
Top Skills: AirflowAlteryxSparkBigQueryConfluenceDatabricksDataformDbtGCPGitJIRAPythonSQLTalend
14 Hours Ago
Remote
India
Expert/Leader
Expert/Leader
Information Technology • Software • Business Intelligence
The role involves data modeling, troubleshooting complex issues, optimizing storage architectures and mentoring junior engineers while working with GCP and Python.
Top Skills: Apache AirflowBigQueryEltETLGCPPythonSQL

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account