Rackspace Technology Logo

Rackspace Technology

MLOps Engineer (AWS / Azure / GCP)

Posted 14 Days Ago
Be an Early Applicant
Bangalore, Bengaluru, Karnataka
Mid level
Bangalore, Bengaluru, Karnataka
Mid level
As an MLOps Engineer, you will build ETL code and design data engineering workflows to facilitate AI and ML pipelines. Your role includes automating ML infrastructure, collaborating with Data Science teams, and implementing best practices for scalable deployment and monitoring of ML systems.
The summary above was generated by AI

What you will be doing: Build complex ETL code to enable AI and ML pipelines Design and implement complex Data Engineering Alerting and monitoring workflows Design and execute automation strategies to wholistically manage ML infra, encompassing DevOps pipelines, scripting, Infrastructure as Code Establish credibility and build impactful relationships with our customers. Capture and share industry best practices amongst the MLOps community Productionalizing Data Science models to serve at scale Assist Data Science teams in experimentation


Exp - 4+yrs

Skill Required : MLOPS + Azure/AWS / GCP skillset. 

Location - Bangalore


Requirements:

• Expertise in public cloud services, particularly in GCP.

• Experience working with VertexAI / Kubeflow Pipelines or other MLOps tools like MLFlow.

• Experience with machine learning frameworks (TensorFlow, PyTorch) and libraries (e.g., scikit-learn, HuggingFace).

• Knowledge of prompt engineering methods, LangChain, Vector databases and machine learning algorithms.

• Knowledge of LLMOps concepts and practices, including model versioning, deployment, monitoring, and efficient resource utilization for large language models.

• Experience in Infrastructure and Applied DevOps principles utilizing tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.

• Knowledge of containerization technologies such as Docker and Kubernetes to enhance the scalability and efficiency of applications.

• Proven experience in engineering machine learning systems at scale.

• Strong programming abilities in SQL, Java and Python.

• Proven experience with the entire ML lifecycle (processing, training, evaluation, deployment, serving, monitoring).

• Familiarity with model performance monitoring, data drift detection, and anomaly detection (hands-on experience preferred).

• Strong understanding of IT infrastructure and operations, with the ability to set up monitoring/alerting tools and access controls for both infrastructure and applications.

• Have strong communication skills to collaborate effectively with cross-functional teams.

• Demonstrate a willingness to adapt to new technologies and methodologies in the ever-evolving AIOps landscape. 


Must Have:


• Overall 5+ years experience in ML Ops.

• 2+ years of experience building end-to-end data pipelines.

• 2+ years of experience building and deploying ML models in a GCP.

• 3+ years of experience working with SQL, Java, Python for data analysis.


 programming.


• Technical degree: Computer Science, software engineering or related


• Based in Bangalore and willing to work from the office 3 days a week

Top Skills

Java
Python
SQL

Similar Jobs

Be an Early Applicant
3 Hours Ago
Bangalore, Bengaluru, Karnataka, IND
8,000 Employees
Senior level
8,000 Employees
Senior level
Fintech • Legal Tech • Software • Financial Services • Cybersecurity • Data Privacy
The Senior Software Engineer will develop and maintain applications using the dot net tech stack, focusing on full stack development. They will work within a financial services domain, with responsibilities including RESTful API integration, cloud services on Azure, and applying Agile methodologies.
5 Hours Ago
Bengaluru, Karnataka, IND
2,800 Employees
Entry level
2,800 Employees
Entry level
Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
As a Data API Engineer at Samsara, you will develop and maintain data product APIs, manage backend ingestion pipelines, and work with various data sources. You will play a pivotal role in building Samsara's Data Platform to support advanced automation and analytics, ensuring data flows efficiently to support business growth and decision-making.
Be an Early Applicant
6 Hours Ago
Bengaluru, Karnataka, IND
Hybrid
4,196 Employees
Mid level
4,196 Employees
Mid level
Fintech • Payments • Financial Services
As a Kotlin Engineer at Adyen, you will develop payment solutions for various terminals using Kotlin and work on backend communication and integration. You'll be responsible for design, implementation, and testing, ensuring that the features are of high quality and maintain a secure architecture.

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account