OpenGov Logo

OpenGov

Sr. AI Platform and Orchestration Engineer

Posted 5 Days Ago
Be an Early Applicant
In-Office or Remote
2 Locations
Senior level
In-Office or Remote
2 Locations
Senior level
Build and scale Snowflake-native AI execution layers and orchestrated agent workflows. Implement RAG, embedding pipelines, vector retrieval, context-routing, monitoring, and integrations with enterprise data using SQL, Python, and DBT. Ensure scalable, observable, and governed AI systems across the organization.
The summary above was generated by AI

OpenGov is the leader in AI and ERP solutions for local and state governments in the U.S. More than 2,000 cities, counties, state agencies, school districts, and special districts rely on the OpenGov Public Service Platform to operate efficiently, adapt to change, and strengthen the public trust. Category-leading products include enterprise asset management, procurement and contract management, accounting and budgeting, billing and revenue management, permitting and licensing, and transparency and open data. These solutions come together in the OpenGov ERP, allowing public sector organizations to focus on priorities and deliver maximum ROI with every dollar and decision in sync. Learn about OpenGov’s mission to power more effective and accountable government and the vision of high-performance government for every community at OpenGov.com.

Job Summary

OpenGov is seeking a highly technical AI Platform & Orchestration Engineer to build and scale our Snowflake-native AI execution layer. You will be among the first members of our Pune AI engineering team, with significant ownership and influence over how we build and scale AI systems globally.

This role is responsible for transforming AI use cases into durable, production-ready systems using Snowflake AI features, agent orchestration platforms, and modern LLM frameworks.

You will work at the intersection of data, process orchestration, and applied AI. This is not a research role. It is a hands-on engineering role focused on building reliable pipelines, agent workflows, retrieval systems, and context-engineer frameworks that power intelligent automation across GTM, G&A, and internal enterprise operations.

You will partner closely with a global team to operationalize AI workflows that are scalable, observable, and governed. This role requires strong Snowflake expertise, comfort with Python, SQL, and working with DBT data models, as well as experience building orchestrated AI systems that can scale across an enterprise.

Core Responsibilities
  • Design and implement Snowflake-native AI pipelines to support enterprise automation and intelligent workflows.

  • Build Retrieval-Augmented Generation (RAG) systems, embedding pipelines, and vector-based retrieval architectures.

  • Configure and manage AI orchestration workflows using Airia or similar agent frameworks.

  • Develop scalable context-routing and orchestration logic to support multi-step agentic workflows.

  • Partner with Data Engineering and Enterprise Systems teams to integrate both structured and unstructured enterprise data into AI systems in a cost effective and performant manner.

  • Implement monitoring, evaluation, and feedback mechanisms to track agent reliability and adoption, as well as continuously learn from accuracy logs.

  • Optimize Snowflake AI feature usage for performance, cost-efficiency, and scalability.

  • Support development of prompt libraries, context schemas, and reusable AI workflow components.

  • Collaborate on architectural standards for Model Context Protocol (MCP), agent governance, and data security.

  • Continuously evaluate emerging AI frameworks and tooling to enhance OpenGov’s AI platform capabilities.

Required Experience
  • Bachelor’s degree in Computer Science, Engineering, or a related field.

  • 5–8 years of experience in software engineering, data engineering, applied AI engineering, platform engineering, or related roles.

  • Strong hands-on experience with Snowflake, including AI functions, semantic modeling, and Cortex.

  • Experience building or supporting Retrieval-Augmented Generation (RAG) pipelines.

  • Experience working with Large Language Models (LLMs) and LLM APIs (e.g., OpenAI, Anthropic, Gemini, etc.).

  • Hands-on experience with orchestration frameworks (Airia, LangChain, CrewAI, or similar).

  • Strong proficiency in SQL/Python and comfort working directly with enterprise data warehouses.

  • Experience building API integrations and managing structured and unstructured data pipelines.

  • Understanding of agent lifecycle management, reliability, and evaluation best practices.

  • Ability to independently own and deliver end-to-end AI workflow implementations.

  • Strong problem-solving skills with a bias toward scalable system design.

Nice to Have:
  • Surfacing AI systems via lightweight web applications (React, Streamlit, FastAPI, Flask, Node.js, etc.).

  • Experience implementing enterprise AI systems in SaaS environments.

  • Exposure to event-driven architectures and workflow engines.

  • Experience with vector databases and embedding models.

  • Understanding of CI/CD in data engineering contexts.

  • Experience working in compliance-heavy or public sector environments.

  • Experience collaborating with cross-functional, non-technical stakeholders.

Why OpenGov?

A Mission That Matters.

At OpenGov, public service is personal. We are passionate about our mission to power more effective and accountable government. Government that operates efficiently, adapts to change, and strengthens public trust.  Some people say this is boring.  We think it’s the core of our democracy.

Opportunity to Innovate

The next great wave of innovation is unfolding with AI, and it will impact everything—from the way we work to the way governments interact with their residents. Join a trusted team with the passion, technology, and expertise to drive innovation and bring AI to local government. We’ve touched 2,000 communities so far, and we’re just getting started.

A Team of Passionate, Driven People

This isn’t your typical 9-to-5 job; we operate in a fast-paced, results-driven environment where impact matters more than simply clocking in and out. Our global team of 800+ employees is united in our commitment to challenge the status quo. OpenGov is headquartered in San Francisco and has offices in Atlanta, Boston, Buenos Aires, Chicago, Dubuque, Plano, and Pune.

A Place to Make Your Mark

We pride ourselves on our performance-based culture, where every employee is encouraged to jump in head-first and take action to help us improve. If you have a great idea, we want to hear it. Excellent performance is recognized and rewarded, and we love to promote from within.

Benefits That Work for You

Enjoy an award-winning workplace with the benefits to match, including:

  • Comprehensive healthcare options for individuals and families

  • Flexible vacation policy and paid company holidays

  • 401(k) with company match

  • Paid parental leave, wellness stipends, and HSA contributions

  • Professional development and growth opportunities

  • A collaborative office environment with weekly catered lunches.

Top Skills

Airia
Anthropic
Cortex
Crewai
Dbt
Embedding Pipelines
Gemini)
Langchain
Llm Apis (Openai
Model Context Protocol (Mcp)
Python
Retrieval-Augmented Generation (Rag)
Snowflake
Snowflake Ai
SQL
Vector-Based Retrieval

Similar Jobs

An Hour Ago
Easy Apply
Remote or Hybrid
IND
Easy Apply
Mid level
Mid level
Artificial Intelligence • Cloud • Security • Software
As a Sales Solutions Engineer, you'll support customers in evaluating Sonar's products, deliver technical presentations, troubleshoot issues, and ensure successful product adoption.
Top Skills: AWSAzureDockerKubernetesLinuxWindows
3 Hours Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Lead delivery engineer responsible for managing stakeholders, implementing technology solutions, adopting Agile practices, and overseeing the software lifecycle in Internal Audit projects.
Top Skills: AzureAzure DevopsCa SiteminderJavaJfrogMs EntraPingonePythonReactSQL
3 Hours Ago
Remote or Hybrid
India
Mid level
Mid level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Assistant Manager - IT will analyze user requirements, design and develop web applications, integrate applications, and support users while managing software development projects and troubleshooting issues.
Top Skills: .Net Core.Net MvcAngular 2+Angular JsSQL ServerWeb Api

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account