As a Data Architect, define data requirements, design application structures, and ensure alignment with business goals while managing data effectively.
Project Role : Data Architect
Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration.
Must have skills : PySpark
Good to have skills : AWS Architecture
Minimum 12 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary:
As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various teams to ensure that the data architecture is robust, scalable, and efficient, while also addressing any challenges that arise during the development process. Your role will be pivotal in shaping the data landscape of the organization, enabling data-driven decision-making and fostering innovation through effective data management practices.
Job Description – Data Architect
Experience: 10+ years
Mandatory Skills: PySpark, AWS
Desirable Skills: SQL (ETL/ELT), Teradata, Lakehouse (Iceberg/Delta), Databricks/Snowflake, DevOps/CI-CD
Key Responsibilities
1. Architecture & Solution Design
Design and maintain scalable, reliable, secure enterprise data architectures (Lake, Warehouse, Lakehouse).
Lead solution design for client RFPs/change requests provide design assurance.
Define strategic roadmaps and align architectures with business goals.
Create architectural artifacts (conceptual/logical/physical models, lineage, integration flows).
Evaluate integration approaches, ensure cost-effectiveness, scalability, and compliance.
Identify implementation risks and recommend mitigations.
2. Project & Delivery Support
Provide technical leadership in PySpark, SQL, AWS-native patterns.
Ensure adherence to enterprise architecture, security, governance, and regulatory standards.
Collaborate with cross-functional teams and support design/code reviews.
Guide teams during planning and execution phases for delivery success.
3. Competency & Thought Leadership
Build POCs, accelerators, and case studies for emerging areas.
Author white papers/blogs and present solution viewpoints.
Mentor engineers and junior architects conduct selection interviews.
Represent practice in client workshops, forums, and industry events.
Attributes
Strong client-facing, communication, and stakeholder management skills.
Ability to balance strategic vision with hands-on delivery.
Passion for continuous learning and modern data/cloud innovations.
Additional Information:
- The candidate should have minimum 12 years of experience in PySpark.
- This position is based at our Pune office.
- A 15 years full time education is required.15 years full time education
Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration.
Must have skills : PySpark
Good to have skills : AWS Architecture
Minimum 12 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary:
As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various teams to ensure that the data architecture is robust, scalable, and efficient, while also addressing any challenges that arise during the development process. Your role will be pivotal in shaping the data landscape of the organization, enabling data-driven decision-making and fostering innovation through effective data management practices.
Job Description – Data Architect
Experience: 10+ years
Mandatory Skills: PySpark, AWS
Desirable Skills: SQL (ETL/ELT), Teradata, Lakehouse (Iceberg/Delta), Databricks/Snowflake, DevOps/CI-CD
Key Responsibilities
1. Architecture & Solution Design
Design and maintain scalable, reliable, secure enterprise data architectures (Lake, Warehouse, Lakehouse).
Lead solution design for client RFPs/change requests provide design assurance.
Define strategic roadmaps and align architectures with business goals.
Create architectural artifacts (conceptual/logical/physical models, lineage, integration flows).
Evaluate integration approaches, ensure cost-effectiveness, scalability, and compliance.
Identify implementation risks and recommend mitigations.
2. Project & Delivery Support
Provide technical leadership in PySpark, SQL, AWS-native patterns.
Ensure adherence to enterprise architecture, security, governance, and regulatory standards.
Collaborate with cross-functional teams and support design/code reviews.
Guide teams during planning and execution phases for delivery success.
3. Competency & Thought Leadership
Build POCs, accelerators, and case studies for emerging areas.
Author white papers/blogs and present solution viewpoints.
Mentor engineers and junior architects conduct selection interviews.
Represent practice in client workshops, forums, and industry events.
Attributes
Strong client-facing, communication, and stakeholder management skills.
Ability to balance strategic vision with hands-on delivery.
Passion for continuous learning and modern data/cloud innovations.
Additional Information:
- The candidate should have minimum 12 years of experience in PySpark.
- This position is based at our Pune office.
- A 15 years full time education is required.15 years full time education
About Accenture
Accenture is a leading global professional services company that helps the world’s leading businesses, governments and other organizations build their digital core, optimize their operations, accelerate revenue growth and enhance citizen services—creating tangible value at speed and scale. We are a talent- and innovation-led company with approximately 791,000 people serving clients in more than 120 countries. Technology is at the core of change today, and we are one of the world’s leaders in helping drive that change, with strong ecosystem relationships. We combine our strength in technology and leadership in cloud, data and AI with unmatched industry experience, functional expertise and global delivery capability. Our broad range of services, solutions and assets across Strategy & Consulting, Technology, Operations, Industry X and Song, together with our culture of shared success and commitment to creating 360° value, enable us to help our clients reinvent and build trusted, lasting relationships. We measure our success by the 360° value we create for our clients, each other, our shareholders, partners and communities.Visit us at www.accenture.com
Equal Employment Opportunity Statement
We believe that no one should be discriminated against because of their differences. All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, military veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by applicable law. Our rich diversity makes us more innovative, more competitive, and more creative, which helps us better serve our clients and our communities.
Top Skills
Aws Architecture
Ci-Cd
Databricks
DevOps
Lakehouse
Pyspark
Snowflake
SQL
Teradata
Accenture Chennai, Tamil Nadu, IND Office
Chennai, India
Similar Jobs
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
Responsible for shaping data ecosystems, designing data architectures, developing data models, mentoring engineers, and driving collaboration across teams.
Top Skills:
Aws RedshiftAzure SynapseBig QueryMs FabricPower BIPythonRSnowflakeSQLTableau
Information Technology • Consulting
The Cloud Data Architect will design enterprise-level cloud data architectures, lead cloud migration, define governance frameworks, and mentor data engineers.
Top Skills:
Amazon EmrAmazon KinesisAmazon RedshiftAmazon S3ArmAWSAws GlueAws IamAws Lake FormationAzureAzure Data FactoryAzure Data Lake StorageAzure DatabricksAzure Event HubsAzure PurviewAzure Synapse AnalyticsBigQueryCloud ComposerCloud StorageCloudFormationDataflowDataprocGCPGoogle Cloud PlatformAzurePub/SubTerraform
eCommerce • Fintech • Payments
Design and maintain enterprise data architecture, including data modeling, metadata management, ETL/ingestion, data security, and migration strategies. Collaborate with architects, developers, DBAs to implement cloud/on-prem data platforms, optimize performance, research new technologies, and occasionally develop/deploy applications and DevOps tasks on Google Cloud.
Top Skills:
Java,Python,Groovy,Unix Scripting,Sql,Big Data,Spark,Scala,Kafka,Spring,Hibernate,Spring Boot,Bigquery,Cloud Sql,Cloud Function,Cloud Storage,Cloud Datastore,Gcp,Aws,Ibm Watson,Azure,Datapower,Apim,Apigee,Nosql,Airflow,Ctrl-M,Cloud Composer,Github,Rally,Jira,Confluence,Vsts
What you need to know about the Chennai Tech Scene
To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.



