TECHNICAL ARCHITECT
Pune, IN
Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 47,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come.
Role Overview:-
The Senior Tech Lead - Snowflake leads the design, development, and optimization of advanced data warehousing solutions. The jobholder has extensive experience with Snowflake, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems.
Responsibilities:-
- Lead the design and implementation of Snowflake-based data architectures and pipelines.
- Provide technical leadership and mentorship to a team of data engineers.
- Collaborate with stakeholders to define project requirements and ensure alignment with business goals.
- Ensure best practices in data security, governance, and compliance.
- Troubleshoot and resolve complex technical issues in Snowflake environments.
- Stay updated on the latest Snowflake technologies and industry trends.
Key Technical Skills & Responsibilities
- Minimum 7 + years of experience of designing and developing data warehouse / big data applications
- Must be able to lead data product development using Streamlit and Cortex
- Deep understanding of relational as well as NoSQL data stores, data modeling methods and approaches (star and snowflake, dimensional modeling) Good communication skill.
- Must have experience of solution architecture using Snowflake
- Must have experience of working with Snowflake data platform, it’s utilities (SnowSQL, SnowPipe etc) and it’s features (time travel, support to semi-structured data etc)
- Must have experience of migrating on premise data warehouse to Snowflake cloud data platform
- Must have experience of working with any cloud platform, AWS | Azure | GCP Experience of developing accelerators (using Python, Java etc) to expedite the migration to Snowflake
- Must be good with Python and PySpark (including Snowpark) for data pipeline building.
- Must have expereince of working with streaming data sources and Kafka.
- Extensive experience of developing ANSI SQL queries and Snowflake compatible stored procedures
- Snowflake certification is preferred
Eligibility Criteria:
- Bachelor’s degree in Computer Science, Data Engineering, or a related field.
- Extensive experience with Snowflake, SQL, and data modeling.
- Snowflake certification (e.g., SnowPro Core Certification).
- Experience with cloud platforms like AWS, Azure, or GCP.
- Strong understanding of ETL/ELT processes and cloud integration.
- Proven leadership experience in managing technical teams.
- Excellent problem-solving and communication skills.
Our Offering:
- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment.
- Wellbeing programs & work-life balance - integration and passion sharing events.
- Attractive Salary and Company Initiative Benefits
- Courses and conferences.
- Attractive Salary.
- Hybrid work culture.
Let’s grow together.