TECHNICAL ARCHITECT

Publication Date:  May 15, 2025
Ref. No:  531386
Location: 

Pune, IN

Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 47,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come.

 

Role Overview:-
The Senior Tech Lead - GCP Data Engineering leads the design, development, and optimization of advanced data solutions. The jobholder has extensive experience with GCP services, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems.

 

Responsibilities:-

  • Lead the design and implementation of GCP-based data architectures and pipelines.
  • Architect and optimize data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
  • Provide technical leadership and mentorship to a team of data engineers.
  • Collaborate with stakeholders to define project requirements and ensure alignment with business goals.
  • Ensure best practices in data security, governance, and compliance.
  • Troubleshoot and resolve complex technical issues in GCP data environments.
  • Stay updated on the latest GCP technologies and industry trends.

Key Technical Skills & Responsibilities

  • Overall 10+ Yrs of experience with GCP and Data Warehousing concepts; Coding; reviewing; testing and debugging
  • Experience as architect on GCP implementation/or migration data projects.
  • Must have understanding of Data Lakes and Data Lake Architectures, best practices in data storage, loading, retrieving data from data lakes.
  • Experience in develop and maintain pipelines in GCP platform, understand best practices of bringing on-prem data to the cloud. File loading, compression, parallelization of loads, optimization etc.
  • Working knowledge and/or experience with Google Data Studio, looker and other visualization tools
  • Working knowledge in Hadoop and Python/Java would be an added advantage
  • Experience in designing and planning BI solutions, Debugging, monitoring and troubleshooting BI solutions, Creating and deploying reports and Writing relational and multidimensional database queries.
  • Any experience in NOSQL environment is a plus.
  • Must be good with Python and PySpark for data pipeline building.
  • Must have experience of working with streaming data sources and Kafka.
  • GCP Services - Cloud Storage, BigQuery , Big Table, Cloud Spanner, Cloud SQL, DataStore/Firestore, DataFlow, DataProc, DataFusion, DataPrep, Pub/Sub, Data Studio, Looker, Data Catalog, Cloud Composer, Cloud Scheduler, Cloud Function

Eligibility Criteria:

  • Bachelor’s degree in Computer Science, Data Engineering, or a related field.
  • Extensive experience with GCP data services and tools.
  • GCP certification (e.g., Professional Data Engineer, Professional Cloud Architect).
  • Experience with machine learning and AI integration in GCP environments.
  • Strong understanding of data modeling, ETL/ELT processes, and cloud integration.
  • Proven leadership experience in managing technical teams.
  • Excellent problem-solving and communication skills.

Our Offering:

  • Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment.
  • Wellbeing programs & work-life balance - integration and passion sharing events.
  • Attractive Salary and Company Initiative Benefits
  • Courses and conferences.
  • Attractive Salary.
  • Hybrid work culture.

 

 

 

Let’s grow together.