Technical Architect-Databricks

Publication Date:  Jun 16, 2025
Ref. No:  531371
Location: 

Pune, IN

Role Overview:

The Technical Architect - Databricks designs and implements scalable data architectures and solutions. The jobholder has expertise in Databricks Lakehouse, data modeling, and cloud integration, ensuring high performance, security, and reliability.

Responsibilities:

  • Design and implement Databricks-based data architectures to meet business requirements.
  • Develop and optimize data pipelines using PySpark, Scala, or SQL.
  • Establish the Databricks Lakehouse architecture for batch and streaming data.
  • Collaborate with cross-functional teams to integrate Databricks with cloud platforms (e.g., AWS, Azure, GCP).
  • Ensure data security and compliance with best practices.
  • Monitor and troubleshoot Databricks environments for performance and reliability.
  • Stay updated on Databricks advancements and industry trends.

Key Technical Skills & Responsibilities

  • 12+  years of experience in data engineering using Databricks or Apache Spark-based platforms.
  • Proven track record of building and optimizing ETL/ELT pipelines for batch and streaming data ingestion.
  • Hands-on experience with Azure services such as Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, or Azure SQL Data Warehouse.
  • Proficiency in programming languages such as Python, Scala, or SQL for data processing and transformation.
  • Expertise in Spark (PySpark, Spark SQL, or Scala) and Databricks notebooks for large-scale data processing.
  • Familiarity with Delta Lake, Delta Live Tables, and medallion architecture for data lakehouse implementations.
  • Build and query deltalake storage solutions 
  • Process streaming data with Azure Databricks structured streaming
  • Design Azure Databricks security and data protection solutions
  • Flatten nested structures and explode arrays with spark
  • Transfer data outside using sparkpools using pyspark connector
  • Optimizing spark jobs
  • Implementing best practices in spark/databricks
  • Experience with orchestration tools like Azure Data Factory or Databricks Jobs for scheduling and automation.
  • Knowledge of Git for source control and CI/CD integration for Databricks workflows, cost optimization, performance tuning.
  • Familiarity with Unity Catalog, RBAC, or enterprise-level Databricks setups.
  • Ability to create reusable components, templates, and documentation to standardize data engineering workflows.
  • Solutioning and presales - Architecting frameworks, defining roadmaps, and engaging with stakeholders.
  • Experience in defining data strategy, evaluating new tools/technologies, and driving adoption across the organization.
  • Must have experience of working with streaming data sources and Kafka (preferred).

Eligibility Criteria:

  • Bachelor’s degree in computer science, Information Technology, or related field
  • Proven experience as a Databricks Architect or similar role
  • Complete knowledge in Azure Databricks platform architecture
  • Databricks certification (e.g., Certified Data Engineer, Associate Developer)
  • Expertise in Python/Scala/ SQL/R
  • Experience with cloud platforms like AWS, Azure, or GCP
  • Strong understanding of data modeling and cloud integration
  • Experience with cluster sizing and security implementation
  • Excellent problem-solving and communication skills