Databricks Architect/Solution Architect
The Solution Architect is responsible for designing and delivering scalable data and AI solutions using Databricks Lakehouse and Delta Lake. The role involves leading end-to-end project delivery, guiding customers through architecture, migrations, and deployment, and ensuring best practices in performance, security, and governance (Unity Catalog). Solution Architect also supports production operations, implements CI/CD pipelines, and enables customer teams through training and reusable assets.
Key Requirements:
- 8+ years in data engineering, analytics, and big data technologies
- Hands-on experience with cloud platforms: AWS, Azure, or GCP
- Strong programming skills in Python, SQL, and Scala (with PySpark)
- Expertise in Databricks (Delta Lake, MLflow, DLT, Unity Catalog)
- Experience in data migration, streaming (Kafka), and Spark internals
- Knowledge of deployment tools (Terraform, CI/CD)
- Proven project delivery and stakeholder management skills
- Relevant Databricks certifications preferred