Specialist Solutions Architect
You will guide customers in building big data solutions on Databricks that span a large variety of use cases.
These are customer-facing roles, working with and supporting the Solution Architects, requiring production experience with Apache Spark and expertise in other data technologies.
SSAs help customers through design and successful implementation of essential workloads while aligning their technical roadmap for expanding the usage of the Databricks Lakehouse Platform.
As a deep go-to-expert reporting to the Specialist Field Engineering Manager, you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in an area of specialty - whether that be performance tuning, machine learning, industry expertise, or more.
The impact you will have:
Provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment Architect production level workloads, including end-to-end pipeline load performance testing and optimisation Provide technical expertise in an area such as data management, cloud platforms, data science, machine learning, or architecture Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures Improve community adoption (through tutorials, training, hackathons, and conference presentations) Contribute to the Databricks Community What we look for:
Pre-sales or post-sales experience working with external clients across a variety of industry markets You will have experience in a customer-facing technical role with expertisein at least one of the following:
Software Engineer/Data Engineer:
Query tuning, performance tuning, troubleshooting, and debugging Spark or other big data solutions.
Data Scientist/ML Engineer:
Model selection, model lifecycle, hyper parameter tuning, model serving, deep learning.
Data Applications Engineer:
Build use cases that use data - such as risk modelling, fraud detection, customer life-time value.
Cloud Architect:
Design, deploy and automate cloud architecture including security, infrastructure, identity management Production programming experience in one of the following:
Python or Scala Deep Specialty Expertise in at least one of the following areas:
Experience scaling big data workloads that are performant and cost-effective.
Experience with Development Tools for CI/CD, Unit and Integration testing, Automation and Orchestration, REST API, BI tools and SQL Interfaces.
Experience designing data solutions on cloud infrastructure and services, such as AWS, Azure, or GCP using best practises in cloud security and networking.
Experience with ML concepts covering Model Tracking, Model Serving and other aspects of productionizing ML pipelines in distributed data processing environments like Apache Spark, using tools like MLflow Experience with design and implementation of big data technologies such as Spark/Delta, Hadoop, NoSQL, MPP, OLTP, and OLAP.
Travel will can be up to 30% regionally Benefits
Complementary health insurance Complementary life and disability coverage Complementary pension Equity awards Paid parental leave Gym reimbursement Annual personal development fund Work headphones reimbursement Business travel accident insurance Mental wellness resources
Diventa il primo a rispondere a un'offerta di lavoro!
-
Perché cercare un lavoro con PostiVacanti.it?
Ogni giorno nuove offerte di lavoro È possibile scegliere tra un'ampia gamma di lavori: il nostro obiettivo è quello di offrire la più ampia selezione possibile Ricevi nuove offerte via e-mail Essere i primi a rispondere alle nuove offerte di lavoro Tutte le offerte di lavoro in un unico posto (da datori di lavoro, agenzie e altri portali) Tutti i servizi per le persone in cerca di lavoro sono gratuiti Vi aiuteremo a trovare un nuovo lavoro