Databricks Cloud engineer - REMOTE

Databricks Cloud engineer - REMOTE

Contract Type:

Full Time

Location:

Off-Site / Remote - Off-Site / Remote

Industry:

Engineering

Contact Name:

Tim Lane

Contact Email:

tim@plr.ltd

Contact Phone:

Tim Lane

Date Published:

15-Dec-2025

Databricks Cloud engineer
 
- REMOTE
 
We are seeking a skilled professional to help us build, guide, and scale our next-generation data & ML platform. This role offers a unique opportunity to combine deep DevOps expertise with hands-on experience on the Databricks platform within Azure.
 
You will play a critical role in shaping architectural direction, establishing best practices, and implementing the core infrastructure that drives data intelligence and predictive modeling capabilities. We value stability, scalability, and efficiency in the production-grade data platforms we create.
 
This is a fully remote role, only within UTC+2 to UTC+4 timezones (CEE, CIS).
 
 
Key responsibilities:
  • Design, build, and automate end-to-end Databricks infrastructure (workspaces, clusters, jobs) using Terraform for reliable and repeatable deployments.
  • Implement and maintain robust CI/CD pipelines (using Azure DevOps or GitHub Actions) to enable continuous integration and delivery for all data and ML code.
  • Provide senior-level guidance on the platform's architecture, security, and governance, specifically focusing on the integration of Azure and Databricks.
  • Deploy and manage environments for model training, versioning, and serving utilizing Databricks services and MLflow.
  • Securely integrate Databricks with core Azure components such as Key Vault, Storage, Azure Data Factory (ADF), and Synapse.
  • Help establish access controls, data lineage, and governance policies within the Unity Catalog.
  • Work closely with Data Scientists and Data Engineers to operationalize models and enforce DevOps best practices for security and infrastructure management.
 
Qualifications:
Required experience:
  • 6+ years of hands-on experience in DevOps.
  • Proven, hands-on professional experience with the Databricks platform, including workspaces, clusters, jobs, and Delta Lake.
  • Solid operational experience with Azure Cloud services and their secure integration with Databricks.
  • Expertise in using Terraform for provisioning complex cloud and Databricks resources.
  • Deep practical knowledge of CI/CD automation using tools like Azure DevOps or GitHub Actions.
  • Proficiency in scripting with Python and/or Bash.
  • Experience managing data/ML governance and versioning within Unity Catalog and MLflow.
  • Fluent communication in English and the ability to work effectively in a distributed, senior-level team.
  • Nice to have 
  • Experience with containerization tools like Docker or orchestration using Kubernetes.
  •   Contextual understanding of popular ML frameworks (e.g., TensorFlow, PyTorch, Scikit-learn).
  •   Familiarity with industry-specific compliance standards (e.g., GDPR, financial sector regulations).
Why is This a Great Opportunity
What we offer
  • A pleasant, professional, and supportive work environment.
  • Exceptional career development opportunities through working on challenging and innovative projects.
  • The opportunity to drive technology choices and take ownership of a modern cloud infrastructure stack in a data-intensive sector.
  • A culture that values technical excellence, autonomy, and continuous learning.
IND123
APPLY NOW

Share this job

Interested in this job?
Save Job
CREATE AS ALERT

Similar Jobs

SCHEMA MARKUP ( This text will only show on the editor. )