Hiring Now

Senior DevOps Engineer - Data Focus

Full-time

Commitment type

Albania, Bosnia and Herzegovina, Bulgaria, Greece, Macedonia, Serbia

Location

$4000 - $5500/month

Salary
Job Description

Adeva is a global talent network that enables work without boundaries by connecting tech professionals with top companies worldwide. 

As a DevOps Engineer, you will be responsible for the reliability, security, and automation of the software delivery lifecycle. You will not write data transformation logic; instead, you build the deployment pipelines that enable Data Engineers to deliver their code safely and efficiently. 

Your primary goal is to implement a "Zero-Touch Deployment" model. You will navigate a hybrid environment: actively collaborating with internal data transformation framework team on standardized CICD pipelines while also supporting projects that require custom CI/CD implementations (using Databricks Asset Bundles). A critical part of the role is to drive the strategic unification of these diverse deployment patterns into a single, standardized, and compliant delivery model. 

You will configure strict Pull Request policies that trigger automated testing suites (Unit, Integration, Data Quality) and manage a gated release process where Production deployments require specific approvals from Leadership. 

Responsibilities

Pipeline Architecture & Release Management 

  • Design and maintain end-to-end deployment pipelines in Azure DevOps using YAML, completely removing the need for manual deployments. 
  • Support and Unify Deployment Standards: Manage CI/CD for both HITT-framework based projects and custom implementations using Databricks Asset Bundles (DABs). You will lead initiatives to unify these workflows, reducing fragmentation and establishing a common standard across the domain. 
  • Extend the HITT framework, identifying gaps in internal tooling and implementing new features to support complex testing and compliance requirements. 
  • Configure Pull Request automation: Ensure that every PR automatically triggers a comprehensive test suite (Unit Tests, Integration Tests, and Data Quality checks) before it can be merged. 
  • Implement a Gated Release Strategy: 
  • Auto-deploy to Acceptance: Automatically deploy code to the Acceptance environment upon successful merge. 
  • Gated Production Deployment: Configure strict approval gates for Production, requiring explicit sign-off from the Data Engineering Lead, Staff Engineer, or Manager before deployment proceeds. 

Infrastructure & Platform Operations 

  • Manage cloud infrastructure using Infrastructure as Code (Bicep/ARM/Terraform) to ensure environment consistency across Dev, Acc, and Prod. 
  • Administer Azure Databricks workspaces, ensuring correct configuration of Databricks Workflows, cluster policies, and library dependencies. 
  • Monitor pipeline performance and stability, troubleshooting failures in the CI/CD process (not the data logic itself). 

Security & Collaboration 

  • Enforce Least Privilege Access: Ensure that developers have read-only access to Production, with all changes applied strictly via service principals through the pipeline. 
  • Collaborate with QA Engineers to integrate new testing frameworks into the CI/CD loop. 
  • Work with the Staff Data Engineer to align pipeline capabilities and the unification roadmap with evolving architectural standards. 

Requirements

  • Azure DevOps (Expert Level): Mastery of Pipelines, Repos, Branch Policies, and Release Gates. Deep proficiency in YAML for defining complex pipeline logic and infrastructure configurations. 
  • Databricks Ecosystem: Strong experience in administering Azure Databricks workspaces. proficiency in Databricks Asset Bundles (DABs) for managing and deploying data products, as well as Databricks Workflows (Jobs) orchestration. 
  • Scripting & Automation: Strong proficiency in Python (essential for automation and DABs), as well as Bash and/or PowerShell for operational tasks. 
  • Azure Cloud Platform: Deep knowledge of Azure services, including ADLS Gen2, Key Vault, Virtual Networks, and private endpoints. Strong understanding of RBAC and Managed Identities. 
  • Infrastructure as Code (IaC): Solid understanding of IaC principles. Experience with ARM Templates, Bicep, or Terraform for provisioning and managing cloud resources. 

Nice to have:

  • Knowledge of Data Quality tools (Great Expectations, Soda) integration.
  • Docker / Containerization experience.
  • Experience with migrating legacy pipelines to modern standards (e.g., to DABs).

About Adeva

Adeva is an exclusive network of engineers, product and data professionals that connects consultants with leading enterprise organizations and startups. Our network is distributed all over the world, with engineers in more than 35 countries. Our company culture builds connections, careers, and employee growth. We are creating a workplace from the future that values flexibility, autonomy, and transparency. If that sounds like something you’d like to be part of, we’d love to hear from you.

Required skills
  • Azure DevOps
  • Azure Databricks
  • Python
  • IaC
Apply for This Job

Or, continue with

Share this job

Not what you’re looking for? Check out these similar roles.

ML Engineer

Hiring Now

$5500 - $7000/month

Europe

Full-time

  • Machine Learning
  • AI
  • Python
  • GCP

View Details