📍 v Polsku💰 Dle domluvy🏢 Square One Resources

📝 Podrobnosti o nabídce práce

  • 3+ years of experience in DevOps or Data Platform operations with cloud technologies.
  • Hands-on experience with Databricks environment administration.
  • Proficiency in Python (automation, scripting).
  • Familiarity with BI/analytics tool integration via Databricks connectors.
  • Solid knowledge of SQL and data engineering fundamentals.
  • Experience with orchestration tools (Databricks Workflows, Airflow, Azure Data Factory).
  • Understanding of Identity & Access Management in cloud environments.
  • Terraform
  • Python (for automation)
  • English level B2

We are looking for an experienced DevOps Engineer to set up, configure, and operationalize a new Databricks environment focused on business intelligence (BI), analytics, and data engineering workflows.

Working closely with an ML Ops Engineer, you will ensure the Databricks platform supports both traditional BI/data processing use cases and AI workloads. This includes secure access for data analysts, seamless integration with downstream AI/BI tools, and optimized data pipelines.

Work mode: Hybrid (3 days/week in office — Warsaw, Poznań, or Lublin)

,[Deploy and configure Databricks workspaces for multi-team usage., Set up resource management policies for shared clusters, automated job clusters, and interactive analytical clusters., Configure role-based access controls aligned with data governance standards., Establish secure connectivity to on-premise and cloud data sources (SQL Server, data lake, APIs, etc.)., Build shared data ingestion pipelines for BI and analytics teams., Automate daily and weekly data refresh schedules., Integrate Databricks with BI platforms (e.g., Power BI)., Configure and optimize JDBC/ODBC connectors to ensure performance and reliability., Implement monitoring and logging for Databricks jobs and pipelines., Define backup and disaster recovery processes for key data sets., Apply cost tracking, budgeting, and optimization practices for cluster usage., Set up CI/CD pipelines for data engineering code and deployments., Manage deployment workflows for notebooks, SQL queries, and data models., Work with ML Ops Engineers to maintain shared infrastructure (storage, Delta Lake tables) supporting both BI and ML use cases., Partner with Data Engineers to maintain central data sources in Databricks., Collaborate with security teams to implement access controls for sensitive data., Enforce data governance (GDPR and internal compliance) including workspace auditing and logging., Document procedures for configuration, usage, and operations for all teams., 3+ years of experience in DevOps or Data Platform operations with cloud technologies., Hands-on experience with Databricks environment administration., Proficiency in Python (automation, scripting)., Familiarity with BI/analytics tool integration via Databricks connectors., Solid knowledge of SQL and data engineering fundamentals., Experience with orchestration tools such as Databricks Workflows, Airflow, or Azure Data Factory., Understanding of Identity & Access Management in cloud environments.] Requirements: Python, Terraform

Kategorie

devops

  • 📍
    Lokalita: Warsaw
  • ⏱️
    Směnnost: fulltime - 40 hours per week
  • 📆
    Nástup: IHNED
  • 🏢
    Firma: Square One Resources
  • ❓ Vše, co o této práci potřebujete vědět

    👉 Kde je tato práce?

    Práce je v lokalitě Warsaw.

    👉 Kdo na tuto pozici nabírá?

    Tuto pracovní pozici nabízí firma Square One Resources.

    👉 Jaká je směnnost?

    Směnnost: fulltime - 40 hours per week.

    👉 Kdy je nástup?

    Nástup je od IHNED.

Odpovědět na inzerát
    Buďte první, kdo se na danou nabídku práce přihlásí!
0.0779