Data Engineer Hadoop @ Antal
Must-have qualifications:
- Minimum 5 years of experience as a Data Engineer / Big Data Engineer
- Hands-on expertise in Hadoop, Hive, HDFS, Apache Spark, Scala, SQL
- Solid experience with GCP and services like BigQuery, Dataflow, DataProc, Pub/Sub, Composer (Airflow)
- Experience with CI/CD processes and DevOps tools: Jenkins, GitHub, Ansible
- Strong data architecture and data engineering skills in large-scale environments
- Experience working in enterprise environments and with external stakeholders
- Familiarity with Agile methodologies such as Scrum or Kanban
- Ability to debug and analyze application-level logic and performance
Nice to have:
- Google Cloud certification (e.g., Professional Data Engineer)
- Experience with Tableau, Cloud DataPrep, or Ansible
- Knowledge of cloud design patterns and modern data architectures
Hadoop Data Engineer (GCP, Spark, Scala) – Kraków / Hybrid
We are looking for an experienced Hadoop Data Engineer to join a global data platform project built in the Google Cloud Platform (GCP) environment. This is a great opportunity to work with distributed systems, cloud-native data solutions, and a modern tech stack. The position is based in Kraków (hybrid model – 2 days per week in the office).
Work model:
- Hybrid – 2 days per week from the Kraków office (rest remotely)
- Opportunity to join an international team and contribute to global-scale projects
Kategorie
data
- Podrobné informace o nabídce práce
Firma: Antal Lokalita: Práce v Polsku Odvětví práce: data Pracovní pozice: Data Engineer Hadoop @ Antal Směnnost práce fulltime - 40 hours per week Nástup do práce od: IHNED Nabízená mzda: neuvedeno Nabídka přidána: 31. 5. 2025
Pracovní pozice aktivní
Buďte první, kdo se na danou nabídku práce přihlásí!
Práce Data Engineer Hadoop @ Antal: Často kladené otázky
👉 V jakém městě se nabízí nabídka práce Data Engineer Hadoop @ Antal?
Práce je nabízena v lokalitě Kraków.
👉 Jaká firma nabírá na tuto pozici?
Tato nabídka práce je do firmy Antal.