Big Data Engineer @ Antal

Polsko

  • Degree in Computer Science, IT, or a related field
  • Fluent in English, with strong communication and problem-solving skills
  • Hands-on experience with big data solutions and distributed systems (e.g., Apache Spark)
  • Strong backend development using Java 11+, Python, and Groovy
  • Experience in building REST APIs, microservices, and integrating with API gateways
  • Exposure to public cloud platforms, especially GCP or AWS
  • Familiarity with Spring (Boot, Batch, Cloud), Git, Maven, Unix/Linux
  • Experience with RDBMS (e.g., PostgreSQL) and data orchestration tools (e.g., Apache Airflow)
  • Solid understanding of test automation tools like JUnit, Cucumber, Karate, Rest Assured

Desirable Skills


  • Knowledge of financial or traded risk systems
  • Experience with UI/BI tools and streaming solutions
  • OLAP and distributed computation platforms such as ClickHouse, Druid, or Pinot
  • Familiarity with data lakehouse technologies (e.g., Dremio, Trino, Delta Lake, Iceberg)
  • Exposure to technologies like Apache Flink, Beam, Samza, Redis, Hazelcast
  • Containerization and orchestration tools: Docker, Kubernetes
  • Certifications: Scrum Master, PMP, FRM, or CFA
  • Knowledge of RPC frameworks (e.g., gRPC)

About the Role


You will work as part of a newly established engineering team in Kraków, responsible for the development, enhancement, and support of high-volume data processing systems and OLAP solutions used in global traded risk management.

,[Design, develop, test, and deploy scalable IT systems to meet business objectives, Build data processing and calculation services integrated with risk analytics components, Collaborate with BAs, business users, vendors, and IT teams across regions, Integrate with analytical libraries and contribute to overall architecture decisions, Apply DevOps and Agile methodologies, focusing on test-driven development, Provide production support, manage incidents, and ensure platform stability, Contribute to both functional and non-functional aspects of delivery] Requirements: Degree, Big Data, Apache Spark, Java, Python, Groovy, REST API, Microservices, API Gateway, Cloud platform, GCP, AWS, Spring Boot, Cloud, Git, Maven, Unix, Linux, RDBMS, PostgreSQL, Airflow, Test automation, JUnit, Cucumber, UI, BI, OLAP, Druid, Data Lake, Apache Flink, Redis, Hazelcast, Docker, Kubernetes, PMP, gRPC

Kategorie

data

  • Podrobné informace o nabídce práce
    Firma: Antal
    Lokalita: Práce v Polsku
    Odvětví práce: data
    Pracovní pozice: Big Data Engineer @ Antal
    Směnnost práce fulltime - 40 hours per week
    Nástup do práce od: IHNED
    Nabízená mzda: neuvedeno
    Nabídka přidána: 18. 6. 2025
    Pracovní pozice aktivní
Odpovědět na inzerát
    Buďte první, kdo se na danou nabídku práce přihlásí!
Zajímavé nabídky práce v okolí:

Práce Big Data Engineer @ Antal: Často kladené otázky

👉 V jakém městě se nabízí nabídka práce Big Data Engineer @ Antal?

Práce je nabízena v lokalitě Kraków.

👉 Jaká firma nabírá na tuto pozici?

Tato nabídka práce je do firmy Antal.

0.1114