Data Engineer @ Harvey Nash Technology
Must-Have Skills:
- Proven experience building data pipelines and performing ETL/ELT
- Proficiency in Python and SQL
- Experience with dbt, Airbyte, Kafka, PostgreSQL, Databricks
- Familiarity with data modeling, integration, and visualization tools (e.g., Tableau, Qlik Sense, Superset, Splunk)
- Experience working in AWS environments (Glue, Lambda, S3, RDS)
- Solid understanding of data architecture, system integration, and data quality best practices
- Customer-facing experience in solution engineering, including assessments, solution proposals, and architectural design
Data Engineer
We’re seeking a Data Engineer with strong experience designing and building robust data pipelines and performing ETL/ELT processes. You will work on data integration, modeling, analytics, and visualization while ensuring data quality and supporting customer-facing solution engineering activities.
Must-Have Skills:
- Proven experience building data pipelines and performing ETL/ELT
- Proficiency in Python and SQL
- Experience with dbt, Airbyte, Kafka, PostgreSQL, Databricks
- Familiarity with data modeling, integration, and visualization tools (e.g., Tableau, Qlik Sense, Superset, Splunk)
- Experience working in AWS environments (Glue, Lambda, S3, RDS)
- Solid understanding of data architecture, system integration, and data quality best practices
- Customer-facing experience in solution engineering, including assessments, solution proposals, and architectural design
,[Relational databases: PostgreSQL, SQL Server, Oracle, MySQL , Cloud: AWS (Glue, Lambda, S3, RDS, SageMaker, Secrets Manager), basic Azure knowledge , Data platforms: Databricks , ETL/ELT tools: Fivetran, Airbyte, dbt , Orchestration: Airflow , Version control & DevOps: GitLab, CI/CD , Microservices & containers: Docker, Docker Compose , Big data & processing: Hadoop, Spark , NoSQL: Redis, Cassandra, MongoDB, Hive/HBase, Neo4J ] Requirements: Data pipelines, ETL, Python, SQL, dbt, Kafka, PostgreSQL, Databricks, Data modeling, Tableau, Qlik, Splunk, AWS, Glue, AWS Lambda, AWS S3, Amazon RDS, Relational database, Oracle, MySQL, AWS Glue, Azure, Airflow, GitLab, Microservices, Docker, Docker Compose, Big Data, Hadoop, Spark, Redis, Cassandra, MongoDB, Hive, HBase, Neo4j Tools: . Additionally: Private healthcare, International projects, Small teams.
Kategorie
data
- Podrobné informace o nabídce práce
Firma: Harvey Nash Technology Lokalita: Remote Odvětví práce: data Pracovní pozice: Data Engineer @ Harvey Nash Technology Směnnost práce fulltime - 40 hours per week Nástup do práce od: IHNED Nabízená mzda: neuvedeno Nabídka přidána: 2. 7. 2025
Pracovní pozice aktivní
Buďte první, kdo se na danou nabídku práce přihlásí!
Práce Data Engineer @ Harvey Nash Technology: Často kladené otázky
👉 V jakém městě se nabízí nabídka práce Data Engineer @ Harvey Nash Technology?
Práce je nabízena v lokalitě Remote.
👉 Jaká firma nabírá na tuto pozici?
Tato nabídka práce je do firmy Harvey Nash Technology.
Zaujala Vás nabídka práce na pozici Data Engineer @ Harvey Nash Technology ve městě Remote?
Pošlete svůj životopis firmě Harvey Nash Technology ještě dnes.
Pokud hledáte další podobné nabídky práce, podívejte se na aktuální pracovní místa Remote - data
Pokud hledáte další podobné nabídky práce, podívejte se na aktuální pracovní místa Remote - data