Vacatures 51 tot 60 van 624
ads
27 dagen geleden - Volledige vacature bekijken
- Volledige vacature bekijken
- Volledige vacature bekijken
Eneco
experience using Snowflake Snowpipe and Databricks Autoloader. Is this about you? Must-have: 3 years of experience with distributed data processing: Apache Spark, Beam, Flink, Kafka Hands-on experience with Databricks Autoloader and27 dagen geleden - Volledige vacature bekijken
Sr. Java ontwikkelaar (Nederlandstalig)
Cognizant - Amsterdam
relationele databases (ervaring met Cassandra is een pluspunt). • · Goede kennis van TDD (Test Driven Development) en ATDD (Acceptance Test Driven Development). · Kennis van messaging middleware zoals Kafka en MQ. · Ervaring met de ELK- Volledige vacature bekijken
Eneco - Rotterdam
experience using Snowflake Snowpipe and Databricks Autoloader. Is this about you? Must-have: 3+ years of experience with distributed data processing: Apache Spark, Beam, Flink, Kafka Hands-on experience with Databricks Autoloader and- Volledige vacature bekijken
Data Engineer ApeldoornApeldoorn
Akos - Apeldoorn
en distributiesystemen zoals Spark, Snowflake en Kafka . Tot slot analyseer en interpreteer je ruwe gegevens om inzicht te krijgen in bijvoorbeeld variabelen (Data Exploration). Je combineert gegevens uit verschillende- Preview
Senior Core integration Specialist
Schiphol Group via Talent - Schiphol
Quarkus framework. Hands-on experience with Camel and container technology such as OpenShift, Docker or AKS. Messaging (AMQ, Kafka , JMS, and AMQP) (pré). CI/CD (Maven, Azure DevOps, and ArgoCD). Enterprise Integration PatternsCbs - Den Haag
Daarnaast is ervaring met Git en PostgreSQL een mooie aanvulling. Kennis van Apache Spark, Apache Iceberg, Apache Kafka en Argo is een sterke pré. Bij voorkeur zoeken wij een Nederlands sprekende data engineer, aangezien de- Preview
Eneco via Adzuna - Rotterdam
experience using Snowflake Snowpipe and Databricks Autoloader. Is this about you? Must-have: 3 years of experience with distributed data processing: Apache Spark, Beam, Flink, Kafka Hands-on experience with Databricks Autoloader andSenior Core integration Specialist
Schiphol Group via Talent - Amsterdam
framework.Hands-on experience with Camel and container technology such as OpenShift, Docker or AKS.Messaging (AMQ, Kafka , JMS, and AMQP) (pré).CI/CD (Maven, Azure DevOps, and ArgoCD).Enterprise Integration Patterns (Hohpe Woolf). You getSenior core integration specialist
Schiphol Group via Talent - Schiphol
framework.Hands-on experience with Camel and container technology such as Open Shift, Docker or AKS.Messaging (AMQ, Kafka , JMS, and AMQP) (pré).CI/CD (Maven, Azure Dev Ops, and Argo CD).Enterprise Integration Patterns (Hohpe & Woolf).YouData Engineer Consultant (cloud computing)
CareerValue - Utrecht
Python, Pipelines, CI/CD, Cloud, Azure, Data, Data Engineer, BigData, Interne functie Data, Data Science, Machine Learning, Deep Learning, Predicitive, Artificial Intelligence, BI, Hadoop, Kafka , Nifi, Mesos of Spark, Cloud, R- Preview
Germany Senior Java Backend Developer
ING Deutschland via Adzuna - Veldhoven
Boot), Hibernate, JDBC and synchronous / asynchronous communication protocols (HTTP / REST, Kafka ) Confident in dealing with build tools, version management systems and CI/CD processes Practice in agile, international andEneco via Adzuna - Rotterdam
initiatives. Our platforms are meticulously constructed using .NET Core and Azure DevOps, offering you the freedom to leverage a range of tools that best suit the task at hand – think Kubernetes, Kafka , and more. By immersing yourselfTenneT 807992 via Magnet.me - Arnhem
big data technologies such as Hadoop, Spark or Kafka Proficiency in Python and maybe other programming languages like Java or Scala Experience with data modeling, data warehousing and ETL processes Knowledge of