
Job description
Frontex – the European Border and Coast Guard Agency – is looking for a Data Flow Engineer to support the integration of the Frontex Data Platform with internal applications and external data sources.
[01] Hands-on experience in daily working with Apache NiFi, preferably in CDP environment - design, deployment, monitoring, troubleshooting of advanced flows - min. 2-3 years of hands-on
[02] Documented experience in realisation of at least one big integration project using NiFi as central tool (API calling, db integrations, transformations, data routing and delivery)
[03] Practical knowledge and experience with Apache Iceberg, preferably in CDP environment - table creation/management, NiFi/Spark/Flink integration etc.
[04] Experience in implementation of CDC pipelines to/from relational databases
[05] Practical knowledge of configuring and managing governance/lineage in Apache Atlas + Ranger in the context of NiFi flows (tagging, policies, audit)
[06] Experience in working with Apache Kafka in CDP ecosystem (integration NiFi -> Kafka -> downstream consumers, schema management with Avro)
Job requirements
Knowledge and skills;
[01] Expert knowledge in defining, designing, implementing and maintaining complex data flows in Apache NiFi (Cloudera DataFlow)
[02] Adanced Python programming skills (data processing, NiFi custom logic, flow automation, integrations)
[03] Advanced in building integrations based on REST API (endpoints' calling, OAuth/JWT authentication, rate limiting, error recovery)
[04] Hands-on experience in building CDC-based data flows (Change Data Capture) - using native NiFI processes/connectors as well as with SQL Builder
[05] Good knowledge of Apache Iceberg (tables, schema evolution, partitioning)
[06] Knowledge on data governance and catalog in CDP: Apache Atlas (metadata, lineage, tagging) and Apache Ranger (security policies, authorization)
[07] Experience with Apache Kafka as message broker (topics, producers/consumers, schema registry, NiFi integration) and Apache Avro as serialization standard (schema evolution, compatibility etc.)
Specific requirements:
[01] Hands-on experience in daily working with Apache NiFi, preferably in CDP environment - design, deployment, monitoring, troubleshooting of advanced flows - min. 2-3 years of hands-on
[02] Documented experience in realisation of at least one big integration project using NiFi as central tool (API calling, db integrations, transformations, data routing and delivery)
[03] Practical knowledge and experience with Apache Iceberg, preferably in CDP environment - table creation/management, NiFi/Spark/Flink integration etc.
[04] Experience in implementation of CDC pipelines to/from relational databases
[05] Practical knowledge of configuring and managing governance/lineage in Apache Atlas + Ranger in the context of NiFi flows (tagging, policies, audit)
[06] Experience in working with Apache Kafka in CDP ecosystem (integration NiFi -> Kafka -> downstream consumers, schema management with Avro)
or
- Warsaw, Poland
All done!
Your application has been successfully submitted!
You've already applied for this job
We appreciate your interest in this position. Unfortunately, you have already applied for this job.

