Data Engineer (Senior)

155 - 165 PLN/ godz.B2B (netto)
SeniorFull-time·B2B
#320835·Dodano 28 dni temu·28
Źródło: nofluffjobs.com
Aplikuj teraz

Tech Stack / Keywords

GCPCloudETLApache AirflowKafkaPUBInformaticaOraclePostgreSQLREST APIPythonPySparkRustMicroservicesJavaLinuxGitGrafanaJMeter

Firma i stanowisko

Scalo is a company working in the area of data integration processes on Google Cloud Platform (GCP) and on-premises environments.


Wymagania

  • At least 5 years of experience as a Data Engineer on GCP in data integration processes
  • Very good knowledge of GCP databases: GCP Big Query, GCP Big Table, Scylla Cloud
  • Good knowledge of ETL tools on GCP: Apache AirFlow, GCP Data Flow, GCP Data Proc
  • Experience with Kafka queues and GCP Pub/Sub
  • At least 4 years of experience as an ETL Developer in on-premises data integration processes
  • Advanced skills with on-premises ETL tools: Informatica Power Center, NIFI
  • Expert knowledge of relational databases Oracle, PostgreSQL, ScyllaDB
  • Very good knowledge of Automate Now scheduling tool by Infinity Data
  • Ability to expose REST API services
  • At least 3 years of experience as a programmer
  • Proficient in Python for data integration and analysis and PySpark framework
  • Knowledge of Rust language for writing tools and frameworks supporting efficient data loading

Nice to have:

  • Basic knowledge of MicroServices programming in Java
  • Advanced user skills in Linux operating system
  • Experience working with large data volumes (~100TB)
  • Analytical thinking and quick learning ability
  • Independence and creativity in problem solving
  • Timeliness and reliability in project execution
  • Knowledge of auxiliary tools: GIT repository, monitoring tools like Grafana, performance testing tools like Apache JMeter

Obowiązki

  • Work in the area of data integration processes on GCP
  • Work in the area of data integration processes on-premises
  • Design, build, and tune databases
  • Use ETL tools on GCP and on-premises
  • Use Kafka queues and GCP Pub/Sub
  • Expose REST API services
  • Work with Python and the PySpark framework
  • Use Rust language to write tools and frameworks supporting efficient data loading
  • Eventually take responsibility for a selected area of data processing and service exposure under high-performance requirements
  • Work in a hybrid model with one day per week in the Warsaw office

Oferta

  • Stable cooperation with technological challenges and work with modern solutions
  • Internal mobility allowing project changes without changing the company
  • Opportunity to develop technical and presales competencies and influence organizational development
  • Building personal brand by creating valuable content and participating as an expert in events
  • Benefit box including full medical care, MultiSport card, and a wide range of Motivizer offers
  • Referral program with bonuses
  • Company integrations and events to foster good team relationships
Opieka zdrowotna
Karta sportowa
Bonusy
Dofinansowanie szkoleń
Imprezy teamowe
Scalo

Scalo

530 aktywnych ofert

Zobacz wszystkie oferty
Aplikuj teraz