#300194•Dodano Invalid Date•11•źródło: nofluffjobs.com
Senior Data Engineer (Databricks)
21 000 - 28 560 PLN(znormalizowane)
Doświadczenie
Senior
Lokalizacja
Tryb pracy
Zdalnie
Wymiar
Full-time
PythonSQLETLAzureAirflowDatabricksSparkDockerCI/CDKubernetesKafkaPower BIDagsterdbt
O ofercie
Addepto is a leading AI consulting and data engineering company that builds scalable, ROI-focused AI solutions for large enterprises and startups, including Rolls Royce, Continental, Porsche, ABB, and WGU. The company focuses exclusively on Artificial Intelligence and Big Data, helping organizations unlock the potential of their data through systems designed for measurable business impact and long-term growth. Addepto is part of KMS Technology, a US-based global technology group, combining AI specialization with enterprise-scale delivery capabilities.
Wymagania
- At least 5 years of commercial experience implementing, developing, or maintaining Big Data systems.
- Strong programming skills in Python including writing clean code and OOP design.
- Strong SQL skills including performance tuning, query optimization, and experience with data warehousing solutions.
- Experience designing and implementing data governance and data management processes.
- Deep expertise in Big Data technologies including Apache Airflow, Dagster, Databricks, Spark, and DBT.
- Experience implementing and deploying solutions in cloud environments, preferably Azure.
- Knowledge of building and deploying Power BI reports and dashboards for data visualization.
- Excellent understanding of dimensional data and data modeling techniques.
- Consulting experience with ability to guide clients through architectural decisions, technology selection, and best practices.
- Ability to work independently and take ownership of project deliverables.
- Master’s or Ph.D. in Computer Science, Data Science, Mathematics, Physics, or related field.
Obowiązki
- Design and optimize scalable data processing pipelines for streaming and batch workloads using Big Data technologies such as Databricks, Apache Airflow, and Dagster.
- Architect and implement end-to-end data platforms ensuring high availability, performance, and reliability.
- Lead development of CI/CD and MLOps processes to automate deployments, monitoring, and model lifecycle management.
- Develop and maintain applications for aggregating, processing, and analyzing data from diverse sources ensuring efficiency and scalability.
- Collaborate with Data Science teams on Machine Learning projects including text/image analysis, feature engineering, and predictive model deployment.
- Design and manage complex data transformations using Databricks, DBT, and Apache Airflow ensuring data integrity and consistency.
- Translate business requirements into scalable and efficient technical solutions while ensuring optimal performance and data quality.
- Ensure data security, compliance, and governance best practices are followed across all data pipelines.
Benefity
- Work remotely or from modern offices and coworking spaces.
- Career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences including Databricks certifications.
- Choice of cooperation form: B2B or contract of mandate with 20 fully paid days off.
- Team-building events and integration budget.
- Celebration of work anniversaries, birthdays, and milestones.
- Access to medical and sports packages, eye care, psychotherapy, and coaching.
- Full work equipment including laptop and necessary devices.
- Opportunities to boost personal brand by speaking at conferences, writing for blog, or participating in meetups.