27 червня 2025
Data/BI Engineer (вакансія неактивна)
Дніпро, Варшава (Польща), віддалено
We’re looking for a Data/BI Engineer to join a fast-growing SaaS product team revolutionizing how marketing decisions are made. This project focuses on business process optimization and brings together a modern data stack to unlock human creativity by turning complex data into clear insights.
Responsibilities:
- Development and support of data collection pipelines from numerous third-party APIs (REST), including authorisation, monitoring, re-calls, caching.
- Processing, transformation and cleansing of structured and semi-structured data (CSV, Parquet) for further use in AI/ML models and analytics.
- Development and support of data models taking into account business requirements and technical constraints (using dbt, SQL, schema design).
- Integration and optimisation of ETL/ELT pipelines in the AWS environment (using AWS Glue, S3, Lambda, etc.).
- Building and monitoring data orchestration using Apache Airflow.
- Data quality assurance: validation, normalisation, mapping, error logging, consistency control.
- Close collaboration with analytics and data science teams to deliver clean, verified and accessible data.
- Integration with internal analytics systems (Metabase), preparation of datasets for visualisation.
Requirements:
- 4+ years of Python development experience, with a special focus on data processing and API integration.
- Deep knowledge of SQL — writing complex queries, performance optimisation.
- Experience with REST APIs: integration, token management, handling unstable APIs.
- Practical experience with AWS (S3, Glue, Lambda, IAM, CloudWatch).
- Strong Python (pandas,numpy)
- Working with Snowflake: deploying data models, writing SQL, integrating with pipelines.
- Skills in building pipelines with Airflow, knowledge of orchestration concepts.
- Experience with ETL/ELT: data collection, processing, storage, and quality assurance.
- Knowledge of dbt for building data models.
- Proficiency in working with CSV/Parquet files.
- Understanding of the needs of analytical and AI/ML teams, ability to provide clean, consistent datasets.
- Experience with BI tools — Metabase/Superset/Tableau/Quicksight etc.
Nice-to-Have:
- Experience with CI/CD for pipelines (e.g. GitHub Actions, Jenkins).
- Knowledge or experience with ClickHouse.
- Understanding of data governance principles, data access control, quality monitoring.
Our Benefits:
- Professional growth: Individual development plan, mentorship, reimbursement for professional certifications and English lessons, access to professional courses in Corporate Learning Management System.
- Community: Tech community and knowledge-sharing events, English speaking club, corporate library and book club, volunteering and charity initiatives.
- Wellbeing: Medical insurance, regular medical check-ups, sport reimbursement, paid vacation and sick leave, mental health support and events.
- Work environment: Fully-equipped offices, top-notch equipment, flexible work format, activities both in-office and online, Y-bucks and access to the Yalantis store.
Before sending us your CV, you may read our Privacy Notice.