14 серпня 2025
DataOps Engineer (вакансія неактивна)
Bulgaria, Portugal, Romania, Spain, Київ, Львів, Краків (Польща), Варшава (Польща), віддалено
Description
Join our team in building a modern, high-impact Analytical Platform for one of the largest integrated resort and entertainment companies in Southeast Asia. This platform will serve as a unified environment for data collection, transformation, analytics, and AI-driven insights—powering decisions across marketing, operations, gaming, and more.
You’ll work closely with Data Architects, Data Engineers, Business Analyst and DevOps Engineers to design and implement scalable data solutions.
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 3+ years of experience in DataOps, DevOps, or Data Engineering roles.
- Proficiency in scripting languages (Python, Bash, etc.).
- Strong experience with orchestration tools (e.g., Apache Airflow, Prefect, or Dagster).
- Hands-on experience with cloud platforms (e.g., AWS, GCP, Azure) and cloud-native data tools.
- Familiarity with CI/CD tools (e.g., GitLab CI, Jenkins, CircleCI).
- Knowledge of containerization and orchestration technologies (Docker, Kubernetes).
- Experience with infrastructure-as-code tools (Terraform, CloudFormation).
- Strong understanding of data privacy, security, and compliance practices
- Experience with modern data warehouses (e.g., Snowflake, Redshift, Yellowbrick) and ETL/ELT tools.
- Understanding of data governance, metadata management, and data cataloging tools.
- Experience collaborating in Agile/Scrum teams and working with version-controlled data models (e.g., via Git).
Nice to have skills
- Experience with real-time data processing (e.g., Kafka, Spark Streaming).
- Familiarity with data observability platforms (e.g., Monte Carlo, Datadog, Great Expectations).
- Experience working in regulated industries (e.g., gaming, finance, hospitality).
Responsibilities:
- Design, build, and manage CI/CD pipelines for data applications, models, and pipelines.
- Develop and maintain infrastructure-as-code (IaC) for data platform components.
- Automate data quality checks, validation, and monitoring processes.
- Collaborate with data engineers and analysts to optimize data ingestion and transformation pipelines.
- Implement robust logging, alerting, and observability tools for data pipelines.
- Manage orchestration frameworks (e.g., Airflow) and ensure timely execution of workflows.
- Maintain compliance with data governance, privacy, and security policies.
- Support and troubleshoot production data issues and infrastructure outages.
Benefits
- 35 absence days per year for work-life balance
- Udemy courses of your choice
- English courses with native-speaker
- Regular soft-skills trainings
- Excellence Сenters meetups
- Online/offline team-buildings
- Business trips