Job Description
Job Description
Key Responsibilities:
- Design, develop, and implement efficient ELT/ETL processes for large datasets.
- Build and optimize data processing workflows using Apache Spark.
- Utilize Python for data manipulation, transformation, and analysis.
- Develop and manage data pipelines using Apache Airflow.
- Write and optimize SQL queries for data extraction, transformation, and loading.
- Collaborate with data scientists, analysts, and other engineers to understand data requirements and deliver effective solutions.
- Work within an on-premise computing environment for data processing and storage.
- Ensure data quality, integrity, and performance throughout the data lifecycle.
- Participate in the implementation and maintenance of CI/CD pipelines for data processes.
- Utilize Git for version control and collaborative development.
- Troubleshoot and resolve issues related to data pipe...
Ready to Apply?
Submit your application today and join our talented team at Inetum Polska.
Submit ApplicationJob Details
- Location Warsaw, Masovian Voivodeship
- Job Type full-time
- Category Computer Occupations
- Posted Date March 02, 2026
- Application Deadline April 11, 2026