Work with data and analytics experts to strive for greater functionality in data systems.
Build and maintain end-to-end data pipelines from various data
Implement data modeling concept driven by business requirements.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and 'big data' technologies.
Research innovative technologies and make continuous improvements.
BS/MS/PhD in Data Science, Computer Science, Mathematics, Science or Engineering discipline
Min.
3 - 5 years of working experience
Experience with data pipeline and workflow management tools (Airflow).
Experience with SQL and NoSQL databases.
**Experience with one of the following languages**: Python, R, Java, C/C++ and Scala.
Familiar with containerisation (Docker) and orchestration tools (Kubernetes).
**Familiar with one of the following cloudbased services**: GCP, AWS or Azure.
Good working knowledge of productivity tools such as G Suite, Git, Jira, Confluence.
Experience in SSIS is a plus.
Built at: 2025-06-14T05:21:32.059Z