Primary responsibilities:
- Assist in developing and maintaining ETL pipelines, working on tasks from data source integration to visualization.
- Collaborate with cross-functional teams in an Agile environment, following Scrum principles to deliver high-quality solutions incrementally.
- Support the design and implementation of scalable data workflows and ensure efficient data processing.
- Work closely with senior engineers to learn and apply best practices for data security, governance, and performance optimization.
- Create and maintain dashboards and visualizations using tools like Power BI to deliver actionable insights.
- Stay updated on the latest tools and technologies to contribute to the team’s innovation and efficiency.
Required Qualifications:
- 2-4 years of experience in software development with exposure to ETL pipeline development.
- Basic understanding of tools and technologies such as Spark, Kafka, and Data Warehousing.
- Familiarity with data visualization tools like Power BI.
- Exposure to Agile methodologies and a willingness to learn and apply Scrum principles in day-to-day work.
- Strong problem-solving skills and a keen interest in learning new tools and frameworks such as StarRocks, MinIO, or Iceberg.
- Good communication skills and ability to work collaboratively in a team environment.
Preferred Qualifications:
- Prior experience in a product development or data engineering environment is a plus.
- Basic understanding of data security concepts, such as Kerberos.