Description:
To help maintain the Big Data platform and further enhance our data and analytics practices, we are looking for a developer analyst to support all facets of the platform’s development and analysis, including designing data products; administering data tools such as Airflow and Databricks; and building data ingest/transformation workflows or other data engineering solutions affecting the platform.
Your key responsibilities:
- Assist in implementing the architecture, new tools, processes and models to build a sustainable Big Data platform.
- Analyze, document and program the extraction, standardization, transformation and integration of multiple data sources within the Big Data platform.
- Help administer the Big Data platform by applying access controls to data and tools, monitoring the platform’s use and costs and defining its configuration.
- Review and develop miscellaneous data engineering solutions that enable the implementation of new data valorization use cases.
- Support the continuous improvement of development practices within the team by participating in code reviews, developing tests and sharing innovative practices with your developer colleagues.
- Assist in setting up a data governance plan in compliance with our privacy policy.
- Provide technical expertise, advice and support to the BI team.
- Help evolve the media and entertainment industries by collaborating on research projects with universities.
What you bring to our team:
- You have applied Big Data processing techniques in a professional setting and are familiar with the tools involved.
- You have experience in collecting, standardizing and transforming raw data from all kinds of data sources (ETL/ELT).
- You have data modelling experience, specifically with data warehouses and data lakes (or lakehouses).
- You appreciate the importance of sound data governance, quality and security.
- You have experience in SQL/Python/Java/Scala development and care about quality assurance and writing clean code.
- You are keen on automation and optimization.
- You have experience identifying client needs and translating them into technical solutions in a data and analytics environment.
- You are autonomous and like to step out of your comfort zone.
- You like to work collaboratively and share your knowledge and are always open to ideas from others.
The bonus skills:
- You have experience in the media and entertainment industries.
- You have experience with Databricks, Delta Lake or Iceberg.
- You have experience with a Big Data processing system (ideally, Apache Spark).
- You have experience with one of the cloud service providers (ideally, Azure).
- You have experience with real-time data streaming, cloud messaging and event streaming tools (ideally, Kafka, Azure Service Bus and Azure Event Hubs).
- You are familiar with one or more workflow orchestration tools (ideally, Airflow).
- You have experience with content development/rollout tools and processes.
- You are ideally proficient in spoken and written English.