The Data Analytics team is comprised of Python Data Engineers and DevOps specialists. The team is part of the technology department and works closely with the business users across the businesses. The data engineers work in collaboration with software engineers, analytics teams, data scientists, and DBAs in order to understand and implement innovative analytics solutions across the business.
The team is responsible for building and maintaining a Python-based platform for data analysis. This includes creating scaleable and resilient data pipelines, developing new and better analytical capabilities and leveraging a range of technologies and methods to enable our dynamic business to make better decisions.
This is an ideal role for an enthusiastic data engineer with strong Python development skills, and who wants to work in a greenfield project. The team is expanding its responsibilities across the organisation and scaling the existing data platform using cloud technologies. The data engineering team will innovate through exploration, design and bench-marking. The goal is to make technology recommendations and implement a scaleable data analysis platform in the cloud. The role is also tasked with the development and implementation of a range of infrastructure and automation projects.
Our goal is to provide the world’s best sports betting data science platform.
Working with users to understand requirements as well as actively promoting the technology capabilities. The key responsibilities of the Data Analytics team are:
- To build and maintain a scalable data platform.
- Using the platform to build scalable data pipelines and ETL workflows.
- Support other areas of the business with analysis, visualisations, automation, and performance bottlenecks.
Essential Skills and Experience
- Strong Python development skills including experience working with Numpy and Pandas.
- Understanding and experience working with data driven architectures.
- Proficient using Linux and Bash scripting.
- Experience using/building cloud-based solutions using AWS.
- Docker and Docker Compose.
- Experience working with efficient data storage formats such as Parquet, Avro or ORC.
- Management of virtual environments using anaconda and pip.
- Interest and willingness to teach and learn from others.
Beneficial Skills and Experience
- Experience working with C/C++.
- Hands-on experience with Nifi, Airflow and Kubernetes.
- Familiarity with visualisation tools such as Plotly or Bokeh.
- Understanding of data platforms and resilient architectures.