Easyfairs is working towards improving its data-driven decision making capabilities. To this end, we are looking for a Data Engineer with the ability to write, maintain and administer cloud-based and managed ETL and reverse ETL flows into and out of our BigQuery-based data warehouse.
Working remotely and from our Easyfairs is working towards improving its data-driven decision making capabilities. To this end, we are looking for a Data Engineer with the ability to write, maintain and administer cloud-based and managed ETL and reverse ETL flows into and out of our BigQuery-based data warehouse.
Working remotely and from our head office in Brussels and reporting to the Head of Data Intelligence, you – as our new Data Engineer - will be responsible for:
* Building custom ETL and reverse ETL flows when necessary using modern Python.
* Deploying said custom flows on the cloud, maintaining them and securing them.
* Expanding and maintaining Python-based API deployments which permit programmatic access to the contents of our data warehouse, as well as other uses.
* Deploying and keeping an eye on managed ETL flows, ensuring they are still fit for purpose and controlling costs.
What technical skills do you need to be successful in this role?
* You can handle writing Python with confidence, or are willing to enthusiatically embrace it.
* You are excited about and have some experience deploying workloads in the public Cloud, such as Google Cloud Platform (GCP), for example via serverless container running services or managed Kubernetes.
* You are no stranger to implementing REST APIs and have considered their security aspects as well (Basic Auth, JWT, etc).
* Perhaps you have had exposure to scraping frameworks such as scrapy for obtaining API data.
* As you are all about data, you can write queries in SQL. We work with BigQuery.
* Git is your friend and you have some idea of how CICD can help you and your team make less mistakes.
* You know how containers help you deliver your code to your users and how to use them in CICD.
* You know that data needs to be fresh, and hence have considered how to keep it so using a scheduling framework such as Apache Airflow.
* You have considered ways to keep your infrastructure comprehensible to teammates and hence have considered immutable infrastructure approaches such as terraform.
* You may have heard of dbt and see the problem it is trying to solve.
What personal qualities would help you succeed?
* You are a decent communicator and enjoy collaboration (written and verbal), make it part of your daily process in the form of actively communicating with your team members on Slack and in code reviews.
* You do not hold back from turning to your teammates for help if you are blocked or unclear on something.
* If you identify a gap in the knowledge of your teammate, you see it as an opportunity to teach and improve the overall knowledge of the team.
* When things break, you do not rush to blame, but instead focus on addressing the problem and setting yourself and the team up to avoid repeating the mistake that caused the issue in the future.