Thndr is an investment platform that aims to democratize access to investing for everyday individuals in MENA. Our app represents a seamless way to achieve financial independence and growth, without the need to have prior financial knowledge or access to huge capital.
The company was formed to primarily address irrelevant existing products and low financial literacy, we are aiming to solve this by focusing on education, offering a seamless and intuitive product, removing barriers and building an investment supermarket.
We launched in Egypt in late 2020 and currently allow our users to learn, connect & invest in multiple investment products.
It’s not everyday that you solve a basic societal necessity and at the same time change cultural norms. But the reward will be priceless. In our short journey we’ve validated this, as illustrated by these key figures:
96% of our investors are investing for the 1st time through Thndr
54% come from outside of capital cities and have previously had limited access to financial institutions
86% of new stock market investors in Egypt during 2022 came through Thndr
#1 platform in terms of local trades with 25% of EGX trades happening through Thndr
We’re still in the very early stages of our story, but we know for a fact that we won’t stop until everyone in MENA has equal opportunity to generate and grow their wealth in an ethical manner
About The Role
This role will directly report into the VP of Data. We are looking for a Senior Data Engineer with
BS/BE/MS degree in Computer Science, Applied Mathematics, Statistics, Information Systems, or equivalent experience
5+ years of industry experience in Data Engineering
Strong programming skills in Python and ETL tools like Airflow, Flume, Fivetran
Strong technical leadership skills and ability to work in cross functional teams
Hands-on experience in creating and managing an enterprise-scale Data Platform, while setting best practices for security, privacy, monitoring & alerting, and CI/CD
Familiarity with analytics libraries (e.g. pandas, numpy, matplotlib), distributed computing frameworks (e.g. Spark, Dask) and cloud platforms (e.g. AWS, Azure, GCP)
Hands on experience with modern cloud data warehouses/platforms like Snowflake, Redshift or Databricks is a plus
Exposure to software engineering concepts and best practices, inc. DevOps, DataOps and MLOps is beneficial
Building and optimizing data pipelines, architectures and data sets in a cloud environment.
Strong SQL knowledge and experience working with relational and non-relational databases like PostgreSQL, Redshift, Aurora, Metabase, Mongo DB, Elasticserach
Assemble large, complex data sets that meet functional / non-functional business requirements
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
Build analytics tools and APIs that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
Work with stakeholders including the Growth and BI teams to assist with data-related technical issues and support their data infrastructure needs
Strong knowledge of database design and data modeling
ACQUISITION & PREPARATION
Proven hands-on experience in Big Data, Cloud environments and tools like AWS, GCP, RDS, Aurora, PostgreSQL, Metabase, Airflow, Fivetran,
Proven hands-on experience in large data sets both structured and unstructured data: Snowflake, SQL and relational databases, data warehouse, data lake such as Redshift,
Experience in NoSQL databases, such as MongoDB, Elasticsearch,
Proven hands-on experience in API integration using Python for extracting data from different sources
Proven hands-on experience in container technologies Docker, Kubernetes etc.
Proven hands-on experience in solutions development & deployment experience in Cloud ecosystems and its associated services AWS, Azure, Google Cloud, IBM Cloud.
Proven hands-on experience in building robust data pipelines using ETL techniques and frameworks, such as Airflow, Python, Flume, Spark
Experience in the use of various messaging systems such as Kafka
Proven hands-on experience to work with large volume of structured and unstructured data and leveraging it to build AI/ML model deployment & monitoring/enhancements for standalone solutions or through end-to-end automated data pipelines
Ability to step back, analyze problems, find solutions and the drive to implement these.
Ability to work & collaborate with variety of stakeholders & clients throughout data project lifecycle
Strong interpersonal skills and organizational skills, high motivation, an attention to detail, flexibility, and ability to cope under stress, a focus on identifying the solutions to problems.
Strong communication skills & ability to translate complex solutions into business implications and at the same time being able to explain mathematical concepts when required
At Thndr, we’re looking for people invigorated by our mission, not just those who simply check off all the boxes. We’re looking for people that are hungry to become agents of change and that understand the huge responsibility associated with dealing with people’s money.
Receive emails for the latest jobs matching your search criteria