Data Platform (DataOps) Engineer

Work Type: Full Time
At Thndr we believe that investing should be easy and accessible to everyone. We started our journey to democratize investment in the region by removing commissions, account opening minimums, and launching an intuitive platform with a focus on education. This way investing opportunities can be attainable to all, no matter their income level or expertise.

Thndr is an investment platform that aims to democratize access to investing for everyday individuals in MENA. For the people who use Thndr, our app represents a seamless way for them to achieve financial independence and growth, without the need to have prior financial knowledge or access to huge capital.

The company was formed to primarily address 2 problems: 1) Existing products are not relevant and 2) Financial literacy is low. We’re looking to solve this by focusing on education, offering a seamless and intuitive product, removing barriers and building an investment supermarket.

We launched in Egypt in late 2020 and currently allow our users to learn, connect & invest in the Egyptian Exchange, Egyptian mutual funds and the US Stock Market.

The journey ahead will be long and painful - it’s not everyday that you solve a basic societal necessity and at the same time change cultural norms. But the reward will be priceless. In our short journey we’ve validated this, as illustrated by these key figures:
  • 96% of our investors are investing for the 1st time through Thndr.
  • 54% come from outside of capital cities and have previously had limited access to financial institutions.
  • 86% of new stock market investors in Egypt during 2022 came through Thndr.
  • #1 platform in terms of local trades with 25% of EGX trades happening through Thndr.
We recently raised our Series A and our next steps as a company include the following:
  • Continuing to focus on building the infrastructure. Sadly, existing solutions are outdated and don’t cater to scalable business.
  • Expanding beyond Egypt and into the rest of MENA. We’d like to extend our impact to all Arabic speakers.
  • Adding more products for people to invest in. To be relevant, we believe that we need to cater to all walks of life.
We’re still in the very early stages of our story, but we know for a fact that we won’t stop until everyone in MENA has equally opportunity to generate and grow their wealth in an ethical manner.

What You'll Do

  • Build tools that enable the team to scale, such as self-service console and monitoring.
  • Install, deploy, and manage tools built by us and the open source community.
  • Identify technical obstacles early and work closely with the team to find creative solutions.
  • Ability to grasp new technologies rapidly as needed to progress varied initiatives - Break down data issues and resolve them.
  • Building and optimizing data pipelines, architectures and data sets in a cloud environment.
  • Design and implement end-to-end data solutions for business opportunities.
  • Tenaciously keep the data platform operational.
  • Stand up and configure AWS environment.
  • Strong SQL knowledge and experience working with relational and non-relational databases like PostgreSQL, Redshift, Aurora, Metabase, Mongo DB, Elasticserach.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
  • Build analytics tools and APIs that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Growth and BI teams to assist with data-related technical issues and support their data infrastructure needs.
  • Strong knowledge of database design and data modeling.
  • Exposure to Data Quality and Profiling tools is a plus.
  • Working knowledge of message queuing, stream processing, and highly scalable big data data stores.

What You'll Need

  • BS/BE/MS degree in Computer Science, Applied Mathematics, Statistics, Information Systems, or equivalent experience.
  • 5+ years of industry experience in Data Engineering.
  • Strong programming skills in Python and ETL tools like Airflow, Flume, Fivetran.
  • Strong technical leadership skills and ability to work in cross functional teams.
  • Hands-on experience in creating and managing an enterprise-scale Data Platform, while setting best practices for security, privacy, monitoring & alerting, and CI/CD .
  • Familiarity with analytics libraries (e.g. pandas, numpy, matplotlib), distributed computing frameworks (e.g. Spark, Dask) and cloud platforms (e.g. AWS, Azure, GCP).
  • Hands on experience with modern cloud data warehouses/platforms like Snowflake, Redshift or Databricks is a plus.
  • Exposure to software engineering concepts and best practices, inc. DevOps, DataOps and MLOps is beneficial.


  • Proven hands-on experience in Big Data, Cloud environments and tools like AWS, GCP, RDS, Aurora, PostgreSQL, Metabase, Airflow, Fivetran.
  • Proven hands-on experience in Python programming language coupled with an additional languages experience if possible (e.g. SAS, R, Javascript).
  • Proven hands-on experience in large data sets both structured and unstructured data: Snowflake, SQL and relational databases, data warehouse, data lake such as Redshift.
  • Experience in NoSQL databases, such as MongoDB, Elasticsearch.


  • Experience in NoSQL databases, such as MongoDB, Elasticsearch.
  • Proven hands-on experience in API integration using Python for extracting data from different sources.
  • Proven hands-on experience in container technologies Docker, Kubernetes etc.
  • Proven hands-on experience in solutions development & deployment experience in Cloud ecosystems and its associated services AWS, Azure, Google Cloud, IBM Cloud.
  • Proven hands-on experience in building robust data pipelines using ETL techniques and frameworks, such as Airflow, Python, Flume, Spark.
  • Experience in the use of various messaging systems such as Kafka.
  • Proven hands-on experience to work with large volume of structured and unstructured data and leveraging it to build AI/ML model deployment & monitoring/enhancements for standalone solutions or through end-to-end automated data pipelines.


  • Ability to step back, analyze problems, find solutions and the drive to implement these.
  • Ability to work & collaborate with variety of stakeholders & clients throughout data project lifecycle.
  • Strong interpersonal skills and organizational skills, high motivation, an attention to detail, flexibility, and ability to cope under stress, a focus on identifying the solutions to problems.
  • Strong communication skills & ability to translate complex solutions into business implications and at the same time being able to explain mathematical concepts when required.

At Thndr, we’re looking for people invigorated by our mission, not just those who simply check off all the boxes. We’re looking for people that are hungry to become agents of change and that understand the huge responsibility associated with dealing with people’s money.

Submit Your Application

You have successfully applied
  • You have errors in applying