STEWART
FOHLO
SENIOR DATA ENGINEER--
Midrand South Africa
PROFESSIONAL SUMMARY
A versatile and impact-driven cloud and data engineer with over
a decade of hands-on experience spanning AWS architecture,
software engineering, big data platforms, and DevOps. I
specialize in designing and optimizing scalable cloud-native
solutions using AWS & Azure services, with proven ability to jump
into complex problems and deliver rapid, high-quality results. As
a certified AWS Developer and Microsoft Azure Data Engineer, I
bring cross-functional expertise in ETL/ELT pipeline
development, distributed systems, automation, and end-to-end
solution delivery. Known for my agility and problem-solving
acumen, I thrive in fast-paced, multi-team environments where
high-impact contributions are expected. I am deeply committed
to data security, governance, and operational excellence across
all stages of development and deployment.
EDUCATION
WEBSITES, PORTFOLIOS, PROFILES
PhD Candidate, Computer Science
University of KwaZulu-Natal
Pietermaritzburg, South Africa
04/2023 – 12/2025
• https://www.linkedin.com/in/stewart-fohlo-b-/
• https://www.credly.com/users/stewart-fohlo/badges
• https://bold.pro/my/stewart-fohlo-
Master of Science in Computer Science
& Engineering
Parul University
Gujarat, India
05/2016 – 08/2018
Bachelor of Science: Computer Science
Honors Degree
Midlands State University
Gweru, Zimbabwe
01/2011 – 12/2014
CERTIFICATIONS
AWS Certified Data Engineer Associate
Microsoft Certified Azure Data Engineer
DAMA – Data Management Professional
Certification (CDMP)
Multi-Cloud Aviatrix Certified Engineer
Neo4j Certified Professional
Hortonworks Big Data Hadoop
Certificate
Leading Teams Certificate - University
of Michigan
WORK HISTORY
Analytics Group, Sub-Contracting to ABSA Bank - Lead Data
Engineer
Sandton, South Africa • 07/2025 - Current
• Lead Data Engineer in Fraud and Credit Risk Division at ABSA
Bank reporting to the division's executive head.
• Aligned business objectives with technical requirements
through close collaboration with product owners providing
insights based on available data sources.
• Identified areas for improvement within existing ETL
frameworks to maximize efficiency without sacrificing
accuracy or integrity of results produced during
transformations applied on ingested datasets.
• Ensured compliance with relevant regulations related to data
storage, privacy, security, and transmission standards by
staying updated on legal developments impacting the industry.
• Lead the implementation of Generative Business Intelligence
Reporting (Gen-BI) on AWS platform leveraging services that
include Redshift, Glue, S3 and QuickSight.
• Collaborate with the platforms teams in developing data
pipelines and also assist in architecture designs and
implementation.
• Assisting other team members with data engineering tasks and
data engineering support whenever required to do so.
Analytics Group, Sub-Contracting at BMW IT Hub - Senior Data
Engineer
Pretoria, South Africa • 01/2019 - 07/2025
• Design and implement scalable and robust processes for
CNSS Certified Network Security
Specialist
Hadoop Programming, Administration &
Apache Spark
Enterprise Deep Learning With
Tensorflow
SOFTWARE
PySpark
Advanced
Python
Advanced
AWS Services
Upper intermediate
Azure Cloud
Upper intermediate
Hadoop
Advanced
Spark
Advanced
SQL
Advanced
NoSQL
Advanced
Snowflake
Intermediate
Microsoft Fabric
Advanced
Onelake
Advanced
SKILLS
Data pipeline development
ETL processes
Data Warehousing
Cloud architecture
Data governance
Big data technologies
ingesting and transforming complex datasets.
• Design, develop, construct, maintain, and support data
pipelines for ETL from a multitude of sources.
• Create blueprints for data management systems to centralize,
protect, and maintain data sources.
• Focused on data stewardship and curation, as the Senior Data
Engineer I enabled the data scientists to run their models to
achieve the desired business outcomes
• Ingest large, complex data sets that meet functional and nonfunctional requirements.
• Enable the business to solve the problem of working with large
volumes of data in diverse formats, and in doing so, enable
innovative solutions.
• Design and build bulk and delta data lift patterns for optimal
extraction, transformation, and loading of data.
• Development of API's for returning data to Enterprise
Applications.
• Debugged a data pipeline when batch or streaming loads have
failed with affect downstream reports & users.
• Migrated legacy ETL jobs without breaking downstream
dashboards.
• Designed & Developed EMR Clusters that could recover
gracefully when S3 times out or when Spark clusters
expectedly dies.
• Mentored junior data engineers and also assisted the
Operations team to debug data related issues and incidents.
• Developed CICD pipelines using Github Actions and Terraform
on AWS
Analytics Group, Sub-Contracting at Capitec Bank - Senior AWS
Data Engineer
Sandton, South Africa • 06/2022 - 12/2022
• Company Overview: Capitec Bank's credit products include
Home Loans/Bonds, Personal Loans, Credit Cards, Vehicle
Financing and Business Financing.
• Designing and implementing an end-to-end, cost-effective data
architecture spanning data ingest from disparate sources,
storage, scaling and vending to business stakeholders.
• Retrieve and analyze data using a broad set of AWS
technologies (e.g. Athena, Aurora, Redshift and S3) and
resources, knowing how, when, and which to use.
• Extract, transform, and load data from many data sources
using SQL, Scripting and other ETL tools like AWS Glue and
PySpark.
• Facilitate the selection, licensing, and implementation of a
data visualization product with PowerBI.
• Lead deep dive analysis of customer utilization behaviours,
surface key insights, and identify high value actions the growth
team can action in product and marketing strategies.
• Managing the availability, usability, integrity and security of
the data at Capitec Bank's credit products, based on internal
data standards and policies that also control data usage.
• Partner closely with Product, Tech, and Operations leaders to
define and analyze experiments, including any lessons learned
and recommended changes to approach.
• Define, measure and present metrics / automated reports on
multiple products to senior leadership.
• Implementing effective data governance standards, ensuring
that data is consistent and trustworthy for accurate analytics
SQL programming
Data visualization
and reporting.
• Mentoring juniors and performance reviews
• Sprint planning and work allocation
• Capitec Bank's credit products include Home Loans/Bonds,
Personal Loans, Credit Cards, Vehicle Financing and Business
Financing.
Analytics Group, Sub-Contracting at Vodacom - Senior Data
Engineer
Midrand, Gauteng • 10/2021 - 05/2022
•
•
•
•
•
•
•
•
•
•
•
•
•
AWS Cloud Data Engineering
Vodacom International Markets Data Engineering
AWS Datalake development and support
Design and develop highly performant, scalable, and stable Big
Data cloud native applications
Source data from a variety of different sources, in the correct
format, meeting data quality standards and assuring timeous
access to data and analytical insights
Build batch and real-time data pipelines, using automated
testing and deployment using AWS Glue, Step Functions,
Lamda, Apache Airflow and custom Python/PySpark code
Build transformations to produce enriched data insights on
AWS Athena and Redshift
Integrate applications with business systems to enable value
from analytic models and enable decision making
Define and implement best practices relating to cloud
economics, software engineering and data engineering to
ensure a well-architected cloud framework with data
management practices built into each design.
Work with the architecture team to evolve the Big Data
capabilities (reusable assets/patterns) and components to
support the business requirements/objectives
Research, investigate and evaluate new technologies and
methods to improve the delivery and sustainability of data
applications and services
Develop dynamic applications health and performance
dashboards to be sent out to relevant stakeholders on a daily,
weekly, and monthly basis.
Make contributions to the process of defining best practices for
the agile development of applications to run on the Big Data
Platform
Analytics Group, Sub-Contracting at Standard Bank - Data
Engineer
Johannesburg, South Africa • 10/2019 - 09/2021
• Automating the manual upload of employee data files from
SAP HC into Clearview app on Azure Cloud and enforcing that
the two applications are in real-time sync.
• Collaborate with different teams in the bank i.e IT Security
team, DNS Team, F5 Load Balancer Team, Internet Services,
Oracle, Linux, Wintel and Firewall team to come up with
various data solutions and automation approaches required by
the bank.
• Responsible for maintaining quality reference data in Oracle by
performing operations such as cleaning, transformation and
ensuring integrity in a relational environment for Group
Internal Audit Network Tool (GIANT) application.
• Developed SQL scripts and optimized SQL queries for pulling
out reports to be utilized by business.
• Initiated a lot of fine tuning mechanisms to tune the database
as well as the queries to complete a set of given jobs or tasks
in optimal time.
• Reports development using PowerBI
• Develop and maintain data recovery environment for on
premise applications i.e One World, GIANT, Enterprise Risk
Manager.
• Developing, implementing and supporting GIANT application
running on Tomcat middleware.
• Ensuring that applications are on the latest supported
infrastructure and versions.
• Setup and maintain applications monitoring using
Appdynamics, Azure AppInsight and Splunk and also integrate
email alerts, MS Teams and Kaizala for notifications.
• Develop dynamic applications health and performance
dashboard to be sent out to relevant stakeholders on a daily,
weekly and monthly basis.
Analytics Group, Sub-Contracting at SkX Protiviti - Data
Analytics Consultant
Rosebank, South Africa • 03/2019 - 02/2020
• ETL Development of data from various Big Data sources i.e
SAP, MSSQL, OracleDB, MySQL Teradata, Hive and NoSQL DBs
• Develop SQL and ACL CAATs for clients like Transnet, Telkom,
Airports Company of South Africa, National Treasury, South
African Airways and Department of Labour (Compensation
Fund)
• Identify, design, and develop data analytics extract routines to
support audit activities performed by the Internal Audit team.
• Manage data extraction, storage, transformation, and
processing through data analytics routines, and generate
output for visualization/analysis by the Internal Audit team.
• Use data analysis tools to automate audit testing and develop
techniques for continuous auditing and analyzing large volumes
of data.
• Interact with management and business partners to identify
appropriate data sources and data elements required for
analytics, applying professional scepticism when assessing data
sources and validating the completeness and accuracy of data
received.
• Interact and collaborate with Internal Audit team members in
working towards departmental goals.
• Perform audit tests of controls by obtaining and analyzing
audit evidence, preparing audit working papers, evaluating
test results, and drawing conclusions on the adequacy and
effectiveness of controls.
Analytics Group, Sub-Contracting at Absa Bank - Big Data
Developer
Johannesburg, South Africa • 11/2018 - 10/2019
• Big Data Development for Financial Crime and Conduct Risk
(Hortonworks)
• Tools: HDFS, Apache Spark, Scala, HiveQL, Beeline, Sqoop,
Oozie, Zookeeper, Java
• Hadoop development and implementation.
• Loading from disparate data sets.
• Pre-processing using Hive and Pig
• Perform data quality checks in a methodical manner to
understand how to accurately utilize client data.
• Expert level programming skills using Hadoop to meet the
challenges of advanced data manipulation, complicated
programming logic, and large data volumes.
• Communicate results and methodology with the project team
and clients with the ability to work in offshore/onshore model.
• Provide solutions for data driven applications involving large
and complex data and providing reconciliation and test cases.
• Understand customer's Business processes and pain areas which
need attention.
• Solution Architecture for the entire flow from source to end
reporting data marts.
• Design Conceptual and physical data model for a global data
warehouse in the Hadoop world (ETL versus ELT).
• High Level & Low Level design for ETL Components in Hadoop.
• Test prototypes and oversee handover to operational teams.
• Propose best practices/standards.
• Hands on work on Sqooping and Hive transformations, Build
monitoring and testing mechanisms around Sqooping and data
transformations.
• Continuous improvements to the current Hadoop set up in
terms of scalability, reliability and monitoring.
Escrow Group (EFS) - Software Developer
Harare, Zimbabwe • 05/2015 - 08/2018
• Banking & Microfinance software development
• Developing web based software using computer programming
languages such as VB.net, C#, Java and MSSQL.
• Performing extensive software testing and stringent quality
checks as per the QC guidelines to ensure an error free as well
as optimum final product to the user.
• Creating and maintaining the ASP.NET based websites and
their web applications as per the initial guidelines of the
client.
• Making changes to existing web applications according to the
feedback received from the end users or clients.
• Testing the applications and websites on different web
browsers to ensure a standard user experience for the clients
across all platforms.
• Developing documentation throughout the software
development life cycle (SDLC).
• Training the end user or client on the final product as well as
providing technical support whenever required, weekends
included.
Steward Bank - Software Developer
Harare, Zimbabwe • 01/2014 - 05/2015
• Executed full software development life cycle (SDLC)
• Developed flowcharts, layouts and documentation to identify
requirements and solutions
• Wrote well-designed and testable code
• Troubleshoot, debug and upgrade existing systems
• Integrated software components into a fully functional
software system
• Documented and maintained software functionality
• Deployed programs and evaluate user feedback
• Ensured software is updated with latest features.
Steward Bank - Associate Software Developer (Intern)
Harare, Zimbabwe • 01/2013 - 12/2013
.
#HRJ#9ee0fbdb-39fd-4d0 2-8557-a4696d 113184 #
• Collaborated with cross-functional teams for seamless
integration of new features and functionalities.
• Kept up-to-date with industry trends and emerging
technologies, applying relevant insights to ongoing projects as
needed.
• Worked with developers to identify and remove software bugs.
• Conducted module and regression tests.
• Discussed issues with team members to provide resolution and
apply best practices.
• Supported Data Center Infrastructure issues
PUBLICATIONS
• Google Scholar Reference
• The Role of Artificial Intelligence and Expert Systems in the
Implementation of ZimASSET
• Empirical Review Paper on Voice Recognition & Feature
Extraction Techniques