Zhiwei (Jerry) Yang
Curriculum Vitae
56-60 Gordon Crescent
Lane Cove North, NSW 2066
Australia
H -
B-zhiwei-jerry-yang
zhiwei-jerry-yang
DATA ANALYSIS
Experienced Data Analyst who specialises in statistical modelling and is comfortable working with large
and complex datasets, meets tight deadlines and delivers superior performance. Possesses solid statistical
and machine learning knowledge and extensive experience in programming & databases. Operates with
substantial sense of urgency and thrives in a fast-paced setting. Mandarin as mother tongue, proficient in
English.Core competencies include:
Statistics • Bayesian in the making
Regression Analysis • Time Series • XGBoost • Expectation Maximisation
Business Intelligence • Programming & Algorithms • Databases & SQL • Project Management & Agile
EDUCATION- Master of Statistics, University of New South Wales, Sydney.
The Master of Statistics program covers a wide range of statistical theory and practice and
provides advanced training for those who are currently, or are aiming to become, practicing
statisticians. In addition, the program also provides a means of obtaining the necessary
preparation for further research in Statistics.
- Master of Information Technology, University of New South Wales, Sydney.
The aim of this program is to provide students with a broad-based IT education, and more
specialised knowledge in up to two areas, enabling them to work in a range of positions in the
IT industry.
- Bachelor of Engineering (BE), Nanjing Tech University, Nanjing, China.
Majored in Computer Science, Department of Computer Science and Technology
PROFESSIONAL EXPERIENCE
2018.12–Present Data Analyst, Data Centre of Excellence (DCoE), Optus.
(1 year, 3 months) As a strong contributor to the analytics strategy, capability and delivery of insight
and analytics programs of work required by specific business unit(s), the Data Analyst
is to support and enable the business to make smarter decisions, achieve financial
targets and explore incremental growth opportunities. The Data Analyst engages
and collaborates with stakeholders to understand the key business problems to solve
and business metrics to impact through business analysis and intelligence, supporting
the translation of requirements into analytical tasks, and ensuring accuracy and
completeness in the relevant data.
• Perform analysis on Customer Lifetime Value (CLV) data to discover actionable insights.
• Prototyped and implemented the financial data model for the CLV engine proof of concept.
• Refined Holistic Customer Management (HCM) through feature and insight discovery, and
contributed to business adoption.
• Provided full support for the Customer NPS program through collecting a wide variety of
organisational data and mapping between legacy and current systems.
- Data Analyst - Predictive Modelling, Studiosity, Sydney.
(1 year, 7 months) The Data Analyst is principally responsible for analysing historical data and user
trends to monitor and inform the quality, consistency and efficiency of the Studiosity
service as well as providing insights into client interactions and service usage, in order
to set appropriate staffing levels, and to ensure the provision of the highest quality
educational experiences to students.
• Build machine learning models using algorithms such as LSTM and XGBoost to predict
service demands and guide scheduling.
• Use Python, R, Excel and SQL to perform a variety of analysis support for units across the
business, e.g. measuring specialist engagement and student satisfaction.
• Streamline and automate routine reporting tasks.
• Assist the organisation with becoming more data savvy.
- Statistical Data Analyst, Business Reporting and Intelligence, and Data Governance
(2 years, 2 (BRIDG), University of New South Wales, Sydney.
months) Collects, organises and interprets quantifiable data and uses statistical methodologies
to produce reports and analyses to support BRIDG operations. Administers surveys and
data collections for approximately 55,000 enrolled students and over 7,200 university
professional staff.
• Provided data input for the University’s 2025 strategy benchmark analysis by conducting
the UniForum Staff and Contractor activity collections.
• Built regression with shrinkage models in R to enhance clarity of student satisfaction levels.
• Used R, SAS and other statistical analysis and modelling tools to perform a variety of
ad-hoc explorations, e.g. testing student load calculations.
• Identified possible areas for improvement and streamlining, making data collection processes
more efficient.
• Automated labourious tasks via running VBA code (i.e. auto-filtering), cutting administrative
overhead.
• Customised the online survey instrument written in Oracle PL/SQL and JavaScript to
cater for roughly 3,300 paper form responses collected at graduation ceremonies under
exceptionally tight deadlines.
• Wrote regular expressions in SQL to cleanse tens of thousands of emails and phone numbers
in free text format, reducing the number of email bounce-backs by more than 40% and
potentially saving hundreds of man-hours.
During the workplace change starting in April 2014, I took up the following position in the newly
established work unit through an EOI process- Business Intelligence Developer & Analyst, Business Reporting and Intelligence,
(9 months) and Data Governance (BRIDG), University of New South Wales, Sydney.
Responsible for working on the business intelligence (BI) and analytics strategy for
the University and is involved at some level throughout the entire BI development
life cycle, but primarily focused on front end reporting development and user support.
Sources data from the University’s data warehouse to answer a variety of ad hoc and
analysis requests in a timely manner.
• Within the first week into this new role, gained a working knowledge of modern data
warehousing and Kimball’s methodology - dimensional modeling and began responding to
ad hoc requests.
• Juggled multiple student survey data collection projects on tight deadlines ranging in cohort
size from 60 to 11,500.
• Designed and implemented end-user reports with the suite of SAS Enterprise BI products
to facilitate information transfer.
• Employ data visualisation tools including Tableau, R packages and SAS Visual Analytics
(VA) to unearth insights from survey datasets.
- Database Programmer, Institutional Analysis and Reporting Office, University of
(5 years, 6 New South Wales, Sydney.
months) Develops and maintains web applications, online survey instruments, underlying
database structures and reporting functionality using Oracle PL/SQL and SAS Enterprise BI.
• Implemented project plans, developed and maintained a dozen web applications utilising
PL/SQL and jQuery to accommodate a variety of needs across the University. These
include: research publication management, future student application and leadership
program, among other small-to-medium sized systems.
• Proposed a paradigm shift towards employing PHP as the development tool in an effort to
improve code maintainability: after self-studying the Yii PHP Framework briefly, built a
pilot project for managing student misconduct records and received positive feedback for
its usability and easy maintenance.
• Injected critical thinking and practical coding styles into daily work by absorbing new
technologies and concepts, e.g. new jQuery methods, events and third-party plugins
swiftly, and subsequently applying them to application development.
• Volunteered for writing an official document about development work request process.
• Took on tasks from others, technical or non-technical that do not normally fell into my
skill set. On multiple occasions I filled in as an administrative assistant to take minutes for
Friday meetings.
- Web Developer, Ausnik IT, Sydney.
(6 months) Develops and maintains websites using PHP/MySQL to satisfy customers with diverse
business needs.
TECHNICAL SKILLS
Proficient in Microsoft Office Suite: advanced Excel • Word • Power Point
R • Python • SQL • PL/SQL • SAS • PHP • JavaScript • HTML/CSS • Tableau • SAS VA
MatLab • Mathematica • Linux Shell • LaTex • Markdown
CERTIFICATIONS
Coursera Deep Learning Specialisation
• Improving Deep Neural Networks: Hyperparameter tuning, Regularization and
Optimization
• Neural Networks and Deep Learning
• Structuring Machine Learning Projects
INDEPENDENT PROJECTS
Kaggle Titanic: Machine Learning from Disaster,
Model selection and validation method: Cross-validation,
Missing value imputation: mean substitution.
Modelling methods (Python):
•
•
•
•
Random Forest.
Stochastic Gradient Descent (SGD).
Nearest Neighbours (NN) - Centroid.
Support Vector Machine (SVM).