1
Aditya
Email:-Professional Summary:
•
Experience of working with giants of three major domains.
o
o
o
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Morgan Stanley [Banking Domain]
Assurance [Insurance Domain]
Eli-Lilly [Healthcare Domain]
Working as Senior Associate Technology in Synechron Technologies from November 2017.
Worked as System Engineer at TCS from July 2014 till Oct 2017.
Consistently top performer of the Morgan Stanley account in TCS with throughout A band, which is
the highest possible performance award.
Experience in developing applications using cloud AWS, Python, S3, EMR, Lambda functions,
Dynamo DB, Redshift, SNS, SQS, Kinesis.
Experience in creating various Dashboards on TIBCO Spotfire, Kibana.
Experience in developing Datalake applications using Hadoop, sqoop hive, spark, spark streaming,
Impala, yarn, flume.
Experience in developing applications using UNIX, shell scripting.
Experience in working with multiple schedulers like Active Batch, Cloud Watch, Cron, Autosys, TWS
and Oozie.
Implemented LDAP and Secure LDAP authentication on Hadoop, Hive, Presto and Starburst Presto.
Experience in working with different authentication mechanism as LDAP, BATCH & KERBROSE.
Good understanding of Software Development Life Cycle Phases such as Requirement gathering,
analysis, design, development and unit testing.
Strong ideation and conceptualization skills, been a sought after person for many POCs.
Developed multiple utilities as Auto cleanup, Workbench, ILM, loggings integration, Hadoop jobs
automation, mainframe interaction and automation of procedure using Python, Unix shell scripting
which is used throughout account.
Self-motivated & team-motivator, proficient communication skills aided with a positive and ready
to learn attitude.
Goal-oriented, autonomous when required, good learning curve and appreciation for technology.
Skills Snapshot:
Technology
Stack
•
•
•
AWS: Lambda Function, Redshift, GLUE, Kinesis, EMR, DMS, S3, Glacier Storage,
Dynamo DB, TTL, Lifecycle, SQS, SNS, SageMaker, API Gateway, RDS, Elastic SearchKibana, Quick Sight, Athena, Cognito etc.
Python: Python3, Boto3, Pandas, Asyncio, OpenCV etc.
BigData & Others: Hive, Active Batch, Unix, Informatica, Shell scripting, Sqoop,
Hive, Presto, Impala, Sentry, Ranger, Java, Teradata, Spark, Oozie, Pig, Flume,
Autosys, TWS, DB2, MF, Green plum, My SQL , TIBCO Spot Fire, LDAP, Kerberos, CA
certificates ,SSL etc.
2
Professional Achievements:
•
•
•
•
•
•
•
•
Got Innovator Award for driving an innovative solution for Solution Accelerator.
Got Star Team Award for our contribution in the Eli-Lilly project.
Got 'On the spot' award Twice for outstanding performance and single-handed achievements in
Asurion in Synechron Technologies.
Got 'A' Band consistently for Two years as part of the annual appraisal process for my performance
in Morgan Stanley in TCS.
Got many client appreciations for on time delivery in the project.
Winner of IIT Bombay Zonal Round and secured 5th Position in Final Round of Grid Master Robot.
Participated in IIT Kharagpur Line Follower Robot.
Secured 2nd Position in Line Follower in City level Technical-fest.
Education:
Bachler of Engineering- EC
Professional Experience:
AWS Architect — Migration Accelerator [Synechron] — From Oct 2020 till Present
Migration Accelerator is one of a kind platform leveraging users/customers to migrate the bigdata application to
public cloud like AWS, Azure. Our accelerator supports different aspects of migration such as Data, Workload, Metastore, Security/Governance, Orchestration Migration.
Accomplishments include:
•
•
•
Architecture design for each section to make sure a smooth migration considering the Big Data scenario.
Played a key role designing the pipeline and its components.
Integrated all the different divisions into a Web UI to ensure one click operation.
AWS Architect — Solution Accelerator [Synechron] — Feb 2020 till May 2020
According to a banking domain use case, we need to build a platform that will be sufficient to ingest, transform,
train, and deploy model to predict the outcome. The Solution Accelerator platform was created with complete
capabilities of DataOps and MLOps. It would be used for Execution and Automation of all Data pipeline and ML flow.
Accomplishments include:
•
•
•
Architecture design to implement Data platform, Data and ML Ops. Incorporated the newly launched
features of SageMaker like Autopilot, Notebook Exp, Experiments etc.
Established functionality of Auto Model creation and deployment to enable ML OPS.
Enabled one click Flow for dataset upload, transform, prediction & visualise. This was provisioned after
considering the laymen customer’s need.
3
Senior Data Engineer — Eli Lilly DDR [Synechron] — July 2019 till December 2020
Eli Lilly is one of the healthcare giants. They comeup with an idea to execute clinical trials for different
studies (for ex., Migraine, Diabetes, Covid 19 etc). Our main objective was to capture the sensor details like
heart rate, acc, gyro, migraine etc of a patient and send it for monitoring and study purpose over streams
to the DDR admins and data scientist.
Accomplishments include:
•
•
•
•
Designed a data pipeline to ingest and process the clinical trials studies which was coming in both batch and
streaming manner.
Created stream flow from sending data to kinesis and displaying it to kibana.
To enable the access of studies cleansed data to the data analysts and scientist. Created a python utility
using which Data scientists can access s3 data from on premise.
Create a Data Generator which will send dummy data into kinesis and can upload one or multiple files into
s3 from onpremise.
Senior Data Engineer — Information Lifecycle Management [Synechron] — March 2018 till May 2019
When GDPR compliance was made mandatory in the European countries, no firm was allowed to keep data older
than 7 years. To make our vertical compliant with the GDPR standards, ILM was introduced. The framework was
created, which will keep the track of objects/files uploaded on AWS S3. And each and every object on s3 was
assigned with an Archival and Purge deadline. Due to this framework, the object was automatically archived and
purged, and lineage was also maintained for all the application objects.
Accomplishments include:
•
•
•
Designed the architecture to make the process interactive, flexible, and secure.
As the framework was dealing with each single unit, so established a failure handling framework and prepare
it for disaster recovery.
Created a single point of logging so as to maintain the lineage of each object lifecycle.
Data Engineer — ATLAS Redshift MCL [Synechron] — Nov 2017 till Feb 2018
Data Platform: Asurion jobs are running on multiple zones/servers across the world. Our main objective is to ensure
that the data flow should be end to end and accurate. We have built a framework which will auto reload the data to
our target even after failure and provide all the possible loggings. The same have been used throughout the
horizontal.
Accomplishments include:
•
•
•
It was a team effort, where in my contribution was to send data from multiple sources to AWS.
Worked extensively on the AWS DynamoDB.
Incorporated Asyncio functionality to enable parallel processing in python.
4
Data Engineer — DIH-RAR Morgan Stanley [TCS] — Sept 2015 till Oct 2017
Morgan Stanley is one of the banking domain leader. To overcome the with the disasters and make the applications
more security compliant, the Wealth Mangement Group of MS comeup with an idea of Ra-Remediation. The
objective was to make the applications more parameterized by removing the hardcodings, revamping the traditional
architectures etc.
Also created a single storage zone to make sure that all the data should have a single stop shop using the
Data Integration Hub Framework.
Datalake Integration Hub : Data Ingestion of Teradata/DB2/MF/GP/MYSQL wealth management applications into
Hadoop. Main objective of the project is to make the processing on same zone and create hadoop as the
dumping/analysis ground for all the WM applications, so that the business could impact positively.
Accomplishments include:
•
•
As part of RAR framework, worked on remediating multiple application which includes unix shell scripts,
python, java file, Informatica etc.
Worked on the integration of various applications like Applications: RNC, PADT, NBA, Advisory.