Reshmi Kakkattuchalil
Cloud Certified Hadoop, Java & Shell script Developer / Business Analyst
Summary:
3 Years of experience in developing technical solutions to business problems. Defining, analyzing and documenting requirements and also have involved in all phases of Development, Maintenance & Support of Applications which gave me an opportunity to work with different programing languages and tools. 
Professional Experience:
Company
Role
Duration
Tata Consultancy Services,
Kochi
Assistant Systems Engineer
3 years
Experience Summary:
Over 3 year of experience in Big Data  team, executing Projects/Client PoCs in Big data  & Hadoop technologies  like Map/Reduce, Spark, Hive, Pig, Impala, Sqoop, Oozie, Flume, Solr and YARN.
Making technical decisions within the Business Intelligence arena and Ensuring solutions are delivered to quality, on time and in line with client requirements.
Providing recommendations based on business intelligence analysis.
Strong technical knowledge in Hadoop solution development. Involved in the requirements Analysis, design, build and Development. 
Good understanding of SDLC.
Working knowledge on Cloudera-5 and Hortornworks 2.2.
Experience with working on Linux CentOS and Windows XP, Windows 7 operating systems
Completed certification on Azure Cloud Developer and MongoDB.
Confident, Responsible, highly-proactive and Quick learner. 
Possess good interpersonal skills and very goal oriented.
Able to handle development and support of any Application.
Very good Team player and much interested in learning new technologies.
Interacted with the European and American clients on a regular basis through WebEx and Lync.
Technical Skills:
Programming Languages/Skills
Core Java, Shell Scripting, Mapreduce, Spark, Hive, Oozie,  Apache Sqoop , Solr, Flume, YARN
Scripting Languages
Unix Shell, Java Script
Web Technologies
JSP, Servelts, SQL and knowledge of Hibernate and Struts.
Database                                     
Hive, Oracle10g with SQL and MySQL
Tools 
Tableau, Control M, R-statistics Eclipse IDE for Java, SQL Developer, SQL workbench.
Servers
Apache Tomcat 6
Clusters
Cloudera distribution for Hadoop cluster, Hortonworks distribution Platform for Hadoop cluster,Azure Cloud Services
Process/Methodologies
Agile, Water Fall
Awards and Recognition:
Below are my major achievements during the last 1 year tenure.
Have received Nielsen Buy CTO award for Performance Excellence towards the outstanding contribution to NDX(Nielsn Data Exchange).
Have selected as the best performer in the Analytics Platform & NDX
Have received individual recognition from the client side for the outstanding contribution to NDX
Our team was selected as the star team for successfully on boarding 98 countries to production.
Was awarded "On the Spot award" for leading CRs from offshore.
Won first prize in TCSNielsen TechnologyStudio -Hadoop quiz which was held Globally.
Key Projects:
1. Nielsen ETL for Analytics
Description
Nielsen is a leading global information and measurement company that enables companies to understand consumers and consumer behavior. Nielsen measures and monitors what consumers watch (programming, advertising) and what consumers buy (categories, brands, products) on a global and local basis. 
Client
Nielsen, US
Duration
 22 months
Role
As a BA/Developer and Team lead:
Here we are mainly handling the ETL part of this analytics process, where in we have developed a framework that promises to pull different data from different source and makes it available in cluster's HDFS for further processing and analytics. On a whole our team has on boarded 97 countries to our system.
Started as a developer and the leveraged to the role of a Business Analyst have helped me to understand the product/system at the root level. 
Worked closely with Factory SME’s to understand the data and to provide business solution.
Played a major role in taking technical decisions for various critical issues. 
Responsible for capturing, aggregating and transforming vast amounts of complex data into meaningful reports
Document the analyzed business requirement and proposed solutions.
Developed  MapReduce code, shell scripts and HQLs to pull and populate the data into Hive tables
Technologies Used:
 MapReduce, Spark, Oozie ,Hive , Java and Shell script
2. Web Analytics reporting Tool.
Description
Web data analytics were earlier done with a paid tool of Adobe. Customer wanted to migrate this Hadoop platform. Reporting metrics were calculated in Hadoop and the reports were generated in tableau. Thus 187 repots were generated weekly by our tool and also gave an added advantage to our customer to create new reports as per their need using our tool.
Client
The Hartford Financial Services Group. US
Duration
7 months.
Role
As a Lead Developer:
We receive web data (both event and interaction level data) on a weekly basis, which is ingested into our cluster. On top of this data different normalized tables are created which allows full slicing and dicing of data.
Technologies Used:
 MapReduce, Oozie ,Hive , Java,Tableau and Shell script
I had involved in all phases of development, from the creating the data model to finally scheduling the whole process.
3. TCL ETL Migration
Description
 As a first part of call log analysis, Log  data(CDR) was  classified  into  complete
and incomplete  on the basis of  certain  columns.  A complete  transaction  then  
it is passed  to  the  Enrichment   phase.   On  incomplete  CDR ,  apply correlation
 logic to make  the transaction complete. According to certain  conditions   the 
 Request  and  the  Response  CDR's  are merged to make a complete transaction  
and should  be passed to the Enrichment  phase.   This process continues for   a
 threshold time.  If  a  match  can  be  found within the  Threshold  time  
limit, then  the  Request  and  the  Response  are merged to make a complete 
Transaction and then it is passed to the Enrichment.
Client
Tata Communication Ltd
Duration
2 months.
Role
As a Developer:
Involved in requirements gathering, analysis. Understanding and designing the 
Architecture. Preparing the High level and low level design docs. Storage of data
 in HDFS using flume. Data processing using mapreduce. Scheduling the workflow 
using Oozie
4. Prudential KMT CSR  Tool. 
Description
Tool was developed for call center representatives of Prudential .
Tool   indexes   the different   data   from multiple sources like MySql, .html, .doc files etc. The indexing is done    through    Solr.   On   the    indexed    data,   the representative   performs a search with the plan id and intent.  The related block of information is displayed to the representative in a UI thus saving time and cost of search. This was done as a POC and delivered to customer.
Client
Prudential Financial, Inc, US
Duration
2 month
Role
As a developer:
Analyzed customer data loaded from different sources
Involved in design and development 
Involved in developing an interactive UI.
Technologies Used:
Solr,Java,JSP,Servlets,MySql
Education Details:
Particulars
School/College
Score
Year
BTech(IC)
NSS College of Engineering ,Palakkad, Kerala
85%
2013
12th  HSC
Presentation Higher Secondary School, Kozhikode, Kerala
89%
2009
10th  SSLC
Providence Gils Higher Secondary School, Kozhikode, Kerala
100%
2007
Contact Details:
Name:  Reshmi Kakkattuchalil.
Status:  Married
Email: -
Passport:  P-
Cell Phone:  - 
Address :  Kerala,India