Kris Ram

Kris Ram

Data Engineering, Data Migration, Data Warehouse, Azure & AWS
Reply rate:
-
Availability:
Hourly ($/hour)
Location:
Harrison, Act, Australia
Experience:
15 years
SUMMARY: Over 15+ years of experience as a Data Architecture, Data Analyst and Data Modeler with solid understanding of Business Requirements Gathering, Business Process Mapping Evaluating Data Sources and Data mapping, data profiling, Hadoop ecosystem, AWS, Data Analytics, Data Warehousing and ETL.  Experienced in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables. Solid experience in development of Conceptual, Logical and Physical Models for Online Transaction Processing and Online Analytical Processing (OLTP & OLAP) and Power Design. Experiences with data modeling with NoSQL databases such as MongoDB for document models, AWS Redshift etc. Excellent experience in trouble shooting SQL queries, ETL jobs, data warehouse/data mart/data store models. Experienced with distributed  data warehousing and/or  data mining systems, using one or more Big  Data/NoSQL technologies (Hadoop, Hive, HBase, Pig, Cassandra, MongoDB)  Hands on experience with modeling using ERWIN, ER Studio, MS Visio in both forward and reverse engineering cases and skillful in Data Analysis using SQL on Oracle, MS SQL Server, Netezza, DB2 & Teradata Well versed in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.  Experience in working with Business Intelligence and Enterprise  Data Warehouse(EDW) including SSAS, Pentaho, Cognos, OBIEE, QlikView, Greenplum and Amazon Redshift (AWS). Good experience and understanding of Teradata SQL Assistant, Teradata Administrator and data load/ export utilities like BTEQ, Fast Load, Multi Load, Fast Export. Hands on experience with various  Data  Architect and ETL  Architect, subsystem and patterns, including Change Date Capture, Slow Change Dimension,  Data Cleansing, auditing and validation, etc.  Solid experience with data governance, data steward, data quality concepts and implementations and expertise in performing User Acceptance Testing (UAT) and conducting end user training sessions.  Strong background in various Data Modeling tools using Erwin, ER Studio and Power Designer.  Experience in integration of various relational and non - relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, and Netezza database. Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations. Experienced in data transformation, data mapping from source to target database schemas and data cleansing procedures using Informatica Power Center, Talend and Pentaho.  Good in Data warehouse loads, determining hierarchies, building various logics to handle Slowly Changing Dimensions.  Strong experience in Normalization (1NF, 2NF, 3NF and BCNF) and De-normalization techniques for effective and optimum performance in OLTP and OLAP environments.  Expertise in Informatica Power Center and Data Quality (IDQ) tools.  Excellent experience in trouble shooting SQL queries, ETL jobs, data warehouse/data mart/data store models. Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting  Services (SSRS)  Trained end users, prepare related documentation (requirements, training materials, process and data flows, use cases, functional design, etc), and addressed critical questions from the user base involving the CRM Good experience with use of Access queries, excel functions V-Lookup, formulas, Pivot Tables, etc.  TECHNICAL SKILLS: Data warehousing Tools: Business Objects, Cognos, Microstrategy, Tableau, Informatica, Talend, Pentaho, Datastage Hadoop Ecosystem: Hadoop Framework, HiveQL Queries, Pig, HBase, MongoDB, Sqoop and spark, scala Modeling Tools: ERWIN r9.6/r 9.5/9.1, Aqua Studio, Embarcadero, ER/Studio, MSVisio, Sybase Power Designer Databases: Big Data, Cloudera Hadoop, MangoDB, DynamoDB, Neptune, HDFS, Hive, Aster Data, Teradata, Oracle, DB2, AWS-EMR and Redshift, Azure, MS SQL Server -), Sybase, Progress, PostgreSQL, Essbase and NoSQL DBs AWS: Data-lake , Hadoop, Athena, Redshift, Spectrum, ElasticSearch AWS-RDS, S3, Glacier, Deep Glacier, Lambda, DynamoDB, Cloud Basic , AWS-DMS, Google Big Query, Snowflake, ELK-stack, AWS-EMR using JSON Ruby, AWS-GLUE, Kinesis, Airflow, Data Pipeline, AWS-SQS, SNS, Mulesoft, Kibana, QuickSight & other BI tools. Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Waterfall Model ETL Tools: Informatica, Cloud Basic, AWS-SQS, MuleSoft, Wherescape RED, Talend, Matillion, SSIS, IBM-InfoSphere (Data stage, Business Glossary & Quality stage), Customised ETL for QAD, SOA, API and Cognos Data Manager, Reporting Tools: Crystal reports XI, Business Intelligence, SSRS, Business Objects 5.x/ 6.x, Tableau.
Get your freelancer profile up and running. View the step by step guide to set up a freelancer profile so you can land your dream job.