Claudia Bamba
(240)--
OBJECTIVE:
QA Test Analyst/Software Test Engineer with 10 years of experience in Manual and Automated Testing of web based and client/server applications, mobile applications in various industries, projects, and environments. Extensive experience in requirements-driven and agile requirements-independent testing, User Acceptance Testing, acceptance testing, production shakeouts, developing requirement traceability Matrix (RTM) and defect tracking and test result analysis.
Seeking to obtain a challenging position that will enable me to apply my experience and IT skills in the area of systems quality assurance (SQA) to deliver quality software
METHODOLOGIES, FRAMWORKS & CONCEPTS
Capability Maturity Model Integration Level 3Service Oriented Architecture (SOA)
Software Development Lifecycle (SDLC)Regression Testing, Integration Testing
Unit Testing, Functional TestingAcceptance Testing, Load Testing
Performance Testing, Stress TestingAlpha Testing, Beta Testing
Change Control ProcessChange Control Board Governance
Rational Unified Processes (RUP)Agile Testing Methodology
Software/hardware Skills
Testing Tools: HP Mercury Quality Center, Rational Functional Clear Quest, Clear Case, Team Foundation Server, Github, Postman, Android Studio 2.x
CM & Requirements Tools: DOORS, Serena Team Track and Rational Clear Case.
Office Tools: MS-Word, MS-Visio, Excel, MS-Project, PowerPoint, Outlook, MS-Access, Google Sheets, Google Docs
Operating Systems: Windows 2000/XP/Vista/7, Unix Solaris, Linux
Databases: SQL Server 2005, Oracle 10g, MS Access, MySQL 5.x
Programming Languages: J2EE, Visual Basic, VB.Net, TSQL, PL/SQL, ANSI SQL
Other Technologies: Business Objects Crystal Reports, MS-SharePoint
Key Qualifications:
Experienced in defining and documenting functional and business requirements, requirement specification, data analysis/models, and design documents.
Expertise in writing system integration Test plans, Test Cases, and generating software defects reports using Clear Quest, Clear Case and Team Foundation Server.
Experience in managing and defining configuration management processes, change control board and governance.
Broad experience in using different testing skills such as, regression, user acceptance, compliance testing, positive and negative testing, WhiteBox and Blackbox testing.
Experienced working in CMMI level3 and ISO 9001 process environments
Knowledgeable in testing Enterprise Resource Planning (ERP) and Customer Resource Management (CRM) systems.
Strong knowledge of various SDLC methodologies (Waterfall and Agile)
Strong leadership skills as well as a good team player with excellent problem solving skills, technical, oral & written communication.
Strong ability to adapt new technology quickly.
EDUCATION
Master of Science in Information Systems (Information/Data Management) - University of Maryland
Bachelor of Science in Information Systems - Salisbury University
PROFESSIONAL EXPERIENCE
InfoView Data Solutions – Highland MD
Software Test Manager – June 2015 to Present
Roles and Responsibilities:
Interacting with off shore developers daily to troubleshoot and resolve issues.
Plan, monitor and track test execution progress on a day-to-day basis.
Creation and design of test scenarios and test cases.
Estimate test effort and coordinating testing with testers, project managers to ensure work is consistent with development and deployment schedules.
Development and design of test plans specific for different computer applications.
Conduct daily standup meeting for the testing team
Perform mobile Apps testing and analysis defects using these testing tools: Apple xCode, Android Studio, Microsoft Visual Studio and Appery.io
Defect reporting and tracking of defects on day-to-day basis using Github.com
Create Mobile Apps UI Designs / Mock Ups based on requirements using Appery.io and Android Studio IDE
Utilize MySQL Workbench to perform backend testing against the front end of the App.
Conducts web service Testing using Postman (REST Client) to test API’s (Application Programing Interface)
Performs testing on various mobile platforms (Andriod, iOS, Windows)
Identifying and preparing test data for Manual and Mobile Testing.
Conduct daily standup meeting for the testing team
Work with clients to determine business requirements, priorities, support BI and data warehouse (DW) strategy.
Northrop Grumman – Baltimore, MD
Center for Medicare and Medicaid Services – Fraud Prevention System
Sr. Software Tester – July 2011 to November 2014
Develop use case scenarios, test cases, based on business requirements
Responsible for producing UAT (User Acceptance Testing) strategies and production of detailed UAT Test Scripts
Facilitate UAT process and support nominated testers and obtaining successful sign-off from key business stakeholders
Responsible for testing End to End business process flow
Utilize HP Quality Center to document test cases and defects discovered during testing phase
Proposes and designs root cause analysis, retesting of defects and contributes to defect management
Interact with software developers to analyze defects found during testing
Create Requirement Traceability Matrix (RTM) by mapping test cases to requirements in Quality Center
Create Test Case Specification document, Test case design and conduct a test script walkthroughs with leads
Ensure content and structure of all testing documents / artifacts are documented and maintained
Leads and supports testing process such as functional testing, smoke testing, regression testing etc, throughout the development cycle
Define the testing methodology for the testing team regarding new projects and programs
Review the solution architecture of all releases to derive testing approach and test cases
Collaborate with QA analysts to elaborate and validate test cases
Gather, validate analyze and document business requirements, and project use cases, visual models, requirements specifications, and reports
Collaborate with customers and developers in developing systems requirements and performing analysis activities
Articulate requirements, makes recommendations for process and business flows
Act as Communication liaison between the Business Representatives & Technical Team
Works with business stakeholders and Business Analyst teams to design, and document dashboards, and report on project metrics.
Assist with Ad-Hoc enquires as required
Highland Technology Services Inc. – Gaithersburg, MD
Department of Energy (DOE) – Strategic Integrated Procurement Enterprise System (STRIPES)
Configuration Management Lead / Software Tester – April, 2010 to July 2011
Performs and participates in all aspects of verification of system and functional requirements gathering.
Performs feature and functional testing, unit testing, regression, load and performance testing.
Designs test plans which include developing scripts to enhance regression testing and feature testing.
Prioritizes test requirements and organizes test cases according to release cycles
Communicates testing status to the project manager and the client.
Designs, develops, and deploys automated system tests to replicate real-world scenarios and user defects.
Ensures detailed metrics are recorded and monitored to track the progress and outcomes of the testing process.
Develops analysis to identify root causes of defects while developing resolutions.
Reviews and presents new issues to Change Control Board for proper approval and communication through the change management process
Conducts reviews and evaluations on current configuration management processes
Provide on-going management of the configuration management plan to ensure information is controlled
Institute naming conventions for the configuration items to be managed and controlled
Monitors the life cycle of Configuration items, conducts meetings, updates and notifies users of all system meetings/updates
Optimal Solutions and Technologies Inc. (OST) – Washington DC
Federal Aviation Administration (FAA) – Delphi Project
System Analyst/ Software QA Tester – January 2010 to April 2010
Performed in-depth analysis of Delphi assets against Automated Inventory Tracking System transactions
Documented analyzed findings and provided supporting proofs
Reviewed and Approved OST work paper analysis for accuracy and submissions
Participated in documentation peer reviews for Delphi Project
Created test cases, test plans based on the project/application requirements
NFIP IT Service Desk/Tester – November, 2008 to January 2010
Federal Emergency Management Agency (FEMA) – National Flood Insurance Program Project
Analyzed and tested NextGen Applications (Simple and Quick Access (SQANet), F2M, EzClaims) by comparing data against the current system of records Bureau and Statistical Agent data. As an IT service helpdesk consultant, I ensured that FEMA and WYO Companies received support and assisted with any inquiries they have.
Responsibilities as a Software QA/Tester includes:
Worked with developers and business analysts to analyze business/functional requirements
Initiated and manages Requirement Peer Review based on customer implementations or changes
Participated in various meetings for enhancement and modifications of applications, following client change request
Created test plans based on project/application requirements Created and maintained Requirement Traceability Matrix (RTM) to correlate test cases to Requirements
Actively tested NextGen applications, Flood Financial Management web applications by performing functional testing, throughout the entire software development lifecycle
Detected and tracked defects using IBM Rational Clear Quest bug tracking tool
Updated testing reports, improved testing processes, and continued to communicate effectively with developers
Assisted in training Verification and Validation, Requirement Developments and configuration management process areas of CMMI level 3.
Utilized Rational automation test tool to execute test scripts for Flood Financial Management ( Java-based application)
Responsibilities as an IT Service Desk includes:
Provided supported over 200 users in person and 1000 clients by telephone.
Logged and tracked issues into OTRS (Open-Source Ticket Request System)
Tracked the number of Service Desk calls/emails in OTRS
Played a key role in ensuring a proficient workforce and significantly reducing system downtime.
AWS Convergence Technologies Inc. – Germantown, MD
Software Test Engineer I – July 2007 to October 2008
Performed quality control testing to support WeatherBug mobile products
Performed validation of system requirements with product, development and requirements teams.
Tested mobile Applications using WAP, Brew, and Device Anywhere.
Utilized network sniffers such as CommView to capture network traffic on the system
Created Test plans based on requirement documents
Utilized web browsers such as Internet Explorer, Firefox and Safari to test widgets, Gadgets on XP and Vista PC Clients
Used IBM/Rational Clear Quest and Microsoft Visio for entering Bugs, enhancements and verifying issues.
Conducted back-end testing using SQL commands and CommView
Created test cases that helped detect bugs early in the life cycle on Sprint J2ME applications
Environment: .NET 2.0, IIS 6.0, Rational Clear Quest, VSS, Excel, Commview, SQL Server, SQL Management studio, SQL profiler, Java/J2EE, Brew App loader, Mif editor, Brew simulator, GIS Arc maps