Вы находитесь на странице: 1из 3

** Symantec, Inc (4/2009 - 12/2009)

SQA Analyst (Software Quality Assurance Analyst)


*Testing Method: Black box, Gray Box
*Testing Types: Functional, Acceptance, Smoke Test, Compatibility, Alpha, Beta,
Regression, AdHoc
*Non Functional Testing: Usability, Internationalization/Localization, Security
Testing
*Web Testing: Login Authentication, Browser Attack Handling
*Development Process: Custom Waterfall
*Defect DB: eTrack
*Scripts DB: Toro, a custom DB that utilized on-the-fly editing and test class c
ategorizing, severity grouping and pass/fail reporting with dashboard integratio
n of metrics
*Tools: Ghost, VMWare, Perforce, MS Office, Wiki
*Environment: Microsoft: Windows 98, XP, Vista, Notebook, Media Center, Windows
7
I applied a functional methodology of testing in a black box and gray box e
nvironment on a day-to-day basis for the security rich products N360, N360 Netbo
ok Edition, Norton AntiVirus and their OEM variations. Non-Functional testing wa
s applied throughout the testing phases. Using the web browser (IE, Firefox) I t
ested the web features of the product for credential authentication and attack h
andling. I tested in a Windows environment using Ghost and VMware, documenting m
y defects in eTrack and maintaining my scripts in the Toro DB. Alpha testing wou
ld consist of smoke testing the build for feature integration and usability enha
ncements.
Typical beta testing process included: new QA build, verify prior defects,
continue scripts progression plus defect submission, and receive next QA build.
Prior to test complete, I would have 100% script completion, all defects verifi
ed, Ad-hoc and edge case tests documented and as many enhancement requests submi
tted as possible. At RM time all defects would go through regression prior to pr
oduction release. Production release testing included end-to-end smoke testing a
nd sign-off. Additionally, I participated in GOLD CD acceptance testing and exte
rnal beta tester defect regression. Post production tasks included script mainte
nance, review and script creation of new project requirements and PC reimaging.
*******************************************************************
Contract Work History
** Symantec 11/25/08 - 04/05/09 SQA (SaiPeople Solutions Contract)
Software Quality Analyst
*Testing Method: Black box, Gray Box
*Testing Types: Functional, Acceptance, Smoke Test, Compatibility, Alpha, Beta,
Regression, AdHoc
*Non Functional Testing: Usability, Internationalization/Localization, Security
Testing
Web Testing: Browser Login Authentication, Browser Attack Handling
*Development Process: Custom Waterfall
*Defect DB: eTrack
*Scripts DB: Toro, a custom DB that utilized on-the-fly editing and test class c
ategorizing, severity grouping and pass/fail reporting with dashboard integratio
n of metrics
*Tools: Ghost, VMWare, Perforce, MS Office, Wiki
*Environment: Microsoft Windows 98, XP, Vista, Notebook, Media Center, Windows 7
As a contract tester, my main focus was functional and smoke testing the p
roduct. Typical testing process included: new QA build, verify prior defects, co
ntinue scripts progression plus defect submission, and receive next QA build. Pr
ior to test complete, I would have 100% script completion, all defects verified,
Ad-hoc and edge case tests documented and as many enhancement requests submitte
d as possible. At RM time all defects would go through regression prior to produ
ction release. Production release testing included end-to-end smoke testing and
sign-off.
** YellowPages.com, an AT&T Company 2/11/08 - 3/11/08 SQA (TenTek Contract)
Quality Experience Analyst
*Testing Method: Black box
*Testing Types: Functional, AdHoc
*Non Functional Testing: Usability
*Web Testing: Browser Login Authentication
*Development Process: Custom Waterfall
*Defect DB: Jira
*Scripts DB: Custom DB
*Tools: IE, Firefox, MS Office, Wiki
*Environment: Microsoft Windows Vista
This position utilized both functional and usability methodology to form t
he, "Customer Experience QA" skill set. Functional black box testing included va
riable field box input and validation of output data. Verification of output dat
a relevance was a primary testing point making sure data was categorized and lis
ted correctly. For example, a customer doing a search for Mothers Day flowers wo
uld not see a listing for gardening tools unless the enhanced feature was select
ed. A typical testing day would include: notification that the new build was ins
talled, regress previous defects, submit new defects, and complete scripts progr
ession and ad-hoc testing as time permitted.
*******************************************************************
EarthLink, Inc 3/2004-9/2007
SQA Analyst (Software Quality Assurance Analyst)
*Testing Method: Black box, Gray Box, White Box: With development created harnes
s
*Testing Types: Functional, Acceptance, Smoke Test, Compatibility, System, Bound
ry, Alpha, Beta, Regression, AdHoc
*Non Functional Testing: Usability, Security Testing
*Mobile/Wireless Testing: Treo 600/650, Blackberry RIM7250, Symbian, Handset sim
ulators (PalmOS, BBOS, Symbian), Wap 1.1, CDMA Sierra wireless Aircard 555, WiFi
(Boingo)
*Web Testing: XML API, Mobile PSP Portal, HTML, Login Authentication, Authentica
tion token handling
*Automation Testing: None
*Voice Testing: Voip, SIP/Proxy, Home Networking, Web Integration
*Development Process: Custom Waterfall and V-model
*Defect DB: TestTrack Pro, Jira
*Scripts DB: MS Excel and Customized FilemakerPro DB
*Tools: Ghost, VMWare, MS Sharepoint, MS Office, Wiki
*Environment: Microsoft Windows 98, XP, NT, 2000, Vista, Notebook, Media Center,
Windows 7
*Additional KnowHow: DSL, Satellite, Cable, Linux CLI, Home Router Configuration
s, Personal Networking, 5+ years of Technical and Customer Tier 1,2,3
My main projects included the Mobile PSP, Mobile Total Access, desktop To
tal Access and the voice/chat client MindSpring. The Mobile PSP included testing
the interface, authentication servers, content servers and vendor API service.
Custom html code and perl scripts were created to test the XML from the vendor A
PI service. Nonfunctional testing was used to test the authentication process &
token passing to promote "one-login" for all the integrated components. Handheld
simulators were heavily used to promote faster defect detection. The Total Acce
ss desktop version included testing the connection portion of the product with:
DSL, Wi-Fi (Boingo), cable, satellite and personal home networking. Additional t
esting incorporated email, ftp and web based integrated components such as passw
ord wallet and child safe.
Acceptance testing was used on a Mobile TA build which was embedded on de
vice firmware. Integration testing was used to test the seamless interaction bet
ween mobile, desktop and web environments. Team meetings and bug scrubs were hel
d weekly promoting good project handling. Leading the bug scrubs consisted of di
scussing currently open defects based on severity, change requests and defect po
stponements. Leading the QA effort in team meetings allowed me to present the QA
status and discuss additional defects or process handling. Typical beta testing
process included: new QA build, verify prior defects, continue scripts progress
ion and receive next QA build. Prior to test complete, I would have 100% script
completion, all defects verified, ad-hoc and edge tests documented and as many e
nhancement requests submitted as possible. Following a coordinated battle plan,
QA would work with System Administrators, Project Managers, Development and othe
r QA to coordinate the transition of the project from the QA environment into th
e production environment with minimal residual consequence. Once complete, a pos
t mortem meeting would be held and SQA would present their status and feedback.