Job Description
About Walker Digital Table Systems (WDTS)
Walker Digital Table Systems (WDTS) is the global leader in smart table gaming technology. Our patented
PJM3.0™ RFID platform , combined with real-time software, analytics, and AI-driven capabilities, transforms live casino table games into secure, data-rich environments.
Our products —
Perfect Pay™, Perfect Table™, and Perfect Cage™
— are deployed in high‑volume, regulated casino environments worldwide. These systems operate in mission‑critical conditions where
accuracy, reliability, and release confidence are non‑negotiable .
Role Purpose
The Director of Quality Engineering is a
senior technology leader accountable for product quality, release confidence, and quality‑related risk across the WDTS platform .
This role exists to address
systemic quality challenges and scale quality through platform evolution and AI‑driven automation , not to manage test execution alone. The Director defines and leads a
modern Quality Engineering operating model
that:
scales with architectural change,
supports complex release types (major releases, feature sets, patches, hotfixes), and
treats quality as a
business and operational risk , not merely a testing activity.
A
core mandate of this role
is to transform Quality Engineering into an
AI‑first discipline , where intelligent agents generate the majority of automated test cases, and QE leadership focuses on
risk modeling, test intent, governance, and release decisioning .
The Director partners closely with
Architecture, Engineering, Product, DevOps, PMO, and Support
to ensure quality readiness is
designed in, measured consistently, and clearly communicated at release decision points .
Key Responsibilities
1. Quality Strategy, Operating Model & Governance
Define and own the
Quality Engineering operating model
across all release types, ensuring it scales with architectural change and platform evolution.
Establish a shared, defensible definition of
quality readiness , incorporating change impact, regression risk, defect exposure, environment readiness, and architectural dependencies.
Shift quality from reactive testing to a
risk‑based, preventative discipline , treating quality as a business and operational risk.
What success looks like:
Quality decisions are predictable, risk‑informed, and consistently understood across the organization.
2. AI‑First Quality Engineering Transformation
Lead the transformation of QE into an
AI‑first discipline , with agent‑driven test generation producing the majority of new automated coverage.
Define where AI is trusted end‑to‑end versus where
human oversight and explicit quality gates
remain required.
Establish governance standards for AI‑generated tests, including framework conformity, correctness, coverage, stability, and maintainability.
Champion
agentic testing workflows
(planning, generation, execution, healing) as a first‑class QE capability.
What success looks like:
QE effort shifts from test execution to
test intent, risk analysis, and quality governance , while automation scales sustainably.
3. Test Architecture, DevOps & Continuous Quality
Define and oversee
end‑to‑end test strategies
spanning embedded RFID hardware, firmware, real‑time backend systems, data integrity, UI/UX, and cross‑system integrations.
Ensure test architecture is
stable, scalable, and agent‑friendly , with automation treated as a platform capability rather than a coverage metric.
Embed quality into
CI/CD pipelines , defining release‑readiness standards, quality KPIs, and regression frameworks.
Own production quality feedback loops, including observability, post‑release metrics, incident RCA, and continuous improvement.
4. Release Readiness, Compliance & Executive Leadership
Act as the
single accountable leader for quality readiness at release time , ensuring decisions are driven by risk and impact rather than test counts.
Identify and escalate quality risks early, clearly articulating
business, customer, and operational impact .
Partner with Compliance and Regulatory teams to ensure readiness for
gaming lab certifications and regional regulatory requirements , with full traceability from requirements to validation.
Lead and develop the Quality Engineering organization as a
strategic partner , representing quality and release risk credibly at senior leadership and executive forums.
What This Role Is Not:
Not a manual testing manager role
Not a tooling‑only automation lead
Not a role measured by test counts alone
This role is accountable for:
quality outcomes,
AI‑first QE transformation,
automation as an engineering system,
and release confidence.
Ideal Candidate Profile
Required Experience
10+ years in Quality Engineering, with 5+ years in senior leadership roles.
Proven experience leading quality in
complex, distributed, high‑reliability systems .
Strong understanding of:
release management and regression strategy
automation frameworks and CI/CD integration
architecture‑driven quality risk
Experience operating credibly with architects, senior engineers, and executives.
Strongly Preferred
Experience operationalizing
AI‑assisted or agent‑driven test generation
beyond experimentation.
Experience defining
governance and evaluation frameworks
for AI outputs.
Background in regulated or high‑stakes environments (gaming, fintech, medical, aerospace, defense).
Familiarity with embedded systems, real‑time transaction platforms, or hybrid edge environments.
Why This Role Exists
WDTS is evolving its platform architecture while supporting complex, regulated deployments worldwide. This role ensures that
quality scales with that change , leveraging AI to improve coverage and speed
without increasing risk , and that release decisions are made with clarity, consistency, and executive confidence.