Descripción del Puesto
About the Role
We are looking for a Knowledge Base Operations Engineer to own the preparation, testing, deployment, and maintenance of monthly and quarterly software composition analysis (SCA)
Knowledge Base updates. This role operates downstream of the data ingestion pipeline: you will
consume data from PostgreSQL, run Go-based mining tools to generate data layers (snippets, licenses, copyrights, cryptography, code quality), orchestrate multi-server deployments, create backups, and deliver updates to customers via SFTP.
Responsibilities
• Execute the monthly/quarterly KB update cycle
• Run and maintain Ansible playbooks
• Write and maintain Python scripts
• Operate and maintain a Go-based mining tool
• Manage large-scale data transfers
• Test and validate KB integrity
• Create and verify MD5 hash manifests
• Manage backup rotations
• Deliver updates to customers
• Prepare release documentation
• Monitor and troubleshoot disk space, failed transfers, stale processes, and data integrity across the fleet, responding to automated alerts
Required Skills
Linux & Shell Scripting (Advanced)
• Daily SSH work across a large server fleet
• Proficient with rsync, find, awk, md5sum, tar, cron, tmux/byobu
• Comfortable managing multi-TB datasets and long-running processes
Ansible (Intermediate–Advanced)
• Writing and running complex multi-play playbooks with inventory groups, tags, rsync-based
transfers, error handling, and conditional execution
• Variable management and multi-host orchestration
SQL / PostgreSQL (Intermediate)
• Writing queries with JOINs, subqueries, aggregations, and upserts
• Bulk data operations (COPY, batch inserts, temp tables)
• Database backup and restore
Python (Intermediate)
• Maintaining scripts that interact with PostgreSQL (psycopg2), process CSV/JSON, export to SQLite, and handle batch operations on large datasets
• Familiarity with argparse, concurrent.futures, and standard data processing libraries
Go (Basic–Intermediate)
• Reading and navigating a mid-size Go codebase
• Fixing small bugs, adding minor features
• Building and deploying Go binaries (go build)
Nice to Have
• Experience with SCA (Software Composition Analysis) tools or license compliance
• SFTP server administration
• Familiarity with Package URLs (purls), SPDX licenses, vulnerability databases (NVD, OSV/GHSA)
• Telegram Bot API or similar alerting integrations
Qualifications
• 3+ years of experience in DevOps, SRE, or data pipeline operations
• Strong experience managing Linux servers and large-scale data transfers
• Comfortable working with multi-TB datasets and long-running processes
• Experience with automation tools (Ansible preferred)
• Able to follow detailed multi-step procedures with strict ordering dependencies across dis- tributed infrastructure
• Self-motivated, detail-oriented, and comfortable working independently
• Able to troubleshoot issues across a distributed system (disk space, failed transfers, data integrity, process management)
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Detalles del Puesto
Fecha de Publicación:
March 8, 2026
Tipo de Trabajo:
Negocios y Operaciones
Ubicación:
es
Company:
SCANOSS
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.