Position Description
U.S. Citizen: Active Clearance or Clearance Eligibility REQUIRED
Full-Time – Hybrid - Leesburg, VA (with rotational international travel as required)
Duty Sites: Leesburg, VA (Remote) and Amman, Jordan
Division: Cybersecurity Services
FLSA Status: Exempt
Job Description:
As a Data Engineer within SASSI’s Cybersecurity Services Division, you will design, build, and maintain data pipelines and analytical systems supporting U.S. Government and defense missions overseas. This position combines data engineering and software development expertise to deliver secure, efficient, and scalable data solutions that operate within resource-constrained environments.
This is a hybrid position based out of Leesburg, VA, with rotational international travel to Amman, Jordan. The standard rotation includes three consecutive weeks of on-site work every three months, followed by three to four weeks working remotely from the U.S. The role requires flexibility to travel internationally to support mission needs, team collaboration, and data system deployment and maintenance activities.
The ideal candidate will demonstrate proficiency in Python, Docker, and Kubernetes, possess a strong full-stack development background, and have experience in data engineering and cloud environments. An innovative mindset and ability to adapt to limited computational resources are essential.
SASSI is a Woman-Owned Small Business (WOSB) headquartered in Leesburg, VA, supporting the U.S. Government and National Security sectors since 1989. Visit www.teamsassi.com for more information.
Primary Duties and Responsibilities:
-
Design, build, and maintain secure data pipelines to collect, transform, and deliver data from multiple sources.
-
Develop and manage databases, warehouses, and cloud data lakes for structured and unstructured data.
-
Apply full-stack development principles to integrate data workflows with applications and visualization tools.
-
Utilize Python for automation, scripting, and backend development of scalable data processes.
-
Use Docker and Kubernetes to containerize and deploy applications across development and production environments.
-
Implement and optimize data transformation and enrichment workflows to support analytics, machine learning, and mission objectives.
-
Collaborate with data scientists, developers, and analysts to deliver reliable and validated datasets.
-
Engineer efficient solutions for environments with limited network bandwidth or computing resources.
-
Implement and maintain data security and access controls in compliance with federal and DoD policies.
-
Troubleshoot and resolve issues related to data integration, delivery, or system configuration.
-
Document engineering workflows, configuration baselines, and standard operating procedures.
-
Continuously identify and implement process improvements and automation opportunities to enhance system performance and reliability.
Minimum Requirements:
-
U.S. Citizenship and active clearance or clearance eligibility.
-
Bachelor’s degree in Computer Science, Data Science, Information Technology, or a related discipline (or equivalent professional experience).
-
Demonstrated experience in data engineering, ETL development, or full-stack software engineering.
-
Proficiency in Python for scripting, automation, and backend development.
-
Proficiency in Docker and Kubernetes for containerized application deployment and orchestration.
-
Strong full-stack development background, including API integration and data delivery.
-
Experience or demonstrated skill in data engineering and data pipeline design (SQL, NoSQL, or cloud-native tools).
-
Familiarity with cloud platforms (AWS, Azure, Oracle Cloud, or similar).
-
Strong analytical and problem-solving skills with a focus on performance and scalability.
-
Excellent written and verbal communication skills and ability to collaborate across technical teams.
Preferred Qualifications:
-
AWS or Azure professional certification.
-
Experience supporting DoD or U.S. Government programs in overseas or OCONUS environments.
-
Experience with big data technologies (Hadoop, Spark, Airflow).
-
Knowledge of Infrastructure as Code (Terraform, CloudFormation) and CI/CD pipelines (Git, Jenkins, Artifactory).
-
Experience implementing STIGs, hardening systems, or securing cloud-based data infrastructure.
-
Familiarity with data visualization tools (Power BI, Tableau, or D3.js).
-
Demonstrated ability to innovate and optimize performance in limited-resource or high-latency environments.
-
Master’s degree in Computer Science, Data Engineering, or a related field.
Travel Requirement:
This position requires rotational international travel to Amman, Jordan. The standard rotation includes three consecutive weeks of on-site work approximately every three months, followed by three to four weeks working remotely from the United States. Additional travel may be required based on mission or customer needs.