• Location: Chantilly
  • Type: Contract
  • Job #1969

Title: Big Data Engineer

Location: Chantilly, VA

*Clearance: *Active TS/SCI w/ Polygraph needed to apply *

 

Company Overview:

Cornerstone Defense, in partnership with our military, intelligence, and civil government customers, supports U.S. operations worldwide through the use of many different types of intelligence, satellite, and cyber technologies. Cornerstone’s Intelligence Sector provides solutions to the United States Government for information collection, operations, exploitation and dissemination, and research activities. Our Team specializes in software development, cloud architecture, systems and network engineering, systems integration, agile management, as well as targeting operations and intelligence analysis. Our support to our mission customers includes cyber network operations, exploitation and defense, signals intelligence, human intelligence, and critical missions and networks.

We are seeking a Big Data Engineer to work in the Chantilly, Virginia area.  Area of responsibility will include:

  • Big data engineering and management
  • Data analytics, data modeling and data wrangling
  • Cloud automation engineering using AWS environment

The Big Data Engineer will engineer, analyze, integrate, test, debug, and monitor data processing workflows to ensure accurate and efficient Extract, Transform, Load (ETL) of data.  This may require:

  • Writing /editing and testing scripts, parsers, and small programs to process, load or analyze data. 
  • Perform the analysis of databases and datasets to ensure proper handling of the data
  • Perform end-to-end data quality assessments focused on analytical accuracy and data integrity
  • Coordinate and resolve data quality issues
  • Help gather metrics and contribute status as required by the customer and management. 
  • Interface with customers electronically, on the phone and in person to ensure data requirements are met.   
  • Perform data flow implementation using approved SOPs and patterns to ensure consistency and maintainability
  • Identify areas of improvement and present innovative solutions to increase efficiency and quality of data flows.

 

Basic Qualifications:

  • Bachelor’s degree (or higher) in an IT-related field and minimum 3 years of experience in Systems Engineering.  Bachelor’s degree in a non-technical field and a minimum of 5 years of experience performing data engineering functions or a minimum of 15 years of experience performing data engineering functions will be considered in lieu of a technical degree.
  • Understanding of at least one programming language
  • Familiarity with AWS cloud environments  
  • Proficiency with Linux (e.g. file/permission manipulation, directory navigation, executing scripts, etc.)
  • Agile Development experience
  • Ability to work with data modelers, data architects, software developers, DBAs to achieve project objectives
  • Active TS/SCI with polygraph clearance.

 

Preferred Qualifications:

  • Familiarity with open source software packages used to perform data engineering in both Python and Java
  • Familiarity with open source ETL frameworks such as NiFi
  • Familiarity with unstructured and structured data analysis techniques
  • Familiarity with streaming and batch layer open techniques to perform big data processing
  • Scripting Experience (e.g. bash, Perl, Python, etc.)
  • Implementing Linux security best practices (e.g. STIGS, CIS Benchmarks, etc.)
  • Experience in the Intelligence Community
  • Experience working in a Customer Facility
Attach a resume file. Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

We are uploading your application. It may take a few moments to read your resume. Please wait!