Candidate MUST HAVE a TS/SCI and Polygraph security clearance in order to be considered.
This program is responsible for building, designing and maintaining a technical environment for the customer to be able to run and use its analytics tools. Their major efforts are focused around bringing everything up into the AWS Cloud
- Responsible for ingesting large amount of data for Analysts consumption for multiple consumers.
- Production Release and Technical Lead Team Member (5 member team)
- Review ETL work using formal Peer Review, UATs and Release steps.
- Created a formal process by inserting robust business rules and practices in several Use CASES. This assisted with expediting data ingestion and created ETL Developer accountability. This process increased our productivity two-fold once ETL Developers took ownership of the process and became more experienced ETLers.
- Java Developer Role
- Assist with adding more functionality to custom Pentaho Data Integration steps
- Assist with testing Pentaho 7 functionality
- Eclipse / Intellij-IDEA IDEs
- Familiar with Maven build
- Ingest large amount of data for legacy and new datasets using Pentaho Data Integration
- Fix production tier1 – tier4 datasets when error occurs.
- Work in Linux environment to ingest data using Linux scripts and Pentaho Data Integration.
- Linux scripting includes logic, source file extractions (e.g, .zip,.csv, oracle exports, .xml, .txt)
- Java Script and User Defined Java Classes used for Regex related tasking.
- Responsible for correcting legacy data ingestion issues that have Java code.
- Assist junior ETL’ ers with ingestion by viewing code, source files and requirements.
- Support staff to end users and technical personnel within project
- LINUX / UNIX
- Eclipse IDE
- Building USE CASE scenarios
Job Type: Full Time