Job Url: https://www.glassdoor.com/Job/united-states-python-engineer-jobs-SRCH_IL.0,13_IN1_KO14,29.htm?remoteWorkType=1&fromAge=1 Job Description: Maximus 3.3 Data Pipeline Engineer (Mid-Level) Apply on employer site Remote · $135K - $155K (Employer provided) Apply on employer site Your qualifications for this job Do you have any of these qualifications? SQL REST Agile Java Show more General information Job Posting Title Data Pipeline Engineer (Mid-Level) Date Monday, July 7, 2025 City Remote Country United States Working time Full-time Description & Requirements We are seeking a Data Pipeline Engineer to join our team supporting an Internal Revenue Service (IRS) client. The Data Pipeline Engineer will work with a team of Data engineers and ETL specialists and conduct hands-on development as a member of an integrated Agile project team Position is remote within US. Ideal candidates will be in Lanham MD or Farmers Branch or Austin TX areas. Local candidates may attend meetings occasionally at the IRS facility. Essential Duties and Responsibilities: Lead the design and development of complex software applications from business requirements in collaboration with other team members Investigate new solutions, tools, products, and techniques to incorporate into coding standards. Troubleshoot complex issues and produce detailed proposals to resolve. Support testing and remediate defects Guide, coach, and mentor entry and mid-level developers Job-Specific Essential Duties and Responsibilities: Maintain direct responsibility and ownership in developing and deploying components or complete application functionality using Databricks, Informatica, SQL and other tools. Design software components from business requirements in collaboration with other team members. Support ETL and BI functionalities of the application. Provide support to technical team, writing the necessary required documentation as per the IRS development methodology for ETL integration. Coordinate, plan, communicate, develop and review client presentations and documentation required under the IRS Enterprise Lifecycle Additional requirements as per Contract/Client: Candidates must meet requirements to obtain and maintain an IRS Minimum Background Investigation (MBI) clearance (active IRS Moderate Risk MBI clearance is a plus). Candidates must be a US Citizen or a Legal Permanent Resident (Green Card status) for at least three (3) years, and Federal Tax compliant Minimum Requirements Bachelor's degree in related field. 7-10 years of relevant professional experience required. Equivalent combination of education and experience considered in lieu of degree. Job-Specific Minimum Requirements (required skills that align with contract LCAT, verifiable, and measurable): Project Specific Qualifications: Bachelor's Degree from an accredited college or university required; an additional four (4) years of related work experience can substitute for a degree At least seven (7) years of relevant experience required Possess an active IRS MBI Minimum of two (2) years’ Experience working with IRS data such as IRTF, IRMF, CADE2, IMF, BMF, BRTF and others. Experience integrating a range of technologies in a large federal IT environment such as the IRS. Knowledge of cloud-based technologies and integration of on-prem to on-cloud migrations. Implementation knowledge of Java, Python and REST API’s Experience with performance optimization of Data pipelines Expertise in building CLI based solutions using a variety of available tools Knowledge of Databricks and ability to write SQL queries and Python scripts to Query and manipulate data. Knowledge of ETL and BI tools and able to understand and support ETL and BI functionalities of the application. Good problem-solving skills with architectural, design and hands on development experience. Experience working in and a general understanding of Red Hat Linux OS, or other Unix-like OS. Experience with Shell scripting. Preferred Skills and Qualifications: Prior experience working with IRS systems and data Experience building Data pipelines moving data from on-premises to Databricks running on AWS cloud. Hands-on implementation experience with data migration, data wrangling and data manipulation Proven experience with using COTS products and custom scripting to build ETL pipelines. Prior implementation and development knowledge of Bash scripts Excellent programming language skills in Python on Linux environment Understanding of application and transport layer security Knowledge of SQL and performance tuning technique Experience with Jenkins and automated testing techniques Familiar with Agile / Scrum Prior implementation of containerization using Docker, Open Shift or AWS EC2 instances EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Minimum Salary $ 135,000.00 Maximum Salary $ 155,000.00 Show less Base pay range The minimum salary is $135K and the max salary is $155K. $135K – $155K/yr (Employer provided) $145K /yr Median Remote If an employer includes a salary or salary range on their job, we display it as "Employer Provided". If a job has no salary data, Glassdoor displays a "Glassdoor Estimate" if available. To learn more about "Glassdoor Estimates," see our FAQ page. Conversations @Maximus Kick off the conversation by asking about salaries, interviews or anything else @Maximus. See what other Data Pipeline Engineer (Mid-Level) are saying Interview Tips 523K members