Data Engineer

Pacific Junction, IA
Job Requisition: 6991

Data Engineer – Contract

REMOTE

Job Requisition: 6991

Description for Data Engineer: 

Client is seeking an experienced data engineer to help modernize data pipelines which is entering an aggressive growth phase of their products. This position will be critical for the successful launch and adoption of our new faculty record system, and for laying the foundation of their long-term strategy of becoming the enterprise platform of choice in higher education.

You May Be A Great Fit If:

  1. You love data and you really like code
  2. You’re a self-starter who dislikes micromanagement
  3. You want your work to make a noticeable impact
  4. You’re constantly on the lookout for better ways to do things

You Probably Aren’t a Great Fit If:

  1. You need lots of structure to perform optimally
  2. You aren’t comfortable taking the initiative or you require lots of instruction
  3. You don’t like change
  4. You aren’t available during traditional business hours for team collaborations (daily standup at 9am) 

Responsibilities for Data Engineer: 

What you would do in this role:

  1. Propose, architect, and implement ETL pipelines (Extract-Transform-Load) to ingest heterogeneous data from a variety of sources. ETL Pipeline is a set of processes that involve extracting data from sources like transactional databases, APIs, marketing tools, or other business systems, transforming the data, and loading it into a cloud-hosted database or a data warehouse for deeper analytics and business intelligence.
  2. Create strategies to reduce complexity and increase scale for existing internal data processes.
  3. Collaborate with other data engineers and developers to execute modernization strategies.
  4. Develop cloud data platform solutions for external consumption.
  5. Work directly with client engineers to facilitate integration and implementation.

 

Requirements for Data Engineer:

  1. 3-4+ years’ of ETL experience
  2. Bachelor’s Degree in Engineering or equivalent
  3. Knowledge of at least one ETL tool (SSIS, Informatica, Talend, etc)
  4. Micro soft SQL Server knowledge and T-SQL (query performance tuning, index maintenance, etc.) as well as an understanding of database structure
  5. Experience in Cloud Data Platform (snowflake is preferred)
  6. Basic programming skills.
  7. Experience with Azure
  8. Experience with ETL tooling - Extract-Transform-Load
  9. Comfortable with .Net and C# (highly preferred)
  10. Highly autonomous

No sponsorship at this time.

 

 

 

Resume Upload

  • Accepted file types: pdf, doc, docx, Max. file size: 800 MB.
  • Drop your resume and other files here or upload here
  • This field is for validation purposes and should be left unchanged.