Data Engineer

Bellevue, WA. Posted 8 months ago

Contact Privacy Inc. Customer 1241774648
Posted By
Sam Aspire
6 Months
Start Date
Tech Category
4 - 15 Years
Tech Sub-Category
Work Permit
Any Work Permit encouraged to apply
Tax Terms

Job Description

Description:Data Engineer

Location: Bellevue, WA

Duration: 6 months +

Faster interview schedule.


Does empowering teams to make data driven decisions excite you? Do you wake up in the morning wondering what possibilities could be unlocked with more data? We are looking for a seasoned data engineer to join the team. Data Engineering focuses on making possible fast, accurate, and reliable access to data. We build data pipelines, manage a data warehouse, and support the production use of our data. We advocate for good data practices and make sure that our business users are able to make good data driven decisions.

Job Functions / Responsibilities

  • Provide data engineering on modern, cloud-based and legacy data processing technology stacks.
  • Build data pipelines, data validation frameworks, job schedules with emphasis on automation and scale.
  • Contribute to overall architecture, framework, and design patterns to store and process high data volumes.
  • Ensure product and technical features are delivered to spec and on-time.
  • Design and implement features in collaboration with product owners, reporting analysts / data analysts, and business partners within an Agile / Scrum methodology.
  • Proactively support product health by building solutions that are automated, scalable, and sustainable – be relentlessly focused on minimizing defects and technical debt.

Qualifications And Skills

  • 5+ years of experience in large-scale software development with emphasis on data analytics and high-volume data processing.
  • 3+ years of experience in data engineering development.
  • 2+ years of experience implementing scalable data architectures.
  • 2+ years of experience with AWS and related services (e.g., EC2, S3, DynamoDB, ElasticSearch, SQS, SNS, Lambda, Airflow, Snowflake).
  • Experience in data-centric programming languages (e.g., Python, GO, Ruby, Javascript, Scala).
  • Proficiency with ETL tools and techniques.
  • Knowledge of and experience with RDBMS platforms, such as MS SQL Server, Oracle, DB2, IMS, MySQL, Postgres, SAP HANA, and Teradata.
  • Experience with participating in projects in a highly collaborative, multi-discipline team environment.
Key Skills
Interview Schedule Data Driven Data Warehouse Production Cloud Data Validation Frameworks Automation Framework Design Patterns Collaboration Analysts Business Partners Agile/Scrum Methodology Focused ETL Tools Reliable Software Development Data Analytics Data Processing Data Engineering Data Architectures AWS Related Services EC2 DynamoDB ElasticSearch Airflow Snowflake Programming Languages Python Ruby Javascript Scala RDBMS Platforms MS SQL Server Oracle DB2 IMS MySQL Postgres SAP HANA Collaborative Multi-discipline

Similar Jobs